Mastering Anthropic Claude 3 with Dify and AWS Bedrock

This article discusses how to use Anthropic Claude 3 more stably, as well as some interesting AI native tools based on the currently well-performing models.

Introduction

After experiencing Anthropic Claude Pro, which went live at the beginning of the month, for more than half a month, I found that the Claude 3 series models (Opus, Sonnet, Haiku) indeed perform quite well, with noticeable improvements in instruction adherence and response speed.

Mastering Anthropic Claude 3 with Dify and AWS Bedrock
Anthropic Claude Pro premium experience

However, like last year’s ChatGPT Plus, my personal account was lent to friends for them to experience, and due to different client logins (IP switching), it is easy to encounter risk control issues, such as needing uncertain time assistance for login confirmation, and in worse cases, account bans requiring a ticket to lift account restrictions.

Recently, Amazon AWS’s Bedrock foundation model (FM) platform updated the Anthropic Claude 3 model[1]. Over the weekend, I spent some time adapting the model mentioned in the previous article[2] for Dify, and now with just a few commands, we can quickly start a set of AI tools based on Claude 3 through Docker, such as Chat Claude3, Claude3 AI Apps, and lazy tools like Claude 3 Bot.

Mastering Anthropic Claude 3 with Dify and AWS Bedrock
Quickly build AI applications like this

The related code changes for Dify have been submitted to the Dify official repository[3]. It seems that the official application of these functional adjustments will take some time. Therefore, for those eager to try, you can visit and use my repository to get configurations or build Docker images yourself: soulteary/dify[4].

Of course, if you want to use more ready-made applications, you can download the Docker image from the cloud disk to quickly experience how to set up your own “Claude3 AI application” in minutes.

Let’s start with the preparations.

Preparation Work

There are three items to prepare:

1. A device environment that can run Docker applications (local or cloud host is fine).2. Download my built Dify application image or use the source code repository I provided to build it manually.3. Apply for permission to use the Amazon AWS Bedrock Claude 3 model.

Preparation Work: Docker Runtime Environment

With Docker, we can quickly obtain a clean, reproducible, and consistent environment with very little extra resources.

Regardless of whether your device hardware or cloud host includes a graphics card, you can configure the basic environment according to your operating system preferences by referring to these two articles: “Deep Learning Environment Based on Docker: Windows Edition[5]” and “Deep Learning Environment Based on Docker: Getting Started[6]“. Of course, after installing Docker, you can do many other interesting things, such as previous dozens of articles on Docker practices[7], which I will not elaborate on here.

If you already have an environment that can run Docker, we can prepare the next item.

Preparation Work: Download or Build the Modified Dify Application Image

Later, I will upload the built image to the cloud disk, and you can find the resource download address in the comments section of the column. Once the image is downloaded, you can use the following command to load the image:

docker load -i dify-api-claude3.tar

If you prefer to build from scratch, you can execute the following commands:

git clone https://github.com/soulteary/dify.git
cd dify/api
git checkout wow/so-lazy
docker build -t soulteary/dify-api:claude3 .

Wait for the commands to finish executing, and the image preparation work will be complete.

Preparation Work: Apply for AWS Bedrock’s Claude3 Usage Permission

We won’t spend time detailing the simple operations of account registration and other “Next” clicks.

When we log in to “AWS Bedrock” and open the model access permission management page[8], by default, we may not have permission to use the “Anthropic Claude3” model. In this case, simply submit a request to enable the permission.

Mastering Anthropic Claude 3 with Dify and AWS Bedrock
By default, you may not have Claude 3 model permission

Once permission is granted, you will see “All Set” on the model permission page.

Mastering Anthropic Claude 3 with Dify and AWS Bedrock
Obtained Claude 3 model permission

Visit the “User Credentials Management[9]” page, create a credential, and download the credential file (credentials), and our preparation work will be complete.

Practice Begins

This also includes three steps:

1. Start Dify using Docker, complete the basic application configuration, allowing us to manage and configure the specific AI applications built later in the browser.2. Complete the configuration of Claude3 or other model resources in Dify, allowing our program to call these models for content generation.3. Use Dify to simply tinker with some common applications, such as ChatBot and the simplest knowledge base.

Quickly Complete Dify Configuration and Start

To quickly start Dify using Docker, we only need to prepare two files: the Docker orchestration file docker-compose.yml and the Nginx configuration file nginx.conf.

version: '3.1'
services:
  # API service
  api:
    image: soulteary/dify-api:claude3
    restart: always
    environment:
      # Startup mode, 'api' starts the API server.
      MODE: api
      # The log level for the application. Supported values are `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`
      LOG_LEVEL: INFO
      # A secret key that is used for securely signing the session cookie and encrypting sensitive information on the database. You can generate a strong key using `openssl rand -base64 42`.
      SECRET_KEY: sk-9f73s3ljTXVcMT3Blb3ljTqtsKiGHXVcMT3BlbkFJLK7U
      # The base URL of console application web frontend, refers to the Console base URL of WEB service if console domain is
      # different from api or web app domain.
      # example: http://cloud.dify.ai
      CONSOLE_WEB_URL: ''
      # Password for admin user initialization.
      # If left unset, admin user will not be prompted for a password when creating the initial admin account.
      INIT_PASSWORD: ''
      # The base URL of console application api server, refers to the Console base URL of WEB service if console domain is
      # different from api or web app domain.
      # example: http://cloud.dify.ai
      CONSOLE_API_URL: ''
      # The URL prefix for Service API endpoints, refers to the base URL of the current API service if api domain is
      # different from console domain.
      # example: http://api.dify.ai
      SERVICE_API_URL: ''
      # The URL prefix for Web APP frontend, refers to the Web App base URL of WEB service if web app domain is different from
      # console or api domain.
      # example: http://udify.app
      APP_WEB_URL: ''
      # File preview or download Url prefix.
      # used to display File preview or download Url to the front-end or as Multi-model inputs;
      # Url is signed and has expiration time.
      FILES_URL: ''
      # When enabled, migrations will be executed prior to application startup and the application will start after the migrations have completed.
      MIGRATION_ENABLED: 'true'
      # The configurations of postgres database connection.
      # It is consistent with the configuration in the 'db' service below.
      DB_USERNAME: postgres
      DB_PASSWORD: difyai123456
      DB_HOST: db
      DB_PORT: 5432
      DB_DATABASE: dify
      # The configurations of redis connection.
      # It is consistent with the configuration in the 'redis' service below.
      REDIS_HOST: redis
      REDIS_PORT: 6379
      REDIS_USERNAME: ''
      REDIS_PASSWORD: difyai123456
      REDIS_USE_SSL: 'false'
      # use redis db 0 for redis cache
      REDIS_DB: 0
      # The configurations of celery broker.
      # Use redis as the broker, and redis db 1 for celery broker.
      CELERY_BROKER_URL: redis://:difyai123456@redis:6379/1
      # Specifies the allowed origins for cross-origin requests to the Web API, e.g. https://dify.app or * for all origins.
      WEB_API_CORS_ALLOW_ORIGINS: '*'
      # Specifies the allowed origins for cross-origin requests to the console API, e.g. https://cloud.dify.ai or * for all origins.
      CONSOLE_CORS_ALLOW_ORIGINS: '*'
      # The type of storage to use for storing user files. Supported values are `local` and `s3`, Default: `local`
      STORAGE_TYPE: local
      # The path to the local storage directory, the directory relative the root path of API service codes or absolute path. Default: `storage` or `/home/john/storage`.
      # only available when STORAGE_TYPE is `local`.
      STORAGE_LOCAL_PATH: storage
      # The S3 storage configurations, only available when STORAGE_TYPE is `s3`.
      S3_ENDPOINT: 'https://xxx.r2.cloudflarestorage.com'
      S3_BUCKET_NAME: 'difyai'
      S3_ACCESS_KEY: 'ak-difyai'
      S3_SECRET_KEY: 'sk-difyai'
      S3_REGION: 'us-east-1'
      # The type of vector store to use. Supported values are `weaviate`, `qdrant`, `milvus`.
      VECTOR_STORE: weaviate
      # The Weaviate endpoint URL. Only available when VECTOR_STORE is `weaviate`.
      WEAVIATE_ENDPOINT: http://weaviate:8080
      # The Weaviate API key.
      WEAVIATE_API_KEY: WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih
      # Mail configuration, support: resend, smtp
      MAIL_TYPE: ''
      # default send from email address, if not specified
      MAIL_DEFAULT_SEND_FROM: 'YOUR EMAIL FROM (eg: no-reply <[email protected]>)'
      SMTP_SERVER: ''
      SMTP_PORT: 587
      SMTP_USERNAME: ''
      SMTP_PASSWORD: ''
      SMTP_USE_TLS: 'true'
      # the api-key for resend (https://resend.com)
      RESEND_API_KEY: ''
      RESEND_API_URL: https://api.resend.com
      # The DSN for Sentry error reporting. If not set, Sentry error reporting will be disabled.
      SENTRY_DSN: ''
      # The sample rate for Sentry events. Default: `1.0`
      SENTRY_TRACES_SAMPLE_RATE: 1.0
      # The sample rate for Sentry profiles. Default: `1.0`
      SENTRY_PROFILES_SAMPLE_RATE: 1.0
    depends_on:
      - db
      - redis
    volumes:
      # Mount the storage directory to the container, for storing user files.
      - ./volumes/app/storage:/app/api/storage

  # worker service
  # The Celery worker for processing the queue.
  worker:
    image: soulteary/dify-api:claude3
    restart: always
    environment:
      # Startup mode, 'worker' starts the Celery worker for processing the queue.
      MODE: worker

      # --- All the configurations below are the same as those in the 'api' service. ---

      # The log level for the application. Supported values are `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`
      LOG_LEVEL: INFO
      # A secret key that is used for securely signing the session cookie and encrypting sensitive information on the database. You can generate a strong key using `openssl rand -base64 42`.
      # same as the API service
      SECRET_KEY: sk-9f73s3ljTXVcMT3Blb3ljTqtsKiGHXVcMT3BlbkFJLK7U
      # The configurations of postgres database connection.
      # It is consistent with the configuration in the 'db' service below.
      DB_USERNAME: postgres
      DB_PASSWORD: difyai123456
      DB_HOST: db
      DB_PORT: 5432
      DB_DATABASE: dify
      # The configurations of redis cache connection.
      REDIS_HOST: redis
      REDIS_PORT: 6379
      REDIS_USERNAME: ''
      REDIS_PASSWORD: difyai123456
      REDIS_DB: 0
      REDIS_USE_SSL: 'false'
      # The configurations of celery broker.
      CELERY_BROKER_URL: redis://:difyai123456@redis:6379/1
      # The type of storage to use for storing user files. Supported values are `local` and `s3`, Default: `local`
      STORAGE_TYPE: local
      STORAGE_LOCAL_PATH: storage
      # The type of vector store to use. Supported values are `weaviate`, `qdrant`, `milvus`.
      VECTOR_STORE: weaviate
      # The Weaviate endpoint URL. Only available when VECTOR_STORE is `weaviate`.
      WEAVIATE_ENDPOINT: http://weaviate:8080
      # The Weaviate API key.
      WEAVIATE_API_KEY: WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih
      # Mail configuration, support: resend
      MAIL_TYPE: ''
      # default send from email address, if not specified
      MAIL_DEFAULT_SEND_FROM: 'YOUR EMAIL FROM (eg: no-reply <[email protected]>)'
      # the api-key for resend (https://resend.com)
      RESEND_API_KEY: ''
      RESEND_API_URL: https://api.resend.com
    depends_on:
      - db
      - redis
    volumes:
      # Mount the storage directory to the container, for storing user files.
      - ./volumes/app/storage:/app/api/storage

  # Frontend web application.
  web:
    image: langgenius/dify-web:0.5.9
    restart: always
    environment:
      EDITION: SELF_HOSTED
      # The base URL of console application api server, refers to the Console base URL of WEB service if console domain is
      # different from api or web app domain.
      # example: http://cloud.dify.ai
      CONSOLE_API_URL: ''
      # The URL for Web APP api server, refers to the Web App base URL of WEB service if web app domain is different from
      # console or api domain.
      # example: http://udify.app
      APP_API_URL: ''
      # The DSN for Sentry error reporting. If not set, Sentry error reporting will be disabled.
      SENTRY_DSN: ''

  # The postgres database.
  db:
    image: postgres:15-alpine
    restart: always
    environment:
      PGUSER: postgres
      # The password for the default postgres user.
      POSTGRES_PASSWORD: difyai123456
      # The name of the default postgres database.
      POSTGRES_DB: dify
      # postgres data directory
      PGDATA: /var/lib/postgresql/data/pgdata
    volumes:
      - ./volumes/db/data:/var/lib/postgresql/data
    healthcheck:
      test: [ "CMD", "pg_isready" ]
      interval: 1s
      timeout: 3s
      retries: 30

  # The redis cache.
  redis:
    image: redis:6-alpine
    restart: always
    volumes:
      # Mount the redis data directory to the container.
      - ./volumes/redis/data:/data
    # Set the redis password when startup redis server.
    command: redis-server --requirepass difyai123456
    healthcheck:
      test: [ "CMD", "redis-cli", "ping" ]

  # The Weaviate vector store.
  weaviate:
    image: semitechnologies/weaviate:1.19.0
    restart: always
    volumes:
      # Mount the Weaviate data directory to the container.
      - ./volumes/weaviate:/var/lib/weaviate
    environment:
      # The Weaviate configurations
      # You can refer to the [Weaviate](https://weaviate.io/developers/weaviate/config-refs/env-vars) documentation for more information.
      QUERY_DEFAULTS_LIMIT: 25
      AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: 'false'
      PERSISTENCE_DATA_PATH: '/var/lib/weaviate'
      DEFAULT_VECTORIZER_MODULE: 'none'
      CLUSTER_HOSTNAME: 'node1'
      AUTHENTICATION_APIKEY_ENABLED: 'true'
      AUTHENTICATION_APIKEY_ALLOWED_KEYS: 'WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih'
      AUTHENTICATION_APIKEY_USERS: '[email protected]'
      AUTHORIZATION_ADMINLIST_ENABLED: 'true'
      AUTHORIZATION_ADMINLIST_USERS: '[email protected]'

  # The nginx reverse proxy.
  # used for reverse proxying the API service and Web service.
  nginx:
    image: nginx:latest
    restart: always
    volumes:
      - ./nginx.conf:/etc/nginx/nginx.conf
    depends_on:
      - api
      - web
    ports:
      - "80:80"

The above is the content of the docker-compose.yml file, saved in soulteary/dify/docker/docker-compose.yml[10]. In the default configuration, we will use Nginx to connect various services and provide access services on port 80. You can adjust the configuration according to your own needs. Of course, if you are a Traefik user, you can refer to this configuration file: soulteary/dify/docker/docker-compose.traefik.yml[11]. After completing the Docker configuration, about 50% of this stage’s work is done.

Let’s continue to complete the remaining 50% of the work. In the same directory as docker-compose.yml, create a file named nginx.conf (soulteary/dify/docker/nginx.conf[12]):

user  nginx;
worker_processes  auto;

error_log  /var/log/nginx/error.log notice;
pid        /var/run/nginx.pid;


events {
    worker_connections  1024;
}


http {
    include       /etc/nginx/mime.types;
    default_type  application/octet-stream;

    log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
                      '$status $body_bytes_sent "$http_referer" '
                      '"$http_user_agent" "$http_x_forwarded_for"';

    access_log  /var/log/nginx/access.log  main;

    sendfile        on;
    keepalive_timeout  65;
    client_max_body_size 15M;

    server {
        listen 80;
        server_name _;

        proxy_set_header Host $host;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
        proxy_http_version 1.1;
        proxy_set_header Connection "";
        proxy_buffering off;
        proxy_read_timeout 3600s;
        proxy_send_timeout 3600s;

        location @backend {
            proxy_pass http://api:5001;
        }

        location @frontend {
            proxy_pass http://web:3000;
        }

        location /console/api {
            try_files $uri $uri/ @backend;
        }

        location /api {
            try_files $uri $uri/ @backend;
        }

        location /v1 {
            try_files $uri $uri/ @backend;
        }

        location /files {
            try_files $uri $uri/ @backend;
        }

        location / {
            try_files $uri $uri/ @frontend;
        }
    }
}

After creating and saving the above two configuration files, we execute docker compose up -d, wait a moment for the services to finish running, and we can access Dify in the browser.

# docker compose up -d
[+] Running 7/7
 ✔ Container docker-weaviate-1  Started                                                                                                                                                                 0.6s 
 ✔ Container docker-web-1       Started                                                                                                                                                                 0.6s 
 ✔ Container docker-db-1        Started                                                                                                                                                                 0.8s 
 ✔ Container docker-redis-1     Started                                                                                                                                                                 0.7s 
 ✔ Container docker-worker-1    Started                                                                                                                                                                 1.0s 
 ✔ Container docker-api-1       Started                                                                                                                                                                 1.0s 
 ✔ Container docker-nginx-1     Started                              

Once the programs in Docker are running, we can access the device IP and port running the Docker program in the browser, for example: http://IP:80 (if running locally, it will be http://127.0.0.1), and we will see the Dify admin account setup interface.

Mastering Anthropic Claude 3 with Dify and AWS Bedrock
First use requires setting up an admin account

After setting up the admin account, log into the program, and we can access the Dify control panel, where we can freely create AI applications based on prompts.

Mastering Anthropic Claude 3 with Dify and AWS Bedrock
Default interface of the Dify control panel

Of course, before building AI applications, we need to perform one more step, configuring available model resources.

Configuring Claude 3 and Other Model Services

Click on the user avatar in the upper right corner of the interface, click “Settings” in the dropdown menu, and in the pop-up window, select the “Model Provider” menu on the left to see all the model types that Dify supports configuring.

Mastering Anthropic Claude 3 with Dify and AWS Bedrock
Open the

Leave a Comment