The Model Context Protocol (MCP) is an open protocol that enables seamless integration of LLM applications with external data sources and tools. Whether building an AI-based IDE, enhancing chat interfaces, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the required context.
https://spec.modelcontextprotocol.io
git clone https://github.com/ChatGPTNextWeb/NextChat.git && cd NextChat
2. Build the image
docker build -t nextchat .
3. Start the container
docker run -d -p 8080:3000 -e ENABLE_MCP="true" nextchat
4. Access the website
Now visit http://localhost:8080 to see the NextChat interface

5. Configure MCP
Click the “MCP” button on the left sidebar to enter the MCP Market and add the required services as needed

git clone https://github.com/ChatGPTNextWeb/NextChat.git && cd NextChat
cp .env.template .env
# Linux/Unix
sed -i 's/ENABLE_MCP=/ENABLE_MCP=true/' .env
# macOS
sed -i '' 's/ENABLE_MCP=/ENABLE_MCP=true/' .env
yarn install && yarn dev




5. Notes
1. MCP services run on the server, please do not enable the MCP feature in public environments to ensure system security.
2. The prompts are still being improved, and there may be issues where the AI model cannot accurately understand user requests or construct MCP requests correctly.