Experience Mistral AI in Roo Cline with VS Code

Mistral AI

Previously, we briefly introduced Mistral AI. For a detailed introduction, please refer to: Introduction to Mistral AI.Mistral AI supports free calls to the API and Codestral API, today we will try to integrate Mistral AI with Roo Cline.

Current Version

Roo Cline 2.2.42

Limitations

Currently, Roo Cline does not support direct integration using the Mistral API Key. Using the 【OpenAI Compatible method will result in a 404 error. One solution found is to use an intermediary proxy for forwarding.

Experience Mistral AI in Roo Cline with VS Code

Litellm

Introduction

Litellm is a powerful LLM (Large Language Model) API calling tool designed to simplify the interaction between developers and various LLM provider APIs. In simple terms, it allows you to call all LLM APIs using the OpenAI format. Litellm is a Python dependency package that is easy to install and can start the proxy with one click.

Installation

Litellm is a Python library that can be installed directly using pip.

pip install litellm
pip install 'litellm[proxy]'  # litellm proxy library, if only using the proxy, you can try installing just this library to see if it works

If installation is slow, you can try the acceleration mode.

pip install litellm -i http://mirrors.aliyun.com/pypi/simple/ --trusted-host mirrors.aliyun.com
pip install 'litellm[proxy]' -i http://mirrors.aliyun.com/pypi/simple/ --trusted-host mirrors.aliyun.com

Export Environment

Litellm will default to obtaining the required API Key from environment variables. You can directly store the API Key in the environment variables.

export MISTRAL_API_KEY=your Mistral API Key  // Chat and code completion API Key
export CODESTRAL_API_KEY=your Codestral API Key

Starting the Model

// Universal model
litellm --model mistral/mistral-large-latest
// Chat and code completion model
litellm --model codestral/codestral-latest

Experience Mistral AI in Roo Cline with VS Code

Integrating Mistral AI with Roo Cline

Installing Roo Cline

For installation methods, refer to:【VS Code】Is Roo Cline + DeepSeek More Useful?

Obtaining Mistral AI API Key

The API Key needs to be kept safe, as it cannot be viewed later.

OpenRouter API Key retrieval address: https://openrouter.ai/settings/keys

Click the top 【More】 to select 【Keys】 to enter the API Key list.

Experience Mistral AI in Roo Cline with VS Code

Click 【Create new key】 to create a new API Key.

Experience Mistral AI in Roo Cline with VS Code

Input the API Key name, expiration time (optional), and after creation, copy the API Key. This needs to be kept safe, as it cannot be viewed later.

Experience Mistral AI in Roo Cline with VS Code

Roo Cline Configuration

In the 【API Provider】 list, select 【OpenAI Compatible】 and enter the service address started by Litellm in the 【Base URL】 field, which defaults to <span><span>http://0.0.0.0:4000</span></span>. For the 【API Key】, you can fill in anything, and for the 【Models field, input the model of the Litellm proxy <span><span>codestral/codestral-latest</span></span>, and finally click the 【Done】 at the top right to complete the configuration.

Experience Mistral AI in Roo Cline with VS Code

Experience Mistral AI in Roo Cline with VS Code

After the configuration is complete, you can use Roo Cline to call Mistral AI for chatting.

Experience Mistral AI in Roo Cline with VS Code

Click to follow and receive the latest news promptly.

Leave a Comment