Cohere’s Open Source 35B Model Surpasses Mixtral in RAG and Tool Capabilities
https://txt.cohere.com/command-r/ https://huggingface.co/CohereForAI/c4ai-command-r-v01 1. RAG Performance On multiple datasets, it far exceeds the Mixtral MoE model. By using their own embeddings and reranking, it significantly outperforms open-source models. 2. Tool Capabilities The tool capabilities are slightly better than Mixtral and significantly outperform GPT-3.5. 3. Multilingual Capabilities Supports English, French, Spanish, Italian, German, Brazilian Portuguese, Japanese, Korean, … Read more