What Is the Model Context Protocol (MCP)?

What Is the Model Context Protocol (MCP)?

Anthropic has open-sourced a revolutionary new protocol—MCP (Model Context Protocol)—which aims to completely address the pain points of connecting data in LLM applications! Its goal is to enable cutting-edge models to generate better and more relevant responses. No longer will you need to write custom integration code for each data source; MCP handles it all … Read more

Understanding the Model Context Protocol (MCP)

Understanding the Model Context Protocol (MCP)

Before reading the content below, you need to understand ollama and Function Calling. The MCP is an open protocol that standardizes how applications provide context to large language models (LLMs). It also provides a standardized way to connect AI models to various data sources and tools. This protocol was defined by Anthropic, and one of … Read more

Comprehensive Guide to Fine-Tuning Qwen7b

Comprehensive Guide to Fine-Tuning Qwen7b

Warning: This may be the easiest to understand, easiest to run example for efficient fine-tuning of various open-source LLM models, supporting both multi-turn and single-turn dialogue datasets. We constructed a toy dataset of three rounds of dialogue that modifies the self-awareness of the large model, using the QLoRA algorithm, which can complete fine-tuning in just … Read more

Ollama: Run Local Large Language Models Effortlessly

Ollama: Run Local Large Language Models Effortlessly

Project Introduction Ollama is a project focused on the local deployment and running of large language models, such as Llama 2 and Mistral. This project is licensed under the MIT License and is primarily written in Go, while also integrating languages such as C, Shell, TypeScript, C++, and PowerShell. With over 33.5k stars and 2.2k … Read more

Quickly Deploy Local Open Source Large Language Models Using Ollama

Quickly Deploy Local Open Source Large Language Models Using Ollama

If you’re starting to explore how to test open source large language models (LLM) with Generative AI for the first time, the overwhelming amount of information can be daunting. There is a lot of fragmented information from various sources on the internet, making it difficult to quickly start a project. The goal of this article … Read more

Introduction to Using LM Studio for Local LLM Applications

Introduction to Using LM Studio for Local LLM Applications

LM Studio is the simplest way to support local open-source large language models. It is plug-and-play, requires no coding, is very simple, and has a beautiful interface. Today, I will introduce this application. 1. What Can LM Studio Do? 🤖 Run LLM completely offline on a laptop 👾 Use models via in-app chat UI or … Read more

Building A Secure Personal/Enterprise Knowledge Base with Ollama and WebUI

Building A Secure Personal/Enterprise Knowledge Base with Ollama and WebUI

I have an AI assistant named “Lao Liu”. Why? Because it sometimes speaks nonsense seriously. That’s right, this is a drawback of large models — “hallucination”. Therefore, LLMs + knowledge base is a solution to the “hallucination” problem. At the same time, for enterprises, information security must be considered; a privately owned knowledge base obviously … Read more

Customize Your Large Language Model with Ollama

Customize Your Large Language Model with Ollama

In the previous article, I shared how to run Google’s GemmaLLM locally using Ollama. If you haven’t seen that article, you can click the link below to review that content. Today, I’ll share how to customize your own LLM using the Modefile mechanism provided by Ollama, and I’ll demonstrate using Gemma7B again. Google’s open-source Gemma, … Read more

Building a Personal Knowledge Base Using Ollama, Docker, and Anything LLM

Building a Personal Knowledge Base Using Ollama, Docker, and Anything LLM

Previously, I introduced the NVIDIA open-source AI tool Chat With RTX that can be used on PCs: [Tool] AI tools that can be installed and used on personal computers: Chat With RTX However, when using this tool, I found that although it learned a lot of information, it could only reference information from one piece … Read more