Using LlamaIndex Agent to Call Multiple Tool Functions

Using LlamaIndex Agent to Call Multiple Tool Functions

Overview This article introduces how to use LlamaIndex’s Agent to call multiple custom Agent tool functions. As with the previous articles in this series, this article does not use the OpenAI API and relies entirely on a local large model to complete the entire functionality. The goal of this article is simple: to save the … Read more

Prompt Engineering in LlamaIndex

Prompt Engineering in LlamaIndex

Prompt is the fundamental input that grants LLM expressive capabilities. LlamaIndex uses prompts to build indexes, execute inserts, retrieve during queries, and synthesize final answers. LlamaIndex provides a set of out-of-the-box default prompt templates: https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/prompts/default_prompts.py Additionally, here are some prompts specifically written for chat models like gpt-3.5-turbo: https://github.com/run-llama/llama_index/blob/main/llama-index-core/llama_index/core/prompts/chat_prompts.py Custom Prompts Users can also provide their … Read more

Embedding Models in LlamaIndex

Embedding Models in LlamaIndex

You may have heard of the concept of word embedding, which represents semantics using numerical vectors. The closer the numerical vectors are, the more similar the corresponding statements or words are in meaning. LlamaIndex also uses embeddings to represent documents. The embedding model takes text as input and returns a long string of numbers that … Read more

Implementing RAG Queries in LlamaIndex Agent

Implementing RAG Queries in LlamaIndex Agent

Implementing RAG Queries in LlamaIndex Agent Overview This article explains how to integrate a RAG query engine into an Agent, enabling the Agent to utilize external knowledge bases for data queries, thus enhancing its capabilities. This approach is useful in many scenarios, for instance: often we need to query or compute a specific metric first, … Read more

Using External Tools in Agent with Llamaindex

Using External Tools in Agent with Llamaindex

Overview For Agents, it is common to call multiple external tools to achieve various functions. This article introduces how to use external tools through llamaindex. Of course, these tools are all provided by llamaindex. The framework offers several external tools that can infinitely expand the capabilities of the Agent. These tools can be downloaded from … Read more

LlamaIndex and RAG Evaluation Tools Overview

LlamaIndex and RAG Evaluation Tools Overview

LlamaIndex is an LLM (Large Language Model) application development framework that many developers prefer to use for developing RAG (Retrieval-Augmented Generation) applications. During the development of RAG applications, we often need to evaluate relevant data to better adjust and optimize the applications. With the development of RAG technology, more excellent evaluation tools have emerged, which … Read more

Detailed Explanation of LlamaIndex Workflows: Key to Improving Data Processing Efficiency

Detailed Explanation of LlamaIndex Workflows: Key to Improving Data Processing Efficiency

Click the “Blue Words” to Follow Us LlamaIndex, as a powerful framework, provides a solid foundation for building data pipelines that connect with large language models (LLMs). It implements a modular approach to query execution through structured workflows, simplifying solutions to complex problems. Today, let’s discuss the workflows of LlamaIndex. 1. Basics of LlamaIndex Workflows … Read more

Full-Stack Chatbot Template for Multi-Document Analysis on LlamaIndex

Full-Stack Chatbot Template for Multi-Document Analysis on LlamaIndex

Project Introduction The easiest way to start using LlamaIndex is by using <span>create-llama</span>. This CLI tool allows you to quickly start building new LlamaIndex applications and sets everything up for you. Quick Run npx create-llama@latest to get started, or refer to the options below for more choices. After generating the application, run npm run dev … Read more

Using Large Language Models in LlamaIndex

Using Large Language Models in LlamaIndex

One of the primary steps to consider when building any LLM application based on data is choosing the right LLM. LLMs are a core component of LlamaIndex. They can be used as standalone modules or inserted into other core LlamaIndex modules (indexers, retrievers, query engines). They are generally used during the response synthesis step after … Read more

Getting Started with LlamaIndex

Getting Started with LlamaIndex

First, we need to clarify that we require two types of models: LLM, which is the large model responsible for generating content. Embedding model, which is responsible for generating embeddings that represent text semantics in vector form. Set Up OpenAI API Key By default, LlamaIndex uses OpenAI’s LLM and embedding models, so we first need … Read more