LangChain: The Trendiest Web Framework of 2023

LangChain: The Trendiest Web Framework of 2023

Author: Richard MacManus
Translator: Ming Zhi Shan
Editor: Tina

LangChain is a programming framework that helps use large language models (LLMs) in applications. Like everything in generative AI, this project’s development is very rapid. In October 2022, it started as a Python tool, and then in February this year, support for TypeScript was added. By April this year, it supports multiple JavaScript environments including Node.js, browsers, Cloudflare Workers, Vercel/Next.js, Deno, and Supabase Edge Functions.

So, what aspects do JavaScript developers need to understand about LangChain and how to use LLMs? In this article, we will answer this question by analyzing two recent talks by LangChain’s author, Harrison Chase.

LangChain initially started as an open-source project and quickly transformed into a startup after gaining significant attention on GitHub. In 2017, Harrison Chase was still a student at Harvard, and now he is the CEO of a hot startup in Silicon Valley, marking a significant and rapid leap for him. Earlier, Microsoft’s CTO Kevin Scott praised Chase during his Build keynote.

The Popularity of Chat Apps

Unsurprisingly, the main application of LangChain currently is to build chat-based applications on top of LLMs (especially ChatGPT). As Tyler McGinnis jokingly remarked on bytes.dev about LangChain: “You can never have too many chat interfaces.”

In an interview earlier this year, Chase mentioned that one of the best use cases for LangChain right now is “document-based chatting.” LangChain also provides other features to enhance the chat experience, such as streams, which allow LLM outputs to be returned word by word instead of all at once.

However, Chase also pointed out that other types of interfaces are developing rapidly.

“In the long run, there may be better experiences than chat applications. But I think for now, it’s a way we can quickly get started without needing to do a lot of extra work. Can we say that chat applications will definitely be the best experience in the next six months? Probably not, but I believe what can currently bring value is likely to be chat applications.”

After all, developing applications based on LLMs is a new technology, and startups in this field (like LangChain) have been working hard to launch tools that help understand issues related to LLMs. For example, prompt engineering still mainly relies on developers’ intuition to judge which prompts will work better. However, LangChain has introduced features like “tracing” this year to help address this issue.

Agents

One feature recently introduced by LangChain is “custom agents,” which Chase mentioned at the LLM bootcamp in San Francisco in April this year. He defined agents as a way to “use language models as reasoning engines” to determine how to interact with the external world based on user input.

LangChain: The Trendiest Web Framework of 2023

Harrison Chase at the LLM Bootcamp

He provided an example of interacting with an SQL database. He said that typically, we would use natural language queries and a language model to convert them into SQL queries. We execute the query and pass the results back to the language model, asking it to synthesize them based on the original question, ultimately arriving at what Chase calls a “natural language wrapper for SQL databases.”

The role of the agent is to handle what Chase refers to as “edge cases,” which are ambiguous outputs that the LLM might produce at any time in the above example.

He explained: “You can use the agent to choose which tool to use and the input for that tool. Then you execute it, get the results, and feed the results back into the language model. You continue this process until you meet the stopping condition.”

LangChain: The Trendiest Web Framework of 2023

Implementing Agents

We refer to the agent as “ReAct,” which has no relation to the JavaScript framework React; “ReAct” stands for “Reason + Act.” Chase stated that this approach produces “higher quality and more reliable results” compared to other forms of prompt engineering.

LangChain: The Trendiest Web Framework of 2023

ReAct (Not React)

Chase acknowledged that agents also face “many challenges,” and that “most agents are not yet ready for production.”

Memory Issues

Some of the issues he listed seem to be basic computational problems, but they are more challenging in LLMs. For instance, LLMs typically lack long-term memory. As noted in the Pinecone tutorial, “By default, LLMs are stateless—this means that each query is processed independently of other queries.”

LangChain provides assistance to developers in this regard by adding components like memory to the processing of LLMs. In fact, for JavaScript and TypeScript, LangChain offers two memory-related methods: loadMemoryVariables and saveContext. The first method “is used to retrieve data from memory (which can also use the current input value), and the second method is used to save data in memory.”

Another form of agent mentioned by Chase is Auto-GPT, a program that can configure and deploy autonomous AI agents.

He said: “Auto-GPT provides long-term memory for interactions between agents and tools and uses retrieval vectors as storage (vector databases).”

A New LAMP Tech Stack?

Clearly, there is still much to be done in building applications based on LLMs. In the Build keynote, Microsoft positioned LangChain as part of its “Copilot tech stack’s orchestration layer.” In Microsoft’s system, orchestration includes prompt engineering and so-called “meta prompts.”

Microsoft has launched its own tool, Semantic Kernel, which has similar functionalities to LangChain. Microsoft also released a tool called Prompt Flow, which Microsoft’s CTO Kevin Scott referred to as “another orchestration mechanism integrating LangChain and Semantic Kernel.”

It is worth noting that the “Chain” in LangChain indicates that it can interoperate with other tools—not just various LLMs but also other development frameworks. In May of this year, Cloudflare announced that its Workers framework supports LangChain.

There has even been an acronym about LangChain: OPL, which stands for OpenAI, Pinecone, and LangChain. It may have been inspired by LAMP (Linux, Apache, MySQL, PHP/Perl/Python), a key tech stack from the 1990s that powered Web 2.0. We don’t know if OPL will become a technical term—certainly, its components are not all open-source—but in any case, it is a good sign that LangChain has become an important part of many developers’ personal tech stacks.

Disclaimer: This article is translated by InfoQ and reproduction without permission is prohibited.

Original link:

https://thenewstack.io/langchain-the-trendiest-web-framework-of-2023-thanks-to-ai/

Today’s Recommended Articles

How to Seamlessly Implement Authentication in Next.js Full-Stack Applications

A Brief Discussion on Issues Using Popover with Dialog

Which is Stronger in Modern Mobile Development: Native or Cross-Platform? JetBrains Expert: I Choose Flutter

Apple Releases Revolutionary Operating System VisionOS: This is a Shift from “sudo” to “Siri”

Leave a Comment