Comparisons with 🦜🔗LangChain Agent
Just two days ago, 🤗Hugging Face released Transformers Agent — an agent that leverages natural language to decide on a tool from a curated collection of tools and achieve various tasks. Does it sound familiar? Yes, it does since it’s rather a lot like 🦜🔗LangChain Tools and Agents. On this blog post, I’ll cover what Transformers Agent is and its comparisons with 🦜🔗LangChain Agent.
You’ll be able to check out the code on this colab (provided by Hugging Face).
In brief, it provides a natural language API on top of transformers: we define a set of curated tools and design an agent to interpret natural language and to make use of these tools.
I can imagine engineers at HuggingFace be like: We have now so many amazing models hosted on HuggingFace. Can we integrate those with LLMs? Can we use LLMs to make a decision which model to make use of, write code, run code, and generate results? Essentially, no one must learn all of the complicated task-specific models anymore. Just give it a task, LLMs (agents) will do every part for us.
Listed here are the steps:
- Instruction: the prompt users provide
- Prompt: a prompt template with the precise instruction added, where it lists multiple tools to make use of.
- Tools: a curated list of transformers models, e.g., Flan-T5 for query answering,
- Agent: an LLM that interprets the query, decides which tools to make use of, and generates code to perform the duty with the tools.
- Restricted Python interpreter: execute Python code.
Step 1: Instantiate an agent.
Step 1 is to instantiate an agent. An agent is just an LLM, which will be an OpenAI model, a StarCoder model, or an OpenAssistant model.
The OpenAI model needs the OpenAI API key and the usage just isn’t free. We load the StarCoder model and the OpenAssistant model from the HuggingFace Hub, which requires HuggingFace Hub API key and it’s free to make use of.
from transformers import HfAgent# OpenAI
agent = OpenAiAgent(model="text-davinci-003", api_key="")
from transformers import OpenAiAgent
from huggingface_hub import login
login("")
# Starcoder
agent = HfAgent("https://api-inference.huggingface.co/models/bigcode/starcoder")
# OpenAssistant
agent = HfAgent(url_endpoint="https://api-inference.huggingface.co/models/OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5")
Step 2: Run the agent.
agent.run
is a single execution method and selects the tool for the duty routinely, e.g., select the image generator tool to create a picture.
agent.chat
keeps the chat history. For instance, here it knows we generated an image earlier and it might transform a picture.
Transformers Agent remains to be experimental. It’s rather a lot smaller scope and fewer flexible. The foremost focus of Transformers Agent at once is for using Transformer models and executing Python code, whereas LangChain Agent does “almost” every part. Let be break it right down to compare different components between Transformers and LangChain Agents:
Tools
- 🤗Hugging Face Transfomers Agent has a tremendous list of tools, each powered by transformer models. These tools offer three significant benefits: 1) Although Transformers Agent can only interact with few tools currently, it has the potential to speak with over 100,000 Hugging Face model. It possesses full multimodal capabilities, encompassing text, images, video, audio, and documents.; 2) Since these models are purpose-built for specific tasks, utilizing them will be more straightforward and yield more accurate results in comparison with relying solely on LLMs. For instance, as an alternative of designing the prompts for the LLM to perform text classification, we will simply deploy BART that’s designed for text classification; 3) These tools unlocked capabilities that LLMs alone can’t accomplish. Take BLIP, for instance, which enables us to generate fascinating image captions — a task beyond the scope of LLMs.
- 🦜🔗LangChain tools are all external APIs, reminiscent of Google Search, Python REPL. In reality, LangChain supports HuggingFace Tools via the
load_huggingface_tool
function. LangChain can potentially do plenty of things Transformers Agent can do already. However, Transformers Agents can potentially incorporate all of the LangChain tools as well. - In each cases, each tool is only a Python file. You’ll find the files of 🤗Hugging Face Transformers Agent tools here and 🦜🔗LangChain tools here. As you may see, each Python file incorporates one class indicating one tool.
Agent
- 🤗Hugging Face Transformers Agent uses this prompt template to find out which tool to make use of based on the tool’s description. It asks the LLM to offer an explanations and it provides some few-shots learning examples within the prompt.
- 🦜🔗LangChain by default uses the ReAct framework to find out which tool to make use of based on the tool’s description. The ReAct framework is described on this paper. It doesn’t only act on a choice but in addition provides thoughts and reasoning, which has similarities to the explanations Transformers Agent uses. As well as, 🦜🔗LangChain has 4 agent types.
Custom Agent
Making a custom agent just isn’t too difficult in each cases:
- See the HuggingFace Transformer Agent example towards the top of this colab.
- See the LangChain example here.
“Code-execution”
- 🤗Hugging Face Transformers Agent includes “code-execution” as one in all the steps after the LLM selects the tools and generates the code. This restricts the Transformers Agent’s goal to execute Python code.
- 🦜🔗LangChain includes “code-execution” as one in all its tools, which suggests that executing code just isn’t the last step of the entire process. This provides rather a lot more flexibility on what the duty goal is: it could possibly be executing Python code, or it may be something else like doing a Google Search and returning search results.
On this blog post, we explored the functionality of 🤗Hugging Face Transformers Agents and compared it to 🦜🔗LangChain Agents. I look ahead to witnessing further developments and advancements in Transformers Agent.
. . .
By Sophia Yang on May 12, 2023
Sophia Yang is a Senior Data Scientist. Connect with me on LinkedIn, Twitter, and YouTube and join the DS/ML Book Club ❤️