Summarize the following. Use a bulleted list for all key point. Do not include introductory material. If code is explained, use speech to text and extract the code into a code block. If you can Identify what models, tokenizers, embeddings or other files mentioned, create a list in the footer.
Title: “LangChain Quickstart Guide | Part 2”
Transcript: “hello guys welcome back this is the second part of quick starter guide of line chain if you just landed to this video I recommend you to watch the previous one because this is the continuation of the previous one in the first quick starter guide we went through building a language model application with llms but in this video we will be going through building a language model application with chat models so in the beginning I will explain you why is the need of chat models and I will show you what is the difference between llms and algebra chat models and then we will go through the code let’s get started what is chat models as it is said here you can use chat models instead of nlms meaning that the chat models are A variation of language models while chat models use large language models under the hood right but instead of me going through all the things in this documentation instead of me explaining all the things let’s ask the chat feature right in the line chain there is already a chat feature being integrated if you click this icon on the bottom right corner it opens a chat feature I have already asked the question but what is this is this is a chat feature where we can integrate with the particular document in line chain just to show you it is not for the random things but it works because this is the large language model I ask what is carrot and it’s sorry I couldn’t find any information in the documentation about that expect answer to be less accurate and then it provides the answer but you get the idea this is implemented the main focus is to get the information from the from the documentation itself and I asked here what is the difference between the large language models and chat models explain for a 15 year old this is how you can interact with the chat models or chat so that it gives the very easy answer right so it says here that llms and chat models are both types of models used in land chain but they have different purposes llms or large language models take or text string as input and return or text string and output this is what we went through the first video right but they are used for tasks like language translation and text generation for chat models on the other hand are designed specifically for conversational interfaces so they take in chat messages as input and output the chat messages it also provided the verified resources or sources that is all from the link chain documentation and I just ask here now explain the same thing for a developer then it also explains the same thing but little bit more advanced you can go through and play around this but why I am explaining this is because the documentation can be quite lengthy and if you want to know something quickly you can just use the chat feature so that it condenses all the information in that web page or documentation so now let’s move to the output as in the previous video all the components are almost the same so I will just walk you through that right first we need to install the packages so you can run the command and we need open AI because length JM does not do itself right we need to integrate this with one or more model providers so I’m using open AI for this case if you don’t have already the account please go through this link and create the account it is not free but in the first time they may provide you some free credits and then you need to provide the API key in this format just replace this string here and you are good to go now let’s go with the first one get a message completions from a chat model what we can do is we can import different class from link chain so in the beginning I have here langchain. chat models I import chat chat open Ai and then from Lang Lang chain dot schema and importing AI message human message and system message so there are different types of message that that are currently supported in Langston but mostly we will be using the human Ai and system message so I can just run this command and now what I can do is provide some content into that chat so here I am just initializing the object so here I am passing translate this sentence from English to Nepali ilog programming right so if I run this it must give me the output Translating that particular order to Nepali right so let’s see yeah okay it gives more live programming one person so that is how it works as I said before we can pass more multiple messages also so system message and human message and if you run this command it must keep the same output but just to demonstrate that it also takes the system message and you can give like okay you are a helpful assistant that translates English to Nepali or all the different things so based on the system message you can restrict the chat to to Output what you want and the good part here is that you can go one step further and generate completions for multiple set of messages using the generate module so here is batch messages it’s the same thing as before I provided two different messages and I’m passing that into the chat object and I with the generate module right so if I run this it will produce the same thing first it will translate the ilog programming to Nepalese and then I love artificial intelligence to Nepalese so why I’m using Netflix because I’m Netflix and I speak hey buddy so I just want to see how it outputs see here it says small eye programming one project that’s fine and this is more like clicking Buddhism okay that’s fine so you see the idea how it translates and the good part here with the chat models is that from the output we get we can even extract the specific things like token uses it’s already being printed here but with the result this is what we output right result dot llm underscore output and we can pass what we want to extract so if I run this it will provide the broken uses for us so prompt tokens is 77 completion tokens is 54 and the total and the total tokens is 131 so this is how we can go through this the next is the chat ROM template so instead of hard coding the text we want to ask we can actually provide similar to llms we can use the templating by using the message prompt template so what we can do here first I import the necessary class and I initialize the class here then I create a template here as before I said you are a helpful assistant that translates and now I am passing the input variable here so and then 2 and output variable I provide the system message prompt here I set system message from template Dot from template I passed that particular template similar to this I create a human template and there is a string I need to pass and human message from template Dot from template I pass that particular human template with these two templates being created we can now create the chat prompt template and from messages and we can pass the two system and human template this is quite straightforward like but you have the system message and then you have the human message and you pass that into the chat prompt template and when you pass this and now we need to get our chat completion from the formatted messages right so what we can do here we can just say chat prompt dot format prompt and input a language so what is the input language we want to provide you can provide anything from here I am providing English so this English will go into this input language the output language is Nepali it will go to this output language and I love programming it the text right it will go in the human template text and I just says two messages and if I run this cell it will do the same thing that I did in the first place but here we are not hard coding right we are just using the prompts now let’s go with the chains and chat models right so in real application we need to change different components together right similar to what we did in the llms in the previous video extending the previous example we can construct an llm chain which takes user input for Max E with a prompt template and then passes the formatted responses to the last language model that is what the change does right we can combine different chains or we have the prompt or the model and all these things we can chain together as in the previous example we are just importing the necessary class here and then all the things are same until this part but now there is the part of the chain here we have the chain and we have llm chain and we provide the first input at llm equals to chat that is the model and there is the prompt the chat prompt and then we are just running the prompt passing the English as it goes to the input language and Nepali goes to the output language and I love programming goes to the text as before and then if we run this it will provide the answer so you get the idea we did the first and second step are now combined together in this third step using the chains okay now comes the interesting part A genes and chart models here so far what we did was we run the chains in a predetermined order right but agents can also be used with chat models you can initialize one using the agent type so here we are going to use the chart zero shot react description as the Asian type so as I said in the previous video also in agents there are actually three crucial Parts there is tool that is chain models and there is the agents so if you don’t if you want to know more you can go through this link but I will create a Separated video all with these topics only but in this video what I’m going to do is going to do a Google search so we need to have the API key right so you can go through this sort api.com you can create account and have the API key and you can pass in this string here I have already done this so now I’ll go to the implementation part so here is all the class that I’m importing from here first I am just loading the language model and then here is the open AI model itself so the large language model we need to pass there and then there is the tools so we have to have the load tools and you see how it is also changing here but first we pass the sort API and then we have the llm math to do some mathematics for us also and then we pass the last language model so finally let’s initialize an agent with the tools the language model and the type of agent we want to use right so here is the agent we initialize the agent there is the tools we created here and there is the chat we initialized here and there is the agent right as I said before we are using the chart 0 short reaction description in this case and power was equals to 2 because we want to see what it is going behind the scene and here I am passing two questions actually who is the president of Finland so it needs to do the Google source and here is the second question following what is 2 plus 2 meaning that it needs to do some mathematics also right so now if I run this share foreign let’s see the output here it is saying here that entering new agent executor chain I haven’t shown here what to do right it is going to do by itself here it says the question who is the president of Finland what is two plus two third process is going here the first question requires a source while the second question requires the calculator you see the how it is doing itself and there here itself the action is Source action input is president of Finland and it dips the source and his files that okay president is though and for the second query I need to use the calculator tool action is calculator action input is two plus two you get the idea here and it says the answer is 4 and it provides the final answer solving it is to N4 so you get how we can integrate the chart models and do the Google Source by the way if you are using chat CPT you get the idea here that in the free version although now there are plugins which does the Google source and all different kind of things but in the free version still it does not do the things right but here in few lines of code we can do a Google Source we can do some mathematics and all these things very easily with launching now comes the memory part add state to the chains and agent so far what we did was the chains and the Agents we are going through were stateless right but you can use the memory with chains and agents to initialize the chat models the main difference between this and memory for llms is that rather than trying to condense all the previous messages into a string we can keep them as their own unique memory object the first part we are just importing all the necessary classes from the land chain and here we create a prompt right so chat prompt template Dot from messages and here as the as previous system message and human right there is some of the things called system message now the following is a friendly conversation and all these things you can pass for the chat model and there is the history and there is the input right and then as you see here we are passing the history right so here message placeholder is the history and human message either input of course and here we have just the last language model we are using chat open AI of course and there is the memory part where we are saying the conversation buffer memory and what we are saying here return message is equals to 3. and the conversation itself is conversation chain and now you see the main thing here there is a chain but with the conversation chain right so we are passing the memory as we just initialize here we are passing the prompt as we initialize here and we are passing the last language model so you see we are passing three different chains together and now we can say conversation dot predict and we did that we give the input here hi there let’s run this cell okay hello how can I assist you today so it gives the message right so now if I run the next one I’m doing well this is the random question right just having a conversation with an AI so now you will see that I am not passing any things but if you notice in the earlier video or earlier part of quic”
– This is the second part of the LangChain quickstart guide.
– The first part covered building a language model application with LLMS.
– This part focuses on building a language model application with chat models.
– Chat models are a variation of language models used for conversational interfaces.
– The chat feature in LangChain allows interaction with the documentation using chat models.
– Chat models take chat messages as input and output chat messages.
– LLMS and chat models have different purposes: LLMS for tasks like language translation and text generation, and chat models for conversational interfaces.
– The chat feature condenses information from the documentation for quick answers.
– Code examples demonstrate getting message completions from a chat model and generating completions for multiple sets of messages.
– The chat prompt template allows for templating the text instead of hardcoding it.
– Chains in chat models combine different components, allowing for the construction of an LLMS chain.
– Agents can be used with chat models, and the example shows using the ChartZeroShotReactDescription agent type.
– The example also demonstrates using the agent to perform a Google search and do mathematics.
– Memory can be added to chains and agents, allowing for stateful conversations.
– The conversation chain uses memory, prompt, and the last language model for a conversation.
– An example conversation with the AI is shown using the conversation chain.
continue
– The conversation chain example demonstrates passing memory, prompt, and the last language model to create a stateful conversation.
– The code shows initializing the conversation prompt template and setting the conversation buffer memory.
– The conversation chain predicts the output based on the input.
– The example shows a conversation with the AI by providing inputs and receiving responses.
List of files/models/tokenizers/embeddings mentioned (if any):
– LLMS (Large Language Models)
– Chat models
– OpenAI (model provider)
– LangChain (framework)
– chat.openai (module)
– langchain.schema (module)
– AI message (class)
– Human message (class)
– Sort API (API provider)
– LLM math (module)
– ChatOpenAI (model)
– ConversationChain (model)