When building conversational AI applications, one of the key challenges is maintaining context throughout the conversation. This is where LangChain's memory systems come into play. They allow your application to remember previous interactions and use that information to generate more relevant and coherent responses.
Let's dive into the different types of memory systems available in LangChain and how to implement them using Python.
Buffer Memory is the simplest form of memory in LangChain. It stores the last few interactions in a buffer.
from langchain.memory import ConversationBufferMemory memory = ConversationBufferMemory() memory.save_context({"input": "Hi"}, {"output": "Hello! How can I help you?"}) memory.save_context({"input": "What's the weather like?"}, {"output": "It's sunny today!"}) print(memory.load_memory_variables({}))
This will output the entire conversation history.
Summary Memory maintains a running summary of the conversation, which can be useful for longer interactions.
from langchain.memory import ConversationSummaryMemory from langchain.llms import OpenAI llm = OpenAI(temperature=0) memory = ConversationSummaryMemory(llm=llm) memory.save_context({"input": "Hi"}, {"output": "Hello! How can I help you?"}) memory.save_context({"input": "Can you tell me about AI?"}, {"output": "Certainly! AI stands for Artificial Intelligence..."}) print(memory.load_memory_variables({}))
This will output a summary of the conversation so far.
This type of memory focuses on tracking specific entities mentioned in the conversation.
from langchain.memory import ConversationEntityMemory from langchain.llms import OpenAI llm = OpenAI(temperature=0) memory = ConversationEntityMemory(llm=llm) memory.save_context({"input": "My name is Alice and I like cats"}, {"output": "Nice to meet you, Alice!"}) print(memory.load_memory_variables({"input": "What do I like?"}))
This will output information about the entity "Alice" and her preferences.
Effective chat history management is crucial for creating natural and context-aware conversations. Here are some best practices:
from langchain.memory import ConversationBufferWindowMemory memory = ConversationBufferWindowMemory(k=5) # Store last 5 interactions
from langchain.memory import ConversationTokenBufferMemory memory = ConversationTokenBufferMemory(llm=llm, max_token_limit=1000)
from langchain.memory import ConversationBufferMemory import time class TimedConversationBufferMemory(ConversationBufferMemory): def __init__(self, expiration_time=3600): super().__init__() self.expiration_time = expiration_time self.timestamps = [] def save_context(self, inputs, outputs): super().save_context(inputs, outputs) self.timestamps.append(time.time()) def load_memory_variables(self, inputs): current_time = time.time() self.chat_memory.messages = [ msg for msg, ts in zip(self.chat_memory.messages, self.timestamps) if current_time - ts < self.expiration_time ] return super().load_memory_variables(inputs) memory = TimedConversationBufferMemory(expiration_time=1800) # 30 minutes
import json from langchain.memory import ConversationBufferMemory class PersistentConversationBufferMemory(ConversationBufferMemory): def __init__(self, file_path): super().__init__() self.file_path = file_path def save_context(self, inputs, outputs): super().save_context(inputs, outputs) with open(self.file_path, 'w') as f: json.dump(self.chat_memory.messages, f) def load_memory_variables(self, inputs): try: with open(self.file_path, 'r') as f: self.chat_memory.messages = json.load(f) except FileNotFoundError: pass return super().load_memory_variables(inputs) memory = PersistentConversationBufferMemory('conversation_history.json')
By implementing these memory systems and chat history management techniques, you can create more engaging and context-aware conversational AI applications using LangChain and Python. Experiment with different memory types and management strategies to find the best fit for your specific use case.
15/11/2024 | Python
08/12/2024 | Python
15/11/2024 | Python
06/10/2024 | Python
05/11/2024 | Python
25/09/2024 | Python
15/11/2024 | Python
06/10/2024 | Python
14/11/2024 | Python
05/10/2024 | Python
15/11/2024 | Python
15/11/2024 | Python