Building Chatbots That Actually Remember You: A LangChain Guide

Most chatbots are frustrating. You tell them your name, and two messages later, they’re asking for it again. You explain your problem, only to repeat yourself when transferred to another agent.

Here’s the good news: with LangChain, we can build chatbots that:

  • Remember details across conversations
  • Pull real-time data when needed
  • Feel more like talking to a helpful human than a scripted bot

I’ll walk you through creating one from scratch, using practical examples you can adapt immediately.

What Makes a Chatbot Actually Useful?

Before we dive into code, let’s establish what separates a “good” chatbot from the clunky ones we all hate:

  1. Memory
    • Remembers your name, preferences, and past questions
    • No more “As I mentioned earlier…” frustrations
  2. Real-Time Capabilities
    • Checks live inventory, weather, or schedules when needed
    • Doesn’t just rely on static responses
  3. Natural Flow
    • Handles topic changes gracefully
    • Knows when to fetch data vs. continue a conversation

Building Blocks for Our Smarter Chatbot

We’ll use three key LangChain features:

  1. Conversation Memory
    • ConversationBufferMemory: Perfect for short chats where every detail matters
    • ConversationSummaryMemory: Better for long conversations (condenses history)
  2. Custom Tools
    • Connect to APIs (weather, databases, etc.)
    • Execute Python functions on demand
  3. Adaptive Response Handling
    • Decides when to use memory vs. fetch fresh data

1. Setting Up the Basics

python

Copy

Download

from langchain.llms import OpenAI

from langchain.memory import ConversationBufferMemory

from langchain.chains import ConversationChain

# Initialize with gpt-3.5-turbo for cost efficiency

llm = OpenAI(model=”gpt-3.5-turbo-instruct”, temperature=0.5)  # Less randomness for consistency

# This will store our chat history

memory = ConversationBufferMemory()

# Create our conversational agent

chatbot = ConversationChain(

    llm=llm,

    memory=memory,

    verbose=True  # Shows thought process (helpful for debugging)

)

Test It Out:

python

Copy

Download

print(chatbot.run(“Hi, I’m Alex!”))

# Output: “Hello Alex! How can I assist you today?”

print(chatbot.run(“What’s my name?”))

# Output: “Your name is Alex!”

Already better than most customer service bots!

2. Adding Real-Time Data: Weather Lookup

Let’s make our bot useful by integrating live weather data.

python

Copy

Download

import requests

from langchain.tools import Tool

def get_weather(city):

    “””Mock function – replace with real API call”””

    weather_data = {

        “New York”: “72°F and sunny”,

        “Tokyo”: “68°F with light rain”,

        “London”: “58°F and cloudy”

    }

    return weather_data.get(city, “Weather data unavailable”)

# Register as a LangChain tool

weather_tool = Tool(

    name=”WeatherLookup”,

    func=get_weather,

    description=”Get current weather for a city”

)

Create an Agent That Uses Tools:

python

Copy

Download

from langchain.agents import initialize_agent

agent = initialize_agent(

    tools=[weather_tool],

    llm=llm,

    agent=”zero-shot-react-description”,

    verbose=True

)

3. Combining Memory and Tools

Here’s the magic—making our bot smart enough to know when to remember and when to fetch fresh data:

python

Copy

Download

def handle_query(user_input):

    if “weather” in user_input.lower():

        return agent.run(user_input)

    else:

        return chatbot.run(user_input)

# Test it

print(handle_query(“What’s the weather in Tokyo?”))

# Output: “The current weather in Tokyo is 68°F with light rain”

print(handle_query(“What did I ask earlier?”))

# Output: “You asked about the weather in Tokyo”

Real-World Example: Travel Assistant Bot

Let’s build something practical—a bot that helps plan trips while remembering preferences.

Sample Conversation:

text

Copy

Download

User: I’m planning a trip to Japan. 

Bot: Great! Tokyo, Kyoto, or somewhere else? 

User: Tokyo for 5 days. 

Bot: I’ll find hotels in Tokyo for 5 days. By the way, current weather there is 68°F with rain—pack an umbrella! 

User: Actually, make it 7 days. 

Bot: Updated to 7 days. Would you like recommendations for museums or nightlife? 

Key Features Demonstrated:

  • Remembers trip details (location, duration)
  • Pulls live weather data
  • Adapts to changes naturally

Pro Tips for Production

1. Balance Memory Length

For short chats (customer support): Use ConversationBufferMemory

For long conversations (therapy bots): Use ConversationSummaryMemory

2. Add Error Handling

python

Copy

Download

try:

    response = agent.run(user_input)

except Exception as e:

    response = “Sorry, I encountered an error. Let’s try something else.”

3. Make It Personal

Store user preferences in a database after they say:
“I’m allergic to shellfish” → Save to profile

Why This Approach Wins

Most chatbot frameworks force you to choose between:

  • Simple Q&A (no memory)
  • Complex dialog trees (hard to maintain)

LangChain gives you:

  • Human-like memory without complexity
  • Real-time data when needed
  • Natural conversations that adapt

The best part? You’re not locked into one service. Swap OpenAI for Anthropic, add a PostgreSQL memory store, or integrate with your CRM—all without rewriting your core logic.

Your Next Steps:
  1. Clone this working example
  2. Replace our mock weather function with the OpenWeather API
  3. Add a “favorite cities” feature that remembers preferences

Remember—the difference between an annoying bot and a helpful assistant often comes down to just one thing: It remembers.

Leave a Comment