LangChain Chatbot Project Structure
Conversational chatbot with memory, message history, and UI-ready architecture.
Project Directory
myproject/
main.py
Chat loop entry
app/
__init__.py
config.py
chat/
Chat logic
__init__.py
chain.py
Conversational chain
prompts.py
System prompt
memory/
Conversation memory
__init__.py
buffer.py
Buffer memory
summary.py
Summary memory
history.py
Message history store
tools/
Optional tools
__init__.py
web_search.py
ui/
Optional frontend
streamlit_app.py
Streamlit UI
gradio_app.py
Gradio UI
requirements.txt
.env.example
.gitignore
Why This Structure?
A focused chatbot structure with proper conversation memory handling. The memory/ folder supports different memory strategies. Optional ui/ folder has Streamlit and Gradio templates for quick demos.
Key Directories
- app/chat/chain.py-Conversational chain with memory
- app/memory/-Buffer, summary, or hybrid memory
- app/memory/history.py-Persistent message storage
- ui/-Quick demo UIs (Streamlit, Gradio)
Memory Integration
from langchain.memory import ConversationBufferMemory
from langchain_core.runnables.history import (
RunnableWithMessageHistory
)
chain_with_history = RunnableWithMessageHistory(
chain, get_session_history,
input_messages_key="input",
history_messages_key="history"
)
When To Use This
- Customer support chatbots
- Personal assistant applications
- Interactive demos and prototypes
- Any multi-turn conversation use case
Trade-offs
- Memory growth-Long conversations need summarization
- Context limits-Token limits constrain history
- Simple scope-Complex workflows need multi-agent