agent.pydevelopmentLangchainv1.0.0
LangChain: Code Explainer
Explains code with retrieval from your codebase. Answers how and why questions.
Setup time: ~15 min
Model: GPT-4o
Cost: ~$0.15/day
Last updated: Mar 16, 2026
Template
agent.py
# Install: pip install langchain langchain-openai langchain-community chromadb
from langchain_community.document_loaders import DirectoryLoader, TextLoader
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain_openai import ChatOpenAI, OpenAIEmbeddings
from langchain_community.vectorstores import Chroma
from langchain.chains import RetrievalQA
from langchain.prompts import PromptTemplate
import os
# --- Config ---
CODE_DIR = "./src"
CHROMA_DIR = "./chroma_code"
# --- Load source code ---
loader = DirectoryLoader(
CODE_DIR,
glob="**/*.{py,ts,js,go,rs}",
loader_cls=TextLoader,
loader_kwargs={"autodetect_encoding": True},
)
docs = loader.load()
splitter = RecursiveCharacterTextSplitter(
chunk_size=1500,
chunk_overlap=200,
separators=["\nclass ", "\ndef ", "\nfunction ", "\n\n", "\n"],
)
chunks = splitter.split_documents(docs)
# --- Build vector store ---
embeddings = OpenAIEmbeddings()
vectorstore = Chroma.from_documents(chunks, embeddings, persist_directory=CHROMA_DIR)
retriever = vectorstore.as_retriever(search_kwargs={"k": 5})
# --- Code-aware prompt ---
prompt = PromptTemplate(
template="""You are a senior developer explaining code to a teammate.
Use the code context below to answer the question.
Explain the logic, point out important patterns, and mention any gotchas.
If the answer is not in the context, say so.
Code context:
{context}
Question: {question}
Explanation:""",
input_variables=["context", "question"],
)
llm = ChatOpenAI(model="gpt-4o", temperature=0)
qa_chain = RetrievalQA.from_chain_type(
llm=llm,
retriever=retriever,
chain_type_kwargs={"prompt": prompt},
return_source_documents=True,
)
if __name__ == "__main__":
print("Code Explainer ready. Ask about any part of the codebase.")
while True:
query = input("\nQuestion: ")
if query.lower() == "quit":
break
result = qa_chain.invoke({"query": query})
print(f"\n{result['result']}")
files = set(d.metadata["source"] for d in result["source_documents"])
print(f"\nRelevant files: {', '.join(files)}")Setup
- 1
Copy the agent.py content above.
- 2
Create a Python virtual environment and install dependencies.
- 3
Set your OPENAI_API_KEY environment variable.
- 4
Run: python agent.py
Run with LangChain
This is a Python script using LangChain. Set up a virtual environment and install dependencies.
# 1. Create a virtual environment
python -m venv venv && source venv/bin/activate
# 2. Install dependencies
pip install langchain langchain-openai langchain-community chromadb
# 3. Set your API key
export OPENAI_API_KEY="sk-..."
# 4. Save the agent.py from above and run it
python agent.pyVersion History
v1.0.0Initial releaseMar 16, 2026
Framework
LangchainRequirements
Python 3.10+
OpenAI API key
Estimated cost
~$0.15/day
on GPT-4o model
File type
agent.py
Version
v1.0.0
Updated Mar 16, 2026
You might also like
HEARTBEAT.md
GitHub PR Reviewer
Auto-review new PRs during business hours. Post summaries, flag issues.
~$0.50/day · ~5 min setup
HEARTBEAT.md
CI Monitor
Watch GitHub Actions. Alert on failures with the relevant logs.
~$0.20/day · ~3 min setup
HEARTBEAT.md
Deploy Tracker
Track deployments across repos. Daily changelog of what shipped.
~$0.15/day · ~5 min setup