A simple LangGraph agent quickstart template that demonstrates the basic structure of building an AI agent with LangGraph framework.
- 🚀 Quick Start: Get started with LangGraph in minutes
- 🔧 Modular Design: Clean separation of concerns with dedicated modules for graph, state, tools, and prompts
- 🛠️ Tool Integration: Built-in example tools (current time getter)
- 🎯 React Agent: Uses LangGraph's prebuilt React agent pattern
- 📦 Modern Package Management: Uses
uv
for fast and reliable dependency management - 🔑 Flexible API Support: Compatible with OpenAI API and DeepSeek API
src/
├── agent/
│ ├── __init__.py
│ ├── graph.py # Main LangGraph graph definition
│ ├── state.py # Agent state management
│ ├── tools.py # Custom tools and functions
│ ├── prompts.py # System prompts and templates
│ └── utils.py # Utility functions for agent creation
├── langgraph.json # LangGraph configuration
└── pyproject.toml # Project dependencies and metadata
- Python >= 3.13
- uv package manager
- OpenAI API key or DeepSeek API key
-
Clone the repository
git clone https://github.com/hfyydd/a_simple_agent_quickstart.git cd a_simple_agent_quickstart
-
Install dependencies using uv
uv sync
-
Set up environment variables
# Copy the example environment file cp .env.example .env # Edit .env and add your API keys # For OpenAI: OPENAI_API_KEY=your_openai_api_key_here MODEL_NAME=gpt-4 # For DeepSeek: DEEPSEEK_API_KEY=your_deepseek_api_key_here DEEPSEEK_BASE_URL=https://api.deepseek.com DEEPSEEK_MODEL=deepseek-chat
Run the agent in development mode with LangGraph CLI:
uv run langgraph dev
This will start the development server and you can interact with your agent through the LangGraph Studio interface.
You can also use the agent directly in your Python code:
from src.agent.graph import graph
# Run the agent
result = await graph.ainvoke({
"messages": [{"role": "user", "content": "What time is it?"}]
})
print(result["messages"][-1]["content"])
Defines the agent's state structure using TypedDict:
class State(TypedDict):
messages: Annotated[list, add_messages]
Contains the main LangGraph graph with:
- Node definitions (chatbot_node)
- Edge connections
- Graph compilation
Custom tools available to the agent:
get_current_time
: Returns current timestamp
System prompts and templates for agent behavior customization.
Utility functions for agent creation and configuration.
- Define your tool in
src/agent/tools.py
:
@tool
def my_custom_tool(input: str) -> str:
"""Description of what this tool does"""
# Your tool logic here
return result
- Import and add it to the tools list in
graph.py
:
from src.agent.tools import get_current_time, my_custom_tool
agent = create_agent("chatbot", llm, [get_current_time, my_custom_tool], system_prompt)
Edit src/agent/prompts.py
to customize the agent's behavior:
system_prompt = """
Your custom system prompt here.
Define the agent's role, capabilities, and behavior.
"""
Extend the graph by adding new nodes in graph.py
:
async def my_new_node(state: State):
# Your node logic here
return {"messages": [new_message]}
builder.add_node("my_node", my_new_node)
builder.add_edge("chatbot", "my_node")
The agent supports multiple LLM providers:
- OpenAI: Set
OPENAI_API_KEY
andMODEL_NAME
- DeepSeek: Set
DEEPSEEK_API_KEY
,DEEPSEEK_BASE_URL
, andDEEPSEEK_MODEL
- Other OpenAI-compatible APIs: Configure base URL and API key accordingly
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
This project is open source and available under the MIT License.
If you encounter any issues or have questions, please:
- Check the LangGraph documentation
- Open an issue in this repository
- Join the LangChain community discussions