langchain-mcp-tools-py-usage
Categories
Language:
Python
Stars:
3
Forks:
1
Simple MCP Client Using LangChain / Python 
This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.
It leverages a utility function convert_mcp_to_langchain_tools()
from
langchain_mcp_tools
.
This function handles parallel initialization of specified multiple MCP servers
and converts their available tools into a list of LangChain-compatible tools
(List[BaseTool]).
Anthropic's claude-3-5-sonnet-latest
is used as the LLM.
For convenience, code for OpenAI's gpt-4o
is also included and commented out.
A bit more realistic (conversational) MCP Client is available here
A typescript equivalent of this MCP client is available here
Prerequisites
- Python 3.11+
- [optional]
uv
(uvx
) installed to run Python package-based MCP servers - [optional] npm 7+ (
npx
) to run Node.js package-based MCP servers - API key from Anthropic (or OpenAI)
Usage
-
Install dependencies:
make install
-
Setup API key:
cp .env.template .env
- Update
.env
as needed. .gitignore
is configured to ignore.env
to prevent accidental commits of the credentials.
- Update
-
Run the app:
make start
It takes a while on the first run.
Publisher info
More MCP servers built with Python
Unified Context Layer (UCL) is a multi-tenant Model Context Protocol (MCP) server that enables AI agents, automation platforms, and applications to connect to over 1,000 SaaS tools—such as Slack, Jira, Gmail, Shopify, Notion, and more—via a single standardized /command endpoint.
Bridge the gap between design and code. Send pixel-perfect website components directly to Cursor or Claude Code using Model Context Protocol (MCP). No more screenshots or descriptions needed.