P

mcp-client-langchain-py

...
Created 1/12/2025byhideya

Categories

langchainmcpmcp-clientmodelcontextprotocolpythontool-calling

Language:

Python

Stars:

6

Forks:

2

MCP Client Using LangChain / Python License: MIT

This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.

It leverages a utility function convert_mcp_to_langchain_tools() from langchain_mcp_tools.
This function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools (List[BaseTool]).

LLMs from Anthropic, OpenAI and Groq are currently supported.

A typescript version of this MCP client is available here

Prerequisites

  • Python 3.11+
  • [optional] uv (uvx) installed to run Python package-based MCP servers
  • [optional] npm 7+ (npx) to run Node.js package-based MCP servers
  • API keys from Anthropic, OpenAI, and/or Groq as needed

Setup

  1. Install dependencies:

    make install
    
  2. Setup API keys:

    cp .env.template .env
    
    • Update .env as needed.
    • .gitignore is configured to ignore .env to prevent accidental commits of the credentials.
  3. Configure LLM and MCP Servers settings llm_mcp_config.json5 as needed.

    • The configuration file format for MCP servers follows the same structure as

                  [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
      

      with one difference: the key name mcpServers has been changed to mcp_servers to follow the snake_case convention commonly used in JSON configuration files.

    • The file format is JSON5, where comments and trailing commas are allowed.

    • The format is further extended to replace ${...} notations with the values of corresponding environment variables.

    • Keep all the credentials and private info in the .env file and refer to them with ${...} notation as needed.

Usage

Run the app:

make start

It takes a while on the first run.

Run in verbose mode:

make start-v

See commandline options:

make start-h

At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.

Example queries can be configured in llm_mcp_config.json5

Last updated: 3/31/2025

Publisher info

hideya's avatar

hideya

self-employed
tokyo
23
followers
8
following
95
repos

More MCP servers built with Python

Rodin MCP Server

Generate 3D models with Hyper3D Rodin's 3D generation AI

By DeemosTech
mcp-server-webcrawl

Bridge the gap between your web crawler and AI language models using Model Context Protocol (MCP). With mcp-server-webcrawl, your AI client filters and analyzes web content under your direction or autonomously, extracting insights from your web content. Support for WARC, wget, InterroBot, Katana, and SiteOne crawlers is available out of the gate. The server includes a full-text search interface with boolean support, resource filtering by type, HTTP status, and more.

By pragmar
armor-crypto-mcp

Armor Model Context Protocol (MCP) gives developers full access to the blockchain functionality of Armor Wallet. This includes cross-chain swaps, token data, bridging, wallet management, limit orders, staking, and many other features. With the Armor MCP, developers can integrate a complete suite of crypto tools available to their AI Agents quickly and easily for fast, reliable AI Agent development.

By Armor wallet