mcp-fetch
Language:
Python
Stars:
1
Forks:
2
Fetch MCP Server
A Model Context Protocol server that provides web content fetching capabilities. This server enables LLMs to retrieve and process content from web pages, converting HTML to markdown for easier consumption.
The fetch tool will truncate the response, but by using the start_index
argument, you can specify where to start the content extraction. This lets models read a webpage in chunks, until they find the information they need.
Available Tools
fetch
- Fetches a URL from the internet and extracts its contents as markdown.url
(string, required): URL to fetchmax_length
(integer, optional): Maximum number of characters to return (default: 5000)start_index
(integer, optional): Start content from this character index (default: 0)raw
(boolean, optional): Get raw content without markdown conversion (default: false)
Prompts
- fetch
- Fetch a URL and extract its contents as markdown
- Arguments:
url
(string, required): URL to fetch
Installation
Optionally: Install node.js, this will cause the fetch server to use a different HTML simplifier that is more robust.
Using uv (recommended)
When using uv
no specific installation is needed. We will
use uvx
to directly run mcp-server-fetch
.
Using PIP
Alternatively you can install mcp-server-fetch
via pip:
pip install mcp-server-fetch
Publisher info
More MCP servers built with Python
Create, backtest, and execute trades directly in one chat box. The Composer MCP Server gives LLMs the power to backtest investment ideas and execute automated trading strategies. Trade across stocks, ETFs, and crypto directly in Claude.
An MCP to generate presentations with AI. Create and edit PowerPoint presentations with AI.
The PaddleOCR MCP server brings enterprise-grade OCR and document parsing capabilities to AI applications. Built on PaddleOCR — a proven solution with 50,000+ GitHub stars, deeply integrated by leading projects like MinerU, RAGFlow, and OmniParser— with targeted optimizations based on the MCP concept.