P

mcp-server-collector

Created Oct 19, 2025 by chatmcp

Language:

Python

Stars:

18

Forks:

4

README

mcp-server-collector MCP server

A MCP Server used to collect MCP Servers over the internet.

Components

Resources

No resources yet.

Prompts

No prompts yet.

Tools

The server implements 3 tools:

  • extract-mcp-servers-from-url: Extracts MCP Servers from given URL.
    • Takes "url" as required string argument
  • extract-mcp-servers-from-content: Extracts MCP Servers from given content.
    • Takes "content" as required string argument
  • submit-mcp-server: Submits a MCP Server to the MCP Server Directory like mcp.so.
    • Takes "url" as required string argument and "avatar_url" as optional string argument

Configuration

.env file is required to be set up.

OPENAI_API_KEY="sk-xxx"
OPENAI_BASE_URL="https://api.openai.com/v1"
OPENAI_MODEL="gpt-4o-mini"

MCP_SERVER_SUBMIT_URL="https://mcp.so/api/submit-project"

Quickstart

Install

Claude Desktop

On MacOS: ~/Library/Application\ Support/Claude/claude_desktop_config.json On Windows: %APPDATA%/Claude/claude_desktop_config.json

Development/Unpublished Servers Configuration

  "mcpServers": {
    "fetch": {
      "command": "uvx",
      "args": ["mcp-server-fetch"]
    },
    "mcp-server-collector": {
      "command": "uv",
      "args": [
        "--directory",
        "path-to/mcp-server-collector",
        "run",
        "mcp-server-collector"
      ],
      "env": {
        "OPENAI_API_KEY": "sk-xxx",
        "OPENAI_BASE_URL": "https://api.openai.com/v1",
        "OPENAI_MODEL": "gpt-4o-mini",
        "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project"
      }
    }
  }

Published Servers Configuration

  "mcpServers": {
    "fetch": {
      "command": "uvx",
      "args": ["mcp-server-fetch"]
    },
    "mcp-server-collector": {
      "command": "uvx",
      "args": [
        "mcp-server-collector"
      ],
      "env": {
        "OPENAI_API_KEY": "sk-xxx",
        "OPENAI_BASE_URL": "https://api.openai.com/v1",
        "OPENAI_MODEL": "gpt-4o-mini",
        "MCP_SERVER_SUBMIT_URL": "https://mcp.so/api/submit-project"
      }
    }
  }

Development

Building and Publishing

To prepare the package for distribution:

  1. Sync dependencies and update lockfile:
uv sync
  1. Build package distributions:
uv build

This will create source and wheel distributions in the dist/ directory.

  1. Publish to PyPI:
uv publish

Note: You'll need to set PyPI credentials via environment variables or command flags:

  • Token: --token or UV_PUBLISH_TOKEN
  • Or username/password: --username/UV_PUBLISH_USERNAME and --password/UV_PUBLISH_PASSWORD

Debugging

Since MCP servers run over stdio, debugging can be challenging. For the best debugging experience, we strongly recommend using the MCP Inspector.

You can launch the MCP Inspector via npm with this command:

npx @modelcontextprotocol/inspector uv --directory path-to/mcp-server-collector run mcp-server-collector

Upon launching, the Inspector will display a URL that you can access in your browser to begin debugging.

Community

About the author

Last updated: Oct 19, 2025

Publisher info

chatmcp's avatar

chatmcp

24
followers
0
following
7
repos

More MCP servers built with Python

Stable Diffusion WebUI

Stable Diffusion web UI

By AUTOMATIC1111 160.1K
Transformers

🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.

By huggingface 155.5K
PyTorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration

By pytorch 96.8K