P

higress-ai-search-mcp-server

...
Created 3/5/2025bycr7258

Language:

Python

Stars:

5

Forks:

2

Higress AI-Search MCP Server

Overview

A Model Context Protocol (MCP) server that provides an AI search tool to enhance AI model responses with real-time search results from various search engines through Higress ai-search feature.

Demo

Cline

https://github.com/user-attachments/assets/60a06d99-a46c-40fc-b156-793e395542bb

Claude Desktop

https://github.com/user-attachments/assets/5c9e639f-c21c-4738-ad71-1a88cc0bcb46

Features

  • Internet Search: Google, Bing, Quark - for general web information
  • Academic Search: Arxiv - for scientific papers and research
  • Internal Knowledge Search

Prerequisites

Configuration

The server can be configured using environment variables:

  • HIGRESS_URL(optional): URL for the Higress service (default: http://localhost:8080/v1/chat/completions).
  • MODEL(required): LLM model to use for generating responses.
  • INTERNAL_KNOWLEDGE_BASES(optional): Description of internal knowledge bases.

Option 1: Using uvx

Using uvx will automatically install the package from PyPI, no need to clone the repository locally.

{
  "mcpServers": {
    "higress-ai-search-mcp-server": {
      "command": "uvx",
      "args": [
        "higress-ai-search-mcp-server"
      ],
      "env": {
        "HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
        "MODEL": "qwen-turbo",

            
        
            
                        "INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
      }
    }
  }
}

Option 2: Using uv with local development

Using uv requires cloning the repository locally and specifying the path to the source code.

{
  "mcpServers": {
    "higress-ai-search-mcp-server": {
      "command": "uv",
      "args": [
        "--directory",
        "path/to/src/higress-ai-search-mcp-server",
        "run",
        "higress-ai-search-mcp-server"
      ],
      "env": {
        "HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
        "MODEL": "qwen-turbo",
        "INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
      }
    }
  }
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Last updated: 3/12/2025

Publisher info

cr7258's avatar

Se7en

19
followers
35
following
239
repos

More MCP servers built with Python

composer-trade-mcp

Create, backtest, and execute trades directly in one chat box. The Composer MCP Server gives LLMs the power to backtest investment ideas and execute automated trading strategies. Trade across stocks, ETFs, and crypto directly in Claude.

By https://github.com/ronnyli
slidespeak-mcp

An MCP to generate presentations with AI. Create and edit PowerPoint presentations with AI.

By https://github.com/SlideSpeak
PaddleOCR

The PaddleOCR MCP server brings enterprise-grade OCR and document parsing capabilities to AI applications. Built on PaddleOCR — a proven solution with 50,000+ GitHub stars, deeply integrated by leading projects like MinerU, RAGFlow, and OmniParser— with targeted optimizations based on the MCP concept.

By PaddlePaddle