the docker file
هذا الالتزام موجود في:
35
Dockerfile
Normal file
35
Dockerfile
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
# syntax=docker/dockerfile:1.7
|
||||||
|
FROM python:3.12-slim
|
||||||
|
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1 \
|
||||||
|
PYTHONUNBUFFERED=1 \
|
||||||
|
UV_PROJECT_ENV=.venv
|
||||||
|
|
||||||
|
ENV DEBIAN_FRONTEND=noninteractive
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install system tools + uv package manager
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends curl ca-certificates \
|
||||||
|
&& rm -rf /var/lib/apt/lists/* \
|
||||||
|
&& curl -LsSf https://astral.sh/uv/install.sh | sh \
|
||||||
|
&& ln -s /root/.local/bin/uv /usr/local/bin/uv
|
||||||
|
|
||||||
|
# Copy dependency files first to leverage Docker layer caching
|
||||||
|
COPY pyproject.toml uv.lock requirements.txt ./
|
||||||
|
|
||||||
|
# Install dependencies in .venv with uv (cached layer)
|
||||||
|
RUN --mount=type=cache,target=/root/.cache/uv \
|
||||||
|
uv sync --frozen --no-dev
|
||||||
|
|
||||||
|
# Copy the rest of the source code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Add virtual environment to PATH
|
||||||
|
ENV PATH="/app/.venv/bin:${PATH}"
|
||||||
|
|
||||||
|
# Environment variable to be set at runtime
|
||||||
|
# ENV SERPER_API_KEY=your_key
|
||||||
|
|
||||||
|
# Default command to start MCP server via stdio
|
||||||
|
CMD ["uv","run","--with","mcp","mcp","run","server.py"]
|
208
README.md
208
README.md
@@ -1,3 +1,207 @@
|
|||||||
# MCP_SERCVER_SEARCH
|
# 📚 MCP Documentation Server (Version 1)
|
||||||
|
|
||||||
This MCP Server helps you search for some docs like langchain, lammaindex, only using the Query or Prompet
|
This project is an **MCP (Model Context Protocol) Server** designed to help fetch and search the latest documentation for libraries such as **`LangChain`**, **`OpenAI`**, and **`LlamaIndex`**.
|
||||||
|
It allows you to query documentation sources and retrieve clean, plain text using a combination of **`Google Serper API`** for search and **`BeautifulSoup`** for scraping.
|
||||||
|
|
||||||
|
⚡ This is **Version 1 (In Progress)** — future versions will expand with more features and libraries.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 Features
|
||||||
|
|
||||||
|
- 🔍 Search official docs for supported libraries:
|
||||||
|
- [LangChain](https://python.langchain.com/docs)
|
||||||
|
- [OpenAI](https://platform.openai.com/docs)
|
||||||
|
- [LlamaIndex](https://docs.llamaindex.ai/en/stable)
|
||||||
|
- 🌐 Fetch and clean documentation text directly from web pages.
|
||||||
|
- ⚙️ Exposed as an **MCP Tool** via `server.py`.
|
||||||
|
- ⏳ Handles timeouts gracefully to avoid breaking workflows.
|
||||||
|
- 🔑 Configurable using `.env` with your **Serper API Key**.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📂 Project Structure
|
||||||
|
|
||||||
|
```bash
|
||||||
|
documentation/
|
||||||
|
│── services/
|
||||||
|
│ │──── scrape.py
|
||||||
|
│ │──── search.py
|
||||||
|
│
|
||||||
|
│── tools/
|
||||||
|
│ │──── get_docs.py
|
||||||
|
│
|
||||||
|
│──── config.py
|
||||||
|
│──── server.py
|
||||||
|
│──── pyproject.toml
|
||||||
|
│──── .env.example
|
||||||
|
│──── .gitignore
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ⚡ Setup & Installation
|
||||||
|
|
||||||
|
This project uses [**uv**](https://docs.astral.sh/uv/) for dependency management.
|
||||||
|
|
||||||
|
### 1️⃣Clone the repository
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone <your-repo-url>
|
||||||
|
cd documentation
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2️⃣ Create a virtual environment
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv venv
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3️⃣ Activate the virtual environment
|
||||||
|
|
||||||
|
Linux/macOS:
|
||||||
|
```bash
|
||||||
|
source .venv/bin/activate
|
||||||
|
```
|
||||||
|
Windows (PowerShell):
|
||||||
|
```bash
|
||||||
|
.venv\Scripts\Activate
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4️⃣ Install dependencies
|
||||||
|
|
||||||
|
```bash
|
||||||
|
uv sync
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🔑 Environment Variables
|
||||||
|
|
||||||
|
Copy .env.example → .env and set your Serper API key:
|
||||||
|
```bash
|
||||||
|
SERPER_API_KEY="your_api_key_here"
|
||||||
|
```
|
||||||
|
You can get an API key from [Serper.dev](https://serper.dev/)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ▶️ Running the MCP Server
|
||||||
|
|
||||||
|
Start the server with:
|
||||||
|
```bash
|
||||||
|
uv run --with mcp mcp run server.py
|
||||||
|
```
|
||||||
|
This runs the MCP server over stdio, making it available as a tool.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🛠 How It Works
|
||||||
|
|
||||||
|
1. get_docs MCP tool
|
||||||
|
- Input: query + library
|
||||||
|
- Example: query="embedding models", library="openai"
|
||||||
|
- Process:
|
||||||
|
- Builds a site-specific query (site:platform.openai.com/docs embedding models)
|
||||||
|
- Fetches results using Serper API
|
||||||
|
- Scrapes and cleans text from found pages
|
||||||
|
- Output: plain text documentation snippets
|
||||||
|
2. Can be integrated into other MCP-compatible apps for automated documentation search & retrieval.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 📌 Notes
|
||||||
|
|
||||||
|
- This is Version 1 — focused on core functionality.
|
||||||
|
- Future improvements:
|
||||||
|
- Support for more libraries/frameworks.
|
||||||
|
- Improved result filtering & ranking.
|
||||||
|
- Error handling & logging improvements.
|
||||||
|
- Possible caching layer for repeated queries.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🐳 Docker Support
|
||||||
|
|
||||||
|
This project also provides a Dockerfile so you can containerize the MCP Documentation Server.
|
||||||
|
|
||||||
|
- Why Docker?
|
||||||
|
- Consistency: run the server with the same environment on any machine.
|
||||||
|
- Isolation: dependencies are fully contained, no need to pollute your local Python environment.
|
||||||
|
- Portability: easy to share or deploy on servers/CI pipelines.
|
||||||
|
- Caching: faster rebuilds thanks to Docker layer caching.
|
||||||
|
|
||||||
|
- When do you need it?
|
||||||
|
- If you want to run the MCP server without installing uv or Python locally.
|
||||||
|
- If you plan to deploy on a server or inside CI/CD pipelines.
|
||||||
|
- If you want a reproducible environment across your team.
|
||||||
|
|
||||||
|
- How to use it?
|
||||||
|
1. Build the image:
|
||||||
|
```bash
|
||||||
|
docker build -t docs-mcp:dev -f Dockerfile .
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Run with your .env file (containing SERPER_API_KEY):
|
||||||
|
```bash
|
||||||
|
docker run --rm -it --env-file .env docs-mcp:dev
|
||||||
|
```
|
||||||
|
|
||||||
|
That’s it — the server will start inside the container and be available to any MCP-compatible client (e.g. Claude Desktop).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 💻 Running Locally with Claude Desktop
|
||||||
|
|
||||||
|
You can also connect this MCP Documentation Server directly to Claude Desktop on your machine.
|
||||||
|
|
||||||
|
Steps:
|
||||||
|
|
||||||
|
1. Install Claude Desktop
|
||||||
|
- Make sure Claude Desktop is already installed on your system.
|
||||||
|
|
||||||
|
2. Enable Developer Mode
|
||||||
|
- Open Claude Desktop.
|
||||||
|
- Go to Developer Mode.
|
||||||
|
- Click Add Configuration.
|
||||||
|
This will generate a local file called:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
claude_desktop_config.json
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Edit the Configuration File
|
||||||
|
Open claude_desktop_config.json and add a new entry like this (adjust the paths to match your machine):
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"mcpServers": {
|
||||||
|
"documentation": {
|
||||||
|
"command": "C:\\Users\\Alawakey\\Desktop\\MCP Server\\documentation\\.venv\\Scripts\\python.exe",
|
||||||
|
"args": ["C:\\Users\\Alawakey\\Desktop\\MCP Server\\documentation\\server.py"],
|
||||||
|
"cwd": "C:\\Users\\Alawakey\\Desktop\\MCP Server\\documentation"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
- `command`: path to your Python executable inside the virtual environment.
|
||||||
|
- `args`: path to the `server.py` file.
|
||||||
|
- `cwd`: working directory of your project.
|
||||||
|
|
||||||
|
4. Restart Claude Desktop
|
||||||
|
- Fully close Claude Desktop and start it again.
|
||||||
|
|
||||||
|
5. Verify the MCP Server
|
||||||
|
- In Claude Desktop, go to Search and Tools (near the text input).
|
||||||
|
- You should see your MCP server listed under the name you provided (`documentation` in the example above).
|
||||||
|
|
||||||
|
6. Try some queries
|
||||||
|
- Example prompts:
|
||||||
|
```bash
|
||||||
|
how do i implement a chroma db in langchain?
|
||||||
|
```
|
||||||
|
```bash
|
||||||
|
Use get_docs with query="ChatOpenAI" and library="openai"
|
||||||
|
```
|
||||||
|
Claude will ask for permission the first time you use the tool. After granting it, the `get_docs` tool will fetch results from the documentation.
|
||||||
|
|
||||||
|
---
|
المرجع في مشكلة جديدة
حظر مستخدم