# pyserver **Repository Path**: mr_jianlong/pyserver ## Basic Information - **Project Name**: pyserver - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2026-03-26 - **Last Updated**: 2026-03-26 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # PyServer - DeepSeek API Proxy FastAPI backend server that provides OpenAPI-style HTTP interfaces for DeepSeek AI API. ## Features - 🚀 FastAPI with automatic OpenAPI documentation - 🔌 DeepSeek API integration - 🔒 Environment-based configuration - 📝 Structured logging - 🌐 CORS support - 🐳 Poetry for dependency management ## Prerequisites - Python 3.11+ - Conda (for environment management) - Poetry (for dependency management) - DeepSeek API key ## Installation ### 1. Clone the repository ```bash git clone https://gitee.com/mr_jianlong/pyserver.git cd pyserver ``` ### 2. Create and activate Conda environment ```bash conda create -n py311 python=3.11 conda activate py311 ``` ### 3. Install Poetry ```bash pip install poetry ``` ### 4. Install dependencies ```bash poetry install ``` ### 5. Configure environment variables Copy the example environment file and update with your DeepSeek API key: ```bash cp .env.example .env ``` Edit `.env` file: ```env DEEPSEEK_API_KEY=your_deepseek_api_key_here DEEPSEEK_API_BASE_URL=https://api.deepseek.com DEEPSEEK_API_MODEL=deepseek-chat SERVER_HOST=0.0.0.0 SERVER_PORT=8000 SERVER_RELOAD=true LOG_LEVEL=INFO ``` **Note:** You need a valid DeepSeek API key from [DeepSeek Platform](https://platform.deepseek.com/). ## Running the Server ### Development mode ```bash poetry run python -m app.main ``` Or using uvicorn directly: ```bash poetry run uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload ``` ### Production mode ```bash poetry run uvicorn app.main:app --host 0.0.0.0 --port 8000 --workers 4 ``` ## API Documentation Once the server is running, visit: - OpenAPI Docs: http://localhost:8000/docs - ReDoc: http://localhost:8000/redoc - OpenAPI JSON: http://localhost:8000/openapi.json ## API Endpoints ### DeepSeek-style API #### Chat Completion (Non-Streaming) **POST** `/api/v1/chat/completions` Request body: ```json { "messages": [ { "role": "user", "content": "Hello, how are you?" } ], "temperature": 0.7, "max_tokens": 1000, "stream": false } ``` Response: ```json { "id": "chatcmpl-123", "object": "chat.completion", "created": 1677652288, "model": "deepseek-chat", "choices": [ { "index": 0, "message": { "role": "assistant", "content": "I'm doing well, thank you for asking!" }, "finish_reason": "stop" } ], "usage": { "prompt_tokens": 9, "completion_tokens": 12, "total_tokens": 21 } } ``` #### Chat Completion (Streaming) **POST** `/api/v1/chat/completions` with `stream: true` Request body: ```json { "messages": [ { "role": "user", "content": "Hello, how are you?" } ], "temperature": 0.7, "max_tokens": 1000, "stream": true } ``` Response (Server-Sent Events stream): ``` data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"role":"assistant","content":"I"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":"'m"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":" doing"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":" well"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":"stop"}]} data: [DONE] ``` #### Explicit Streaming Endpoint **POST** `/api/v1/chat/completions/stream` Same request body as above (stream parameter is ignored, always streams). ### Anthropic-style API #### Create Message (Non-Streaming) **POST** `/api/v1/anthropic/v1/messages` Request body: ```json { "model": "claude-3-5-sonnet-20241022", "messages": [ { "role": "user", "content": "Hello, Claude!" } ], "max_tokens": 1024, "temperature": 1.0, "stream": false } ``` Response: ```json { "id": "msg_01X2Y3Z4", "type": "message", "role": "assistant", "content": [ { "type": "text", "text": "Hello! How can I help you today?" } ], "model": "claude-3-5-sonnet-20241022", "stop_reason": "end_turn", "stop_sequence": null, "usage": { "input_tokens": 10, "output_tokens": 12 } } ``` #### Stream Message **POST** `/api/v1/anthropic/v1/messages/stream` Request body (same as non-streaming, with `stream: true`): ```json { "model": "claude-3-5-sonnet-20241022", "messages": [ { "role": "user", "content": "Hello, Claude!" } ], "max_tokens": 1024, "temperature": 1.0, "stream": true } ``` Response (Server-Sent Events): ``` event: {"type": "message_start", "message": {"id": "msg_01X2Y3Z4", "type": "message", "role": "assistant", "content": [{"type": "text", "text": ""}], "model": "claude-3-5-sonnet-20241022", "usage": {"input_tokens": 10, "output_tokens": 0}}} event: {"type": "content_block_delta", "index": 0, "delta": {"type": "text_delta", "text": "Hello"}} event: {"type": "content_block_delta", "index": 0, "delta": {"type": "text_delta", "text": "!"}} event: {"type": "message_stop"} ``` #### List Models **GET** `/api/v1/anthropic/v1/models` Response: ```json { "data": [ { "id": "claude-3-5-sonnet-20241022", "object": "model", "created": 1698962400, "owned_by": "anthropic", "description": "Claude 3.5 Sonnet (October 2024)" }, { "id": "claude-3-opus-20240229", "object": "model", "created": 1698962400, "owned_by": "anthropic", "description": "Claude 3 Opus (February 2024)" }, { "id": "claude-3-haiku-20240307", "object": "model", "created": 1698962400, "owned_by": "anthropic", "description": "Claude 3 Haiku (March 2024)" } ], "object": "list" } ``` #### Legacy Completion (Backward Compatibility) **POST** `/api/v1/anthropic/v1/complete` Request body: ```json { "model": "claude-3-5-sonnet-20241022", "prompt": "\n\nHuman: Hello, Claude!\n\nAssistant:", "max_tokens_to_sample": 1024, "temperature": 1.0, "stream": false } ``` ### Chat Completion (Streaming) **POST** `/api/v1/chat/completions` with `stream: true` Request body: ```json { "messages": [ { "role": "user", "content": "Hello, how are you?" } ], "temperature": 0.7, "max_tokens": 1000, "stream": true } ``` Response (Server-Sent Events stream): ``` data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"role":"assistant","content":"I"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":"'m"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":" doing"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":" well"},"finish_reason":null}]} data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"deepseek-chat","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":"stop"}]} data: [DONE] ``` ### Explicit Streaming Endpoint **POST** `/api/v1/chat/completions/stream` Same request body as above (stream parameter is ignored, always streams). ### Client Usage Examples #### DeepSeek-style API ##### Python with requests (non-streaming): ```python import requests response = requests.post( "http://localhost:8000/api/v1/chat/completions", json={ "messages": [{"role": "user", "content": "Hello"}], "stream": False } ) print(response.json()) ``` ##### Python with requests (streaming): ```python import requests import json response = requests.post( "http://localhost:8000/api/v1/chat/completions", json={ "messages": [{"role": "user", "content": "Hello"}], "stream": True }, stream=True ) for line in response.iter_lines(): if line: line = line.decode('utf-8') if line.startswith('data: '): data = line[6:] if data == '[DONE]': break try: chunk = json.loads(data) if 'choices' in chunk and chunk['choices']: delta = chunk['choices'][0].get('delta', {}) if 'content' in delta: print(delta['content'], end='', flush=True) except json.JSONDecodeError: pass ``` ##### JavaScript/TypeScript (streaming): ```javascript async function streamChat() { const response = await fetch('http://localhost:8000/api/v1/chat/completions', { method: 'POST', headers: { 'Content-Type': 'application/json', }, body: JSON.stringify({ messages: [{ role: 'user', content: 'Hello' }], stream: true }) }); const reader = response.body.getReader(); const decoder = new TextDecoder(); while (true) { const { done, value } = await reader.read(); if (done) break; const chunk = decoder.decode(value); const lines = chunk.split('\n'); for (const line of lines) { if (line.startsWith('data: ')) { const data = line.slice(6); if (data === '[DONE]') { return; } try { const parsed = JSON.parse(data); if (parsed.choices?.[0]?.delta?.content) { console.log(parsed.choices[0].delta.content); } } catch (e) { console.error('Parse error:', e); } } } } } ``` #### Anthropic-style API ##### Python with requests (non-streaming): ```python import requests response = requests.post( "http://localhost:8000/api/v1/anthropic/v1/messages", headers={ "anthropic-version": "2024-10-22", "Content-Type": "application/json" }, json={ "model": "claude-3-5-sonnet-20241022", "messages": [{"role": "user", "content": "Hello, Claude!"}], "max_tokens": 1024, "stream": False } ) print(response.json()) ``` ##### Python with requests (streaming): ```python import requests import json response = requests.post( "http://localhost:8000/api/v1/anthropic/v1/messages/stream", headers={ "anthropic-version": "2024-10-22", "Content-Type": "application/json", "Accept": "text/event-stream" }, json={ "model": "claude-3-5-sonnet-20241022", "messages": [{"role": "user", "content": "Hello, Claude!"}], "max_tokens": 1024, "stream": True }, stream=True ) for line in response.iter_lines(): if line: line = line.decode('utf-8') if line.startswith('event: '): data = line[7:] # Remove "event: " prefix try: event = json.loads(data) if event.get("type") == "content_block_delta": if event.get("delta", {}).get("type") == "text_delta": print(event["delta"]["text"], end='', flush=True) except json.JSONDecodeError: pass ``` ##### JavaScript/TypeScript (Anthropic streaming): ```javascript async function streamAnthropic() { const response = await fetch('http://localhost:8000/api/v1/anthropic/v1/messages/stream', { method: 'POST', headers: { 'Content-Type': 'application/json', 'anthropic-version': '2024-10-22', 'Accept': 'text/event-stream' }, body: JSON.stringify({ model: 'claude-3-5-sonnet-20241022', messages: [{ role: 'user', content: 'Hello, Claude!' }], max_tokens: 1024, stream: true }) }); const reader = response.body.getReader(); const decoder = new TextDecoder(); while (true) { const { done, value } = await reader.read(); if (done) break; const chunk = decoder.decode(value); const lines = chunk.split('\n'); for (const line of lines) { if (line.startsWith('event: ')) { const data = line.slice(7); try { const event = JSON.parse(data); if (event.type === 'content_block_delta' && event.delta?.type === 'text_delta') { console.log(event.delta.text); } } catch (e) { console.error('Parse error:', e); } } } } } ``` ##### Using official Anthropic SDK (compatible): ```python import anthropic # Point the client to our proxy server client = anthropic.Anthropic( api_key="your-api-key", # Not used, but required by SDK base_url="http://localhost:8000/api/v1/anthropic" ) # Non-streaming message = client.messages.create( model="claude-3-5-sonnet-20241022", max_tokens=1024, messages=[ {"role": "user", "content": "Hello, Claude!"} ] ) print(message.content) # Streaming with client.messages.stream( model="claude-3-5-sonnet-20241022", max_tokens=1024, messages=[ {"role": "user", "content": "Hello, Claude!"} ] ) as stream: for text in stream.text_stream: print(text, end="", flush=True) ``` ### Health Checks - **GET** `/` - Root endpoint - **GET** `/health` - Health check - **GET** `/api/v1/chat/health` - Chat service health ## Project Structure ``` pyserver/ ├── app/ │ ├── api/ │ │ └── v1/ │ │ ├── endpoints/ │ │ │ └── chat.py │ │ └── router.py │ ├── core/ │ │ └── config.py │ ├── models/ │ │ └── chat.py │ ├── services/ │ │ └── deepseek_service.py │ └── main.py ├── .env.example ├── .gitignore ├── pyproject.toml ├── poetry.lock └── README.md ``` ## Development ### Adding new endpoints 1. Create new endpoint file in `app/api/v1/endpoints/` 2. Add router in `app/api/v1/router.py` 3. Update models if needed in `app/models/` ### Running tests ```bash poetry run pytest ``` ### Code formatting ```bash poetry run black . poetry run isort . ``` ## Deployment ### Docker ```bash docker build -t pyserver . docker run -p 8000:8000 --env-file .env pyserver ``` ### Systemd Service Create `/etc/systemd/system/pyserver.service`: ```ini [Unit] Description=PyServer DeepSeek API After=network.target [Service] Type=simple User=www-data WorkingDirectory=/path/to/pyserver Environment="PATH=/path/to/conda/envs/py311/bin" ExecStart=/path/to/conda/envs/py311/bin/uvicorn app.main:app --host 0.0.0.0 --port 8000 --workers 4 Restart=always [Install] WantedBy=multi-user.target ``` ## License MIT ## Support For issues and feature requests, please create an issue on the Git repository.