A containerized AI chat API service built with Express.js and Docker, providing seamless access to multiple AI models through a unified interface.
- Multiple AI Models: Access to various AI models (Kimi, GLM, DeepSeek, GPT-OSS, Qwen)
- Streaming Support: Real-time streaming responses for better user experience
- Dockerized: Easy deployment with Docker and Docker Compose
- Production Ready: Health checks, logging, and monitoring
- Security: Bearer token authentication and secure configuration
- Scalable: Built with scalability in mind
- Docker
- Docker Compose
- Node.js 18+ (for development)
- Clone the repository
git clone https://github.com/TeamKillerX/Ryzenth-Docker-API
cd Ryzenth-Docker-API- Create environment file
cp .env.example .envEdit .env with your configuration:
AUTHORIZATION_KEY=admin1234
ENABLE_MIDDLEWARE_DEBUG=true
APP_HOST_PORT=7860- Start services
docker-compose up -d- Verify deployment
curl http://localhost:7980/healthdocker run -d \
-p 7980:7860 \
-e AUTHORIZATION_KEY="admin1234" \
--name ryzenth-api \
rendyprojects/chat-api:stable| Variable | Description | Default |
|---|---|---|
ENDPOINT_API_URL |
Ryzenth API endpoint | https://api.ryzenths.dpdns.org/api/chat |
RYZENTH_API_KEY |
Ryzenth service API key | Required |
AUTHORIZATION_KEY |
Your secure authentication key | Required |
APP_HOST_PORT |
Application port | 7860 |
NODE_ENV |
Environment mode | production |
LOG_LEVEL |
Logging level | info |
- ryzenth-chat-api: Main API service
- nginx-proxy: Reverse proxy (optional)
import requests
import json
url = "http://localhost:7980/api/v1/chat/completions"
headers = {
"Content-Type": "application/json",
"authorization": "Bearer your_secure_auth_key"
}
payload = {
"model": "kimi-k2",
"stream": False,
"messages": [
{"role": "user", "content": "What AI models are available?"}
]
}
response = requests.post(url, headers=headers, json=payload)
print(response.json())import requests
import json
url = "http://localhost:7980/api/v1/chat/completions"
headers = {
"Content-Type": "application/json",
"authorization": "Bearer your_secure_auth_key"
}
payload = {
"model": "gpt-oss-stable",
"stream": True,
"messages": [
{"role": "user", "content": "Explain AI in simple terms"}
]
}
response = requests.post(url, headers=headers, json=payload, stream=True)
for line in response.iter_lines():
if line:
print(line.decode('utf-8'))Regular Request:
curl -X POST http://localhost:7980/api/v1/chat/completions \
-H "authorization: Bearer your_secure_auth_key" \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-v3.1",
"stream": false,
"messages": [
{"role": "user", "content": "Hello, how are you?"}
]
}'Streaming Request:
curl -X POST http://localhost:7980/api/v1/chat/completions \
-H "authorization: Bearer your_secure_auth_key" \
-H "Content-Type: application/json" \
-d '{
"model": "kimi-k2",
"stream": true,
"messages": [
{"role": "user", "content": "Tell me a story"}
]
}'kimi-k2glm-4.6deepseek-v3.1gpt-oss-devgpt-oss-stableqwen3-coder
curl http://localhost:7980/healthResponse:
{
"status": "healthy",
"service": "Ryzenth Chat API",
"timestamp": "2025-01-15T10:30:00.000Z",
"version": "1.0"
}docker-compose logs -f ryzenth-chat-api- Set up reverse proxy (nginx configuration included)
- Configure SSL certificates
- Set up monitoring (logs, metrics)
- Configure backup strategies
# Clone and setup
git clone https://github.com/TeamKillerX/Ryzenth-Docker-API
cd Ryzenth-Docker-API
# Build Docker image
docker build -t ryzenth-api .- Bearer token authentication required
- Environment-based configuration
- Input validation and sanitization
- Rate limiting capabilities (with Redis)
- Secure headers and CORS configuration
{
"id": "chatcmpl-123",
"object": "chat.completion",
"created": 1677652288,
"model": "kimi-k2",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "Hello! How can I help you today?"
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
}{
"error": "Unauthorized: Invalid API key",
"status": 401,
"message": "The provided API key is not valid"
}-
Connection refused
- Check if Docker containers are running:
docker-compose ps - Verify port availability
- Check if Docker containers are running:
-
Authentication errors
- Verify
AUTHORIZATION_KEYin environment - Check Bearer token in request headers
- Verify
-
Model not available
- Check available models list
- Verify model name spelling
# View all logs
docker-compose logs
# View specific service logs
docker-compose logs ryzenth-chat-api
# Follow logs in real-time
docker-compose logs -f ryzenth-chat-api- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the GNU Affero General Public License v3.0 - see the LICENSE file for details.
- Developer: Randy W (@xtdevs, @xtsea)
- Credits: @xpushz on Telegram
- Organization: TeamKillerX
- Channel: @RendyProjects on Telegram
- GitHub Issues: Create an issue
- Telegram Channel: @RendyProjects
- Initial release
- Docker containerization
- Multiple AI model support
- Streaming capabilities
- Production-ready configuration
β Star this repo if you find it useful!