mirror of
https://github.com/samanhappy/mcphub.git
synced 2025-12-31 20:00:00 -05:00
Compare commits
9 Commits
v0.11.9
...
copilot/im
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b67111219d | ||
|
|
f280cd593d | ||
|
|
d25a85a1bf | ||
|
|
1b752827a5 | ||
|
|
8ae542bdab | ||
|
|
88ce94b988 | ||
|
|
7cc330e721 | ||
|
|
ab338e80a7 | ||
|
|
b00e1c81fc |
272
.github/copilot-instructions.md
vendored
272
.github/copilot-instructions.md
vendored
@@ -1,272 +0,0 @@
|
|||||||
# MCPHub Coding Instructions
|
|
||||||
|
|
||||||
**ALWAYS follow these instructions first and only fallback to additional search and context gathering if the information here is incomplete or found to be in error.**
|
|
||||||
|
|
||||||
## Project Overview
|
|
||||||
|
|
||||||
MCPHub is a TypeScript/Node.js MCP (Model Context Protocol) server management hub that provides unified access through HTTP endpoints. It serves as a centralized dashboard for managing multiple MCP servers with real-time monitoring, authentication, and flexible routing.
|
|
||||||
|
|
||||||
**Core Components:**
|
|
||||||
|
|
||||||
- **Backend**: Express.js + TypeScript + ESM (`src/server.ts`)
|
|
||||||
- **Frontend**: React/Vite + Tailwind CSS (`frontend/`)
|
|
||||||
- **MCP Integration**: Connects multiple MCP servers (`src/services/mcpService.ts`)
|
|
||||||
- **Authentication**: JWT-based with bcrypt password hashing
|
|
||||||
- **Configuration**: JSON-based MCP server definitions (`mcp_settings.json`)
|
|
||||||
- **Documentation**: API docs and usage instructions(`docs/`)
|
|
||||||
|
|
||||||
## Working Effectively
|
|
||||||
|
|
||||||
### Bootstrap and Setup (CRITICAL - Follow Exact Steps)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Install pnpm if not available
|
|
||||||
npm install -g pnpm
|
|
||||||
|
|
||||||
# Install dependencies - takes ~30 seconds
|
|
||||||
pnpm install
|
|
||||||
|
|
||||||
# Setup environment (optional)
|
|
||||||
cp .env.example .env
|
|
||||||
|
|
||||||
# Build and test to verify setup
|
|
||||||
pnpm lint # ~3 seconds - NEVER CANCEL
|
|
||||||
pnpm backend:build # ~5 seconds - NEVER CANCEL
|
|
||||||
pnpm test:ci # ~16 seconds - NEVER CANCEL. Set timeout to 60+ seconds
|
|
||||||
pnpm frontend:build # ~5 seconds - NEVER CANCEL
|
|
||||||
pnpm build # ~10 seconds total - NEVER CANCEL. Set timeout to 60+ seconds
|
|
||||||
```
|
|
||||||
|
|
||||||
**CRITICAL TIMING**: These commands are fast but NEVER CANCEL them. Always wait for completion.
|
|
||||||
|
|
||||||
### Development Environment
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Start both backend and frontend (recommended for most development)
|
|
||||||
pnpm dev # Backend on :3001, Frontend on :5173
|
|
||||||
|
|
||||||
# OR start separately (required on Windows, optional on Linux/macOS)
|
|
||||||
# Terminal 1: Backend only
|
|
||||||
pnpm backend:dev # Runs on port 3000 (or PORT env var)
|
|
||||||
|
|
||||||
# Terminal 2: Frontend only
|
|
||||||
pnpm frontend:dev # Runs on port 5173, proxies API to backend
|
|
||||||
```
|
|
||||||
|
|
||||||
**NEVER CANCEL**: Development servers may take 10-15 seconds to fully initialize all MCP servers.
|
|
||||||
|
|
||||||
### Build Commands (Production)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Full production build - takes ~10 seconds total
|
|
||||||
pnpm build # NEVER CANCEL - Set timeout to 60+ seconds
|
|
||||||
|
|
||||||
# Individual builds
|
|
||||||
pnpm backend:build # TypeScript compilation - ~5 seconds
|
|
||||||
pnpm frontend:build # Vite build - ~5 seconds
|
|
||||||
|
|
||||||
# Start production server
|
|
||||||
pnpm start # Requires dist/ and frontend/dist/ to exist
|
|
||||||
```
|
|
||||||
|
|
||||||
### Testing and Validation
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Run all tests - takes ~16 seconds with 73 tests
|
|
||||||
pnpm test:ci # NEVER CANCEL - Set timeout to 60+ seconds
|
|
||||||
|
|
||||||
# Development testing
|
|
||||||
pnpm test # Interactive mode
|
|
||||||
pnpm test:watch # Watch mode for development
|
|
||||||
pnpm test:coverage # With coverage report
|
|
||||||
|
|
||||||
# Code quality
|
|
||||||
pnpm lint # ESLint - ~3 seconds
|
|
||||||
pnpm format # Prettier formatting - ~3 seconds
|
|
||||||
```
|
|
||||||
|
|
||||||
**CRITICAL**: All tests MUST pass before committing. Do not modify tests to make them pass unless specifically required for your changes.
|
|
||||||
|
|
||||||
## Manual Validation Requirements
|
|
||||||
|
|
||||||
**ALWAYS perform these validation steps after making changes:**
|
|
||||||
|
|
||||||
### 1. Basic Application Functionality
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Start the application
|
|
||||||
pnpm dev
|
|
||||||
|
|
||||||
# Verify backend responds (in another terminal)
|
|
||||||
curl http://localhost:3000/api/health
|
|
||||||
# Expected: Should return health status
|
|
||||||
|
|
||||||
# Verify frontend serves
|
|
||||||
curl -I http://localhost:3000/
|
|
||||||
# Expected: HTTP 200 OK with HTML content
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. MCP Server Integration Test
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Check MCP servers are loading (look for log messages)
|
|
||||||
# Expected log output should include:
|
|
||||||
# - "Successfully connected client for server: [name]"
|
|
||||||
# - "Successfully listed [N] tools for server: [name]"
|
|
||||||
# - Some servers may fail due to missing API keys (normal in dev)
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Build Verification
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Verify production build works
|
|
||||||
pnpm build
|
|
||||||
node scripts/verify-dist.js
|
|
||||||
# Expected: "✅ Verification passed! Frontend and backend dist files are present."
|
|
||||||
```
|
|
||||||
|
|
||||||
**NEVER skip these validation steps**. If any fail, debug and fix before proceeding.
|
|
||||||
|
|
||||||
## Project Structure and Key Files
|
|
||||||
|
|
||||||
### Critical Backend Files
|
|
||||||
|
|
||||||
- `src/index.ts` - Application entry point
|
|
||||||
- `src/server.ts` - Express server setup and middleware
|
|
||||||
- `src/services/mcpService.ts` - **Core MCP server management logic**
|
|
||||||
- `src/config/index.ts` - Configuration management
|
|
||||||
- `src/routes/` - HTTP route definitions
|
|
||||||
- `src/controllers/` - HTTP request handlers
|
|
||||||
- `src/dao/` - Data access layer (supports JSON file & PostgreSQL)
|
|
||||||
- `src/db/` - TypeORM entities & repositories (for PostgreSQL mode)
|
|
||||||
- `src/types/index.ts` - TypeScript type definitions
|
|
||||||
|
|
||||||
### DAO Layer (Dual Data Source)
|
|
||||||
|
|
||||||
MCPHub supports **JSON file** (default) and **PostgreSQL** storage:
|
|
||||||
|
|
||||||
- Set `USE_DB=true` + `DB_URL=postgresql://...` to use database
|
|
||||||
- When modifying data structures, update: `src/types/`, `src/dao/`, `src/db/entities/`, `src/db/repositories/`, `src/utils/migration.ts`
|
|
||||||
- See `AGENTS.md` for detailed DAO modification checklist
|
|
||||||
|
|
||||||
### Critical Frontend Files
|
|
||||||
|
|
||||||
- `frontend/src/` - React application source
|
|
||||||
- `frontend/src/pages/` - Page components (development entry point)
|
|
||||||
- `frontend/src/components/` - Reusable UI components
|
|
||||||
- `frontend/src/utils/fetchInterceptor.js` - Backend API interaction
|
|
||||||
|
|
||||||
### Configuration Files
|
|
||||||
|
|
||||||
- `mcp_settings.json` - **MCP server definitions and user accounts**
|
|
||||||
- `package.json` - Dependencies and scripts
|
|
||||||
- `tsconfig.json` - TypeScript configuration
|
|
||||||
- `jest.config.cjs` - Test configuration
|
|
||||||
- `.eslintrc.json` - Linting rules
|
|
||||||
|
|
||||||
### Docker and Deployment
|
|
||||||
|
|
||||||
- `Dockerfile` - Multi-stage build with Python base + Node.js
|
|
||||||
- `entrypoint.sh` - Docker startup script
|
|
||||||
- `bin/cli.js` - NPM package CLI entry point
|
|
||||||
|
|
||||||
## Development Process and Conventions
|
|
||||||
|
|
||||||
### Code Style Requirements
|
|
||||||
|
|
||||||
- **ESM modules**: Always use `.js` extensions in imports, not `.ts`
|
|
||||||
- **English only**: All code comments must be written in English
|
|
||||||
- **TypeScript strict**: Follow strict type checking rules
|
|
||||||
- **Import style**: `import { something } from './file.js'` (note .js extension)
|
|
||||||
|
|
||||||
### Key Configuration Notes
|
|
||||||
|
|
||||||
- **MCP servers**: Defined in `mcp_settings.json` with command/args
|
|
||||||
- **Endpoints**: `/mcp/{group|server}` and `/mcp/$smart` for routing
|
|
||||||
- **i18n**: Frontend uses react-i18next with files in `locales/` folder
|
|
||||||
- **Authentication**: JWT tokens with bcrypt password hashing
|
|
||||||
- **Default credentials**: admin/admin123 (configured in mcp_settings.json)
|
|
||||||
|
|
||||||
### Development Entry Points
|
|
||||||
|
|
||||||
- **Add MCP server**: Modify `mcp_settings.json` and restart
|
|
||||||
- **New API endpoint**: Add route in `src/routes/`, controller in `src/controllers/`
|
|
||||||
- **Frontend feature**: Start from `frontend/src/pages/` or `frontend/src/components/`
|
|
||||||
- **Add tests**: Follow patterns in `tests/` directory
|
|
||||||
|
|
||||||
### Common Development Tasks
|
|
||||||
|
|
||||||
#### Adding a new MCP server:
|
|
||||||
|
|
||||||
1. Add server definition to `mcp_settings.json`
|
|
||||||
2. Restart backend to load new server
|
|
||||||
3. Check logs for successful connection
|
|
||||||
4. Test via dashboard or API endpoints
|
|
||||||
|
|
||||||
#### API development:
|
|
||||||
|
|
||||||
1. Define route in `src/routes/`
|
|
||||||
2. Implement controller in `src/controllers/`
|
|
||||||
3. Add types in `src/types/index.ts` if needed
|
|
||||||
4. Write tests in `tests/controllers/`
|
|
||||||
|
|
||||||
#### Frontend development:
|
|
||||||
|
|
||||||
1. Create/modify components in `frontend/src/components/`
|
|
||||||
2. Add pages in `frontend/src/pages/`
|
|
||||||
3. Update routing if needed
|
|
||||||
4. Test in development mode with `pnpm frontend:dev`
|
|
||||||
|
|
||||||
#### Documentation:
|
|
||||||
|
|
||||||
1. Update or add docs in `docs/` folder
|
|
||||||
2. Ensure README.md reflects any major changes
|
|
||||||
|
|
||||||
## Validation and CI Requirements
|
|
||||||
|
|
||||||
### Before Committing - ALWAYS Run:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
pnpm lint # Must pass - ~3 seconds
|
|
||||||
pnpm backend:build # Must compile - ~5 seconds
|
|
||||||
pnpm test:ci # All tests must pass - ~16 seconds
|
|
||||||
pnpm build # Full build must work - ~10 seconds
|
|
||||||
```
|
|
||||||
|
|
||||||
**CRITICAL**: CI will fail if any of these commands fail. Fix issues locally first.
|
|
||||||
|
|
||||||
### CI Pipeline (.github/workflows/ci.yml)
|
|
||||||
|
|
||||||
- Runs on Node.js 20.x
|
|
||||||
- Tests: linting, type checking, unit tests with coverage
|
|
||||||
- **NEVER CANCEL**: CI builds may take 2-3 minutes total
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
### Common Issues
|
|
||||||
|
|
||||||
- **"uvx command not found"**: Some MCP servers require `uvx` (Python package manager) - this is expected in development
|
|
||||||
- **Port already in use**: Change PORT environment variable or kill existing processes
|
|
||||||
- **Frontend not loading**: Ensure frontend was built with `pnpm frontend:build`
|
|
||||||
- **MCP server connection failed**: Check server command/args in `mcp_settings.json`
|
|
||||||
|
|
||||||
### Build Failures
|
|
||||||
|
|
||||||
- **TypeScript errors**: Run `pnpm backend:build` to see compilation errors
|
|
||||||
- **Test failures**: Run `pnpm test:verbose` for detailed test output
|
|
||||||
- **Lint errors**: Run `pnpm lint` and fix reported issues
|
|
||||||
|
|
||||||
### Development Issues
|
|
||||||
|
|
||||||
- **Backend not starting**: Check for port conflicts, verify `mcp_settings.json` syntax
|
|
||||||
- **Frontend proxy errors**: Ensure backend is running before starting frontend
|
|
||||||
- **Hot reload not working**: Restart development server
|
|
||||||
|
|
||||||
## Performance Notes
|
|
||||||
|
|
||||||
- **Install time**: pnpm install takes ~30 seconds
|
|
||||||
- **Build time**: Full build takes ~10 seconds
|
|
||||||
- **Test time**: Complete test suite takes ~16 seconds
|
|
||||||
- **Startup time**: Backend initialization takes 10-15 seconds (MCP server connections)
|
|
||||||
|
|
||||||
**Remember**: NEVER CANCEL any build or test commands. Always wait for completion even if they seem slow.
|
|
||||||
374
AGENTS.md
374
AGENTS.md
@@ -1,26 +1,214 @@
|
|||||||
# Repository Guidelines
|
# MCPHub Development Guide & Agent Instructions
|
||||||
|
|
||||||
These notes align current contributors around the code layout, daily commands, and collaboration habits that keep `@samanhappy/mcphub` moving quickly.
|
**ALWAYS follow these instructions first and only fallback to additional search and context gathering if the information here is incomplete or found to be in error.**
|
||||||
|
|
||||||
|
This document serves as the primary reference for all contributors and AI agents working on `@samanhappy/mcphub`. It provides comprehensive guidance on code organization, development workflow, and project conventions.
|
||||||
|
|
||||||
|
## Project Overview
|
||||||
|
|
||||||
|
MCPHub is a TypeScript/Node.js MCP (Model Context Protocol) server management hub that provides unified access through HTTP endpoints. It serves as a centralized dashboard for managing multiple MCP servers with real-time monitoring, authentication, and flexible routing.
|
||||||
|
|
||||||
|
**Core Components:**
|
||||||
|
|
||||||
|
- **Backend**: Express.js + TypeScript + ESM (`src/server.ts`)
|
||||||
|
- **Frontend**: React/Vite + Tailwind CSS (`frontend/`)
|
||||||
|
- **MCP Integration**: Connects multiple MCP servers (`src/services/mcpService.ts`)
|
||||||
|
- **Authentication**: JWT-based with bcrypt password hashing
|
||||||
|
- **Configuration**: JSON-based MCP server definitions (`mcp_settings.json`)
|
||||||
|
- **Documentation**: API docs and usage instructions(`docs/`)
|
||||||
|
|
||||||
|
## Bootstrap and Setup (CRITICAL - Follow Exact Steps)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install pnpm if not available
|
||||||
|
npm install -g pnpm
|
||||||
|
|
||||||
|
# Install dependencies - takes ~30 seconds
|
||||||
|
pnpm install
|
||||||
|
|
||||||
|
# Setup environment (optional)
|
||||||
|
cp .env.example .env
|
||||||
|
|
||||||
|
# Build and test to verify setup
|
||||||
|
pnpm lint # ~3 seconds - NEVER CANCEL
|
||||||
|
pnpm backend:build # ~5 seconds - NEVER CANCEL
|
||||||
|
pnpm test:ci # ~16 seconds - NEVER CANCEL. Set timeout to 60+ seconds
|
||||||
|
pnpm frontend:build # ~5 seconds - NEVER CANCEL
|
||||||
|
pnpm build # ~10 seconds total - NEVER CANCEL. Set timeout to 60+ seconds
|
||||||
|
```
|
||||||
|
|
||||||
|
**CRITICAL TIMING**: These commands are fast but NEVER CANCEL them. Always wait for completion.
|
||||||
|
|
||||||
|
## Manual Validation Requirements
|
||||||
|
|
||||||
|
**ALWAYS perform these validation steps after making changes:**
|
||||||
|
|
||||||
|
### 1. Basic Application Functionality
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start the application
|
||||||
|
pnpm dev
|
||||||
|
|
||||||
|
# Verify backend responds (in another terminal)
|
||||||
|
curl http://localhost:3000/api/health
|
||||||
|
# Expected: Should return health status
|
||||||
|
|
||||||
|
# Verify frontend serves
|
||||||
|
curl -I http://localhost:3000/
|
||||||
|
# Expected: HTTP 200 OK with HTML content
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. MCP Server Integration Test
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check MCP servers are loading (look for log messages)
|
||||||
|
# Expected log output should include:
|
||||||
|
# - "Successfully connected client for server: [name]"
|
||||||
|
# - "Successfully listed [N] tools for server: [name]"
|
||||||
|
# - Some servers may fail due to missing API keys (normal in dev)
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Build Verification
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Verify production build works
|
||||||
|
pnpm build
|
||||||
|
node scripts/verify-dist.js
|
||||||
|
# Expected: "✅ Verification passed! Frontend and backend dist files are present."
|
||||||
|
```
|
||||||
|
|
||||||
|
**NEVER skip these validation steps**. If any fail, debug and fix before proceeding.
|
||||||
|
|
||||||
## Project Structure & Module Organization
|
## Project Structure & Module Organization
|
||||||
|
|
||||||
- Backend services live in `src`, grouped by responsibility (`controllers/`, `services/`, `dao/`, `routes/`, `utils/`), with `server.ts` orchestrating HTTP bootstrap.
|
### Critical Backend Files
|
||||||
- `frontend/src` contains the Vite + React dashboard; `frontend/public` hosts static assets and translations sit in `locales/`.
|
|
||||||
- Jest-aware test code is split between colocated specs (`src/**/*.{test,spec}.ts`) and higher-level suites in `tests/`; use `tests/utils/` helpers when exercising the CLI or SSE flows.
|
- `src/index.ts` - Application entry point
|
||||||
- Build artifacts and bundles are generated into `dist/`, `frontend/dist/`, and `coverage/`; never edit these manually.
|
- `src/server.ts` - Express server setup and middleware (orchestrating HTTP bootstrap)
|
||||||
|
- `src/services/mcpService.ts` - **Core MCP server management logic**
|
||||||
|
- `src/config/index.ts` - Configuration management
|
||||||
|
- `src/routes/` - HTTP route definitions
|
||||||
|
- `src/controllers/` - HTTP request handlers
|
||||||
|
- `src/dao/` - Data access layer (supports JSON file & PostgreSQL)
|
||||||
|
- `src/db/` - TypeORM entities & repositories (for PostgreSQL mode)
|
||||||
|
- `src/types/index.ts` - TypeScript type definitions and shared DTOs
|
||||||
|
- `src/utils/` - Utility functions and helpers
|
||||||
|
|
||||||
|
### Critical Frontend Files
|
||||||
|
|
||||||
|
- `frontend/src/` - React application source (Vite + React dashboard)
|
||||||
|
- `frontend/src/pages/` - Page components (development entry point)
|
||||||
|
- `frontend/src/components/` - Reusable UI components
|
||||||
|
- `frontend/src/utils/fetchInterceptor.js` - Backend API interaction
|
||||||
|
- `frontend/public/` - Static assets
|
||||||
|
|
||||||
|
### Configuration Files
|
||||||
|
|
||||||
|
- `mcp_settings.json` - **MCP server definitions and user accounts**
|
||||||
|
- `package.json` - Dependencies and scripts
|
||||||
|
- `tsconfig.json` - TypeScript configuration
|
||||||
|
- `jest.config.cjs` - Test configuration
|
||||||
|
- `.eslintrc.json` - Linting rules
|
||||||
|
|
||||||
|
### Test Organization
|
||||||
|
|
||||||
|
- Jest-aware test code is split between colocated specs (`src/**/*.{test,spec}.ts`) and higher-level suites in `tests/`
|
||||||
|
- Use `tests/utils/` helpers when exercising the CLI or SSE flows
|
||||||
|
- Mirror production directory names when adding new suites
|
||||||
|
- End filenames with `.test.ts` or `.spec.ts` for automatic discovery
|
||||||
|
|
||||||
|
### Build Artifacts
|
||||||
|
|
||||||
|
- `dist/` - Backend build output (TypeScript compilation)
|
||||||
|
- `frontend/dist/` - Frontend build output (Vite bundle)
|
||||||
|
- `coverage/` - Test coverage reports
|
||||||
|
- **Never edit these manually**
|
||||||
|
|
||||||
|
### Localization
|
||||||
|
|
||||||
|
- Translations sit in `locales/` (en.json, fr.json, tr.json, zh.json)
|
||||||
|
- Frontend uses react-i18next
|
||||||
|
|
||||||
|
### Docker and Deployment
|
||||||
|
|
||||||
|
- `Dockerfile` - Multi-stage build with Python base + Node.js
|
||||||
|
- `entrypoint.sh` - Docker startup script
|
||||||
|
- `bin/cli.js` - NPM package CLI entry point
|
||||||
|
|
||||||
## Build, Test, and Development Commands
|
## Build, Test, and Development Commands
|
||||||
|
|
||||||
- `pnpm dev` runs backend (`tsx watch src/index.ts`) and frontend (`vite`) together for local iteration.
|
### Development Environment
|
||||||
- `pnpm backend:dev`, `pnpm frontend:dev`, and `pnpm frontend:preview` target each surface independently; prefer them when debugging one stack.
|
|
||||||
- `pnpm build` executes `pnpm backend:build` (TypeScript to `dist/`) and `pnpm frontend:build`; run before release or publishing.
|
```bash
|
||||||
- `pnpm test`, `pnpm test:watch`, and `pnpm test:coverage` drive Jest; `pnpm lint` and `pnpm format` enforce style via ESLint and Prettier.
|
# Start both backend and frontend (recommended for most development)
|
||||||
|
pnpm dev # Backend on :3001, Frontend on :5173
|
||||||
|
|
||||||
|
# OR start separately (required on Windows, optional on Linux/macOS)
|
||||||
|
# Terminal 1: Backend only
|
||||||
|
pnpm backend:dev # Runs on port 3000 (or PORT env var)
|
||||||
|
|
||||||
|
# Terminal 2: Frontend only
|
||||||
|
pnpm frontend:dev # Runs on port 5173, proxies API to backend
|
||||||
|
|
||||||
|
# Frontend preview (production build)
|
||||||
|
pnpm frontend:preview # Preview production build
|
||||||
|
```
|
||||||
|
|
||||||
|
**NEVER CANCEL**: Development servers may take 10-15 seconds to fully initialize all MCP servers.
|
||||||
|
|
||||||
|
### Production Build
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Full production build - takes ~10 seconds total
|
||||||
|
pnpm build # NEVER CANCEL - Set timeout to 60+ seconds
|
||||||
|
|
||||||
|
# Individual builds
|
||||||
|
pnpm backend:build # TypeScript compilation to dist/ - ~5 seconds
|
||||||
|
pnpm frontend:build # Vite build to frontend/dist/ - ~5 seconds
|
||||||
|
|
||||||
|
# Start production server
|
||||||
|
pnpm start # Requires dist/ and frontend/dist/ to exist
|
||||||
|
```
|
||||||
|
|
||||||
|
Run `pnpm build` before release or publishing.
|
||||||
|
|
||||||
|
### Testing and Validation
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run all tests - takes ~16 seconds with 73 tests
|
||||||
|
pnpm test:ci # NEVER CANCEL - Set timeout to 60+ seconds
|
||||||
|
|
||||||
|
# Development testing
|
||||||
|
pnpm test # Interactive mode
|
||||||
|
pnpm test:watch # Watch mode for development
|
||||||
|
pnpm test:coverage # With coverage report
|
||||||
|
|
||||||
|
# Code quality
|
||||||
|
pnpm lint # ESLint - ~3 seconds
|
||||||
|
pnpm format # Prettier formatting - ~3 seconds
|
||||||
|
```
|
||||||
|
|
||||||
|
**CRITICAL**: All tests MUST pass before committing. Do not modify tests to make them pass unless specifically required for your changes.
|
||||||
|
|
||||||
|
### Performance Notes
|
||||||
|
|
||||||
|
- **Install time**: pnpm install takes ~30 seconds
|
||||||
|
- **Build time**: Full build takes ~10 seconds
|
||||||
|
- **Test time**: Complete test suite takes ~16 seconds
|
||||||
|
- **Startup time**: Backend initialization takes 10-15 seconds (MCP server connections)
|
||||||
|
|
||||||
## Coding Style & Naming Conventions
|
## Coding Style & Naming Conventions
|
||||||
|
|
||||||
- TypeScript everywhere; default to 2-space indentation and single quotes, letting Prettier settle formatting. ESLint configuration assumes ES modules.
|
- **TypeScript everywhere**: Default to 2-space indentation and single quotes, letting Prettier settle formatting
|
||||||
- Name services and data access layers with suffixes (`UserService`, `AuthDao`), React components and files in `PascalCase`, and utility modules in `camelCase`.
|
- **ESM modules**: Always use `.js` extensions in imports, not `.ts` (e.g., `import { something } from './file.js'`)
|
||||||
- Keep DTOs and shared types in `src/types` to avoid duplication; re-export through index files only when it clarifies imports.
|
- **English only**: All code comments must be written in English
|
||||||
|
- **TypeScript strict**: Follow strict type checking rules
|
||||||
|
- **Naming conventions**:
|
||||||
|
- Services and data access layers: Use suffixes (`UserService`, `AuthDao`)
|
||||||
|
- React components and files: `PascalCase`
|
||||||
|
- Utility modules: `camelCase`
|
||||||
|
- **Types and DTOs**: Keep in `src/types` to avoid duplication; re-export through index files only when it clarifies imports
|
||||||
|
- **ESLint configuration**: Assumes ES modules
|
||||||
|
|
||||||
## Testing Guidelines
|
## Testing Guidelines
|
||||||
|
|
||||||
@@ -28,12 +216,86 @@ These notes align current contributors around the code layout, daily commands, a
|
|||||||
- Mirror production directory names when adding new suites and end filenames with `.test.ts` or `.spec.ts` for automatic discovery.
|
- Mirror production directory names when adding new suites and end filenames with `.test.ts` or `.spec.ts` for automatic discovery.
|
||||||
- Aim to maintain or raise coverage when touching critical flows (auth, OAuth, SSE); add integration tests under `tests/integration/` when touching cross-service logic.
|
- Aim to maintain or raise coverage when touching critical flows (auth, OAuth, SSE); add integration tests under `tests/integration/` when touching cross-service logic.
|
||||||
|
|
||||||
|
## Key Configuration Notes
|
||||||
|
|
||||||
|
- **MCP servers**: Defined in `mcp_settings.json` with command/args
|
||||||
|
- **Endpoints**: `/mcp/{group|server}` and `/mcp/$smart` for routing
|
||||||
|
- **i18n**: Frontend uses react-i18next with files in `locales/` folder
|
||||||
|
- **Authentication**: JWT tokens with bcrypt password hashing
|
||||||
|
- **Default credentials**: admin/admin123 (configured in mcp_settings.json)
|
||||||
|
|
||||||
|
## Development Entry Points
|
||||||
|
|
||||||
|
### Adding a new MCP server
|
||||||
|
|
||||||
|
1. Add server definition to `mcp_settings.json`
|
||||||
|
2. Restart backend to load new server
|
||||||
|
3. Check logs for successful connection
|
||||||
|
4. Test via dashboard or API endpoints
|
||||||
|
|
||||||
|
### API development
|
||||||
|
|
||||||
|
1. Define route in `src/routes/`
|
||||||
|
2. Implement controller in `src/controllers/`
|
||||||
|
3. Add types in `src/types/index.ts` if needed
|
||||||
|
4. Write tests in `tests/controllers/`
|
||||||
|
|
||||||
|
### Frontend development
|
||||||
|
|
||||||
|
1. Create/modify components in `frontend/src/components/`
|
||||||
|
2. Add pages in `frontend/src/pages/`
|
||||||
|
3. Update routing if needed
|
||||||
|
4. Test in development mode with `pnpm frontend:dev`
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
|
||||||
|
1. Update or add docs in `docs/` folder
|
||||||
|
2. Ensure README.md reflects any major changes
|
||||||
|
|
||||||
## Commit & Pull Request Guidelines
|
## Commit & Pull Request Guidelines
|
||||||
|
|
||||||
- Follow the existing Conventional Commit pattern (`feat:`, `fix:`, `chore:`, etc.) with imperative, present-tense summaries and optional multi-line context.
|
- Follow the existing Conventional Commit pattern (`feat:`, `fix:`, `chore:`, etc.) with imperative, present-tense summaries and optional multi-line context.
|
||||||
- Each PR should describe the behavior change, list testing performed, and link issues; include before/after screenshots or GIFs for frontend tweaks.
|
- Each PR should describe the behavior change, list testing performed, and link issues; include before/after screenshots or GIFs for frontend tweaks.
|
||||||
- Re-run `pnpm build` and `pnpm test` before requesting review, and ensure generated artifacts stay out of the diff.
|
- Re-run `pnpm build` and `pnpm test` before requesting review, and ensure generated artifacts stay out of the diff.
|
||||||
|
|
||||||
|
### Before Committing - ALWAYS Run
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pnpm lint # Must pass - ~3 seconds
|
||||||
|
pnpm backend:build # Must compile - ~5 seconds
|
||||||
|
pnpm test:ci # All tests must pass - ~16 seconds
|
||||||
|
pnpm build # Full build must work - ~10 seconds
|
||||||
|
```
|
||||||
|
|
||||||
|
**CRITICAL**: CI will fail if any of these commands fail. Fix issues locally first.
|
||||||
|
|
||||||
|
### CI Pipeline (.github/workflows/ci.yml)
|
||||||
|
|
||||||
|
- Runs on Node.js 20.x
|
||||||
|
- Tests: linting, type checking, unit tests with coverage
|
||||||
|
- **NEVER CANCEL**: CI builds may take 2-3 minutes total
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Common Issues
|
||||||
|
|
||||||
|
- **"uvx command not found"**: Some MCP servers require `uvx` (Python package manager) - this is expected in development
|
||||||
|
- **Port already in use**: Change PORT environment variable or kill existing processes
|
||||||
|
- **Frontend not loading**: Ensure frontend was built with `pnpm frontend:build`
|
||||||
|
- **MCP server connection failed**: Check server command/args in `mcp_settings.json`
|
||||||
|
|
||||||
|
### Build Failures
|
||||||
|
|
||||||
|
- **TypeScript errors**: Run `pnpm backend:build` to see compilation errors
|
||||||
|
- **Test failures**: Run `pnpm test:verbose` for detailed test output
|
||||||
|
- **Lint errors**: Run `pnpm lint` and fix reported issues
|
||||||
|
|
||||||
|
### Development Issues
|
||||||
|
|
||||||
|
- **Backend not starting**: Check for port conflicts, verify `mcp_settings.json` syntax
|
||||||
|
- **Frontend proxy errors**: Ensure backend is running before starting frontend
|
||||||
|
- **Hot reload not working**: Restart development server
|
||||||
|
|
||||||
## DAO Layer & Dual Data Source
|
## DAO Layer & Dual Data Source
|
||||||
|
|
||||||
MCPHub supports **JSON file** (default) and **PostgreSQL** storage. Set `USE_DB=true` + `DB_URL` to switch.
|
MCPHub supports **JSON file** (default) and **PostgreSQL** storage. Set `USE_DB=true` + `DB_URL` to switch.
|
||||||
@@ -64,15 +326,99 @@ When adding/changing fields, update **ALL** these files:
|
|||||||
### Data Type Mapping
|
### Data Type Mapping
|
||||||
|
|
||||||
| Model | DAO | DB Entity | JSON Path |
|
| Model | DAO | DB Entity | JSON Path |
|
||||||
| -------------- | ----------------- | -------------- | ------------------------ |
|
| -------------- | ----------------- | -------------- | ------------------------- |
|
||||||
| `IUser` | `UserDao` | `User` | `settings.users[]` |
|
| `IUser` | `UserDao` | `User` | `settings.users[]` |
|
||||||
| `ServerConfig` | `ServerDao` | `Server` | `settings.mcpServers{}` |
|
| `ServerConfig` | `ServerDao` | `Server` | `settings.mcpServers{}` |
|
||||||
| `IGroup` | `GroupDao` | `Group` | `settings.groups[]` |
|
| `IGroup` | `GroupDao` | `Group` | `settings.groups[]` |
|
||||||
| `SystemConfig` | `SystemConfigDao` | `SystemConfig` | `settings.systemConfig` |
|
| `SystemConfig` | `SystemConfigDao` | `SystemConfig` | `settings.systemConfig` |
|
||||||
| `UserConfig` | `UserConfigDao` | `UserConfig` | `settings.userConfigs{}` |
|
| `UserConfig` | `UserConfigDao` | `UserConfig` | `settings.userConfigs{}` |
|
||||||
|
| `BearerKey` | `BearerKeyDao` | `BearerKey` | `settings.bearerKeys[]` |
|
||||||
|
| `IOAuthClient` | `OAuthClientDao` | `OAuthClient` | `settings.oauthClients[]` |
|
||||||
|
| `IOAuthToken` | `OAuthTokenDao` | `OAuthToken` | `settings.oauthTokens[]` |
|
||||||
|
|
||||||
### Common Pitfalls
|
### Common Pitfalls
|
||||||
|
|
||||||
- Forgetting migration script → fields won't migrate to DB
|
- Forgetting migration script → fields won't migrate to DB
|
||||||
- Optional fields need `nullable: true` in entity
|
- Optional fields need `nullable: true` in entity
|
||||||
- Complex objects need `simple-json` column type
|
- Complex objects need `simple-json` column type
|
||||||
|
|
||||||
|
## Auto-Evolution Guidelines for AI Agents
|
||||||
|
|
||||||
|
**This section provides guidelines for AI agents to automatically maintain and improve this document.**
|
||||||
|
|
||||||
|
### When to Update AGENTS.md
|
||||||
|
|
||||||
|
AI agents MUST update this document in the following situations:
|
||||||
|
|
||||||
|
#### 1. Code-Documentation Mismatch Detected
|
||||||
|
|
||||||
|
When executing tasks, if you discover that:
|
||||||
|
|
||||||
|
- The actual code structure differs from descriptions in this document
|
||||||
|
- File paths, imports, or module organization has changed
|
||||||
|
- New critical files or directories exist that aren't documented
|
||||||
|
- Documented files or patterns no longer exist
|
||||||
|
|
||||||
|
**Action**: Immediately update the relevant section to reflect the current codebase state.
|
||||||
|
|
||||||
|
**Example scenarios**:
|
||||||
|
|
||||||
|
- A controller is now in `src/api/controllers/` instead of `src/controllers/`
|
||||||
|
- New middleware files exist that should be documented
|
||||||
|
- The DAO implementation has been refactored with a different structure
|
||||||
|
- Build output directories have changed
|
||||||
|
|
||||||
|
#### 2. User Preferences and Requirements
|
||||||
|
|
||||||
|
During conversation, if the user expresses:
|
||||||
|
|
||||||
|
- **Coding preferences**: Indentation style, naming conventions, code organization patterns
|
||||||
|
- **Workflow requirements**: Required validation steps, commit procedures, testing expectations
|
||||||
|
- **Tool preferences**: Preferred libraries, frameworks, or development tools
|
||||||
|
- **Quality standards**: Code review criteria, documentation requirements, error handling patterns
|
||||||
|
- **Development principles**: Architecture decisions, design patterns, best practices
|
||||||
|
|
||||||
|
**Action**: Add or update the relevant section to capture these preferences for future reference.
|
||||||
|
|
||||||
|
**Example scenarios**:
|
||||||
|
|
||||||
|
- User prefers async/await over promises → Update coding style section
|
||||||
|
- User requires specific test coverage thresholds → Update testing guidelines
|
||||||
|
- User has strong opinions about error handling → Add to development process section
|
||||||
|
- User establishes new deployment procedures → Update deployment section
|
||||||
|
|
||||||
|
### How to Update AGENTS.md
|
||||||
|
|
||||||
|
1. **Identify the Section**: Determine which section needs updating based on the type of change
|
||||||
|
2. **Make Precise Changes**: Update only the relevant content, maintaining the document structure
|
||||||
|
3. **Preserve Format**: Keep the existing markdown formatting and organization
|
||||||
|
4. **Add Context**: If adding new content, ensure it fits logically within existing sections
|
||||||
|
5. **Verify Accuracy**: After updating, ensure the new information is accurate and complete
|
||||||
|
|
||||||
|
### Update Principles
|
||||||
|
|
||||||
|
- **Accuracy First**: Documentation must reflect the actual current state
|
||||||
|
- **Clarity**: Use clear, concise language; avoid ambiguity
|
||||||
|
- **Completeness**: Include sufficient detail for agents to work effectively
|
||||||
|
- **Consistency**: Maintain consistent terminology and formatting throughout
|
||||||
|
- **Actionability**: Focus on concrete, actionable guidance rather than vague descriptions
|
||||||
|
|
||||||
|
### Self-Correction Process
|
||||||
|
|
||||||
|
Before completing any task:
|
||||||
|
|
||||||
|
1. Review relevant sections of AGENTS.md
|
||||||
|
2. During execution, note any discrepancies between documentation and reality
|
||||||
|
3. Update AGENTS.md to correct discrepancies
|
||||||
|
4. Verify the update doesn't conflict with other sections
|
||||||
|
5. Proceed with the original task using the updated information
|
||||||
|
|
||||||
|
### Meta-Update Rule
|
||||||
|
|
||||||
|
If this auto-evolution section itself needs improvement based on experience:
|
||||||
|
|
||||||
|
- Update it to better serve future agents
|
||||||
|
- Add new scenarios or principles as they emerge
|
||||||
|
- Refine the update process based on what works well
|
||||||
|
|
||||||
|
**Remember**: This document is a living guide. Keeping it accurate and current is as important as following it.
|
||||||
|
|||||||
@@ -28,7 +28,8 @@
|
|||||||
"features/server-management",
|
"features/server-management",
|
||||||
"features/group-management",
|
"features/group-management",
|
||||||
"features/smart-routing",
|
"features/smart-routing",
|
||||||
"features/oauth"
|
"features/oauth",
|
||||||
|
"features/output-compression"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
109
docs/features/output-compression.mdx
Normal file
109
docs/features/output-compression.mdx
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
---
|
||||||
|
title: 'Output Compression'
|
||||||
|
description: 'Reduce token consumption by compressing MCP tool outputs'
|
||||||
|
---
|
||||||
|
|
||||||
|
# Output Compression
|
||||||
|
|
||||||
|
MCPHub provides an AI-powered compression mechanism to reduce token consumption from MCP tool outputs. This feature is particularly useful when dealing with large outputs that can significantly impact system efficiency and scalability.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The compression feature uses a lightweight AI model (by default, `gpt-4o-mini`) to intelligently compress MCP tool outputs while preserving all essential information. This can help:
|
||||||
|
|
||||||
|
- **Reduce token overhead** by compressing verbose tool information
|
||||||
|
- **Lower operational costs** associated with token consumption
|
||||||
|
- **Improve performance** for downstream processing
|
||||||
|
- **Better resource utilization** in resource-constrained environments
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
Add the compression configuration to your `systemConfig` section in `mcp_settings.json`:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"systemConfig": {
|
||||||
|
"compression": {
|
||||||
|
"enabled": true,
|
||||||
|
"model": "gpt-4o-mini",
|
||||||
|
"maxInputTokens": 100000,
|
||||||
|
"targetReductionRatio": 0.5
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuration Options
|
||||||
|
|
||||||
|
| Option | Type | Default | Description |
|
||||||
|
|--------|------|---------|-------------|
|
||||||
|
| `enabled` | boolean | `false` | Enable or disable output compression |
|
||||||
|
| `model` | string | `"gpt-4o-mini"` | AI model to use for compression |
|
||||||
|
| `maxInputTokens` | number | `100000` | Maximum input tokens for compression |
|
||||||
|
| `targetReductionRatio` | number | `0.5` | Target size reduction ratio (0.0-1.0) |
|
||||||
|
|
||||||
|
## Requirements
|
||||||
|
|
||||||
|
Output compression requires:
|
||||||
|
|
||||||
|
1. An OpenAI API key configured in the smart routing settings
|
||||||
|
2. The compression feature must be explicitly enabled
|
||||||
|
|
||||||
|
### Setting up OpenAI API Key
|
||||||
|
|
||||||
|
Configure your OpenAI API key using environment variables or system configuration:
|
||||||
|
|
||||||
|
**Environment Variable:**
|
||||||
|
```bash
|
||||||
|
export OPENAI_API_KEY=your-api-key
|
||||||
|
```
|
||||||
|
|
||||||
|
**Or in systemConfig:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"systemConfig": {
|
||||||
|
"smartRouting": {
|
||||||
|
"openaiApiKey": "your-api-key",
|
||||||
|
"openaiApiBaseUrl": "https://api.openai.com/v1"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
1. **Content Size Check**: When a tool call completes, the compression service checks if the output is large enough to benefit from compression (threshold is 10% of `maxInputTokens` or 1000 tokens, whichever is smaller)
|
||||||
|
|
||||||
|
2. **AI Compression**: If the content exceeds the threshold, it's sent to the configured AI model with instructions to compress while preserving essential information
|
||||||
|
|
||||||
|
3. **Size Validation**: The compressed result is compared with the original; if compression didn't reduce the size, the original content is used
|
||||||
|
|
||||||
|
4. **Error Handling**: If compression fails for any reason, the original content is returned unchanged
|
||||||
|
|
||||||
|
## Fallback Mechanism
|
||||||
|
|
||||||
|
The compression feature includes graceful degradation for several scenarios:
|
||||||
|
|
||||||
|
- **Compression disabled**: Original content is returned
|
||||||
|
- **No API key**: Original content is returned with a warning
|
||||||
|
- **Small content**: Content below threshold is not compressed
|
||||||
|
- **API errors**: Original content is returned on any API failure
|
||||||
|
- **Error responses**: Tool error responses are never compressed
|
||||||
|
- **Non-text content**: Images and other media types are preserved as-is
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
1. **Start with defaults**: The default configuration provides a good balance between compression and quality
|
||||||
|
|
||||||
|
2. **Monitor results**: Review compressed outputs to ensure important information isn't lost
|
||||||
|
|
||||||
|
3. **Adjust threshold**: If you have consistently large outputs, consider lowering `targetReductionRatio` for more aggressive compression
|
||||||
|
|
||||||
|
4. **Use efficient models**: The default `gpt-4o-mini` provides a good balance of cost and quality; switch to `gpt-4o` if you need higher quality compression
|
||||||
|
|
||||||
|
## Limitations
|
||||||
|
|
||||||
|
- Compression adds latency due to the AI API call
|
||||||
|
- API costs apply for each compression operation
|
||||||
|
- Very short outputs won't be compressed (below threshold)
|
||||||
|
- Binary/non-text content is not compressed
|
||||||
@@ -18,7 +18,17 @@ const EditServerForm = ({ server, onEdit, onCancel }: EditServerFormProps) => {
|
|||||||
try {
|
try {
|
||||||
setError(null);
|
setError(null);
|
||||||
const encodedServerName = encodeURIComponent(server.name);
|
const encodedServerName = encodeURIComponent(server.name);
|
||||||
const result = await apiPut(`/servers/${encodedServerName}`, payload);
|
|
||||||
|
// Check if name is being changed
|
||||||
|
const isRenaming = payload.name && payload.name !== server.name;
|
||||||
|
|
||||||
|
// Build the request body
|
||||||
|
const requestBody = {
|
||||||
|
config: payload.config,
|
||||||
|
...(isRenaming ? { newName: payload.name } : {}),
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await apiPut(`/servers/${encodedServerName}`, requestBody);
|
||||||
|
|
||||||
if (!result.success) {
|
if (!result.success) {
|
||||||
// Use specific error message from the response if available
|
// Use specific error message from the response if available
|
||||||
|
|||||||
@@ -429,7 +429,6 @@ const ServerForm = ({
|
|||||||
className="shadow appearance-none border rounded w-full py-2 px-3 text-gray-700 leading-tight focus:outline-none focus:shadow-outline form-input"
|
className="shadow appearance-none border rounded w-full py-2 px-3 text-gray-700 leading-tight focus:outline-none focus:shadow-outline form-input"
|
||||||
placeholder="e.g.: time-mcp"
|
placeholder="e.g.: time-mcp"
|
||||||
required
|
required
|
||||||
disabled={isEdit}
|
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ interface BearerKeyRowProps {
|
|||||||
name: string;
|
name: string;
|
||||||
token: string;
|
token: string;
|
||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
accessType: 'all' | 'groups' | 'servers';
|
accessType: 'all' | 'groups' | 'servers' | 'custom';
|
||||||
allowedGroups: string;
|
allowedGroups: string;
|
||||||
allowedServers: string;
|
allowedServers: string;
|
||||||
},
|
},
|
||||||
@@ -47,7 +47,7 @@ const BearerKeyRow: React.FC<BearerKeyRowProps> = ({
|
|||||||
const [name, setName] = useState(keyData.name);
|
const [name, setName] = useState(keyData.name);
|
||||||
const [token, setToken] = useState(keyData.token);
|
const [token, setToken] = useState(keyData.token);
|
||||||
const [enabled, setEnabled] = useState<boolean>(keyData.enabled);
|
const [enabled, setEnabled] = useState<boolean>(keyData.enabled);
|
||||||
const [accessType, setAccessType] = useState<'all' | 'groups' | 'servers'>(
|
const [accessType, setAccessType] = useState<'all' | 'groups' | 'servers' | 'custom'>(
|
||||||
keyData.accessType || 'all',
|
keyData.accessType || 'all',
|
||||||
);
|
);
|
||||||
const [selectedGroups, setSelectedGroups] = useState<string[]>(keyData.allowedGroups || []);
|
const [selectedGroups, setSelectedGroups] = useState<string[]>(keyData.allowedGroups || []);
|
||||||
@@ -105,6 +105,13 @@ const BearerKeyRow: React.FC<BearerKeyRowProps> = ({
|
|||||||
);
|
);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
if (accessType === 'custom' && selectedGroups.length === 0 && selectedServers.length === 0) {
|
||||||
|
showToast(
|
||||||
|
t('settings.selectAtLeastOneGroupOrServer') || 'Please select at least one group or server',
|
||||||
|
'error',
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
setSaving(true);
|
setSaving(true);
|
||||||
try {
|
try {
|
||||||
@@ -135,6 +142,31 @@ const BearerKeyRow: React.FC<BearerKeyRowProps> = ({
|
|||||||
};
|
};
|
||||||
|
|
||||||
const isGroupsMode = accessType === 'groups';
|
const isGroupsMode = accessType === 'groups';
|
||||||
|
const isCustomMode = accessType === 'custom';
|
||||||
|
|
||||||
|
// Helper function to format access type display text
|
||||||
|
const formatAccessTypeDisplay = (key: BearerKey): string => {
|
||||||
|
if (key.accessType === 'all') {
|
||||||
|
return t('settings.bearerKeyAccessAll') || 'All Resources';
|
||||||
|
}
|
||||||
|
if (key.accessType === 'groups') {
|
||||||
|
return `${t('settings.bearerKeyAccessGroups') || 'Groups'}: ${key.allowedGroups}`;
|
||||||
|
}
|
||||||
|
if (key.accessType === 'servers') {
|
||||||
|
return `${t('settings.bearerKeyAccessServers') || 'Servers'}: ${key.allowedServers}`;
|
||||||
|
}
|
||||||
|
if (key.accessType === 'custom') {
|
||||||
|
const parts: string[] = [];
|
||||||
|
if (key.allowedGroups && key.allowedGroups.length > 0) {
|
||||||
|
parts.push(`${t('settings.bearerKeyAccessGroups') || 'Groups'}: ${key.allowedGroups}`);
|
||||||
|
}
|
||||||
|
if (key.allowedServers && key.allowedServers.length > 0) {
|
||||||
|
parts.push(`${t('settings.bearerKeyAccessServers') || 'Servers'}: ${key.allowedServers}`);
|
||||||
|
}
|
||||||
|
return `${t('settings.bearerKeyAccessCustom') || 'Custom'}: ${parts.join('; ')}`;
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
};
|
||||||
|
|
||||||
if (isEditing) {
|
if (isEditing) {
|
||||||
return (
|
return (
|
||||||
@@ -194,7 +226,9 @@ const BearerKeyRow: React.FC<BearerKeyRowProps> = ({
|
|||||||
<select
|
<select
|
||||||
className="block w-full py-2 px-3 border border-gray-300 bg-white rounded-md shadow-sm focus:outline-none focus:ring-blue-500 focus:border-blue-500 sm:text-sm form-select transition-shadow duration-200"
|
className="block w-full py-2 px-3 border border-gray-300 bg-white rounded-md shadow-sm focus:outline-none focus:ring-blue-500 focus:border-blue-500 sm:text-sm form-select transition-shadow duration-200"
|
||||||
value={accessType}
|
value={accessType}
|
||||||
onChange={(e) => setAccessType(e.target.value as 'all' | 'groups' | 'servers')}
|
onChange={(e) =>
|
||||||
|
setAccessType(e.target.value as 'all' | 'groups' | 'servers' | 'custom')
|
||||||
|
}
|
||||||
disabled={loading}
|
disabled={loading}
|
||||||
>
|
>
|
||||||
<option value="all">{t('settings.bearerKeyAccessAll') || 'All Resources'}</option>
|
<option value="all">{t('settings.bearerKeyAccessAll') || 'All Resources'}</option>
|
||||||
@@ -204,9 +238,14 @@ const BearerKeyRow: React.FC<BearerKeyRowProps> = ({
|
|||||||
<option value="servers">
|
<option value="servers">
|
||||||
{t('settings.bearerKeyAccessServers') || 'Specific Servers'}
|
{t('settings.bearerKeyAccessServers') || 'Specific Servers'}
|
||||||
</option>
|
</option>
|
||||||
|
<option value="custom">
|
||||||
|
{t('settings.bearerKeyAccessCustom') || 'Custom (Groups & Servers)'}
|
||||||
|
</option>
|
||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{/* Show single selector for groups or servers mode */}
|
||||||
|
{!isCustomMode && (
|
||||||
<div className="flex-1 min-w-[200px]">
|
<div className="flex-1 min-w-[200px]">
|
||||||
<label
|
<label
|
||||||
className={`block text-sm font-medium mb-1 ${accessType === 'all' ? 'text-gray-400' : 'text-gray-700'}`}
|
className={`block text-sm font-medium mb-1 ${accessType === 'all' ? 'text-gray-400' : 'text-gray-700'}`}
|
||||||
@@ -227,6 +266,37 @@ const BearerKeyRow: React.FC<BearerKeyRowProps> = ({
|
|||||||
disabled={loading || accessType === 'all'}
|
disabled={loading || accessType === 'all'}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{/* Show both selectors for custom mode */}
|
||||||
|
{isCustomMode && (
|
||||||
|
<>
|
||||||
|
<div className="flex-1 min-w-[200px]">
|
||||||
|
<label className="block text-sm font-medium text-gray-700 mb-1">
|
||||||
|
{t('settings.bearerKeyAllowedGroups') || 'Allowed groups'}
|
||||||
|
</label>
|
||||||
|
<MultiSelect
|
||||||
|
options={availableGroups}
|
||||||
|
selected={selectedGroups}
|
||||||
|
onChange={setSelectedGroups}
|
||||||
|
placeholder={t('settings.selectGroups') || 'Select groups...'}
|
||||||
|
disabled={loading}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div className="flex-1 min-w-[200px]">
|
||||||
|
<label className="block text-sm font-medium text-gray-700 mb-1">
|
||||||
|
{t('settings.bearerKeyAllowedServers') || 'Allowed servers'}
|
||||||
|
</label>
|
||||||
|
<MultiSelect
|
||||||
|
options={availableServers}
|
||||||
|
selected={selectedServers}
|
||||||
|
onChange={setSelectedServers}
|
||||||
|
placeholder={t('settings.selectServers') || 'Select servers...'}
|
||||||
|
disabled={loading}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
|
||||||
<div className="flex justify-end gap-2">
|
<div className="flex justify-end gap-2">
|
||||||
<button
|
<button
|
||||||
@@ -281,11 +351,7 @@ const BearerKeyRow: React.FC<BearerKeyRowProps> = ({
|
|||||||
</span>
|
</span>
|
||||||
</td>
|
</td>
|
||||||
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-500">
|
||||||
{keyData.accessType === 'all'
|
{formatAccessTypeDisplay(keyData)}
|
||||||
? t('settings.bearerKeyAccessAll') || 'All Resources'
|
|
||||||
: keyData.accessType === 'groups'
|
|
||||||
? `${t('settings.bearerKeyAccessGroups') || 'Groups'}: ${keyData.allowedGroups}`
|
|
||||||
: `${t('settings.bearerKeyAccessServers') || 'Servers'}: ${keyData.allowedServers}`}
|
|
||||||
</td>
|
</td>
|
||||||
<td className="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
|
<td className="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
|
||||||
<button
|
<button
|
||||||
@@ -737,7 +803,7 @@ const SettingsPage: React.FC = () => {
|
|||||||
name: string;
|
name: string;
|
||||||
token: string;
|
token: string;
|
||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
accessType: 'all' | 'groups' | 'servers';
|
accessType: 'all' | 'groups' | 'servers' | 'custom';
|
||||||
allowedGroups: string;
|
allowedGroups: string;
|
||||||
allowedServers: string;
|
allowedServers: string;
|
||||||
}>({
|
}>({
|
||||||
@@ -765,10 +831,10 @@ const SettingsPage: React.FC = () => {
|
|||||||
|
|
||||||
// Reset selected arrays when accessType changes
|
// Reset selected arrays when accessType changes
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (newBearerKey.accessType !== 'groups') {
|
if (newBearerKey.accessType !== 'groups' && newBearerKey.accessType !== 'custom') {
|
||||||
setNewSelectedGroups([]);
|
setNewSelectedGroups([]);
|
||||||
}
|
}
|
||||||
if (newBearerKey.accessType !== 'servers') {
|
if (newBearerKey.accessType !== 'servers' && newBearerKey.accessType !== 'custom') {
|
||||||
setNewSelectedServers([]);
|
setNewSelectedServers([]);
|
||||||
}
|
}
|
||||||
}, [newBearerKey.accessType]);
|
}, [newBearerKey.accessType]);
|
||||||
@@ -866,6 +932,17 @@ const SettingsPage: React.FC = () => {
|
|||||||
);
|
);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
if (
|
||||||
|
newBearerKey.accessType === 'custom' &&
|
||||||
|
newSelectedGroups.length === 0 &&
|
||||||
|
newSelectedServers.length === 0
|
||||||
|
) {
|
||||||
|
showToast(
|
||||||
|
t('settings.selectAtLeastOneGroupOrServer') || 'Please select at least one group or server',
|
||||||
|
'error',
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
await createBearerKey({
|
await createBearerKey({
|
||||||
name: newBearerKey.name,
|
name: newBearerKey.name,
|
||||||
@@ -873,11 +950,13 @@ const SettingsPage: React.FC = () => {
|
|||||||
enabled: newBearerKey.enabled,
|
enabled: newBearerKey.enabled,
|
||||||
accessType: newBearerKey.accessType,
|
accessType: newBearerKey.accessType,
|
||||||
allowedGroups:
|
allowedGroups:
|
||||||
newBearerKey.accessType === 'groups' && newSelectedGroups.length > 0
|
(newBearerKey.accessType === 'groups' || newBearerKey.accessType === 'custom') &&
|
||||||
|
newSelectedGroups.length > 0
|
||||||
? newSelectedGroups
|
? newSelectedGroups
|
||||||
: undefined,
|
: undefined,
|
||||||
allowedServers:
|
allowedServers:
|
||||||
newBearerKey.accessType === 'servers' && newSelectedServers.length > 0
|
(newBearerKey.accessType === 'servers' || newBearerKey.accessType === 'custom') &&
|
||||||
|
newSelectedServers.length > 0
|
||||||
? newSelectedServers
|
? newSelectedServers
|
||||||
: undefined,
|
: undefined,
|
||||||
} as any);
|
} as any);
|
||||||
@@ -901,7 +980,7 @@ const SettingsPage: React.FC = () => {
|
|||||||
name: string;
|
name: string;
|
||||||
token: string;
|
token: string;
|
||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
accessType: 'all' | 'groups' | 'servers';
|
accessType: 'all' | 'groups' | 'servers' | 'custom';
|
||||||
allowedGroups: string;
|
allowedGroups: string;
|
||||||
allowedServers: string;
|
allowedServers: string;
|
||||||
},
|
},
|
||||||
@@ -1128,7 +1207,7 @@ const SettingsPage: React.FC = () => {
|
|||||||
onChange={(e) =>
|
onChange={(e) =>
|
||||||
setNewBearerKey((prev) => ({
|
setNewBearerKey((prev) => ({
|
||||||
...prev,
|
...prev,
|
||||||
accessType: e.target.value as 'all' | 'groups' | 'servers',
|
accessType: e.target.value as 'all' | 'groups' | 'servers' | 'custom',
|
||||||
}))
|
}))
|
||||||
}
|
}
|
||||||
disabled={loading}
|
disabled={loading}
|
||||||
@@ -1142,9 +1221,13 @@ const SettingsPage: React.FC = () => {
|
|||||||
<option value="servers">
|
<option value="servers">
|
||||||
{t('settings.bearerKeyAccessServers') || 'Specific Servers'}
|
{t('settings.bearerKeyAccessServers') || 'Specific Servers'}
|
||||||
</option>
|
</option>
|
||||||
|
<option value="custom">
|
||||||
|
{t('settings.bearerKeyAccessCustom') || 'Custom (Groups & Servers)'}
|
||||||
|
</option>
|
||||||
</select>
|
</select>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
{newBearerKey.accessType !== 'custom' && (
|
||||||
<div className="flex-1 min-w-[200px]">
|
<div className="flex-1 min-w-[200px]">
|
||||||
<label
|
<label
|
||||||
className={`block text-sm font-medium mb-1 ${newBearerKey.accessType === 'all' ? 'text-gray-400' : 'text-gray-700'}`}
|
className={`block text-sm font-medium mb-1 ${newBearerKey.accessType === 'all' ? 'text-gray-400' : 'text-gray-700'}`}
|
||||||
@@ -1177,6 +1260,36 @@ const SettingsPage: React.FC = () => {
|
|||||||
disabled={loading || newBearerKey.accessType === 'all'}
|
disabled={loading || newBearerKey.accessType === 'all'}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{newBearerKey.accessType === 'custom' && (
|
||||||
|
<>
|
||||||
|
<div className="flex-1 min-w-[200px]">
|
||||||
|
<label className="block text-sm font-medium text-gray-700 mb-1">
|
||||||
|
{t('settings.bearerKeyAllowedGroups') || 'Allowed groups'}
|
||||||
|
</label>
|
||||||
|
<MultiSelect
|
||||||
|
options={availableGroups}
|
||||||
|
selected={newSelectedGroups}
|
||||||
|
onChange={setNewSelectedGroups}
|
||||||
|
placeholder={t('settings.selectGroups') || 'Select groups...'}
|
||||||
|
disabled={loading}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div className="flex-1 min-w-[200px]">
|
||||||
|
<label className="block text-sm font-medium text-gray-700 mb-1">
|
||||||
|
{t('settings.bearerKeyAllowedServers') || 'Allowed servers'}
|
||||||
|
</label>
|
||||||
|
<MultiSelect
|
||||||
|
options={availableServers}
|
||||||
|
selected={newSelectedServers}
|
||||||
|
onChange={setNewSelectedServers}
|
||||||
|
placeholder={t('settings.selectServers') || 'Select servers...'}
|
||||||
|
disabled={loading}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</>
|
||||||
|
)}
|
||||||
|
|
||||||
<div className="flex justify-end gap-2">
|
<div className="flex justify-end gap-2">
|
||||||
<button
|
<button
|
||||||
|
|||||||
@@ -310,7 +310,7 @@ export interface ApiResponse<T = any> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Bearer authentication key configuration (frontend view model)
|
// Bearer authentication key configuration (frontend view model)
|
||||||
export type BearerKeyAccessType = 'all' | 'groups' | 'servers';
|
export type BearerKeyAccessType = 'all' | 'groups' | 'servers' | 'custom';
|
||||||
|
|
||||||
export interface BearerKey {
|
export interface BearerKey {
|
||||||
id: string;
|
id: string;
|
||||||
|
|||||||
@@ -568,6 +568,7 @@
|
|||||||
"bearerKeyAccessAll": "All",
|
"bearerKeyAccessAll": "All",
|
||||||
"bearerKeyAccessGroups": "Groups",
|
"bearerKeyAccessGroups": "Groups",
|
||||||
"bearerKeyAccessServers": "Servers",
|
"bearerKeyAccessServers": "Servers",
|
||||||
|
"bearerKeyAccessCustom": "Custom",
|
||||||
"bearerKeyAllowedGroups": "Allowed groups",
|
"bearerKeyAllowedGroups": "Allowed groups",
|
||||||
"bearerKeyAllowedServers": "Allowed servers",
|
"bearerKeyAllowedServers": "Allowed servers",
|
||||||
"addBearerKey": "Add key",
|
"addBearerKey": "Add key",
|
||||||
|
|||||||
@@ -569,6 +569,7 @@
|
|||||||
"bearerKeyAccessAll": "Toutes",
|
"bearerKeyAccessAll": "Toutes",
|
||||||
"bearerKeyAccessGroups": "Groupes",
|
"bearerKeyAccessGroups": "Groupes",
|
||||||
"bearerKeyAccessServers": "Serveurs",
|
"bearerKeyAccessServers": "Serveurs",
|
||||||
|
"bearerKeyAccessCustom": "Personnalisée",
|
||||||
"bearerKeyAllowedGroups": "Groupes autorisés",
|
"bearerKeyAllowedGroups": "Groupes autorisés",
|
||||||
"bearerKeyAllowedServers": "Serveurs autorisés",
|
"bearerKeyAllowedServers": "Serveurs autorisés",
|
||||||
"addBearerKey": "Ajouter une clé",
|
"addBearerKey": "Ajouter une clé",
|
||||||
|
|||||||
@@ -569,6 +569,7 @@
|
|||||||
"bearerKeyAccessAll": "Tümü",
|
"bearerKeyAccessAll": "Tümü",
|
||||||
"bearerKeyAccessGroups": "Gruplar",
|
"bearerKeyAccessGroups": "Gruplar",
|
||||||
"bearerKeyAccessServers": "Sunucular",
|
"bearerKeyAccessServers": "Sunucular",
|
||||||
|
"bearerKeyAccessCustom": "Özel",
|
||||||
"bearerKeyAllowedGroups": "İzin verilen gruplar",
|
"bearerKeyAllowedGroups": "İzin verilen gruplar",
|
||||||
"bearerKeyAllowedServers": "İzin verilen sunucular",
|
"bearerKeyAllowedServers": "İzin verilen sunucular",
|
||||||
"addBearerKey": "Anahtar ekle",
|
"addBearerKey": "Anahtar ekle",
|
||||||
|
|||||||
@@ -570,6 +570,7 @@
|
|||||||
"bearerKeyAccessAll": "全部",
|
"bearerKeyAccessAll": "全部",
|
||||||
"bearerKeyAccessGroups": "指定分组",
|
"bearerKeyAccessGroups": "指定分组",
|
||||||
"bearerKeyAccessServers": "指定服务器",
|
"bearerKeyAccessServers": "指定服务器",
|
||||||
|
"bearerKeyAccessCustom": "自定义",
|
||||||
"bearerKeyAllowedGroups": "允许访问的分组",
|
"bearerKeyAllowedGroups": "允许访问的分组",
|
||||||
"bearerKeyAllowedServers": "允许访问的服务器",
|
"bearerKeyAllowedServers": "允许访问的服务器",
|
||||||
"addBearerKey": "新增密钥",
|
"addBearerKey": "新增密钥",
|
||||||
|
|||||||
@@ -63,5 +63,6 @@
|
|||||||
"requiresAuthentication": false
|
"requiresAuthentication": false
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
},
|
||||||
|
"bearerKeys": []
|
||||||
}
|
}
|
||||||
@@ -57,7 +57,7 @@ export const createBearerKey = async (req: Request, res: Response): Promise<void
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!accessType || !['all', 'groups', 'servers'].includes(accessType)) {
|
if (!accessType || !['all', 'groups', 'servers', 'custom'].includes(accessType)) {
|
||||||
res.status(400).json({ success: false, message: 'Invalid accessType' });
|
res.status(400).json({ success: false, message: 'Invalid accessType' });
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -104,7 +104,7 @@ export const updateBearerKey = async (req: Request, res: Response): Promise<void
|
|||||||
if (token !== undefined) updates.token = token;
|
if (token !== undefined) updates.token = token;
|
||||||
if (enabled !== undefined) updates.enabled = enabled;
|
if (enabled !== undefined) updates.enabled = enabled;
|
||||||
if (accessType !== undefined) {
|
if (accessType !== undefined) {
|
||||||
if (!['all', 'groups', 'servers'].includes(accessType)) {
|
if (!['all', 'groups', 'servers', 'custom'].includes(accessType)) {
|
||||||
res.status(400).json({ success: false, message: 'Invalid accessType' });
|
res.status(400).json({ success: false, message: 'Invalid accessType' });
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -423,7 +423,7 @@ export const deleteServer = async (req: Request, res: Response): Promise<void> =
|
|||||||
export const updateServer = async (req: Request, res: Response): Promise<void> => {
|
export const updateServer = async (req: Request, res: Response): Promise<void> => {
|
||||||
try {
|
try {
|
||||||
const { name } = req.params;
|
const { name } = req.params;
|
||||||
const { config } = req.body;
|
const { config, newName } = req.body;
|
||||||
if (!name) {
|
if (!name) {
|
||||||
res.status(400).json({
|
res.status(400).json({
|
||||||
success: false,
|
success: false,
|
||||||
@@ -510,12 +510,52 @@ export const updateServer = async (req: Request, res: Response): Promise<void> =
|
|||||||
config.owner = currentUser?.username || 'admin';
|
config.owner = currentUser?.username || 'admin';
|
||||||
}
|
}
|
||||||
|
|
||||||
const result = await addOrUpdateServer(name, config, true); // Allow override for updates
|
// Check if server name is being changed
|
||||||
|
const isRenaming = newName && newName !== name;
|
||||||
|
|
||||||
|
// If renaming, validate the new name and update references
|
||||||
|
if (isRenaming) {
|
||||||
|
const serverDao = getServerDao();
|
||||||
|
|
||||||
|
// Check if new name already exists
|
||||||
|
if (await serverDao.exists(newName)) {
|
||||||
|
res.status(400).json({
|
||||||
|
success: false,
|
||||||
|
message: `Server name '${newName}' already exists`,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Rename the server
|
||||||
|
const renamed = await serverDao.rename(name, newName);
|
||||||
|
if (!renamed) {
|
||||||
|
res.status(404).json({
|
||||||
|
success: false,
|
||||||
|
message: 'Server not found',
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update references in groups
|
||||||
|
const groupDao = getGroupDao();
|
||||||
|
await groupDao.updateServerName(name, newName);
|
||||||
|
|
||||||
|
// Update references in bearer keys
|
||||||
|
const bearerKeyDao = getBearerKeyDao();
|
||||||
|
await bearerKeyDao.updateServerName(name, newName);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use the final server name (new name if renaming, otherwise original name)
|
||||||
|
const finalName = isRenaming ? newName : name;
|
||||||
|
|
||||||
|
const result = await addOrUpdateServer(finalName, config, true); // Allow override for updates
|
||||||
if (result.success) {
|
if (result.success) {
|
||||||
notifyToolChanged(name);
|
notifyToolChanged(finalName);
|
||||||
res.json({
|
res.json({
|
||||||
success: true,
|
success: true,
|
||||||
message: 'Server updated successfully',
|
message: isRenaming
|
||||||
|
? `Server renamed and updated successfully`
|
||||||
|
: 'Server updated successfully',
|
||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
res.status(404).json({
|
res.status(404).json({
|
||||||
@@ -524,9 +564,10 @@ export const updateServer = async (req: Request, res: Response): Promise<void> =
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
|
console.error('Failed to update server:', error);
|
||||||
res.status(500).json({
|
res.status(500).json({
|
||||||
success: false,
|
success: false,
|
||||||
message: 'Internal server error',
|
message: error instanceof Error ? error.message : 'Internal server error',
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -13,6 +13,10 @@ export interface BearerKeyDao {
|
|||||||
create(data: Omit<BearerKey, 'id'>): Promise<BearerKey>;
|
create(data: Omit<BearerKey, 'id'>): Promise<BearerKey>;
|
||||||
update(id: string, data: Partial<Omit<BearerKey, 'id'>>): Promise<BearerKey | null>;
|
update(id: string, data: Partial<Omit<BearerKey, 'id'>>): Promise<BearerKey | null>;
|
||||||
delete(id: string): Promise<boolean>;
|
delete(id: string): Promise<boolean>;
|
||||||
|
/**
|
||||||
|
* Update server name in all bearer keys (when server is renamed)
|
||||||
|
*/
|
||||||
|
updateServerName(oldName: string, newName: string): Promise<number>;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -122,4 +126,34 @@ export class BearerKeyDaoImpl extends JsonFileBaseDao implements BearerKeyDao {
|
|||||||
await this.saveKeys(next);
|
await this.saveKeys(next);
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async updateServerName(oldName: string, newName: string): Promise<number> {
|
||||||
|
const keys = await this.loadKeysWithMigration();
|
||||||
|
let updatedCount = 0;
|
||||||
|
|
||||||
|
for (const key of keys) {
|
||||||
|
let updated = false;
|
||||||
|
|
||||||
|
if (key.allowedServers && key.allowedServers.length > 0) {
|
||||||
|
const newServers = key.allowedServers.map((server) => {
|
||||||
|
if (server === oldName) {
|
||||||
|
updated = true;
|
||||||
|
return newName;
|
||||||
|
}
|
||||||
|
return server;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (updated) {
|
||||||
|
key.allowedServers = newServers;
|
||||||
|
updatedCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (updatedCount > 0) {
|
||||||
|
await this.saveKeys(keys);
|
||||||
|
}
|
||||||
|
|
||||||
|
return updatedCount;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -74,4 +74,30 @@ export class BearerKeyDaoDbImpl implements BearerKeyDao {
|
|||||||
async delete(id: string): Promise<boolean> {
|
async delete(id: string): Promise<boolean> {
|
||||||
return await this.repository.delete(id);
|
return await this.repository.delete(id);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async updateServerName(oldName: string, newName: string): Promise<number> {
|
||||||
|
const allKeys = await this.repository.findAll();
|
||||||
|
let updatedCount = 0;
|
||||||
|
|
||||||
|
for (const key of allKeys) {
|
||||||
|
let updated = false;
|
||||||
|
|
||||||
|
if (key.allowedServers && key.allowedServers.length > 0) {
|
||||||
|
const newServers = key.allowedServers.map((server) => {
|
||||||
|
if (server === oldName) {
|
||||||
|
updated = true;
|
||||||
|
return newName;
|
||||||
|
}
|
||||||
|
return server;
|
||||||
|
});
|
||||||
|
|
||||||
|
if (updated) {
|
||||||
|
await this.repository.update(key.id, { allowedServers: newServers });
|
||||||
|
updatedCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return updatedCount;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -36,6 +36,11 @@ export interface GroupDao extends BaseDao<IGroup, string> {
|
|||||||
* Find group by name
|
* Find group by name
|
||||||
*/
|
*/
|
||||||
findByName(name: string): Promise<IGroup | null>;
|
findByName(name: string): Promise<IGroup | null>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update server name in all groups (when server is renamed)
|
||||||
|
*/
|
||||||
|
updateServerName(oldName: string, newName: string): Promise<number>;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -218,4 +223,39 @@ export class GroupDaoImpl extends JsonFileBaseDao implements GroupDao {
|
|||||||
const groups = await this.getAll();
|
const groups = await this.getAll();
|
||||||
return groups.find((group) => group.name === name) || null;
|
return groups.find((group) => group.name === name) || null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async updateServerName(oldName: string, newName: string): Promise<number> {
|
||||||
|
const groups = await this.getAll();
|
||||||
|
let updatedCount = 0;
|
||||||
|
|
||||||
|
for (const group of groups) {
|
||||||
|
let updated = false;
|
||||||
|
const newServers = group.servers.map((server) => {
|
||||||
|
if (typeof server === 'string') {
|
||||||
|
if (server === oldName) {
|
||||||
|
updated = true;
|
||||||
|
return newName;
|
||||||
|
}
|
||||||
|
return server;
|
||||||
|
} else {
|
||||||
|
if (server.name === oldName) {
|
||||||
|
updated = true;
|
||||||
|
return { ...server, name: newName };
|
||||||
|
}
|
||||||
|
return server;
|
||||||
|
}
|
||||||
|
}) as IGroup['servers'];
|
||||||
|
|
||||||
|
if (updated) {
|
||||||
|
group.servers = newServers;
|
||||||
|
updatedCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (updatedCount > 0) {
|
||||||
|
await this.saveAll(groups);
|
||||||
|
}
|
||||||
|
|
||||||
|
return updatedCount;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -151,4 +151,35 @@ export class GroupDaoDbImpl implements GroupDao {
|
|||||||
owner: group.owner,
|
owner: group.owner,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async updateServerName(oldName: string, newName: string): Promise<number> {
|
||||||
|
const allGroups = await this.repository.findAll();
|
||||||
|
let updatedCount = 0;
|
||||||
|
|
||||||
|
for (const group of allGroups) {
|
||||||
|
let updated = false;
|
||||||
|
const newServers = group.servers.map((server) => {
|
||||||
|
if (typeof server === 'string') {
|
||||||
|
if (server === oldName) {
|
||||||
|
updated = true;
|
||||||
|
return newName;
|
||||||
|
}
|
||||||
|
return server;
|
||||||
|
} else {
|
||||||
|
if (server.name === oldName) {
|
||||||
|
updated = true;
|
||||||
|
return { ...server, name: newName };
|
||||||
|
}
|
||||||
|
return server;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
if (updated) {
|
||||||
|
await this.update(group.id, { servers: newServers as any });
|
||||||
|
updatedCount++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return updatedCount;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -41,6 +41,11 @@ export interface ServerDao extends BaseDao<ServerConfigWithName, string> {
|
|||||||
name: string,
|
name: string,
|
||||||
prompts: Record<string, { enabled: boolean; description?: string }>,
|
prompts: Record<string, { enabled: boolean; description?: string }>,
|
||||||
): Promise<boolean>;
|
): Promise<boolean>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Rename a server (change its name/key)
|
||||||
|
*/
|
||||||
|
rename(oldName: string, newName: string): Promise<boolean>;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -95,7 +100,8 @@ export class ServerDaoImpl extends JsonFileBaseDao implements ServerDao {
|
|||||||
return {
|
return {
|
||||||
...existing,
|
...existing,
|
||||||
...updates,
|
...updates,
|
||||||
name: existing.name, // Name should not be updated
|
// Keep the existing name unless explicitly updating via rename
|
||||||
|
name: updates.name ?? existing.name,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -141,9 +147,7 @@ export class ServerDaoImpl extends JsonFileBaseDao implements ServerDao {
|
|||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Don't allow name changes
|
const updatedServer = this.updateEntity(servers[index], updates);
|
||||||
const { name: _, ...allowedUpdates } = updates;
|
|
||||||
const updatedServer = this.updateEntity(servers[index], allowedUpdates);
|
|
||||||
servers[index] = updatedServer;
|
servers[index] = updatedServer;
|
||||||
|
|
||||||
await this.saveAll(servers);
|
await this.saveAll(servers);
|
||||||
@@ -207,4 +211,22 @@ export class ServerDaoImpl extends JsonFileBaseDao implements ServerDao {
|
|||||||
const result = await this.update(name, { prompts });
|
const result = await this.update(name, { prompts });
|
||||||
return result !== null;
|
return result !== null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async rename(oldName: string, newName: string): Promise<boolean> {
|
||||||
|
const servers = await this.getAll();
|
||||||
|
const index = servers.findIndex((server) => server.name === oldName);
|
||||||
|
|
||||||
|
if (index === -1) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if newName already exists
|
||||||
|
if (servers.find((server) => server.name === newName)) {
|
||||||
|
throw new Error(`Server ${newName} already exists`);
|
||||||
|
}
|
||||||
|
|
||||||
|
servers[index] = { ...servers[index], name: newName };
|
||||||
|
await this.saveAll(servers);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -115,6 +115,15 @@ export class ServerDaoDbImpl implements ServerDao {
|
|||||||
return result !== null;
|
return result !== null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async rename(oldName: string, newName: string): Promise<boolean> {
|
||||||
|
// Check if newName already exists
|
||||||
|
if (await this.repository.exists(newName)) {
|
||||||
|
throw new Error(`Server ${newName} already exists`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return await this.repository.rename(oldName, newName);
|
||||||
|
}
|
||||||
|
|
||||||
private mapToServerConfig(server: {
|
private mapToServerConfig(server: {
|
||||||
name: string;
|
name: string;
|
||||||
type?: string;
|
type?: string;
|
||||||
|
|||||||
@@ -25,7 +25,7 @@ export class BearerKey {
|
|||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
|
|
||||||
@Column({ type: 'varchar', length: 20, default: 'all' })
|
@Column({ type: 'varchar', length: 20, default: 'all' })
|
||||||
accessType: 'all' | 'groups' | 'servers';
|
accessType: 'all' | 'groups' | 'servers' | 'custom';
|
||||||
|
|
||||||
@Column({ type: 'simple-json', nullable: true })
|
@Column({ type: 'simple-json', nullable: true })
|
||||||
allowedGroups?: string[];
|
allowedGroups?: string[];
|
||||||
|
|||||||
@@ -33,6 +33,9 @@ export class SystemConfig {
|
|||||||
@Column({ type: 'boolean', nullable: true })
|
@Column({ type: 'boolean', nullable: true })
|
||||||
enableSessionRebuild?: boolean;
|
enableSessionRebuild?: boolean;
|
||||||
|
|
||||||
|
@Column({ type: 'simple-json', nullable: true })
|
||||||
|
compression?: Record<string, any>;
|
||||||
|
|
||||||
@CreateDateColumn({ name: 'created_at', type: 'timestamp' })
|
@CreateDateColumn({ name: 'created_at', type: 'timestamp' })
|
||||||
createdAt: Date;
|
createdAt: Date;
|
||||||
|
|
||||||
|
|||||||
@@ -89,6 +89,19 @@ export class ServerRepository {
|
|||||||
async setEnabled(name: string, enabled: boolean): Promise<Server | null> {
|
async setEnabled(name: string, enabled: boolean): Promise<Server | null> {
|
||||||
return await this.update(name, { enabled });
|
return await this.update(name, { enabled });
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Rename a server
|
||||||
|
*/
|
||||||
|
async rename(oldName: string, newName: string): Promise<boolean> {
|
||||||
|
const server = await this.findByName(oldName);
|
||||||
|
if (!server) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
server.name = newName;
|
||||||
|
await this.repository.save(server);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export default ServerRepository;
|
export default ServerRepository;
|
||||||
|
|||||||
@@ -32,6 +32,7 @@ export class SystemConfigRepository {
|
|||||||
oauth: {},
|
oauth: {},
|
||||||
oauthServer: {},
|
oauthServer: {},
|
||||||
enableSessionRebuild: false,
|
enableSessionRebuild: false,
|
||||||
|
compression: {},
|
||||||
});
|
});
|
||||||
config = await this.repository.save(config);
|
config = await this.repository.save(config);
|
||||||
}
|
}
|
||||||
|
|||||||
266
src/services/compressionService.ts
Normal file
266
src/services/compressionService.ts
Normal file
@@ -0,0 +1,266 @@
|
|||||||
|
import OpenAI from 'openai';
|
||||||
|
import { getSmartRoutingConfig, SmartRoutingConfig } from '../utils/smartRouting.js';
|
||||||
|
import { getSystemConfigDao } from '../dao/index.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Compression configuration interface
|
||||||
|
*/
|
||||||
|
export interface CompressionConfig {
|
||||||
|
enabled: boolean;
|
||||||
|
model?: string;
|
||||||
|
maxInputTokens?: number;
|
||||||
|
targetReductionRatio?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Default compression configuration
|
||||||
|
*/
|
||||||
|
const DEFAULT_COMPRESSION_CONFIG: CompressionConfig = {
|
||||||
|
enabled: false,
|
||||||
|
model: 'gpt-4o-mini',
|
||||||
|
maxInputTokens: 100000,
|
||||||
|
targetReductionRatio: 0.5,
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get compression configuration from system settings
|
||||||
|
*/
|
||||||
|
export async function getCompressionConfig(): Promise<CompressionConfig> {
|
||||||
|
try {
|
||||||
|
const systemConfigDao = getSystemConfigDao();
|
||||||
|
const systemConfig = await systemConfigDao.get();
|
||||||
|
const compressionSettings = systemConfig?.compression || {};
|
||||||
|
|
||||||
|
return {
|
||||||
|
enabled: compressionSettings.enabled ?? DEFAULT_COMPRESSION_CONFIG.enabled,
|
||||||
|
model: compressionSettings.model ?? DEFAULT_COMPRESSION_CONFIG.model,
|
||||||
|
maxInputTokens: compressionSettings.maxInputTokens ?? DEFAULT_COMPRESSION_CONFIG.maxInputTokens,
|
||||||
|
targetReductionRatio:
|
||||||
|
compressionSettings.targetReductionRatio ?? DEFAULT_COMPRESSION_CONFIG.targetReductionRatio,
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to get compression config, using defaults:', error);
|
||||||
|
return DEFAULT_COMPRESSION_CONFIG;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if compression is available and enabled
|
||||||
|
*/
|
||||||
|
export async function isCompressionEnabled(): Promise<boolean> {
|
||||||
|
const config = await getCompressionConfig();
|
||||||
|
if (!config.enabled) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if we have OpenAI API key configured (via smart routing config)
|
||||||
|
const smartRoutingConfig = await getSmartRoutingConfig();
|
||||||
|
return !!smartRoutingConfig.openaiApiKey;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get OpenAI client for compression
|
||||||
|
*/
|
||||||
|
async function getOpenAIClient(smartRoutingConfig: SmartRoutingConfig): Promise<OpenAI | null> {
|
||||||
|
if (!smartRoutingConfig.openaiApiKey) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return new OpenAI({
|
||||||
|
apiKey: smartRoutingConfig.openaiApiKey,
|
||||||
|
baseURL: smartRoutingConfig.openaiApiBaseUrl || 'https://api.openai.com/v1',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Estimate token count for a string (rough approximation)
|
||||||
|
* Uses ~4 characters per token as a rough estimate
|
||||||
|
*/
|
||||||
|
export function estimateTokenCount(text: string): number {
|
||||||
|
return Math.ceil(text.length / 4);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if content should be compressed based on token count
|
||||||
|
*/
|
||||||
|
export function shouldCompress(content: string, maxInputTokens: number): boolean {
|
||||||
|
const estimatedTokens = estimateTokenCount(content);
|
||||||
|
// Only compress if content is larger than a reasonable threshold
|
||||||
|
const compressionThreshold = Math.min(maxInputTokens * 0.1, 1000);
|
||||||
|
return estimatedTokens > compressionThreshold;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Compress MCP tool output using AI
|
||||||
|
*
|
||||||
|
* @param content The MCP tool output content to compress
|
||||||
|
* @param context Optional context about the tool that generated this output
|
||||||
|
* @returns Compressed content or original content if compression fails/is disabled
|
||||||
|
*/
|
||||||
|
export async function compressOutput(
|
||||||
|
content: string,
|
||||||
|
context?: {
|
||||||
|
toolName?: string;
|
||||||
|
serverName?: string;
|
||||||
|
},
|
||||||
|
): Promise<{ compressed: string; originalLength: number; compressedLength: number; wasCompressed: boolean }> {
|
||||||
|
const originalLength = content.length;
|
||||||
|
|
||||||
|
// Check if compression is enabled
|
||||||
|
const compressionConfig = await getCompressionConfig();
|
||||||
|
if (!compressionConfig.enabled) {
|
||||||
|
return {
|
||||||
|
compressed: content,
|
||||||
|
originalLength,
|
||||||
|
compressedLength: originalLength,
|
||||||
|
wasCompressed: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if content should be compressed
|
||||||
|
if (!shouldCompress(content, compressionConfig.maxInputTokens || 100000)) {
|
||||||
|
return {
|
||||||
|
compressed: content,
|
||||||
|
originalLength,
|
||||||
|
compressedLength: originalLength,
|
||||||
|
wasCompressed: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const smartRoutingConfig = await getSmartRoutingConfig();
|
||||||
|
const openai = await getOpenAIClient(smartRoutingConfig);
|
||||||
|
|
||||||
|
if (!openai) {
|
||||||
|
console.warn('Compression enabled but OpenAI API key not configured');
|
||||||
|
return {
|
||||||
|
compressed: content,
|
||||||
|
originalLength,
|
||||||
|
compressedLength: originalLength,
|
||||||
|
wasCompressed: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const targetRatio = compressionConfig.targetReductionRatio || 0.5;
|
||||||
|
const toolContext = context?.toolName ? `from tool "${context.toolName}"` : '';
|
||||||
|
const serverContext = context?.serverName ? `on server "${context.serverName}"` : '';
|
||||||
|
|
||||||
|
const systemPrompt = `You are a data compression assistant. Your task is to compress MCP (Model Context Protocol) tool outputs while preserving all essential information.
|
||||||
|
|
||||||
|
Guidelines:
|
||||||
|
- Remove redundant information, formatting, and verbose descriptions
|
||||||
|
- Preserve all data values, identifiers, and critical information
|
||||||
|
- Keep error messages and status information intact
|
||||||
|
- Maintain structured data (JSON, arrays) in a compact but readable format
|
||||||
|
- Target approximately ${Math.round(targetRatio * 100)}% reduction in size
|
||||||
|
- If the content cannot be meaningfully compressed, return it as-is
|
||||||
|
|
||||||
|
The output is ${toolContext} ${serverContext}.`;
|
||||||
|
|
||||||
|
const userPrompt = `Compress the following MCP tool output while preserving all essential information:
|
||||||
|
|
||||||
|
${content}`;
|
||||||
|
|
||||||
|
const response = await openai.chat.completions.create({
|
||||||
|
model: compressionConfig.model || 'gpt-4o-mini',
|
||||||
|
messages: [
|
||||||
|
{ role: 'system', content: systemPrompt },
|
||||||
|
{ role: 'user', content: userPrompt },
|
||||||
|
],
|
||||||
|
temperature: 0.1,
|
||||||
|
max_tokens: Math.ceil(estimateTokenCount(content) * targetRatio * 1.5),
|
||||||
|
});
|
||||||
|
|
||||||
|
const compressedContent = response.choices[0]?.message?.content;
|
||||||
|
|
||||||
|
if (!compressedContent) {
|
||||||
|
console.warn('Compression returned empty result, using original content');
|
||||||
|
return {
|
||||||
|
compressed: content,
|
||||||
|
originalLength,
|
||||||
|
compressedLength: originalLength,
|
||||||
|
wasCompressed: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const compressedLength = compressedContent.length;
|
||||||
|
|
||||||
|
// Only use compressed version if it's actually smaller
|
||||||
|
if (compressedLength >= originalLength) {
|
||||||
|
console.log('Compression did not reduce size, using original content');
|
||||||
|
return {
|
||||||
|
compressed: content,
|
||||||
|
originalLength,
|
||||||
|
compressedLength: originalLength,
|
||||||
|
wasCompressed: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const reductionPercent = (((originalLength - compressedLength) / originalLength) * 100).toFixed(1);
|
||||||
|
console.log(`Compressed output: ${originalLength} -> ${compressedLength} chars (${reductionPercent}% reduction)`);
|
||||||
|
|
||||||
|
return {
|
||||||
|
compressed: compressedContent,
|
||||||
|
originalLength,
|
||||||
|
compressedLength,
|
||||||
|
wasCompressed: true,
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Compression failed, using original content:', error);
|
||||||
|
return {
|
||||||
|
compressed: content,
|
||||||
|
originalLength,
|
||||||
|
compressedLength: originalLength,
|
||||||
|
wasCompressed: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Compress tool call result content
|
||||||
|
* This handles the MCP tool result format with content array
|
||||||
|
*/
|
||||||
|
export async function compressToolResult(
|
||||||
|
result: any,
|
||||||
|
context?: {
|
||||||
|
toolName?: string;
|
||||||
|
serverName?: string;
|
||||||
|
},
|
||||||
|
): Promise<any> {
|
||||||
|
// Check if compression is enabled first
|
||||||
|
const compressionEnabled = await isCompressionEnabled();
|
||||||
|
if (!compressionEnabled) {
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle error results - don't compress error messages
|
||||||
|
if (result?.isError) {
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle content array format
|
||||||
|
if (!result?.content || !Array.isArray(result.content)) {
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
const compressedContent = await Promise.all(
|
||||||
|
result.content.map(async (item: any) => {
|
||||||
|
// Only compress text content
|
||||||
|
if (item?.type !== 'text' || !item?.text) {
|
||||||
|
return item;
|
||||||
|
}
|
||||||
|
|
||||||
|
const compressionResult = await compressOutput(item.text, context);
|
||||||
|
|
||||||
|
return {
|
||||||
|
...item,
|
||||||
|
text: compressionResult.compressed,
|
||||||
|
};
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
return {
|
||||||
|
...result,
|
||||||
|
content: compressedContent,
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -48,7 +48,9 @@ export const setupClientKeepAlive = async (
|
|||||||
await (serverInfo.client as any).ping();
|
await (serverInfo.client as any).ping();
|
||||||
console.log(`Keep-alive ping successful for server: ${serverInfo.name}`);
|
console.log(`Keep-alive ping successful for server: ${serverInfo.name}`);
|
||||||
} else {
|
} else {
|
||||||
await serverInfo.client.listTools({ timeout: 5000 }).catch(() => void 0);
|
await serverInfo.client
|
||||||
|
.listTools({}, { ...(serverInfo.options || {}), timeout: 5000 })
|
||||||
|
.catch(() => void 0);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
|
|||||||
@@ -27,6 +27,7 @@ import { getDataService } from './services.js';
|
|||||||
import { getServerDao, getSystemConfigDao, ServerConfigWithName } from '../dao/index.js';
|
import { getServerDao, getSystemConfigDao, ServerConfigWithName } from '../dao/index.js';
|
||||||
import { initializeAllOAuthClients } from './oauthService.js';
|
import { initializeAllOAuthClients } from './oauthService.js';
|
||||||
import { createOAuthProvider } from './mcpOAuthProvider.js';
|
import { createOAuthProvider } from './mcpOAuthProvider.js';
|
||||||
|
import { compressToolResult } from './compressionService.js';
|
||||||
|
|
||||||
const servers: { [sessionId: string]: Server } = {};
|
const servers: { [sessionId: string]: Server } = {};
|
||||||
|
|
||||||
@@ -1260,7 +1261,7 @@ export const handleCallToolRequest = async (request: any, extra: any) => {
|
|||||||
const result = await openApiClient.callTool(cleanToolName, finalArgs, passthroughHeaders);
|
const result = await openApiClient.callTool(cleanToolName, finalArgs, passthroughHeaders);
|
||||||
|
|
||||||
console.log(`OpenAPI tool invocation result: ${JSON.stringify(result)}`);
|
console.log(`OpenAPI tool invocation result: ${JSON.stringify(result)}`);
|
||||||
return {
|
const openApiResult = {
|
||||||
content: [
|
content: [
|
||||||
{
|
{
|
||||||
type: 'text',
|
type: 'text',
|
||||||
@@ -1268,6 +1269,10 @@ export const handleCallToolRequest = async (request: any, extra: any) => {
|
|||||||
},
|
},
|
||||||
],
|
],
|
||||||
};
|
};
|
||||||
|
return compressToolResult(openApiResult, {
|
||||||
|
toolName: cleanToolName,
|
||||||
|
serverName: targetServerInfo.name,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Call the tool on the target server (MCP servers)
|
// Call the tool on the target server (MCP servers)
|
||||||
@@ -1297,7 +1302,10 @@ export const handleCallToolRequest = async (request: any, extra: any) => {
|
|||||||
);
|
);
|
||||||
|
|
||||||
console.log(`Tool invocation result: ${JSON.stringify(result)}`);
|
console.log(`Tool invocation result: ${JSON.stringify(result)}`);
|
||||||
return result;
|
return compressToolResult(result, {
|
||||||
|
toolName,
|
||||||
|
serverName: targetServerInfo.name,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Regular tool handling
|
// Regular tool handling
|
||||||
@@ -1356,7 +1364,7 @@ export const handleCallToolRequest = async (request: any, extra: any) => {
|
|||||||
);
|
);
|
||||||
|
|
||||||
console.log(`OpenAPI tool invocation result: ${JSON.stringify(result)}`);
|
console.log(`OpenAPI tool invocation result: ${JSON.stringify(result)}`);
|
||||||
return {
|
const openApiResult = {
|
||||||
content: [
|
content: [
|
||||||
{
|
{
|
||||||
type: 'text',
|
type: 'text',
|
||||||
@@ -1364,6 +1372,10 @@ export const handleCallToolRequest = async (request: any, extra: any) => {
|
|||||||
},
|
},
|
||||||
],
|
],
|
||||||
};
|
};
|
||||||
|
return compressToolResult(openApiResult, {
|
||||||
|
toolName: cleanToolName,
|
||||||
|
serverName: serverInfo.name,
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
// Handle MCP servers
|
// Handle MCP servers
|
||||||
@@ -1374,6 +1386,7 @@ export const handleCallToolRequest = async (request: any, extra: any) => {
|
|||||||
|
|
||||||
const separator = getNameSeparator();
|
const separator = getNameSeparator();
|
||||||
const prefix = `${serverInfo.name}${separator}`;
|
const prefix = `${serverInfo.name}${separator}`;
|
||||||
|
const originalToolName = request.params.name;
|
||||||
request.params.name = request.params.name.startsWith(prefix)
|
request.params.name = request.params.name.startsWith(prefix)
|
||||||
? request.params.name.substring(prefix.length)
|
? request.params.name.substring(prefix.length)
|
||||||
: request.params.name;
|
: request.params.name;
|
||||||
@@ -1383,7 +1396,10 @@ export const handleCallToolRequest = async (request: any, extra: any) => {
|
|||||||
serverInfo.options || {},
|
serverInfo.options || {},
|
||||||
);
|
);
|
||||||
console.log(`Tool call result: ${JSON.stringify(result)}`);
|
console.log(`Tool call result: ${JSON.stringify(result)}`);
|
||||||
return result;
|
return compressToolResult(result, {
|
||||||
|
toolName: originalToolName,
|
||||||
|
serverName: serverInfo.name,
|
||||||
|
});
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`Error handling CallToolRequest: ${error}`);
|
console.error(`Error handling CallToolRequest: ${error}`);
|
||||||
return {
|
return {
|
||||||
|
|||||||
@@ -88,6 +88,29 @@ const isBearerKeyAllowedForRequest = async (req: Request, key: BearerKey): Promi
|
|||||||
return groupServerNames.some((name) => allowedServers.includes(name));
|
return groupServerNames.some((name) => allowedServers.includes(name));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (key.accessType === 'custom') {
|
||||||
|
// For custom-scoped keys, check if the group is allowed OR if any server in the group is allowed
|
||||||
|
const allowedGroups = key.allowedGroups || [];
|
||||||
|
const allowedServers = key.allowedServers || [];
|
||||||
|
|
||||||
|
// Check if the group itself is allowed
|
||||||
|
const groupAllowed =
|
||||||
|
allowedGroups.includes(matchedGroup.name) || allowedGroups.includes(matchedGroup.id);
|
||||||
|
if (groupAllowed) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if any server in the group is allowed
|
||||||
|
if (allowedServers.length > 0 && Array.isArray(matchedGroup.servers)) {
|
||||||
|
const groupServerNames = matchedGroup.servers.map((server) =>
|
||||||
|
typeof server === 'string' ? server : server.name,
|
||||||
|
);
|
||||||
|
return groupServerNames.some((name) => allowedServers.includes(name));
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
// Unknown accessType with matched group
|
// Unknown accessType with matched group
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
@@ -102,8 +125,8 @@ const isBearerKeyAllowedForRequest = async (req: Request, key: BearerKey): Promi
|
|||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
if (key.accessType === 'servers') {
|
if (key.accessType === 'servers' || key.accessType === 'custom') {
|
||||||
// For server-scoped keys, check if the server is in allowedServers
|
// For server-scoped or custom-scoped keys, check if the server is in allowedServers
|
||||||
const allowedServers = key.allowedServers || [];
|
const allowedServers = key.allowedServers || [];
|
||||||
return allowedServers.includes(matchedServer.name);
|
return allowedServers.includes(matchedServer.name);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import { getRepositoryFactory } from '../db/index.js';
|
import { getRepositoryFactory } from '../db/index.js';
|
||||||
import { VectorEmbeddingRepository } from '../db/repositories/index.js';
|
import { VectorEmbeddingRepository } from '../db/repositories/index.js';
|
||||||
import { Tool } from '../types/index.js';
|
import { Tool } from '../types/index.js';
|
||||||
import { getAppDataSource, initializeDatabase } from '../db/connection.js';
|
import { getAppDataSource, isDatabaseConnected, initializeDatabase } from '../db/connection.js';
|
||||||
import { getSmartRoutingConfig } from '../utils/smartRouting.js';
|
import { getSmartRoutingConfig } from '../utils/smartRouting.js';
|
||||||
import OpenAI from 'openai';
|
import OpenAI from 'openai';
|
||||||
|
|
||||||
@@ -197,6 +197,12 @@ export const saveToolsAsVectorEmbeddings = async (
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Ensure database is initialized before using repository
|
||||||
|
if (!isDatabaseConnected()) {
|
||||||
|
console.info('Database not initialized, initializing...');
|
||||||
|
await initializeDatabase();
|
||||||
|
}
|
||||||
|
|
||||||
const config = await getOpenAIConfig();
|
const config = await getOpenAIConfig();
|
||||||
const vectorRepository = getRepositoryFactory(
|
const vectorRepository = getRepositoryFactory(
|
||||||
'vectorEmbeddings',
|
'vectorEmbeddings',
|
||||||
@@ -245,7 +251,7 @@ export const saveToolsAsVectorEmbeddings = async (
|
|||||||
|
|
||||||
console.log(`Saved ${tools.length} tool embeddings for server: ${serverName}`);
|
console.log(`Saved ${tools.length} tool embeddings for server: ${serverName}`);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`Error saving tool embeddings for server ${serverName}:`, error);
|
console.error(`Error saving tool embeddings for server ${serverName}:${error}`);
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -173,6 +173,12 @@ export interface SystemConfig {
|
|||||||
oauth?: OAuthProviderConfig; // OAuth provider configuration for upstream MCP servers
|
oauth?: OAuthProviderConfig; // OAuth provider configuration for upstream MCP servers
|
||||||
oauthServer?: OAuthServerConfig; // OAuth authorization server configuration for MCPHub itself
|
oauthServer?: OAuthServerConfig; // OAuth authorization server configuration for MCPHub itself
|
||||||
enableSessionRebuild?: boolean; // Controls whether server session rebuild is enabled
|
enableSessionRebuild?: boolean; // Controls whether server session rebuild is enabled
|
||||||
|
compression?: {
|
||||||
|
enabled?: boolean; // Enable/disable AI compression of MCP tool outputs
|
||||||
|
model?: string; // AI model to use for compression (default: 'gpt-4o-mini')
|
||||||
|
maxInputTokens?: number; // Maximum input tokens for compression (default: 100000)
|
||||||
|
targetReductionRatio?: number; // Target reduction ratio, 0.0-1.0 (default: 0.5)
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface UserConfig {
|
export interface UserConfig {
|
||||||
@@ -244,7 +250,7 @@ export interface OAuthServerConfig {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Bearer authentication key configuration
|
// Bearer authentication key configuration
|
||||||
export type BearerKeyAccessType = 'all' | 'groups' | 'servers';
|
export type BearerKeyAccessType = 'all' | 'groups' | 'servers' | 'custom';
|
||||||
|
|
||||||
export interface BearerKey {
|
export interface BearerKey {
|
||||||
id: string; // Unique identifier for the key
|
id: string; // Unique identifier for the key
|
||||||
@@ -252,8 +258,8 @@ export interface BearerKey {
|
|||||||
token: string; // Bearer token value
|
token: string; // Bearer token value
|
||||||
enabled: boolean; // Whether this key is enabled
|
enabled: boolean; // Whether this key is enabled
|
||||||
accessType: BearerKeyAccessType; // Access scope type
|
accessType: BearerKeyAccessType; // Access scope type
|
||||||
allowedGroups?: string[]; // Allowed group names when accessType === 'groups'
|
allowedGroups?: string[]; // Allowed group names when accessType === 'groups' or 'custom'
|
||||||
allowedServers?: string[]; // Allowed server names when accessType === 'servers'
|
allowedServers?: string[]; // Allowed server names when accessType === 'servers' or 'custom'
|
||||||
}
|
}
|
||||||
|
|
||||||
// Represents the settings for MCP servers
|
// Represents the settings for MCP servers
|
||||||
|
|||||||
@@ -117,6 +117,7 @@ export async function migrateToDatabase(): Promise<boolean> {
|
|||||||
oauth: settings.systemConfig.oauth || {},
|
oauth: settings.systemConfig.oauth || {},
|
||||||
oauthServer: settings.systemConfig.oauthServer || {},
|
oauthServer: settings.systemConfig.oauthServer || {},
|
||||||
enableSessionRebuild: settings.systemConfig.enableSessionRebuild,
|
enableSessionRebuild: settings.systemConfig.enableSessionRebuild,
|
||||||
|
compression: settings.systemConfig.compression || {},
|
||||||
};
|
};
|
||||||
await systemConfigRepo.update(systemConfig);
|
await systemConfigRepo.update(systemConfig);
|
||||||
console.log(' - System configuration updated');
|
console.log(' - System configuration updated');
|
||||||
|
|||||||
428
tests/services/compressionService.test.ts
Normal file
428
tests/services/compressionService.test.ts
Normal file
@@ -0,0 +1,428 @@
|
|||||||
|
// Mock the DAO module before imports
|
||||||
|
jest.mock('../../src/dao/index.js', () => ({
|
||||||
|
getSystemConfigDao: jest.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock smart routing config
|
||||||
|
jest.mock('../../src/utils/smartRouting.js', () => ({
|
||||||
|
getSmartRoutingConfig: jest.fn(),
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Mock OpenAI
|
||||||
|
jest.mock('openai', () => {
|
||||||
|
return {
|
||||||
|
__esModule: true,
|
||||||
|
default: jest.fn().mockImplementation(() => ({
|
||||||
|
chat: {
|
||||||
|
completions: {
|
||||||
|
create: jest.fn(),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
})),
|
||||||
|
};
|
||||||
|
});
|
||||||
|
|
||||||
|
import {
|
||||||
|
getCompressionConfig,
|
||||||
|
isCompressionEnabled,
|
||||||
|
estimateTokenCount,
|
||||||
|
shouldCompress,
|
||||||
|
compressOutput,
|
||||||
|
compressToolResult,
|
||||||
|
} from '../../src/services/compressionService.js';
|
||||||
|
import { getSystemConfigDao } from '../../src/dao/index.js';
|
||||||
|
import { getSmartRoutingConfig } from '../../src/utils/smartRouting.js';
|
||||||
|
import OpenAI from 'openai';
|
||||||
|
|
||||||
|
describe('CompressionService', () => {
|
||||||
|
const mockSystemConfigDao = {
|
||||||
|
get: jest.fn(),
|
||||||
|
getSection: jest.fn(),
|
||||||
|
update: jest.fn(),
|
||||||
|
updateSection: jest.fn(),
|
||||||
|
};
|
||||||
|
|
||||||
|
beforeEach(() => {
|
||||||
|
jest.clearAllMocks();
|
||||||
|
(getSystemConfigDao as jest.Mock).mockReturnValue(mockSystemConfigDao);
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('getCompressionConfig', () => {
|
||||||
|
it('should return default config when no config is set', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({});
|
||||||
|
|
||||||
|
const config = await getCompressionConfig();
|
||||||
|
|
||||||
|
expect(config).toEqual({
|
||||||
|
enabled: false,
|
||||||
|
model: 'gpt-4o-mini',
|
||||||
|
maxInputTokens: 100000,
|
||||||
|
targetReductionRatio: 0.5,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return configured values when set', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: {
|
||||||
|
enabled: true,
|
||||||
|
model: 'gpt-4o',
|
||||||
|
maxInputTokens: 50000,
|
||||||
|
targetReductionRatio: 0.3,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const config = await getCompressionConfig();
|
||||||
|
|
||||||
|
expect(config).toEqual({
|
||||||
|
enabled: true,
|
||||||
|
model: 'gpt-4o',
|
||||||
|
maxInputTokens: 50000,
|
||||||
|
targetReductionRatio: 0.3,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use defaults for missing values', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: {
|
||||||
|
enabled: true,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const config = await getCompressionConfig();
|
||||||
|
|
||||||
|
expect(config).toEqual({
|
||||||
|
enabled: true,
|
||||||
|
model: 'gpt-4o-mini',
|
||||||
|
maxInputTokens: 100000,
|
||||||
|
targetReductionRatio: 0.5,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return defaults on error', async () => {
|
||||||
|
mockSystemConfigDao.get.mockRejectedValue(new Error('Test error'));
|
||||||
|
|
||||||
|
const config = await getCompressionConfig();
|
||||||
|
|
||||||
|
expect(config).toEqual({
|
||||||
|
enabled: false,
|
||||||
|
model: 'gpt-4o-mini',
|
||||||
|
maxInputTokens: 100000,
|
||||||
|
targetReductionRatio: 0.5,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('isCompressionEnabled', () => {
|
||||||
|
it('should return false when compression is disabled', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: false },
|
||||||
|
});
|
||||||
|
|
||||||
|
const enabled = await isCompressionEnabled();
|
||||||
|
|
||||||
|
expect(enabled).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return false when enabled but no API key', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: '',
|
||||||
|
});
|
||||||
|
|
||||||
|
const enabled = await isCompressionEnabled();
|
||||||
|
|
||||||
|
expect(enabled).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return true when enabled and API key is set', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
});
|
||||||
|
|
||||||
|
const enabled = await isCompressionEnabled();
|
||||||
|
|
||||||
|
expect(enabled).toBe(true);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('estimateTokenCount', () => {
|
||||||
|
it('should estimate tokens for short text', () => {
|
||||||
|
const text = 'Hello world';
|
||||||
|
const tokens = estimateTokenCount(text);
|
||||||
|
|
||||||
|
// Estimate based on ~4 chars per token
|
||||||
|
expect(tokens).toBe(Math.ceil(text.length / 4));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should estimate tokens for longer text', () => {
|
||||||
|
const text = 'This is a longer piece of text that should have more tokens';
|
||||||
|
const tokens = estimateTokenCount(text);
|
||||||
|
|
||||||
|
// Estimate based on ~4 chars per token
|
||||||
|
expect(tokens).toBe(Math.ceil(text.length / 4));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle empty string', () => {
|
||||||
|
const tokens = estimateTokenCount('');
|
||||||
|
|
||||||
|
expect(tokens).toBe(0);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('shouldCompress', () => {
|
||||||
|
it('should return false for small content', () => {
|
||||||
|
const content = 'Small content';
|
||||||
|
const result = shouldCompress(content, 100000);
|
||||||
|
|
||||||
|
expect(result).toBe(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return true for large content', () => {
|
||||||
|
// Create content larger than the threshold
|
||||||
|
const content = 'x'.repeat(5000);
|
||||||
|
const result = shouldCompress(content, 100000);
|
||||||
|
|
||||||
|
expect(result).toBe(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should use 10% of maxInputTokens as threshold', () => {
|
||||||
|
// Test threshold behavior with different content sizes
|
||||||
|
const smallContent = 'x'.repeat(300);
|
||||||
|
const largeContent = 'x'.repeat(500);
|
||||||
|
|
||||||
|
expect(shouldCompress(smallContent, 1000)).toBe(false);
|
||||||
|
expect(shouldCompress(largeContent, 1000)).toBe(true);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('compressOutput', () => {
|
||||||
|
it('should return original content when compression is disabled', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: false },
|
||||||
|
});
|
||||||
|
|
||||||
|
const content = 'Test content';
|
||||||
|
const result = await compressOutput(content);
|
||||||
|
|
||||||
|
expect(result).toEqual({
|
||||||
|
compressed: content,
|
||||||
|
originalLength: content.length,
|
||||||
|
compressedLength: content.length,
|
||||||
|
wasCompressed: false,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return original content when content is too small', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true, maxInputTokens: 100000 },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
});
|
||||||
|
|
||||||
|
const content = 'Small content';
|
||||||
|
const result = await compressOutput(content);
|
||||||
|
|
||||||
|
expect(result.wasCompressed).toBe(false);
|
||||||
|
expect(result.compressed).toBe(content);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return original content when no API key is configured', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: '',
|
||||||
|
});
|
||||||
|
|
||||||
|
const content = 'x'.repeat(5000);
|
||||||
|
const result = await compressOutput(content);
|
||||||
|
|
||||||
|
expect(result.wasCompressed).toBe(false);
|
||||||
|
expect(result.compressed).toBe(content);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should compress content when enabled and content is large', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true, model: 'gpt-4o-mini', maxInputTokens: 100000 },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
openaiApiBaseUrl: 'https://api.openai.com/v1',
|
||||||
|
});
|
||||||
|
|
||||||
|
const originalContent = 'x'.repeat(5000);
|
||||||
|
const compressedContent = 'y'.repeat(2000);
|
||||||
|
|
||||||
|
// Mock OpenAI response
|
||||||
|
const mockCreate = jest.fn().mockResolvedValue({
|
||||||
|
choices: [{ message: { content: compressedContent } }],
|
||||||
|
});
|
||||||
|
|
||||||
|
(OpenAI as unknown as jest.Mock).mockImplementation(() => ({
|
||||||
|
chat: {
|
||||||
|
completions: {
|
||||||
|
create: mockCreate,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
const result = await compressOutput(originalContent, {
|
||||||
|
toolName: 'test-tool',
|
||||||
|
serverName: 'test-server',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(result.wasCompressed).toBe(true);
|
||||||
|
expect(result.compressed).toBe(compressedContent);
|
||||||
|
expect(result.originalLength).toBe(originalContent.length);
|
||||||
|
expect(result.compressedLength).toBe(compressedContent.length);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return original content when compressed is larger', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true, model: 'gpt-4o-mini', maxInputTokens: 100000 },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
openaiApiBaseUrl: 'https://api.openai.com/v1',
|
||||||
|
});
|
||||||
|
|
||||||
|
const originalContent = 'x'.repeat(5000);
|
||||||
|
const largerContent = 'y'.repeat(6000);
|
||||||
|
|
||||||
|
const mockCreate = jest.fn().mockResolvedValue({
|
||||||
|
choices: [{ message: { content: largerContent } }],
|
||||||
|
});
|
||||||
|
|
||||||
|
(OpenAI as unknown as jest.Mock).mockImplementation(() => ({
|
||||||
|
chat: {
|
||||||
|
completions: {
|
||||||
|
create: mockCreate,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
const result = await compressOutput(originalContent);
|
||||||
|
|
||||||
|
expect(result.wasCompressed).toBe(false);
|
||||||
|
expect(result.compressed).toBe(originalContent);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should return original content on API error', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true, model: 'gpt-4o-mini', maxInputTokens: 100000 },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
openaiApiBaseUrl: 'https://api.openai.com/v1',
|
||||||
|
});
|
||||||
|
|
||||||
|
const mockCreate = jest.fn().mockRejectedValue(new Error('API error'));
|
||||||
|
|
||||||
|
(OpenAI as unknown as jest.Mock).mockImplementation(() => ({
|
||||||
|
chat: {
|
||||||
|
completions: {
|
||||||
|
create: mockCreate,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
const content = 'x'.repeat(5000);
|
||||||
|
const result = await compressOutput(content);
|
||||||
|
|
||||||
|
expect(result.wasCompressed).toBe(false);
|
||||||
|
expect(result.compressed).toBe(content);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('compressToolResult', () => {
|
||||||
|
it('should return original result when compression is disabled', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: false },
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = {
|
||||||
|
content: [{ type: 'text', text: 'Test output' }],
|
||||||
|
};
|
||||||
|
|
||||||
|
const compressed = await compressToolResult(result);
|
||||||
|
|
||||||
|
expect(compressed).toEqual(result);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should not compress error results', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = {
|
||||||
|
content: [{ type: 'text', text: 'Error message' }],
|
||||||
|
isError: true,
|
||||||
|
};
|
||||||
|
|
||||||
|
const compressed = await compressToolResult(result);
|
||||||
|
|
||||||
|
expect(compressed).toEqual(result);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should handle results without content array', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
});
|
||||||
|
|
||||||
|
const result = { someOtherField: 'value' };
|
||||||
|
|
||||||
|
const compressed = await compressToolResult(result);
|
||||||
|
|
||||||
|
expect(compressed).toEqual(result);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should only compress text content items', async () => {
|
||||||
|
mockSystemConfigDao.get.mockResolvedValue({
|
||||||
|
compression: { enabled: true, maxInputTokens: 100000 },
|
||||||
|
});
|
||||||
|
(getSmartRoutingConfig as jest.Mock).mockResolvedValue({
|
||||||
|
openaiApiKey: 'test-api-key',
|
||||||
|
openaiApiBaseUrl: 'https://api.openai.com/v1',
|
||||||
|
});
|
||||||
|
|
||||||
|
const largeText = 'x'.repeat(5000);
|
||||||
|
const compressedText = 'y'.repeat(2000);
|
||||||
|
|
||||||
|
const mockCreate = jest.fn().mockResolvedValue({
|
||||||
|
choices: [{ message: { content: compressedText } }],
|
||||||
|
});
|
||||||
|
|
||||||
|
(OpenAI as unknown as jest.Mock).mockImplementation(() => ({
|
||||||
|
chat: {
|
||||||
|
completions: {
|
||||||
|
create: mockCreate,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}));
|
||||||
|
|
||||||
|
const result = {
|
||||||
|
content: [
|
||||||
|
{ type: 'text', text: largeText },
|
||||||
|
{ type: 'image', data: 'base64data' },
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
const compressed = await compressToolResult(result);
|
||||||
|
|
||||||
|
expect(compressed.content[0].text).toBe(compressedText);
|
||||||
|
expect(compressed.content[1]).toEqual({ type: 'image', data: 'base64data' });
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
Reference in New Issue
Block a user