Compare commits

..

4 Commits

Author SHA1 Message Date
copilot-swe-agent[bot]
7eeb136233 Complete fix with screenshots and full testing
Co-authored-by: samanhappy <2755122+samanhappy@users.noreply.github.com>
2025-12-08 02:21:56 +00:00
copilot-swe-agent[bot]
db1d621a06 Fix export/import for OpenAPI server type and improve handling
Co-authored-by: samanhappy <2755122+samanhappy@users.noreply.github.com>
2025-12-08 02:18:20 +00:00
copilot-swe-agent[bot]
4affcb72d7 Initial investigation of export/import issue
Co-authored-by: samanhappy <2755122+samanhappy@users.noreply.github.com>
2025-12-08 02:12:39 +00:00
copilot-swe-agent[bot]
38e274d441 Initial plan 2025-12-08 02:04:27 +00:00
71 changed files with 813 additions and 5055 deletions

272
.github/copilot-instructions.md vendored Normal file
View File

@@ -0,0 +1,272 @@
# MCPHub Coding Instructions
**ALWAYS follow these instructions first and only fallback to additional search and context gathering if the information here is incomplete or found to be in error.**
## Project Overview
MCPHub is a TypeScript/Node.js MCP (Model Context Protocol) server management hub that provides unified access through HTTP endpoints. It serves as a centralized dashboard for managing multiple MCP servers with real-time monitoring, authentication, and flexible routing.
**Core Components:**
- **Backend**: Express.js + TypeScript + ESM (`src/server.ts`)
- **Frontend**: React/Vite + Tailwind CSS (`frontend/`)
- **MCP Integration**: Connects multiple MCP servers (`src/services/mcpService.ts`)
- **Authentication**: JWT-based with bcrypt password hashing
- **Configuration**: JSON-based MCP server definitions (`mcp_settings.json`)
- **Documentation**: API docs and usage instructions(`docs/`)
## Working Effectively
### Bootstrap and Setup (CRITICAL - Follow Exact Steps)
```bash
# Install pnpm if not available
npm install -g pnpm
# Install dependencies - takes ~30 seconds
pnpm install
# Setup environment (optional)
cp .env.example .env
# Build and test to verify setup
pnpm lint # ~3 seconds - NEVER CANCEL
pnpm backend:build # ~5 seconds - NEVER CANCEL
pnpm test:ci # ~16 seconds - NEVER CANCEL. Set timeout to 60+ seconds
pnpm frontend:build # ~5 seconds - NEVER CANCEL
pnpm build # ~10 seconds total - NEVER CANCEL. Set timeout to 60+ seconds
```
**CRITICAL TIMING**: These commands are fast but NEVER CANCEL them. Always wait for completion.
### Development Environment
```bash
# Start both backend and frontend (recommended for most development)
pnpm dev # Backend on :3001, Frontend on :5173
# OR start separately (required on Windows, optional on Linux/macOS)
# Terminal 1: Backend only
pnpm backend:dev # Runs on port 3000 (or PORT env var)
# Terminal 2: Frontend only
pnpm frontend:dev # Runs on port 5173, proxies API to backend
```
**NEVER CANCEL**: Development servers may take 10-15 seconds to fully initialize all MCP servers.
### Build Commands (Production)
```bash
# Full production build - takes ~10 seconds total
pnpm build # NEVER CANCEL - Set timeout to 60+ seconds
# Individual builds
pnpm backend:build # TypeScript compilation - ~5 seconds
pnpm frontend:build # Vite build - ~5 seconds
# Start production server
pnpm start # Requires dist/ and frontend/dist/ to exist
```
### Testing and Validation
```bash
# Run all tests - takes ~16 seconds with 73 tests
pnpm test:ci # NEVER CANCEL - Set timeout to 60+ seconds
# Development testing
pnpm test # Interactive mode
pnpm test:watch # Watch mode for development
pnpm test:coverage # With coverage report
# Code quality
pnpm lint # ESLint - ~3 seconds
pnpm format # Prettier formatting - ~3 seconds
```
**CRITICAL**: All tests MUST pass before committing. Do not modify tests to make them pass unless specifically required for your changes.
## Manual Validation Requirements
**ALWAYS perform these validation steps after making changes:**
### 1. Basic Application Functionality
```bash
# Start the application
pnpm dev
# Verify backend responds (in another terminal)
curl http://localhost:3000/api/health
# Expected: Should return health status
# Verify frontend serves
curl -I http://localhost:3000/
# Expected: HTTP 200 OK with HTML content
```
### 2. MCP Server Integration Test
```bash
# Check MCP servers are loading (look for log messages)
# Expected log output should include:
# - "Successfully connected client for server: [name]"
# - "Successfully listed [N] tools for server: [name]"
# - Some servers may fail due to missing API keys (normal in dev)
```
### 3. Build Verification
```bash
# Verify production build works
pnpm build
node scripts/verify-dist.js
# Expected: "✅ Verification passed! Frontend and backend dist files are present."
```
**NEVER skip these validation steps**. If any fail, debug and fix before proceeding.
## Project Structure and Key Files
### Critical Backend Files
- `src/index.ts` - Application entry point
- `src/server.ts` - Express server setup and middleware
- `src/services/mcpService.ts` - **Core MCP server management logic**
- `src/config/index.ts` - Configuration management
- `src/routes/` - HTTP route definitions
- `src/controllers/` - HTTP request handlers
- `src/dao/` - Data access layer (supports JSON file & PostgreSQL)
- `src/db/` - TypeORM entities & repositories (for PostgreSQL mode)
- `src/types/index.ts` - TypeScript type definitions
### DAO Layer (Dual Data Source)
MCPHub supports **JSON file** (default) and **PostgreSQL** storage:
- Set `USE_DB=true` + `DB_URL=postgresql://...` to use database
- When modifying data structures, update: `src/types/`, `src/dao/`, `src/db/entities/`, `src/db/repositories/`, `src/utils/migration.ts`
- See `AGENTS.md` for detailed DAO modification checklist
### Critical Frontend Files
- `frontend/src/` - React application source
- `frontend/src/pages/` - Page components (development entry point)
- `frontend/src/components/` - Reusable UI components
- `frontend/src/utils/fetchInterceptor.js` - Backend API interaction
### Configuration Files
- `mcp_settings.json` - **MCP server definitions and user accounts**
- `package.json` - Dependencies and scripts
- `tsconfig.json` - TypeScript configuration
- `jest.config.cjs` - Test configuration
- `.eslintrc.json` - Linting rules
### Docker and Deployment
- `Dockerfile` - Multi-stage build with Python base + Node.js
- `entrypoint.sh` - Docker startup script
- `bin/cli.js` - NPM package CLI entry point
## Development Process and Conventions
### Code Style Requirements
- **ESM modules**: Always use `.js` extensions in imports, not `.ts`
- **English only**: All code comments must be written in English
- **TypeScript strict**: Follow strict type checking rules
- **Import style**: `import { something } from './file.js'` (note .js extension)
### Key Configuration Notes
- **MCP servers**: Defined in `mcp_settings.json` with command/args
- **Endpoints**: `/mcp/{group|server}` and `/mcp/$smart` for routing
- **i18n**: Frontend uses react-i18next with files in `locales/` folder
- **Authentication**: JWT tokens with bcrypt password hashing
- **Default credentials**: admin/admin123 (configured in mcp_settings.json)
### Development Entry Points
- **Add MCP server**: Modify `mcp_settings.json` and restart
- **New API endpoint**: Add route in `src/routes/`, controller in `src/controllers/`
- **Frontend feature**: Start from `frontend/src/pages/` or `frontend/src/components/`
- **Add tests**: Follow patterns in `tests/` directory
### Common Development Tasks
#### Adding a new MCP server:
1. Add server definition to `mcp_settings.json`
2. Restart backend to load new server
3. Check logs for successful connection
4. Test via dashboard or API endpoints
#### API development:
1. Define route in `src/routes/`
2. Implement controller in `src/controllers/`
3. Add types in `src/types/index.ts` if needed
4. Write tests in `tests/controllers/`
#### Frontend development:
1. Create/modify components in `frontend/src/components/`
2. Add pages in `frontend/src/pages/`
3. Update routing if needed
4. Test in development mode with `pnpm frontend:dev`
#### Documentation:
1. Update or add docs in `docs/` folder
2. Ensure README.md reflects any major changes
## Validation and CI Requirements
### Before Committing - ALWAYS Run:
```bash
pnpm lint # Must pass - ~3 seconds
pnpm backend:build # Must compile - ~5 seconds
pnpm test:ci # All tests must pass - ~16 seconds
pnpm build # Full build must work - ~10 seconds
```
**CRITICAL**: CI will fail if any of these commands fail. Fix issues locally first.
### CI Pipeline (.github/workflows/ci.yml)
- Runs on Node.js 20.x
- Tests: linting, type checking, unit tests with coverage
- **NEVER CANCEL**: CI builds may take 2-3 minutes total
## Troubleshooting
### Common Issues
- **"uvx command not found"**: Some MCP servers require `uvx` (Python package manager) - this is expected in development
- **Port already in use**: Change PORT environment variable or kill existing processes
- **Frontend not loading**: Ensure frontend was built with `pnpm frontend:build`
- **MCP server connection failed**: Check server command/args in `mcp_settings.json`
### Build Failures
- **TypeScript errors**: Run `pnpm backend:build` to see compilation errors
- **Test failures**: Run `pnpm test:verbose` for detailed test output
- **Lint errors**: Run `pnpm lint` and fix reported issues
### Development Issues
- **Backend not starting**: Check for port conflicts, verify `mcp_settings.json` syntax
- **Frontend proxy errors**: Ensure backend is running before starting frontend
- **Hot reload not working**: Restart development server
## Performance Notes
- **Install time**: pnpm install takes ~30 seconds
- **Build time**: Full build takes ~10 seconds
- **Test time**: Complete test suite takes ~16 seconds
- **Startup time**: Backend initialization takes 10-15 seconds (MCP server connections)
**Remember**: NEVER CANCEL any build or test commands. Always wait for completion even if they seem slow.

View File

@@ -76,7 +76,7 @@ jobs:
# services: # services:
# postgres: # postgres:
# image: pgvector/pgvector:pg17 # image: postgres:15
# env: # env:
# POSTGRES_PASSWORD: postgres # POSTGRES_PASSWORD: postgres
# POSTGRES_DB: mcphub_test # POSTGRES_DB: mcphub_test

374
AGENTS.md
View File

@@ -1,214 +1,26 @@
# MCPHub Development Guide & Agent Instructions # Repository Guidelines
**ALWAYS follow these instructions first and only fallback to additional search and context gathering if the information here is incomplete or found to be in error.** These notes align current contributors around the code layout, daily commands, and collaboration habits that keep `@samanhappy/mcphub` moving quickly.
This document serves as the primary reference for all contributors and AI agents working on `@samanhappy/mcphub`. It provides comprehensive guidance on code organization, development workflow, and project conventions.
## Project Overview
MCPHub is a TypeScript/Node.js MCP (Model Context Protocol) server management hub that provides unified access through HTTP endpoints. It serves as a centralized dashboard for managing multiple MCP servers with real-time monitoring, authentication, and flexible routing.
**Core Components:**
- **Backend**: Express.js + TypeScript + ESM (`src/server.ts`)
- **Frontend**: React/Vite + Tailwind CSS (`frontend/`)
- **MCP Integration**: Connects multiple MCP servers (`src/services/mcpService.ts`)
- **Authentication**: JWT-based with bcrypt password hashing
- **Configuration**: JSON-based MCP server definitions (`mcp_settings.json`)
- **Documentation**: API docs and usage instructions(`docs/`)
## Bootstrap and Setup (CRITICAL - Follow Exact Steps)
```bash
# Install pnpm if not available
npm install -g pnpm
# Install dependencies - takes ~30 seconds
pnpm install
# Setup environment (optional)
cp .env.example .env
# Build and test to verify setup
pnpm lint # ~3 seconds - NEVER CANCEL
pnpm backend:build # ~5 seconds - NEVER CANCEL
pnpm test:ci # ~16 seconds - NEVER CANCEL. Set timeout to 60+ seconds
pnpm frontend:build # ~5 seconds - NEVER CANCEL
pnpm build # ~10 seconds total - NEVER CANCEL. Set timeout to 60+ seconds
```
**CRITICAL TIMING**: These commands are fast but NEVER CANCEL them. Always wait for completion.
## Manual Validation Requirements
**ALWAYS perform these validation steps after making changes:**
### 1. Basic Application Functionality
```bash
# Start the application
pnpm dev
# Verify backend responds (in another terminal)
curl http://localhost:3000/api/health
# Expected: Should return health status
# Verify frontend serves
curl -I http://localhost:3000/
# Expected: HTTP 200 OK with HTML content
```
### 2. MCP Server Integration Test
```bash
# Check MCP servers are loading (look for log messages)
# Expected log output should include:
# - "Successfully connected client for server: [name]"
# - "Successfully listed [N] tools for server: [name]"
# - Some servers may fail due to missing API keys (normal in dev)
```
### 3. Build Verification
```bash
# Verify production build works
pnpm build
node scripts/verify-dist.js
# Expected: "✅ Verification passed! Frontend and backend dist files are present."
```
**NEVER skip these validation steps**. If any fail, debug and fix before proceeding.
## Project Structure & Module Organization ## Project Structure & Module Organization
### Critical Backend Files - Backend services live in `src`, grouped by responsibility (`controllers/`, `services/`, `dao/`, `routes/`, `utils/`), with `server.ts` orchestrating HTTP bootstrap.
- `frontend/src` contains the Vite + React dashboard; `frontend/public` hosts static assets and translations sit in `locales/`.
- `src/index.ts` - Application entry point - Jest-aware test code is split between colocated specs (`src/**/*.{test,spec}.ts`) and higher-level suites in `tests/`; use `tests/utils/` helpers when exercising the CLI or SSE flows.
- `src/server.ts` - Express server setup and middleware (orchestrating HTTP bootstrap) - Build artifacts and bundles are generated into `dist/`, `frontend/dist/`, and `coverage/`; never edit these manually.
- `src/services/mcpService.ts` - **Core MCP server management logic**
- `src/config/index.ts` - Configuration management
- `src/routes/` - HTTP route definitions
- `src/controllers/` - HTTP request handlers
- `src/dao/` - Data access layer (supports JSON file & PostgreSQL)
- `src/db/` - TypeORM entities & repositories (for PostgreSQL mode)
- `src/types/index.ts` - TypeScript type definitions and shared DTOs
- `src/utils/` - Utility functions and helpers
### Critical Frontend Files
- `frontend/src/` - React application source (Vite + React dashboard)
- `frontend/src/pages/` - Page components (development entry point)
- `frontend/src/components/` - Reusable UI components
- `frontend/src/utils/fetchInterceptor.js` - Backend API interaction
- `frontend/public/` - Static assets
### Configuration Files
- `mcp_settings.json` - **MCP server definitions and user accounts**
- `package.json` - Dependencies and scripts
- `tsconfig.json` - TypeScript configuration
- `jest.config.cjs` - Test configuration
- `.eslintrc.json` - Linting rules
### Test Organization
- Jest-aware test code is split between colocated specs (`src/**/*.{test,spec}.ts`) and higher-level suites in `tests/`
- Use `tests/utils/` helpers when exercising the CLI or SSE flows
- Mirror production directory names when adding new suites
- End filenames with `.test.ts` or `.spec.ts` for automatic discovery
### Build Artifacts
- `dist/` - Backend build output (TypeScript compilation)
- `frontend/dist/` - Frontend build output (Vite bundle)
- `coverage/` - Test coverage reports
- **Never edit these manually**
### Localization
- Translations sit in `locales/` (en.json, fr.json, tr.json, zh.json)
- Frontend uses react-i18next
### Docker and Deployment
- `Dockerfile` - Multi-stage build with Python base + Node.js
- `entrypoint.sh` - Docker startup script
- `bin/cli.js` - NPM package CLI entry point
## Build, Test, and Development Commands ## Build, Test, and Development Commands
### Development Environment - `pnpm dev` runs backend (`tsx watch src/index.ts`) and frontend (`vite`) together for local iteration.
- `pnpm backend:dev`, `pnpm frontend:dev`, and `pnpm frontend:preview` target each surface independently; prefer them when debugging one stack.
```bash - `pnpm build` executes `pnpm backend:build` (TypeScript to `dist/`) and `pnpm frontend:build`; run before release or publishing.
# Start both backend and frontend (recommended for most development) - `pnpm test`, `pnpm test:watch`, and `pnpm test:coverage` drive Jest; `pnpm lint` and `pnpm format` enforce style via ESLint and Prettier.
pnpm dev # Backend on :3001, Frontend on :5173
# OR start separately (required on Windows, optional on Linux/macOS)
# Terminal 1: Backend only
pnpm backend:dev # Runs on port 3000 (or PORT env var)
# Terminal 2: Frontend only
pnpm frontend:dev # Runs on port 5173, proxies API to backend
# Frontend preview (production build)
pnpm frontend:preview # Preview production build
```
**NEVER CANCEL**: Development servers may take 10-15 seconds to fully initialize all MCP servers.
### Production Build
```bash
# Full production build - takes ~10 seconds total
pnpm build # NEVER CANCEL - Set timeout to 60+ seconds
# Individual builds
pnpm backend:build # TypeScript compilation to dist/ - ~5 seconds
pnpm frontend:build # Vite build to frontend/dist/ - ~5 seconds
# Start production server
pnpm start # Requires dist/ and frontend/dist/ to exist
```
Run `pnpm build` before release or publishing.
### Testing and Validation
```bash
# Run all tests - takes ~16 seconds with 73 tests
pnpm test:ci # NEVER CANCEL - Set timeout to 60+ seconds
# Development testing
pnpm test # Interactive mode
pnpm test:watch # Watch mode for development
pnpm test:coverage # With coverage report
# Code quality
pnpm lint # ESLint - ~3 seconds
pnpm format # Prettier formatting - ~3 seconds
```
**CRITICAL**: All tests MUST pass before committing. Do not modify tests to make them pass unless specifically required for your changes.
### Performance Notes
- **Install time**: pnpm install takes ~30 seconds
- **Build time**: Full build takes ~10 seconds
- **Test time**: Complete test suite takes ~16 seconds
- **Startup time**: Backend initialization takes 10-15 seconds (MCP server connections)
## Coding Style & Naming Conventions ## Coding Style & Naming Conventions
- **TypeScript everywhere**: Default to 2-space indentation and single quotes, letting Prettier settle formatting - TypeScript everywhere; default to 2-space indentation and single quotes, letting Prettier settle formatting. ESLint configuration assumes ES modules.
- **ESM modules**: Always use `.js` extensions in imports, not `.ts` (e.g., `import { something } from './file.js'`) - Name services and data access layers with suffixes (`UserService`, `AuthDao`), React components and files in `PascalCase`, and utility modules in `camelCase`.
- **English only**: All code comments must be written in English - Keep DTOs and shared types in `src/types` to avoid duplication; re-export through index files only when it clarifies imports.
- **TypeScript strict**: Follow strict type checking rules
- **Naming conventions**:
- Services and data access layers: Use suffixes (`UserService`, `AuthDao`)
- React components and files: `PascalCase`
- Utility modules: `camelCase`
- **Types and DTOs**: Keep in `src/types` to avoid duplication; re-export through index files only when it clarifies imports
- **ESLint configuration**: Assumes ES modules
## Testing Guidelines ## Testing Guidelines
@@ -216,86 +28,12 @@ pnpm format # Prettier formatting - ~3 seconds
- Mirror production directory names when adding new suites and end filenames with `.test.ts` or `.spec.ts` for automatic discovery. - Mirror production directory names when adding new suites and end filenames with `.test.ts` or `.spec.ts` for automatic discovery.
- Aim to maintain or raise coverage when touching critical flows (auth, OAuth, SSE); add integration tests under `tests/integration/` when touching cross-service logic. - Aim to maintain or raise coverage when touching critical flows (auth, OAuth, SSE); add integration tests under `tests/integration/` when touching cross-service logic.
## Key Configuration Notes
- **MCP servers**: Defined in `mcp_settings.json` with command/args
- **Endpoints**: `/mcp/{group|server}` and `/mcp/$smart` for routing
- **i18n**: Frontend uses react-i18next with files in `locales/` folder
- **Authentication**: JWT tokens with bcrypt password hashing
- **Default credentials**: admin/admin123 (configured in mcp_settings.json)
## Development Entry Points
### Adding a new MCP server
1. Add server definition to `mcp_settings.json`
2. Restart backend to load new server
3. Check logs for successful connection
4. Test via dashboard or API endpoints
### API development
1. Define route in `src/routes/`
2. Implement controller in `src/controllers/`
3. Add types in `src/types/index.ts` if needed
4. Write tests in `tests/controllers/`
### Frontend development
1. Create/modify components in `frontend/src/components/`
2. Add pages in `frontend/src/pages/`
3. Update routing if needed
4. Test in development mode with `pnpm frontend:dev`
### Documentation
1. Update or add docs in `docs/` folder
2. Ensure README.md reflects any major changes
## Commit & Pull Request Guidelines ## Commit & Pull Request Guidelines
- Follow the existing Conventional Commit pattern (`feat:`, `fix:`, `chore:`, etc.) with imperative, present-tense summaries and optional multi-line context. - Follow the existing Conventional Commit pattern (`feat:`, `fix:`, `chore:`, etc.) with imperative, present-tense summaries and optional multi-line context.
- Each PR should describe the behavior change, list testing performed, and link issues; include before/after screenshots or GIFs for frontend tweaks. - Each PR should describe the behavior change, list testing performed, and link issues; include before/after screenshots or GIFs for frontend tweaks.
- Re-run `pnpm build` and `pnpm test` before requesting review, and ensure generated artifacts stay out of the diff. - Re-run `pnpm build` and `pnpm test` before requesting review, and ensure generated artifacts stay out of the diff.
### Before Committing - ALWAYS Run
```bash
pnpm lint # Must pass - ~3 seconds
pnpm backend:build # Must compile - ~5 seconds
pnpm test:ci # All tests must pass - ~16 seconds
pnpm build # Full build must work - ~10 seconds
```
**CRITICAL**: CI will fail if any of these commands fail. Fix issues locally first.
### CI Pipeline (.github/workflows/ci.yml)
- Runs on Node.js 20.x
- Tests: linting, type checking, unit tests with coverage
- **NEVER CANCEL**: CI builds may take 2-3 minutes total
## Troubleshooting
### Common Issues
- **"uvx command not found"**: Some MCP servers require `uvx` (Python package manager) - this is expected in development
- **Port already in use**: Change PORT environment variable or kill existing processes
- **Frontend not loading**: Ensure frontend was built with `pnpm frontend:build`
- **MCP server connection failed**: Check server command/args in `mcp_settings.json`
### Build Failures
- **TypeScript errors**: Run `pnpm backend:build` to see compilation errors
- **Test failures**: Run `pnpm test:verbose` for detailed test output
- **Lint errors**: Run `pnpm lint` and fix reported issues
### Development Issues
- **Backend not starting**: Check for port conflicts, verify `mcp_settings.json` syntax
- **Frontend proxy errors**: Ensure backend is running before starting frontend
- **Hot reload not working**: Restart development server
## DAO Layer & Dual Data Source ## DAO Layer & Dual Data Source
MCPHub supports **JSON file** (default) and **PostgreSQL** storage. Set `USE_DB=true` + `DB_URL` to switch. MCPHub supports **JSON file** (default) and **PostgreSQL** storage. Set `USE_DB=true` + `DB_URL` to switch.
@@ -326,99 +64,15 @@ When adding/changing fields, update **ALL** these files:
### Data Type Mapping ### Data Type Mapping
| Model | DAO | DB Entity | JSON Path | | Model | DAO | DB Entity | JSON Path |
| -------------- | ----------------- | -------------- | ------------------------- | | -------------- | ----------------- | -------------- | ------------------------ |
| `IUser` | `UserDao` | `User` | `settings.users[]` | | `IUser` | `UserDao` | `User` | `settings.users[]` |
| `ServerConfig` | `ServerDao` | `Server` | `settings.mcpServers{}` | | `ServerConfig` | `ServerDao` | `Server` | `settings.mcpServers{}` |
| `IGroup` | `GroupDao` | `Group` | `settings.groups[]` | | `IGroup` | `GroupDao` | `Group` | `settings.groups[]` |
| `SystemConfig` | `SystemConfigDao` | `SystemConfig` | `settings.systemConfig` | | `SystemConfig` | `SystemConfigDao` | `SystemConfig` | `settings.systemConfig` |
| `UserConfig` | `UserConfigDao` | `UserConfig` | `settings.userConfigs{}` | | `UserConfig` | `UserConfigDao` | `UserConfig` | `settings.userConfigs{}` |
| `BearerKey` | `BearerKeyDao` | `BearerKey` | `settings.bearerKeys[]` |
| `IOAuthClient` | `OAuthClientDao` | `OAuthClient` | `settings.oauthClients[]` |
| `IOAuthToken` | `OAuthTokenDao` | `OAuthToken` | `settings.oauthTokens[]` |
### Common Pitfalls ### Common Pitfalls
- Forgetting migration script → fields won't migrate to DB - Forgetting migration script → fields won't migrate to DB
- Optional fields need `nullable: true` in entity - Optional fields need `nullable: true` in entity
- Complex objects need `simple-json` column type - Complex objects need `simple-json` column type
## Auto-Evolution Guidelines for AI Agents
**This section provides guidelines for AI agents to automatically maintain and improve this document.**
### When to Update AGENTS.md
AI agents MUST update this document in the following situations:
#### 1. Code-Documentation Mismatch Detected
When executing tasks, if you discover that:
- The actual code structure differs from descriptions in this document
- File paths, imports, or module organization has changed
- New critical files or directories exist that aren't documented
- Documented files or patterns no longer exist
**Action**: Immediately update the relevant section to reflect the current codebase state.
**Example scenarios**:
- A controller is now in `src/api/controllers/` instead of `src/controllers/`
- New middleware files exist that should be documented
- The DAO implementation has been refactored with a different structure
- Build output directories have changed
#### 2. User Preferences and Requirements
During conversation, if the user expresses:
- **Coding preferences**: Indentation style, naming conventions, code organization patterns
- **Workflow requirements**: Required validation steps, commit procedures, testing expectations
- **Tool preferences**: Preferred libraries, frameworks, or development tools
- **Quality standards**: Code review criteria, documentation requirements, error handling patterns
- **Development principles**: Architecture decisions, design patterns, best practices
**Action**: Add or update the relevant section to capture these preferences for future reference.
**Example scenarios**:
- User prefers async/await over promises → Update coding style section
- User requires specific test coverage thresholds → Update testing guidelines
- User has strong opinions about error handling → Add to development process section
- User establishes new deployment procedures → Update deployment section
### How to Update AGENTS.md
1. **Identify the Section**: Determine which section needs updating based on the type of change
2. **Make Precise Changes**: Update only the relevant content, maintaining the document structure
3. **Preserve Format**: Keep the existing markdown formatting and organization
4. **Add Context**: If adding new content, ensure it fits logically within existing sections
5. **Verify Accuracy**: After updating, ensure the new information is accurate and complete
### Update Principles
- **Accuracy First**: Documentation must reflect the actual current state
- **Clarity**: Use clear, concise language; avoid ambiguity
- **Completeness**: Include sufficient detail for agents to work effectively
- **Consistency**: Maintain consistent terminology and formatting throughout
- **Actionability**: Focus on concrete, actionable guidance rather than vague descriptions
### Self-Correction Process
Before completing any task:
1. Review relevant sections of AGENTS.md
2. During execution, note any discrepancies between documentation and reality
3. Update AGENTS.md to correct discrepancies
4. Verify the update doesn't conflict with other sections
5. Proceed with the original task using the updated information
### Meta-Update Rule
If this auto-evolution section itself needs improvement based on experience:
- Update it to better serve future agents
- Add new scenarios or principles as they emerge
- Refine the update process based on what works well
**Remember**: This document is a living guide. Keeping it accurate and current is as important as following it.

View File

@@ -3,7 +3,7 @@ version: "3.8"
services: services:
# PostgreSQL database for MCPHub configuration # PostgreSQL database for MCPHub configuration
postgres: postgres:
image: pgvector/pgvector:pg17-alpine image: postgres:16-alpine
container_name: mcphub-postgres container_name: mcphub-postgres
environment: environment:
POSTGRES_DB: mcphub POSTGRES_DB: mcphub

View File

@@ -59,7 +59,7 @@ version: '3.8'
services: services:
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:16
environment: environment:
POSTGRES_DB: mcphub POSTGRES_DB: mcphub
POSTGRES_USER: mcphub POSTGRES_USER: mcphub

View File

@@ -119,7 +119,7 @@ services:
- mcphub-network - mcphub-network
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-postgres container_name: mcphub-postgres
environment: environment:
- POSTGRES_DB=mcphub - POSTGRES_DB=mcphub
@@ -203,7 +203,7 @@ services:
retries: 3 retries: 3
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-postgres container_name: mcphub-postgres
environment: environment:
- POSTGRES_DB=mcphub - POSTGRES_DB=mcphub
@@ -305,7 +305,7 @@ services:
- mcphub-dev - mcphub-dev
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-postgres-dev container_name: mcphub-postgres-dev
environment: environment:
- POSTGRES_DB=mcphub - POSTGRES_DB=mcphub
@@ -445,7 +445,7 @@ Add backup service to your `docker-compose.yml`:
```yaml ```yaml
services: services:
backup: backup:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-backup container_name: mcphub-backup
environment: environment:
- PGPASSWORD=${POSTGRES_PASSWORD} - PGPASSWORD=${POSTGRES_PASSWORD}

View File

@@ -78,7 +78,7 @@ Smart Routing requires additional setup compared to basic MCPHub usage:
- ./mcp_settings.json:/app/mcp_settings.json - ./mcp_settings.json:/app/mcp_settings.json
postgres: postgres:
image: pgvector/pgvector:pg17 image: pgvector/pgvector:pg16
environment: environment:
- POSTGRES_DB=mcphub - POSTGRES_DB=mcphub
- POSTGRES_USER=mcphub - POSTGRES_USER=mcphub
@@ -146,7 +146,7 @@ Smart Routing requires additional setup compared to basic MCPHub usage:
spec: spec:
containers: containers:
- name: postgres - name: postgres
image: pgvector/pgvector:pg17 image: pgvector/pgvector:pg16
env: env:
- name: POSTGRES_DB - name: POSTGRES_DB
value: mcphub value: mcphub

View File

@@ -96,7 +96,7 @@ Optional for Smart Routing:
# Optional: PostgreSQL for Smart Routing # Optional: PostgreSQL for Smart Routing
postgres: postgres:
image: pgvector/pgvector:pg17 image: pgvector/pgvector:pg16
environment: environment:
POSTGRES_DB: mcphub POSTGRES_DB: mcphub
POSTGRES_USER: mcphub POSTGRES_USER: mcphub

View File

@@ -59,7 +59,7 @@ version: '3.8'
services: services:
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:16
environment: environment:
POSTGRES_DB: mcphub POSTGRES_DB: mcphub
POSTGRES_USER: mcphub POSTGRES_USER: mcphub

View File

@@ -119,7 +119,7 @@ services:
- mcphub-network - mcphub-network
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-postgres container_name: mcphub-postgres
environment: environment:
- POSTGRES_DB=mcphub - POSTGRES_DB=mcphub
@@ -203,7 +203,7 @@ services:
retries: 3 retries: 3
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-postgres container_name: mcphub-postgres
environment: environment:
- POSTGRES_DB=mcphub - POSTGRES_DB=mcphub
@@ -305,7 +305,7 @@ services:
- mcphub-dev - mcphub-dev
postgres: postgres:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-postgres-dev container_name: mcphub-postgres-dev
environment: environment:
- POSTGRES_DB=mcphub - POSTGRES_DB=mcphub
@@ -445,7 +445,7 @@ secrets:
```yaml ```yaml
services: services:
backup: backup:
image: pgvector/pgvector:pg17 image: postgres:15-alpine
container_name: mcphub-backup container_name: mcphub-backup
environment: environment:
- PGPASSWORD=${POSTGRES_PASSWORD} - PGPASSWORD=${POSTGRES_PASSWORD}

View File

@@ -96,7 +96,7 @@ description: '各种平台的详细安装说明'
# 可选:用于智能路由的 PostgreSQL # 可选:用于智能路由的 PostgreSQL
postgres: postgres:
image: pgvector/pgvector:pg17 image: pgvector/pgvector:pg16
environment: environment:
POSTGRES_DB: mcphub POSTGRES_DB: mcphub
POSTGRES_USER: mcphub POSTGRES_USER: mcphub

View File

@@ -18,17 +18,7 @@ const EditServerForm = ({ server, onEdit, onCancel }: EditServerFormProps) => {
try { try {
setError(null); setError(null);
const encodedServerName = encodeURIComponent(server.name); const encodedServerName = encodeURIComponent(server.name);
const result = await apiPut(`/servers/${encodedServerName}`, payload);
// Check if name is being changed
const isRenaming = payload.name && payload.name !== server.name;
// Build the request body
const requestBody = {
config: payload.config,
...(isRenaming ? { newName: payload.name } : {}),
};
const result = await apiPut(`/servers/${encodedServerName}`, requestBody);
if (!result.success) { if (!result.success) {
// Use specific error message from the response if available // Use specific error message from the response if available

View File

@@ -1,284 +0,0 @@
import React, { useState } from 'react';
import { useTranslation } from 'react-i18next';
import { apiPost } from '@/utils/fetchInterceptor';
interface GroupImportFormProps {
onSuccess: () => void;
onCancel: () => void;
}
interface ImportGroupConfig {
name: string;
description?: string;
servers?: string[] | Array<{ name: string; tools?: string[] | 'all' }>;
}
interface ImportJsonFormat {
groups: ImportGroupConfig[];
}
const GroupImportForm: React.FC<GroupImportFormProps> = ({ onSuccess, onCancel }) => {
const { t } = useTranslation();
const [jsonInput, setJsonInput] = useState('');
const [error, setError] = useState<string | null>(null);
const [isImporting, setIsImporting] = useState(false);
const [previewGroups, setPreviewGroups] = useState<ImportGroupConfig[] | null>(null);
const examplePlaceholder = `{
"groups": [
{
"name": "AI Assistants",
"servers": ["openai-server", "anthropic-server"]
},
{
"name": "Development Tools",
"servers": [
{
"name": "github-server",
"tools": ["create_issue", "list_repos"]
},
{
"name": "gitlab-server",
"tools": "all"
}
]
}
]
}
Supports:
- Simple server list: ["server1", "server2"]
- Advanced server config: [{"name": "server1", "tools": ["tool1", "tool2"]}]
- All groups will be imported in a single efficient batch operation.`;
const parseAndValidateJson = (input: string): ImportJsonFormat | null => {
try {
const parsed = JSON.parse(input.trim());
// Validate structure
if (!parsed.groups || !Array.isArray(parsed.groups)) {
setError(t('groupImport.invalidFormat'));
return null;
}
// Validate each group
for (const group of parsed.groups) {
if (!group.name || typeof group.name !== 'string') {
setError(t('groupImport.missingName'));
return null;
}
}
return parsed as ImportJsonFormat;
} catch (e) {
setError(t('groupImport.parseError'));
return null;
}
};
const handlePreview = () => {
setError(null);
const parsed = parseAndValidateJson(jsonInput);
if (!parsed) return;
setPreviewGroups(parsed.groups);
};
const handleImport = async () => {
if (!previewGroups) return;
setIsImporting(true);
setError(null);
try {
// Use batch import API for better performance
const result = await apiPost('/groups/batch', {
groups: previewGroups,
});
if (result.success) {
const { successCount, failureCount, results } = result;
if (failureCount > 0) {
const errors = results
.filter((r: any) => !r.success)
.map((r: any) => `${r.name}: ${r.message || t('groupImport.addFailed')}`);
setError(
t('groupImport.partialSuccess', { count: successCount, total: previewGroups.length }) +
'\n' +
errors.join('\n'),
);
}
if (successCount > 0) {
onSuccess();
}
} else {
setError(result.message || t('groupImport.importFailed'));
}
} catch (err) {
console.error('Import error:', err);
setError(t('groupImport.importFailed'));
} finally {
setIsImporting(false);
}
};
const renderServerList = (
servers?: string[] | Array<{ name: string; tools?: string[] | 'all' }>,
) => {
if (!servers || servers.length === 0) {
return <span className="text-gray-500">{t('groups.noServers')}</span>;
}
return (
<div className="space-y-1">
{servers.map((server, idx) => {
if (typeof server === 'string') {
return (
<div key={idx} className="text-sm">
{server}
</div>
);
} else {
return (
<div key={idx} className="text-sm">
{server.name}
{server.tools && server.tools !== 'all' && (
<span className="text-gray-500 ml-2">
({Array.isArray(server.tools) ? server.tools.join(', ') : server.tools})
</span>
)}
{server.tools === 'all' && <span className="text-gray-500 ml-2">(all tools)</span>}
</div>
);
}
})}
</div>
);
};
return (
<div className="fixed inset-0 bg-black/50 z-50 flex items-center justify-center p-4">
<div className="bg-white shadow rounded-lg p-6 w-full max-w-4xl max-h-[90vh] overflow-y-auto">
<div className="flex justify-between items-center mb-6">
<h2 className="text-xl font-semibold text-gray-900">{t('groupImport.title')}</h2>
<button onClick={onCancel} className="text-gray-500 hover:text-gray-700">
</button>
</div>
{error && (
<div className="mb-4 bg-red-50 border-l-4 border-red-500 p-4 rounded">
<p className="text-red-700 whitespace-pre-wrap">{error}</p>
</div>
)}
{!previewGroups ? (
<div>
<div className="mb-4">
<label className="block text-sm font-medium text-gray-700 mb-2">
{t('groupImport.inputLabel')}
</label>
<textarea
value={jsonInput}
onChange={(e) => setJsonInput(e.target.value)}
className="w-full h-96 px-3 py-2 border border-gray-300 rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500 font-mono text-sm"
placeholder={examplePlaceholder}
/>
<p className="text-xs text-gray-500 mt-2">{t('groupImport.inputHelp')}</p>
</div>
<div className="flex justify-end space-x-4">
<button
onClick={onCancel}
className="px-4 py-2 text-gray-700 bg-gray-200 rounded hover:bg-gray-300 btn-secondary"
>
{t('common.cancel')}
</button>
<button
onClick={handlePreview}
disabled={!jsonInput.trim()}
className="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700 disabled:opacity-50 btn-primary"
>
{t('groupImport.preview')}
</button>
</div>
</div>
) : (
<div>
<div className="mb-4">
<h3 className="text-lg font-medium text-gray-900 mb-3">
{t('groupImport.previewTitle')}
</h3>
<div className="space-y-3">
{previewGroups.map((group, index) => (
<div key={index} className="bg-gray-50 p-4 rounded-lg border border-gray-200">
<div className="flex items-start justify-between">
<div className="flex-1">
<h4 className="font-medium text-gray-900">{group.name}</h4>
{group.description && (
<p className="text-sm text-gray-600 mt-1">{group.description}</p>
)}
<div className="mt-2 text-sm text-gray-600">
<strong>{t('groups.servers')}:</strong>
<div className="mt-1">{renderServerList(group.servers)}</div>
</div>
</div>
</div>
</div>
))}
</div>
</div>
<div className="flex justify-end space-x-4">
<button
onClick={() => setPreviewGroups(null)}
disabled={isImporting}
className="px-4 py-2 text-gray-700 bg-gray-200 rounded hover:bg-gray-300 disabled:opacity-50 btn-secondary"
>
{t('common.back')}
</button>
<button
onClick={handleImport}
disabled={isImporting}
className="px-4 py-2 bg-blue-600 text-white rounded hover:bg-blue-700 disabled:opacity-50 flex items-center btn-primary"
>
{isImporting ? (
<>
<svg
className="animate-spin h-4 w-4 mr-2"
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
>
<circle
className="opacity-25"
cx="12"
cy="12"
r="10"
stroke="currentColor"
strokeWidth="4"
></circle>
<path
className="opacity-75"
fill="currentColor"
d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"
></path>
</svg>
{t('groupImport.importing')}
</>
) : (
t('groupImport.import')
)}
</button>
</div>
</div>
)}
</div>
</div>
);
};
export default GroupImportForm;

View File

@@ -15,8 +15,11 @@ interface McpServerConfig {
url?: string; url?: string;
headers?: Record<string, string>; headers?: Record<string, string>;
openapi?: { openapi?: {
version: string; url?: string;
url: string; schema?: Record<string, any>;
version?: string;
security?: any;
passthroughHeaders?: string[];
}; };
} }
@@ -33,16 +36,29 @@ const JSONImportForm: React.FC<JSONImportFormProps> = ({ onSuccess, onCancel })
null, null,
); );
const examplePlaceholder = `{ const examplePlaceholder = `STDIO example:
{
"mcpServers": { "mcpServers": {
"stdio-server-example": { "stdio-server-example": {
"command": "npx", "command": "npx",
"args": ["-y", "mcp-server-example"] "args": ["-y", "mcp-server-example"]
}, }
}
}
SSE example:
{
"mcpServers": {
"sse-server-example": { "sse-server-example": {
"type": "sse", "type": "sse",
"url": "http://localhost:3000" "url": "http://localhost:3000"
}, }
}
}
HTTP example:
{
"mcpServers": {
"http-server-example": { "http-server-example": {
"type": "streamable-http", "type": "streamable-http",
"url": "http://localhost:3001", "url": "http://localhost:3001",
@@ -50,18 +66,37 @@ const JSONImportForm: React.FC<JSONImportFormProps> = ({ onSuccess, onCancel })
"Content-Type": "application/json", "Content-Type": "application/json",
"Authorization": "Bearer your-token" "Authorization": "Bearer your-token"
} }
}, }
}
}
OpenAPI example (correct format):
{
"mcpServers": {
"openapi-server-example": { "openapi-server-example": {
"type": "openapi", "type": "openapi",
"openapi": { "openapi": {
"url": "https://petstore.swagger.io/v2/swagger.json" "url": "http://localhost:3002/openapi.json"
},
"headers": {
"X-API-Key": "your-api-key"
} }
} }
} }
} }
Supports: STDIO, SSE, HTTP (streamable-http), OpenAPI OpenAPI example (legacy format, also supported):
All servers will be imported in a single efficient batch operation.`; {
"mcpServers": {
"openapi-server-legacy": {
"type": "openapi",
"url": "http://localhost:3002/openapi.json",
"headers": {
"X-API-Key": "your-api-key"
}
}
}
}`;
const parseAndValidateJson = (input: string): ImportJsonFormat | null => { const parseAndValidateJson = (input: string): ImportJsonFormat | null => {
try { try {
@@ -89,18 +124,34 @@ All servers will be imported in a single efficient batch operation.`;
// Normalize config to MCPHub format // Normalize config to MCPHub format
const normalizedConfig: any = {}; const normalizedConfig: any = {};
// Handle different server types
if (config.type === 'sse' || config.type === 'streamable-http') { if (config.type === 'sse' || config.type === 'streamable-http') {
// SSE and streamable-http servers use top-level url
normalizedConfig.type = config.type; normalizedConfig.type = config.type;
normalizedConfig.url = config.url; normalizedConfig.url = config.url;
if (config.headers) { if (config.headers) {
normalizedConfig.headers = config.headers; normalizedConfig.headers = config.headers;
} }
} else if (config.type === 'openapi') { } else if (config.type === 'openapi') {
// OpenAPI servers have special handling
normalizedConfig.type = 'openapi'; normalizedConfig.type = 'openapi';
// Check if openapi configuration is already in correct format
if (config.openapi) {
normalizedConfig.openapi = config.openapi; normalizedConfig.openapi = config.openapi;
} else if (config.url) {
// Legacy format: convert top-level url to openapi.url
normalizedConfig.openapi = {
url: config.url,
};
}
if (config.headers) {
normalizedConfig.headers = config.headers;
}
} else { } else {
// Default to stdio // Command-based servers (stdio or unspecified type)
normalizedConfig.type = 'stdio'; normalizedConfig.type = config.type || 'stdio';
normalizedConfig.command = config.command; normalizedConfig.command = config.command;
normalizedConfig.args = config.args || []; normalizedConfig.args = config.args || [];
if (config.env) { if (config.env) {
@@ -121,19 +172,29 @@ All servers will be imported in a single efficient batch operation.`;
setError(null); setError(null);
try { try {
// Use batch import API for better performance let successCount = 0;
const result = await apiPost('/servers/batch', { const errors: string[] = [];
servers: previewServers,
for (const server of previewServers) {
try {
const result = await apiPost('/servers', {
name: server.name,
config: server.config,
}); });
if (result.success && result.data) { if (result.success) {
const { successCount, failureCount, results } = result.data; successCount++;
} else {
if (failureCount > 0) { errors.push(`${server.name}: ${result.message || t('jsonImport.addFailed')}`);
const errors = results }
.filter((r: any) => !r.success) } catch (err) {
.map((r: any) => `${r.name}: ${r.message || t('jsonImport.addFailed')}`); errors.push(
`${server.name}: ${err instanceof Error ? err.message : t('jsonImport.addFailed')}`,
);
}
}
if (errors.length > 0) {
setError( setError(
t('jsonImport.partialSuccess', { count: successCount, total: previewServers.length }) + t('jsonImport.partialSuccess', { count: successCount, total: previewServers.length }) +
'\n' + '\n' +
@@ -144,9 +205,6 @@ All servers will be imported in a single efficient batch operation.`;
if (successCount > 0) { if (successCount > 0) {
onSuccess(); onSuccess();
} }
} else {
setError(result.message || t('jsonImport.importFailed'));
}
} catch (err) { } catch (err) {
console.error('Import error:', err); console.error('Import error:', err);
setError(t('jsonImport.importFailed')); setError(t('jsonImport.importFailed'));
@@ -234,6 +292,16 @@ All servers will be imported in a single efficient batch operation.`;
<strong>{t('server.url')}:</strong> {server.config.url} <strong>{t('server.url')}:</strong> {server.config.url}
</div> </div>
)} )}
{server.config.openapi?.url && (
<div>
<strong>OpenAPI URL:</strong> {server.config.openapi.url}
</div>
)}
{server.config.openapi?.schema && (
<div>
<strong>OpenAPI Schema:</strong> (Inline schema provided)
</div>
)}
{server.config.env && Object.keys(server.config.env).length > 0 && ( {server.config.env && Object.keys(server.config.env).length > 0 && (
<div> <div>
<strong>{t('server.envVars')}:</strong>{' '} <strong>{t('server.envVars')}:</strong>{' '}

View File

@@ -375,7 +375,6 @@ const ServerForm = ({
? { ? {
url: formData.url, url: formData.url,
...(Object.keys(headers).length > 0 ? { headers } : {}), ...(Object.keys(headers).length > 0 ? { headers } : {}),
...(Object.keys(env).length > 0 ? { env } : {}),
...(oauthConfig ? { oauth: oauthConfig } : {}), ...(oauthConfig ? { oauth: oauthConfig } : {}),
} }
: { : {
@@ -429,6 +428,7 @@ const ServerForm = ({
className="shadow appearance-none border rounded w-full py-2 px-3 text-gray-700 leading-tight focus:outline-none focus:shadow-outline form-input" className="shadow appearance-none border rounded w-full py-2 px-3 text-gray-700 leading-tight focus:outline-none focus:shadow-outline form-input"
placeholder="e.g.: time-mcp" placeholder="e.g.: time-mcp"
required required
disabled={isEdit}
/> />
</div> </div>
@@ -978,49 +978,6 @@ const ServerForm = ({
))} ))}
</div> </div>
<div className="mb-4">
<div className="flex justify-between items-center mb-2">
<label className="block text-gray-700 text-sm font-bold">
{t('server.envVars')}
</label>
<button
type="button"
onClick={addEnvVar}
className="bg-gray-200 hover:bg-gray-300 text-gray-700 font-medium py-1 px-2 rounded text-sm flex items-center justify-center min-w-[30px] min-h-[30px] btn-primary"
>
+
</button>
</div>
{envVars.map((envVar, index) => (
<div key={index} className="flex items-center mb-2">
<div className="flex items-center space-x-2 flex-grow">
<input
type="text"
value={envVar.key}
onChange={(e) => handleEnvVarChange(index, 'key', e.target.value)}
className="shadow appearance-none border rounded py-2 px-3 text-gray-700 leading-tight focus:outline-none focus:shadow-outline w-1/2 form-input"
placeholder={t('server.key')}
/>
<span className="flex items-center">:</span>
<input
type="text"
value={envVar.value}
onChange={(e) => handleEnvVarChange(index, 'value', e.target.value)}
className="shadow appearance-none border rounded py-2 px-3 text-gray-700 leading-tight focus:outline-none focus:shadow-outline w-1/2 form-input"
placeholder={t('server.value')}
/>
</div>
<button
type="button"
onClick={() => removeEnvVar(index)}
className="bg-gray-200 hover:bg-gray-300 text-gray-700 font-medium py-1 px-2 rounded text-sm flex items-center justify-center min-w-[30px] min-h-[30px] ml-2 btn-danger"
>
-
</button>
</div>
))}
</div>
<div className="mb-4"> <div className="mb-4">
<div <div
className="flex items-center justify-between cursor-pointer bg-gray-50 hover:bg-gray-100 p-3 rounded border border-gray-200" className="flex items-center justify-between cursor-pointer bg-gray-50 hover:bg-gray-100 p-3 rounded border border-gray-200"

View File

@@ -21,14 +21,7 @@ interface DynamicFormProps {
title?: string; // Optional title to display instead of default parameters title title?: string; // Optional title to display instead of default parameters title
} }
const DynamicForm: React.FC<DynamicFormProps> = ({ const DynamicForm: React.FC<DynamicFormProps> = ({ schema, onSubmit, onCancel, loading = false, storageKey, title }) => {
schema,
onSubmit,
onCancel,
loading = false,
storageKey,
title,
}) => {
const { t } = useTranslation(); const { t } = useTranslation();
const [formValues, setFormValues] = useState<Record<string, any>>({}); const [formValues, setFormValues] = useState<Record<string, any>>({});
const [errors, setErrors] = useState<Record<string, string>>({}); const [errors, setErrors] = useState<Record<string, string>>({});
@@ -47,14 +40,9 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
description: obj.description, description: obj.description,
enum: obj.enum, enum: obj.enum,
default: obj.default, default: obj.default,
properties: obj.properties properties: obj.properties ? Object.fromEntries(
? Object.fromEntries( Object.entries(obj.properties).map(([key, value]) => [key, convertProperty(value)])
Object.entries(obj.properties).map(([key, value]) => [ ) : undefined,
key,
convertProperty(value),
]),
)
: undefined,
required: obj.required, required: obj.required,
items: obj.items ? convertProperty(obj.items) : undefined, items: obj.items ? convertProperty(obj.items) : undefined,
}; };
@@ -64,14 +52,9 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
return { return {
type: schema.type, type: schema.type,
properties: schema.properties properties: schema.properties ? Object.fromEntries(
? Object.fromEntries( Object.entries(schema.properties).map(([key, value]) => [key, convertProperty(value)])
Object.entries(schema.properties).map(([key, value]) => [ ) : undefined,
key,
convertProperty(value),
]),
)
: undefined,
required: schema.required, required: schema.required,
}; };
}; };
@@ -184,7 +167,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
}; };
const handleInputChange = (path: string, value: any) => { const handleInputChange = (path: string, value: any) => {
setFormValues((prev) => { setFormValues(prev => {
const newValues = { ...prev }; const newValues = { ...prev };
const keys = path.split('.'); const keys = path.split('.');
let current = newValues; let current = newValues;
@@ -212,7 +195,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
// Clear error for this field // Clear error for this field
if (errors[path]) { if (errors[path]) {
setErrors((prev) => { setErrors(prev => {
const newErrors = { ...prev }; const newErrors = { ...prev };
delete newErrors[path]; delete newErrors[path];
return newErrors; return newErrors;
@@ -226,16 +209,10 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
if (schema.type === 'object' && schema.properties) { if (schema.type === 'object' && schema.properties) {
Object.entries(schema.properties).forEach(([key, propSchema]) => { Object.entries(schema.properties).forEach(([key, propSchema]) => {
const fullPath = path ? `${path}.${key}` : key; const fullPath = path ? `${path}.${key}` : key;
const value = values?.[key]; const value = getNestedValue(values, fullPath);
// Check required fields // Check required fields
if ( if (schema.required?.includes(key) && (value === undefined || value === null || value === '')) {
schema.required?.includes(key) &&
(value === undefined ||
value === null ||
value === '' ||
(Array.isArray(value) && value.length === 0))
) {
newErrors[fullPath] = `${key} is required`; newErrors[fullPath] = `${key} is required`;
return; return;
} }
@@ -246,10 +223,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
newErrors[fullPath] = `${key} must be a string`; newErrors[fullPath] = `${key} must be a string`;
} else if (propSchema.type === 'number' && typeof value !== 'number') { } else if (propSchema.type === 'number' && typeof value !== 'number') {
newErrors[fullPath] = `${key} must be a number`; newErrors[fullPath] = `${key} must be a number`;
} else if ( } else if (propSchema.type === 'integer' && (!Number.isInteger(value) || typeof value !== 'number')) {
propSchema.type === 'integer' &&
(!Number.isInteger(value) || typeof value !== 'number')
) {
newErrors[fullPath] = `${key} must be an integer`; newErrors[fullPath] = `${key} must be an integer`;
} else if (propSchema.type === 'boolean' && typeof value !== 'boolean') { } else if (propSchema.type === 'boolean' && typeof value !== 'boolean') {
newErrors[fullPath] = `${key} must be a boolean`; newErrors[fullPath] = `${key} must be a boolean`;
@@ -286,12 +260,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
return path.split('.').reduce((current, key) => current?.[key], obj); return path.split('.').reduce((current, key) => current?.[key], obj);
}; };
const renderObjectField = ( const renderObjectField = (key: string, schema: JsonSchema, currentValue: any, onChange: (value: any) => void): React.ReactNode => {
key: string,
schema: JsonSchema,
currentValue: any,
onChange: (value: any) => void,
): React.ReactNode => {
const value = currentValue?.[key]; const value = currentValue?.[key];
if (schema.type === 'string') { if (schema.type === 'string') {
@@ -330,12 +299,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
step={schema.type === 'integer' ? '1' : 'any'} step={schema.type === 'integer' ? '1' : 'any'}
value={value ?? ''} value={value ?? ''}
onChange={(e) => { onChange={(e) => {
const val = const val = e.target.value === '' ? '' : schema.type === 'integer' ? parseInt(e.target.value) : parseFloat(e.target.value);
e.target.value === ''
? ''
: schema.type === 'integer'
? parseInt(e.target.value)
: parseFloat(e.target.value);
onChange(val); onChange(val);
}} }}
className="w-full border rounded-md px-2 py-1 text-sm border-gray-300 focus:outline-none focus:ring-1 focus:ring-blue-500 form-input" className="w-full border rounded-md px-2 py-1 text-sm border-gray-300 focus:outline-none focus:ring-1 focus:ring-blue-500 form-input"
@@ -377,11 +341,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<div key={fullPath} className="mb-6"> <div key={fullPath} className="mb-6">
<label className="block text-sm font-medium text-gray-700 mb-1"> <label className="block text-sm font-medium text-gray-700 mb-1">
{key} {key}
{(path {(path ? getNestedValue(jsonSchema, path)?.required?.includes(key) : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
? getNestedValue(jsonSchema, path)?.required?.includes(key)
: jsonSchema.required?.includes(key)) && (
<span className="text-status-red ml-1">*</span>
)}
</label> </label>
{propSchema.description && ( {propSchema.description && (
<p className="text-xs text-gray-500 mb-2">{propSchema.description}</p> <p className="text-xs text-gray-500 mb-2">{propSchema.description}</p>
@@ -389,11 +349,9 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<div className="border border-gray-200 rounded-md p-3 bg-gray-50"> <div className="border border-gray-200 rounded-md p-3 bg-gray-50">
{arrayValue.map((item: any, index: number) => ( {arrayValue.map((item: any, index: number) => (
<div key={index} className="mb-3 p-3 bg-white border border-gray-200 rounded-md"> <div key={index} className="mb-3 p-3 bg-white border rounded-md">
<div className="flex justify-between items-center mb-2"> <div className="flex justify-between items-center mb-2">
<span className="text-sm font-medium text-gray-600"> <span className="text-sm font-medium text-gray-600">{t('tool.item', { index: index + 1 })}</span>
{t('tool.item', { index: index + 1 })}
</span>
<button <button
type="button" type="button"
onClick={() => { onClick={() => {
@@ -430,9 +388,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<div key={objKey}> <div key={objKey}>
<label className="block text-xs font-medium text-gray-600 mb-1"> <label className="block text-xs font-medium text-gray-600 mb-1">
{objKey} {objKey}
{propSchema.items?.required?.includes(objKey) && ( {propSchema.items?.required?.includes(objKey) && <span className="text-status-red ml-1">*</span>}
<span className="text-status-red ml-1">*</span>
)}
</label> </label>
{renderObjectField(objKey, objSchema as JsonSchema, item, (newValue) => { {renderObjectField(objKey, objSchema as JsonSchema, item, (newValue) => {
const newArray = [...arrayValue]; const newArray = [...arrayValue];
@@ -481,20 +437,16 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<div key={fullPath} className="mb-6"> <div key={fullPath} className="mb-6">
<label className="block text-sm font-medium text-gray-700 mb-1"> <label className="block text-sm font-medium text-gray-700 mb-1">
{key} {key}
{(path {(path ? getNestedValue(jsonSchema, path)?.required?.includes(key) : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
? getNestedValue(jsonSchema, path)?.required?.includes(key)
: jsonSchema.required?.includes(key)) && (
<span className="text-status-red ml-1">*</span>
)}
</label> </label>
{propSchema.description && ( {propSchema.description && (
<p className="text-xs text-gray-500 mb-2">{propSchema.description}</p> <p className="text-xs text-gray-500 mb-2">{propSchema.description}</p>
)} )}
<div className="border border-gray-200 rounded-md p-4 bg-gray-50"> <div className="border border-gray-200 rounded-md p-4 bg-gray-50">
{Object.entries(propSchema.properties).map(([objKey, objSchema]) => {Object.entries(propSchema.properties).map(([objKey, objSchema]) => (
renderField(objKey, objSchema as JsonSchema, fullPath), renderField(objKey, objSchema as JsonSchema, fullPath)
)} ))}
</div> </div>
{error && <p className="text-status-red text-xs mt-1">{error}</p>} {error && <p className="text-status-red text-xs mt-1">{error}</p>}
@@ -506,11 +458,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<div key={fullPath} className="mb-4"> <div key={fullPath} className="mb-4">
<label className="block text-sm font-medium text-gray-700 mb-1"> <label className="block text-sm font-medium text-gray-700 mb-1">
{key} {key}
{(path {(path ? getNestedValue(jsonSchema, path)?.required?.includes(key) : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
? getNestedValue(jsonSchema, path)?.required?.includes(key)
: jsonSchema.required?.includes(key)) && (
<span className="text-status-red ml-1">*</span>
)}
<span className="text-xs text-gray-500 ml-1">(JSON object)</span> <span className="text-xs text-gray-500 ml-1">(JSON object)</span>
</label> </label>
{propSchema.description && ( {propSchema.description && (
@@ -535,16 +483,13 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
</div> </div>
); );
} }
} } if (propSchema.type === 'string') {
if (propSchema.type === 'string') {
if (propSchema.enum) { if (propSchema.enum) {
return ( return (
<div key={fullPath} className="mb-4"> <div key={fullPath} className="mb-4">
<label className="block text-sm font-medium text-gray-700 mb-1"> <label className="block text-sm font-medium text-gray-700 mb-1">
{key} {key}
{(path ? false : jsonSchema.required?.includes(key)) && ( {(path ? false : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
<span className="text-status-red ml-1">*</span>
)}
</label> </label>
{propSchema.description && ( {propSchema.description && (
<p className="text-xs text-gray-500 mb-2">{propSchema.description}</p> <p className="text-xs text-gray-500 mb-2">{propSchema.description}</p>
@@ -569,9 +514,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<div key={fullPath} className="mb-4"> <div key={fullPath} className="mb-4">
<label className="block text-sm font-medium text-gray-700 mb-1"> <label className="block text-sm font-medium text-gray-700 mb-1">
{key} {key}
{(path ? false : jsonSchema.required?.includes(key)) && ( {(path ? false : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
<span className="text-status-red ml-1">*</span>
)}
</label> </label>
{propSchema.description && ( {propSchema.description && (
<p className="text-xs text-gray-500 mb-2">{propSchema.description}</p> <p className="text-xs text-gray-500 mb-2">{propSchema.description}</p>
@@ -586,15 +529,12 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
</div> </div>
); );
} }
} } if (propSchema.type === 'number' || propSchema.type === 'integer') {
if (propSchema.type === 'number' || propSchema.type === 'integer') {
return ( return (
<div key={fullPath} className="mb-4"> <div key={fullPath} className="mb-4">
<label className="block text-sm font-medium text-gray-700 mb-1"> <label className="block text-sm font-medium text-gray-700 mb-1">
{key} {key}
{(path ? false : jsonSchema.required?.includes(key)) && ( {(path ? false : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
<span className="text-status-red ml-1">*</span>
)}
</label> </label>
{propSchema.description && ( {propSchema.description && (
<p className="text-xs text-gray-500 mb-2">{propSchema.description}</p> <p className="text-xs text-gray-500 mb-2">{propSchema.description}</p>
@@ -604,12 +544,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
step={propSchema.type === 'integer' ? '1' : 'any'} step={propSchema.type === 'integer' ? '1' : 'any'}
value={value !== undefined && value !== null ? value : ''} value={value !== undefined && value !== null ? value : ''}
onChange={(e) => { onChange={(e) => {
const val = const val = e.target.value === '' ? '' : propSchema.type === 'integer' ? parseInt(e.target.value) : parseFloat(e.target.value);
e.target.value === ''
? ''
: propSchema.type === 'integer'
? parseInt(e.target.value)
: parseFloat(e.target.value);
handleInputChange(fullPath, val); handleInputChange(fullPath, val);
}} }}
className={`w-full border rounded-md px-3 py-2 form-input ${error ? 'border-red-500' : 'border-gray-300'} focus:outline-none focus:ring-2 focus:ring-blue-500`} className={`w-full border rounded-md px-3 py-2 form-input ${error ? 'border-red-500' : 'border-gray-300'} focus:outline-none focus:ring-2 focus:ring-blue-500`}
@@ -631,9 +566,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
/> />
<label className="ml-2 block text-sm text-gray-700"> <label className="ml-2 block text-sm text-gray-700">
{key} {key}
{(path ? false : jsonSchema.required?.includes(key)) && ( {(path ? false : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
<span className="text-status-red ml-1">*</span>
)}
</label> </label>
</div> </div>
{propSchema.description && ( {propSchema.description && (
@@ -647,9 +580,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<div key={fullPath} className="mb-4"> <div key={fullPath} className="mb-4">
<label className="block text-sm font-medium text-gray-700 mb-1"> <label className="block text-sm font-medium text-gray-700 mb-1">
{key} {key}
{(path ? false : jsonSchema.required?.includes(key)) && ( {(path ? false : jsonSchema.required?.includes(key)) && <span className="text-status-red ml-1">*</span>}
<span className="text-status-red ml-1">*</span>
)}
<span className="text-xs text-gray-500 ml-1">({propSchema.type})</span> <span className="text-xs text-gray-500 ml-1">({propSchema.type})</span>
</label> </label>
{propSchema.description && ( {propSchema.description && (
@@ -700,8 +631,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<button <button
type="button" type="button"
onClick={switchToFormMode} onClick={switchToFormMode}
className={`px-3 py-1 text-sm rounded-md transition-colors ${ className={`px-3 py-1 text-sm rounded-md transition-colors ${!isJsonMode
!isJsonMode
? 'bg-blue-100 text-blue-800 rounded hover:bg-blue-200 text-sm btn-primary' ? 'bg-blue-100 text-blue-800 rounded hover:bg-blue-200 text-sm btn-primary'
: 'text-sm text-gray-600 bg-gray-200 rounded hover:bg-gray-300 btn-secondary' : 'text-sm text-gray-600 bg-gray-200 rounded hover:bg-gray-300 btn-secondary'
}`} }`}
@@ -711,8 +641,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
<button <button
type="button" type="button"
onClick={switchToJsonMode} onClick={switchToJsonMode}
className={`px-3 py-1 text-sm rounded-md transition-colors ${ className={`px-3 py-1 text-sm rounded-md transition-colors ${isJsonMode
isJsonMode
? 'px-4 py-1 bg-blue-100 text-blue-800 rounded hover:bg-blue-200 text-sm btn-primary' ? 'px-4 py-1 bg-blue-100 text-blue-800 rounded hover:bg-blue-200 text-sm btn-primary'
: 'text-sm text-gray-600 bg-gray-200 rounded hover:bg-gray-300 btn-secondary' : 'text-sm text-gray-600 bg-gray-200 rounded hover:bg-gray-300 btn-secondary'
}`} }`}
@@ -733,8 +662,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
value={jsonText} value={jsonText}
onChange={(e) => handleJsonTextChange(e.target.value)} onChange={(e) => handleJsonTextChange(e.target.value)}
placeholder={`{\n "key": "value"\n}`} placeholder={`{\n "key": "value"\n}`}
className={`w-full h-64 border rounded-md px-3 py-2 font-mono text-sm resize-y form-input ${ className={`w-full h-64 border rounded-md px-3 py-2 font-mono text-sm resize-y form-input ${jsonError ? 'border-red-500' : 'border-gray-300'
jsonError ? 'border-red-500' : 'border-gray-300'
} focus:outline-none focus:ring-2 focus:ring-blue-500`} } focus:outline-none focus:ring-2 focus:ring-blue-500`}
/> />
{jsonError && <p className="text-status-red text-xs mt-1">{jsonError}</p>} {jsonError && <p className="text-status-red text-xs mt-1">{jsonError}</p>}
@@ -768,7 +696,7 @@ const DynamicForm: React.FC<DynamicFormProps> = ({
/* Form Mode */ /* Form Mode */
<form onSubmit={handleSubmit} className="space-y-4"> <form onSubmit={handleSubmit} className="space-y-4">
{Object.entries(jsonSchema.properties || {}).map(([key, propSchema]) => {Object.entries(jsonSchema.properties || {}).map(([key, propSchema]) =>
renderField(key, propSchema), renderField(key, propSchema)
)} )}
<div className="flex justify-end space-x-2 pt-4"> <div className="flex justify-end space-x-2 pt-4">

View File

@@ -1,161 +0,0 @@
import React, { useState, useRef, useEffect } from 'react';
import { Check, ChevronDown, X } from 'lucide-react';
interface MultiSelectProps {
options: { value: string; label: string }[];
selected: string[];
onChange: (selected: string[]) => void;
placeholder?: string;
disabled?: boolean;
className?: string;
}
export const MultiSelect: React.FC<MultiSelectProps> = ({
options,
selected,
onChange,
placeholder = 'Select items...',
disabled = false,
className = '',
}) => {
const [isOpen, setIsOpen] = useState(false);
const [searchTerm, setSearchTerm] = useState('');
const dropdownRef = useRef<HTMLDivElement>(null);
const inputRef = useRef<HTMLInputElement>(null);
// Close dropdown when clicking outside
useEffect(() => {
const handleClickOutside = (event: MouseEvent) => {
if (dropdownRef.current && !dropdownRef.current.contains(event.target as Node)) {
setIsOpen(false);
setSearchTerm('');
}
};
document.addEventListener('mousedown', handleClickOutside);
return () => document.removeEventListener('mousedown', handleClickOutside);
}, []);
const filteredOptions = options.filter((option) =>
option.label.toLowerCase().includes(searchTerm.toLowerCase()),
);
const handleToggleOption = (value: string) => {
if (disabled) return;
const newSelected = selected.includes(value)
? selected.filter((item) => item !== value)
: [...selected, value];
onChange(newSelected);
};
const handleRemoveItem = (value: string, e: React.MouseEvent) => {
e.stopPropagation();
if (disabled) return;
onChange(selected.filter((item) => item !== value));
};
const handleToggleDropdown = () => {
if (disabled) return;
setIsOpen(!isOpen);
if (!isOpen) {
setTimeout(() => inputRef.current?.focus(), 0);
}
};
const getSelectedLabels = () => {
return selected
.map((value) => options.find((opt) => opt.value === value)?.label || value)
.filter(Boolean);
};
return (
<div ref={dropdownRef} className={`relative ${className}`}>
{/* Selected items display */}
<div
onClick={handleToggleDropdown}
className={`
min-h-[38px] w-full px-3 py-1.5 border rounded-md shadow-sm
flex flex-wrap items-center gap-1.5 cursor-pointer
transition-all duration-200
${disabled ? 'bg-gray-100 cursor-not-allowed' : 'bg-white hover:border-blue-400'}
${isOpen ? 'border-blue-500 ring-1 ring-blue-500' : 'border-gray-300'}
`}
>
{selected.length > 0 ? (
<>
{getSelectedLabels().map((label, index) => (
<span
key={selected[index]}
className="inline-flex items-center px-2 py-0.5 rounded text-xs font-medium bg-blue-100 text-blue-800"
>
{label}
{!disabled && (
<button
type="button"
onClick={(e) => handleRemoveItem(selected[index], e)}
className="ml-1 hover:bg-blue-200 rounded-full p-0.5 transition-colors"
>
<X className="h-3 w-3" />
</button>
)}
</span>
))}
</>
) : (
<span className="text-gray-400 text-sm">{placeholder}</span>
)}
<div className="flex-1"></div>
<ChevronDown
className={`h-4 w-4 text-gray-400 transition-transform duration-200 ${isOpen ? 'transform rotate-180' : ''}`}
/>
</div>
{/* Dropdown menu */}
{isOpen && !disabled && (
<div className="absolute z-50 w-full mt-1 bg-white border border-gray-300 rounded-md shadow-lg max-h-60 overflow-hidden">
{/* Search input */}
<div className="p-2 border-b border-gray-200">
<input
ref={inputRef}
type="text"
value={searchTerm}
onChange={(e) => setSearchTerm(e.target.value)}
placeholder="Search..."
className="w-full px-3 py-1.5 text-sm border border-gray-300 rounded focus:outline-none focus:ring-1 focus:ring-blue-500 focus:border-blue-500"
onClick={(e) => e.stopPropagation()}
/>
</div>
{/* Options list */}
<div className="max-h-48 overflow-y-auto">
{filteredOptions.length > 0 ? (
filteredOptions.map((option) => {
const isSelected = selected.includes(option.value);
return (
<div
key={option.value}
onClick={() => handleToggleOption(option.value)}
className={`
px-3 py-2 cursor-pointer flex items-center justify-between
transition-colors duration-150
${isSelected ? 'bg-blue-50 text-blue-700' : 'hover:bg-gray-100'}
`}
>
<span className="text-sm">{option.label}</span>
{isSelected && <Check className="h-4 w-4 text-blue-600" />}
</div>
);
})
) : (
<div className="px-3 py-2 text-sm text-gray-500 text-center">
{searchTerm ? 'No results found' : 'No options available'}
</div>
)}
</div>
</div>
)}
</div>
);
};

View File

@@ -14,17 +14,14 @@ const initialState: AuthState = {
// Create auth context // Create auth context
const AuthContext = createContext<{ const AuthContext = createContext<{
auth: AuthState; auth: AuthState;
login: ( login: (username: string, password: string) => Promise<{ success: boolean; isUsingDefaultPassword?: boolean }>;
username: string,
password: string,
) => Promise<{ success: boolean; isUsingDefaultPassword?: boolean; message?: string }>;
register: (username: string, password: string, isAdmin?: boolean) => Promise<boolean>; register: (username: string, password: string, isAdmin?: boolean) => Promise<boolean>;
logout: () => void; logout: () => void;
}>({ }>({
auth: initialState, auth: initialState,
login: async () => ({ success: false }), login: async () => ({ success: false }),
register: async () => false, register: async () => false,
logout: () => {}, logout: () => { },
}); });
// Auth provider component // Auth provider component
@@ -93,10 +90,7 @@ export const AuthProvider: React.FC<{ children: ReactNode }> = ({ children }) =>
}, []); }, []);
// Login function // Login function
const login = async ( const login = async (username: string, password: string): Promise<{ success: boolean; isUsingDefaultPassword?: boolean }> => {
username: string,
password: string,
): Promise<{ success: boolean; isUsingDefaultPassword?: boolean; message?: string }> => {
try { try {
const response = await authService.login({ username, password }); const response = await authService.login({ username, password });
@@ -117,7 +111,7 @@ export const AuthProvider: React.FC<{ children: ReactNode }> = ({ children }) =>
loading: false, loading: false,
error: response.message || 'Authentication failed', error: response.message || 'Authentication failed',
}); });
return { success: false, message: response.message }; return { success: false };
} }
} catch (error) { } catch (error) {
setAuth({ setAuth({
@@ -125,7 +119,7 @@ export const AuthProvider: React.FC<{ children: ReactNode }> = ({ children }) =>
loading: false, loading: false,
error: 'Authentication failed', error: 'Authentication failed',
}); });
return { success: false, message: error instanceof Error ? error.message : undefined }; return { success: false };
} }
}; };
@@ -133,7 +127,7 @@ export const AuthProvider: React.FC<{ children: ReactNode }> = ({ children }) =>
const register = async ( const register = async (
username: string, username: string,
password: string, password: string,
isAdmin = false, isAdmin = false
): Promise<boolean> => { ): Promise<boolean> => {
try { try {
const response = await authService.register({ username, password, isAdmin }); const response = await authService.register({ username, password, isAdmin });

View File

@@ -7,10 +7,9 @@ import React, {
ReactNode, ReactNode,
} from 'react'; } from 'react';
import { useTranslation } from 'react-i18next'; import { useTranslation } from 'react-i18next';
import { ApiResponse, BearerKey } from '@/types'; import { ApiResponse } from '@/types';
import { useToast } from '@/contexts/ToastContext'; import { useToast } from '@/contexts/ToastContext';
import { useAuth } from '@/contexts/AuthContext'; import { apiGet, apiPut } from '@/utils/fetchInterceptor';
import { apiGet, apiPut, apiPost, apiDelete } from '@/utils/fetchInterceptor';
// Define types for the settings data // Define types for the settings data
interface RoutingConfig { interface RoutingConfig {
@@ -67,7 +66,6 @@ interface SystemSettings {
oauthServer?: OAuthServerConfig; oauthServer?: OAuthServerConfig;
enableSessionRebuild?: boolean; enableSessionRebuild?: boolean;
}; };
bearerKeys?: BearerKey[];
} }
interface TempRoutingConfig { interface TempRoutingConfig {
@@ -84,7 +82,6 @@ interface SettingsContextValue {
oauthServerConfig: OAuthServerConfig; oauthServerConfig: OAuthServerConfig;
nameSeparator: string; nameSeparator: string;
enableSessionRebuild: boolean; enableSessionRebuild: boolean;
bearerKeys: BearerKey[];
loading: boolean; loading: boolean;
error: string | null; error: string | null;
setError: React.Dispatch<React.SetStateAction<string | null>>; setError: React.Dispatch<React.SetStateAction<string | null>>;
@@ -112,14 +109,6 @@ interface SettingsContextValue {
updateNameSeparator: (value: string) => Promise<boolean | undefined>; updateNameSeparator: (value: string) => Promise<boolean | undefined>;
updateSessionRebuild: (value: boolean) => Promise<boolean | undefined>; updateSessionRebuild: (value: boolean) => Promise<boolean | undefined>;
exportMCPSettings: (serverName?: string) => Promise<any>; exportMCPSettings: (serverName?: string) => Promise<any>;
// Bearer key management
refreshBearerKeys: () => Promise<void>;
createBearerKey: (payload: Omit<BearerKey, 'id'>) => Promise<BearerKey | null>;
updateBearerKey: (
id: string,
updates: Partial<Omit<BearerKey, 'id'>>,
) => Promise<BearerKey | null>;
deleteBearerKey: (id: string) => Promise<boolean>;
} }
const getDefaultOAuthServerConfig = (): OAuthServerConfig => ({ const getDefaultOAuthServerConfig = (): OAuthServerConfig => ({
@@ -154,7 +143,6 @@ interface SettingsProviderProps {
export const SettingsProvider: React.FC<SettingsProviderProps> = ({ children }) => { export const SettingsProvider: React.FC<SettingsProviderProps> = ({ children }) => {
const { t } = useTranslation(); const { t } = useTranslation();
const { showToast } = useToast(); const { showToast } = useToast();
const { auth } = useAuth();
const [routingConfig, setRoutingConfig] = useState<RoutingConfig>({ const [routingConfig, setRoutingConfig] = useState<RoutingConfig>({
enableGlobalRoute: true, enableGlobalRoute: true,
@@ -195,7 +183,6 @@ export const SettingsProvider: React.FC<SettingsProviderProps> = ({ children })
const [nameSeparator, setNameSeparator] = useState<string>('-'); const [nameSeparator, setNameSeparator] = useState<string>('-');
const [enableSessionRebuild, setEnableSessionRebuild] = useState<boolean>(false); const [enableSessionRebuild, setEnableSessionRebuild] = useState<boolean>(false);
const [bearerKeys, setBearerKeys] = useState<BearerKey[]>([]);
const [loading, setLoading] = useState(false); const [loading, setLoading] = useState(false);
const [error, setError] = useState<string | null>(null); const [error, setError] = useState<string | null>(null);
@@ -292,10 +279,6 @@ export const SettingsProvider: React.FC<SettingsProviderProps> = ({ children })
if (data.success && data.data?.systemConfig?.enableSessionRebuild !== undefined) { if (data.success && data.data?.systemConfig?.enableSessionRebuild !== undefined) {
setEnableSessionRebuild(data.data.systemConfig.enableSessionRebuild); setEnableSessionRebuild(data.data.systemConfig.enableSessionRebuild);
} }
if (data.success && Array.isArray(data.data?.bearerKeys)) {
setBearerKeys(data.data.bearerKeys);
}
} catch (error) { } catch (error) {
console.error('Failed to fetch settings:', error); console.error('Failed to fetch settings:', error);
setError(error instanceof Error ? error.message : 'Failed to fetch settings'); setError(error instanceof Error ? error.message : 'Failed to fetch settings');
@@ -676,87 +659,11 @@ export const SettingsProvider: React.FC<SettingsProviderProps> = ({ children })
} }
}; };
// Bearer key management helpers
const refreshBearerKeys = async () => {
try {
const data: ApiResponse<BearerKey[]> = await apiGet('/auth/keys');
if (data.success && Array.isArray(data.data)) {
setBearerKeys(data.data);
}
} catch (error) {
console.error('Failed to refresh bearer keys:', error);
showToast(t('errors.failedToFetchSettings'));
}
};
const createBearerKey = async (payload: Omit<BearerKey, 'id'>): Promise<BearerKey | null> => {
try {
const data: ApiResponse<BearerKey> = await apiPost('/auth/keys', payload as any);
if (data.success && data.data) {
await refreshBearerKeys();
showToast(t('settings.systemConfigUpdated'));
return data.data;
}
showToast(data.message || t('errors.failedToUpdateRoutingConfig'));
return null;
} catch (error) {
console.error('Failed to create bearer key:', error);
showToast(t('errors.failedToUpdateRoutingConfig'));
return null;
}
};
const updateBearerKey = async (
id: string,
updates: Partial<Omit<BearerKey, 'id'>>,
): Promise<BearerKey | null> => {
try {
const data: ApiResponse<BearerKey> = await apiPut(`/auth/keys/${id}`, updates as any);
if (data.success && data.data) {
await refreshBearerKeys();
showToast(t('settings.systemConfigUpdated'));
return data.data;
}
showToast(data.message || t('errors.failedToUpdateRoutingConfig'));
return null;
} catch (error) {
console.error('Failed to update bearer key:', error);
showToast(t('errors.failedToUpdateRoutingConfig'));
return null;
}
};
const deleteBearerKey = async (id: string): Promise<boolean> => {
try {
const data: ApiResponse = await apiDelete(`/auth/keys/${id}`);
if (data.success) {
await refreshBearerKeys();
showToast(t('settings.systemConfigUpdated'));
return true;
}
showToast(data.message || t('errors.failedToUpdateRoutingConfig'));
return false;
} catch (error) {
console.error('Failed to delete bearer key:', error);
showToast(t('errors.failedToUpdateRoutingConfig'));
return false;
}
};
// Fetch settings when the component mounts or refreshKey changes // Fetch settings when the component mounts or refreshKey changes
useEffect(() => { useEffect(() => {
fetchSettings(); fetchSettings();
}, [fetchSettings, refreshKey]); }, [fetchSettings, refreshKey]);
// Watch for authentication status changes - refetch settings after login
useEffect(() => {
if (auth.isAuthenticated) {
console.log('[SettingsContext] User authenticated, triggering settings refresh');
// When user logs in, trigger a refresh to load settings
triggerRefresh();
}
}, [auth.isAuthenticated, triggerRefresh]);
useEffect(() => { useEffect(() => {
if (routingConfig) { if (routingConfig) {
setTempRoutingConfig({ setTempRoutingConfig({
@@ -775,7 +682,6 @@ export const SettingsProvider: React.FC<SettingsProviderProps> = ({ children })
oauthServerConfig, oauthServerConfig,
nameSeparator, nameSeparator,
enableSessionRebuild, enableSessionRebuild,
bearerKeys,
loading, loading,
error, error,
setError, setError,
@@ -793,10 +699,6 @@ export const SettingsProvider: React.FC<SettingsProviderProps> = ({ children })
updateNameSeparator, updateNameSeparator,
updateSessionRebuild, updateSessionRebuild,
exportMCPSettings, exportMCPSettings,
refreshBearerKeys,
createBearerKey,
updateBearerKey,
deleteBearerKey,
}; };
return <SettingsContext.Provider value={value}>{children}</SettingsContext.Provider>; return <SettingsContext.Provider value={value}>{children}</SettingsContext.Provider>;

View File

@@ -6,7 +6,6 @@ import { useServerData } from '@/hooks/useServerData';
import AddGroupForm from '@/components/AddGroupForm'; import AddGroupForm from '@/components/AddGroupForm';
import EditGroupForm from '@/components/EditGroupForm'; import EditGroupForm from '@/components/EditGroupForm';
import GroupCard from '@/components/GroupCard'; import GroupCard from '@/components/GroupCard';
import GroupImportForm from '@/components/GroupImportForm';
const GroupsPage: React.FC = () => { const GroupsPage: React.FC = () => {
const { t } = useTranslation(); const { t } = useTranslation();
@@ -16,13 +15,12 @@ const GroupsPage: React.FC = () => {
error: groupError, error: groupError,
setError: setGroupError, setError: setGroupError,
deleteGroup, deleteGroup,
triggerRefresh, triggerRefresh
} = useGroupData(); } = useGroupData();
const { servers } = useServerData({ refreshOnMount: true }); const { servers } = useServerData({ refreshOnMount: true });
const [editingGroup, setEditingGroup] = useState<Group | null>(null); const [editingGroup, setEditingGroup] = useState<Group | null>(null);
const [showAddForm, setShowAddForm] = useState(false); const [showAddForm, setShowAddForm] = useState(false);
const [showImportForm, setShowImportForm] = useState(false);
const handleEditClick = (group: Group) => { const handleEditClick = (group: Group) => {
setEditingGroup(group); setEditingGroup(group);
@@ -49,11 +47,6 @@ const GroupsPage: React.FC = () => {
triggerRefresh(); // Refresh the groups list after adding triggerRefresh(); // Refresh the groups list after adding
}; };
const handleImportSuccess = () => {
setShowImportForm(false);
triggerRefresh(); // Refresh the groups list after import
};
return ( return (
<div> <div>
<div className="flex justify-between items-center mb-8"> <div className="flex justify-between items-center mb-8">
@@ -63,38 +56,11 @@ const GroupsPage: React.FC = () => {
onClick={handleAddGroup} onClick={handleAddGroup}
className="px-4 py-2 bg-blue-100 text-blue-800 rounded hover:bg-blue-200 flex items-center btn-primary transition-all duration-200" className="px-4 py-2 bg-blue-100 text-blue-800 rounded hover:bg-blue-200 flex items-center btn-primary transition-all duration-200"
> >
<svg <svg xmlns="http://www.w3.org/2000/svg" className="h-4 w-4 mr-2" viewBox="0 0 20 20" fill="currentColor">
xmlns="http://www.w3.org/2000/svg" <path fillRule="evenodd" d="M10 3a1 1 0 00-1 1v5H4a1 1 0 100 2h5v5a1 1 0 102 0v-5h5a1 1 0 100-2h-5V4a1 1 0 00-1-1z" clipRule="evenodd" />
className="h-4 w-4 mr-2"
viewBox="0 0 20 20"
fill="currentColor"
>
<path
fillRule="evenodd"
d="M10 3a1 1 0 00-1 1v5H4a1 1 0 100 2h5v5a1 1 0 102 0v-5h5a1 1 0 100-2h-5V4a1 1 0 00-1-1z"
clipRule="evenodd"
/>
</svg> </svg>
{t('groups.add')} {t('groups.add')}
</button> </button>
<button
onClick={() => setShowImportForm(true)}
className="px-4 py-2 bg-blue-100 text-blue-800 rounded hover:bg-blue-200 flex items-center btn-primary transition-all duration-200"
>
<svg
xmlns="http://www.w3.org/2000/svg"
className="h-4 w-4 mr-2"
viewBox="0 0 20 20"
fill="currentColor"
>
<path
fillRule="evenodd"
d="M3 17a1 1 0 011-1h12a1 1 0 110 2H4a1 1 0 01-1-1zm3.293-7.707a1 1 0 011.414 0L9 10.586V3a1 1 0 112 0v7.586l1.293-1.293a1 1 0 111.414 1.414l-3 3a1 1 0 01-1.414 0l-3-3a1 1 0 010-1.414z"
clipRule="evenodd"
/>
</svg>
{t('groupImport.button')}
</button>
</div> </div>
</div> </div>
@@ -107,25 +73,9 @@ const GroupsPage: React.FC = () => {
{groupsLoading ? ( {groupsLoading ? (
<div className="bg-white shadow rounded-lg p-6 loading-container"> <div className="bg-white shadow rounded-lg p-6 loading-container">
<div className="flex flex-col items-center justify-center"> <div className="flex flex-col items-center justify-center">
<svg <svg className="animate-spin h-10 w-10 text-blue-500 mb-4" xmlns="http://www.w3.org/2000/svg" fill="none" viewBox="0 0 24 24">
className="animate-spin h-10 w-10 text-blue-500 mb-4" <circle className="opacity-25" cx="12" cy="12" r="10" stroke="currentColor" strokeWidth="4"></circle>
xmlns="http://www.w3.org/2000/svg" <path className="opacity-75" fill="currentColor" d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"></path>
fill="none"
viewBox="0 0 24 24"
>
<circle
className="opacity-25"
cx="12"
cy="12"
r="10"
stroke="currentColor"
strokeWidth="4"
></circle>
<path
className="opacity-75"
fill="currentColor"
d="M4 12a8 8 0 018-8V0C5.373 0 0 5.373 0 12h4zm2 5.291A7.962 7.962 0 014 12H0c0 3.042 1.135 5.824 3 7.938l3-2.647z"
></path>
</svg> </svg>
<p className="text-gray-600">{t('app.loading')}</p> <p className="text-gray-600">{t('app.loading')}</p>
</div> </div>
@@ -148,13 +98,8 @@ const GroupsPage: React.FC = () => {
</div> </div>
)} )}
{showAddForm && <AddGroupForm onAdd={handleAddComplete} onCancel={handleAddComplete} />} {showAddForm && (
<AddGroupForm onAdd={handleAddComplete} onCancel={handleAddComplete} />
{showImportForm && (
<GroupImportForm
onSuccess={handleImportSuccess}
onCancel={() => setShowImportForm(false)}
/>
)} )}
{editingGroup && ( {editingGroup && (

View File

@@ -44,24 +44,6 @@ const LoginPage: React.FC = () => {
return sanitizeReturnUrl(params.get('returnUrl')); return sanitizeReturnUrl(params.get('returnUrl'));
}, [location.search]); }, [location.search]);
const isServerUnavailableError = useCallback((message?: string) => {
if (!message) return false;
const normalized = message.toLowerCase();
return (
normalized.includes('failed to fetch') ||
normalized.includes('networkerror') ||
normalized.includes('network error') ||
normalized.includes('connection refused') ||
normalized.includes('unable to connect') ||
normalized.includes('fetch error') ||
normalized.includes('econnrefused') ||
normalized.includes('http 500') ||
normalized.includes('internal server error') ||
normalized.includes('proxy error')
);
}, []);
const buildRedirectTarget = useCallback(() => { const buildRedirectTarget = useCallback(() => {
if (!returnUrl) { if (!returnUrl) {
return '/'; return '/';
@@ -117,21 +99,11 @@ const LoginPage: React.FC = () => {
} else { } else {
redirectAfterLogin(); redirectAfterLogin();
} }
} else {
const message = result.message;
if (isServerUnavailableError(message)) {
setError(t('auth.serverUnavailable'));
} else { } else {
setError(t('auth.loginFailed')); setError(t('auth.loginFailed'));
} }
}
} catch (err) { } catch (err) {
const message = err instanceof Error ? err.message : undefined;
if (isServerUnavailableError(message)) {
setError(t('auth.serverUnavailable'));
} else {
setError(t('auth.loginError')); setError(t('auth.loginError'));
}
} finally { } finally {
setLoading(false); setLoading(false);
} }
@@ -159,21 +131,13 @@ const LoginPage: React.FC = () => {
}} }}
/> />
<div className="pointer-events-none absolute inset-0 -z-10"> <div className="pointer-events-none absolute inset-0 -z-10">
<svg <svg className="h-full w-full opacity-[0.08] dark:opacity-[0.12]" xmlns="http://www.w3.org/2000/svg">
className="h-full w-full opacity-[0.08] dark:opacity-[0.12]"
xmlns="http://www.w3.org/2000/svg"
>
<defs> <defs>
<pattern id="grid" width="32" height="32" patternUnits="userSpaceOnUse"> <pattern id="grid" width="32" height="32" patternUnits="userSpaceOnUse">
<path d="M 32 0 L 0 0 0 32" fill="none" stroke="currentColor" strokeWidth="0.5" /> <path d="M 32 0 L 0 0 0 32" fill="none" stroke="currentColor" strokeWidth="0.5" />
</pattern> </pattern>
</defs> </defs>
<rect <rect width="100%" height="100%" fill="url(#grid)" className="text-gray-400 dark:text-gray-300" />
width="100%"
height="100%"
fill="url(#grid)"
className="text-gray-400 dark:text-gray-300"
/>
</svg> </svg>
</div> </div>

File diff suppressed because it is too large Load Diff

View File

@@ -29,7 +29,7 @@ export const login = async (credentials: LoginCredentials): Promise<AuthResponse
console.error('Login error:', error); console.error('Login error:', error);
return { return {
success: false, success: false,
message: error instanceof Error ? error.message : 'An error occurred during login', message: 'An error occurred during login',
}; };
} }
}; };

View File

@@ -309,19 +309,6 @@ export interface ApiResponse<T = any> {
data?: T; data?: T;
} }
// Bearer authentication key configuration (frontend view model)
export type BearerKeyAccessType = 'all' | 'groups' | 'servers' | 'custom';
export interface BearerKey {
id: string;
name: string;
token: string;
enabled: boolean;
accessType: BearerKeyAccessType;
allowedGroups?: string[];
allowedServers?: string[];
}
// Auth types // Auth types
export interface IUser { export interface IUser {
username: string; username: string;

View File

@@ -61,7 +61,6 @@
"emptyFields": "Username and password cannot be empty", "emptyFields": "Username and password cannot be empty",
"loginFailed": "Login failed, please check your username and password", "loginFailed": "Login failed, please check your username and password",
"loginError": "An error occurred during login", "loginError": "An error occurred during login",
"serverUnavailable": "Unable to connect to the server. Please check your network connection or try again later",
"currentPassword": "Current Password", "currentPassword": "Current Password",
"newPassword": "New Password", "newPassword": "New Password",
"confirmPassword": "Confirm Password", "confirmPassword": "Confirm Password",
@@ -254,11 +253,7 @@
"type": "Type", "type": "Type",
"repeated": "Repeated", "repeated": "Repeated",
"valueHint": "Value Hint", "valueHint": "Value Hint",
"choices": "Choices", "choices": "Choices"
"actions": "Actions",
"saving": "Saving...",
"active": "Active",
"inactive": "Inactive"
}, },
"nav": { "nav": {
"dashboard": "Dashboard", "dashboard": "Dashboard",
@@ -281,7 +276,7 @@
"recentServers": "Recent Servers" "recentServers": "Recent Servers"
}, },
"servers": { "servers": {
"title": "Server Management" "title": "Servers Management"
}, },
"groups": { "groups": {
"title": "Group Management" "title": "Group Management"
@@ -558,28 +553,6 @@
"bearerAuthKey": "Bearer Authentication Key", "bearerAuthKey": "Bearer Authentication Key",
"bearerAuthKeyDescription": "The authentication key that will be required in the Bearer token", "bearerAuthKeyDescription": "The authentication key that will be required in the Bearer token",
"bearerAuthKeyPlaceholder": "Enter bearer authentication key", "bearerAuthKeyPlaceholder": "Enter bearer authentication key",
"bearerKeysSectionTitle": "Keys",
"bearerKeysSectionDescription": "Manage multiple keys with different access scopes.",
"noBearerKeys": "No keys configured yet.",
"bearerKeyName": "Name",
"bearerKeyToken": "Token",
"bearerKeyEnabled": "Enabled",
"bearerKeyAccessType": "Access scope",
"bearerKeyAccessAll": "All",
"bearerKeyAccessGroups": "Groups",
"bearerKeyAccessServers": "Servers",
"bearerKeyAccessCustom": "Custom",
"bearerKeyAllowedGroups": "Allowed groups",
"bearerKeyAllowedServers": "Allowed servers",
"addBearerKey": "Add key",
"addBearerKeyButton": "Create",
"bearerKeyRequired": "Name and token are required",
"deleteBearerKeyConfirm": "Are you sure you want to delete this key?",
"generate": "Generate",
"selectGroups": "Select Groups",
"selectServers": "Select Servers",
"selectAtLeastOneGroup": "Please select at least one group",
"selectAtLeastOneServer": "Please select at least one server",
"skipAuth": "Skip Authentication", "skipAuth": "Skip Authentication",
"skipAuthDescription": "Bypass login requirement for frontend and API access (DEFAULT OFF for security)", "skipAuthDescription": "Bypass login requirement for frontend and API access (DEFAULT OFF for security)",
"pythonIndexUrl": "Python Package Repository URL", "pythonIndexUrl": "Python Package Repository URL",
@@ -699,22 +672,6 @@
"importFailed": "Failed to import servers", "importFailed": "Failed to import servers",
"partialSuccess": "Imported {{count}} of {{total}} servers successfully. Some servers failed:" "partialSuccess": "Imported {{count}} of {{total}} servers successfully. Some servers failed:"
}, },
"groupImport": {
"button": "Import",
"title": "Import Groups from JSON",
"inputLabel": "Group Configuration JSON",
"inputHelp": "Paste your group configuration JSON. Each group can contain a list of servers.",
"preview": "Preview",
"previewTitle": "Preview Groups to Import",
"import": "Import",
"importing": "Importing...",
"invalidFormat": "Invalid JSON format. The JSON must contain a 'groups' array.",
"missingName": "Each group must have a 'name' field.",
"parseError": "Failed to parse JSON. Please check the format and try again.",
"addFailed": "Failed to add group",
"importFailed": "Failed to import groups",
"partialSuccess": "Imported {{count}} of {{total}} groups successfully. Some groups failed:"
},
"users": { "users": {
"add": "Add User", "add": "Add User",
"addNew": "Add New User", "addNew": "Add New User",

View File

@@ -61,7 +61,6 @@
"emptyFields": "Le nom d'utilisateur et le mot de passe ne peuvent pas être vides", "emptyFields": "Le nom d'utilisateur et le mot de passe ne peuvent pas être vides",
"loginFailed": "Échec de la connexion, veuillez vérifier votre nom d'utilisateur et votre mot de passe", "loginFailed": "Échec de la connexion, veuillez vérifier votre nom d'utilisateur et votre mot de passe",
"loginError": "Une erreur est survenue lors de la connexion", "loginError": "Une erreur est survenue lors de la connexion",
"serverUnavailable": "Impossible de se connecter au serveur. Veuillez vérifier votre connexion réseau ou réessayer plus tard",
"currentPassword": "Mot de passe actuel", "currentPassword": "Mot de passe actuel",
"newPassword": "Nouveau mot de passe", "newPassword": "Nouveau mot de passe",
"confirmPassword": "Confirmer le mot de passe", "confirmPassword": "Confirmer le mot de passe",
@@ -255,11 +254,7 @@
"type": "Type", "type": "Type",
"repeated": "Répété", "repeated": "Répété",
"valueHint": "Indice de valeur", "valueHint": "Indice de valeur",
"choices": "Choix", "choices": "Choix"
"actions": "Actions",
"saving": "Enregistrement...",
"active": "Actif",
"inactive": "Inactif"
}, },
"nav": { "nav": {
"dashboard": "Tableau de bord", "dashboard": "Tableau de bord",
@@ -559,28 +554,6 @@
"bearerAuthKey": "Clé d'authentification Bearer", "bearerAuthKey": "Clé d'authentification Bearer",
"bearerAuthKeyDescription": "La clé d'authentification qui sera requise dans le jeton Bearer", "bearerAuthKeyDescription": "La clé d'authentification qui sera requise dans le jeton Bearer",
"bearerAuthKeyPlaceholder": "Entrez la clé d'authentification Bearer", "bearerAuthKeyPlaceholder": "Entrez la clé d'authentification Bearer",
"bearerKeysSectionTitle": "Clés",
"bearerKeysSectionDescription": "Gérez plusieurs clés avec différentes portées daccès.",
"noBearerKeys": "Aucune clé configurée pour le moment.",
"bearerKeyName": "Nom",
"bearerKeyToken": "Jeton",
"bearerKeyEnabled": "Activée",
"bearerKeyAccessType": "Portée daccès",
"bearerKeyAccessAll": "Toutes",
"bearerKeyAccessGroups": "Groupes",
"bearerKeyAccessServers": "Serveurs",
"bearerKeyAccessCustom": "Personnalisée",
"bearerKeyAllowedGroups": "Groupes autorisés",
"bearerKeyAllowedServers": "Serveurs autorisés",
"addBearerKey": "Ajouter une clé",
"addBearerKeyButton": "Créer",
"bearerKeyRequired": "Le nom et le jeton sont obligatoires",
"deleteBearerKeyConfirm": "Voulez-vous vraiment supprimer cette clé ?",
"generate": "Générer",
"selectGroups": "Sélectionner des groupes",
"selectServers": "Sélectionner des serveurs",
"selectAtLeastOneGroup": "Veuillez sélectionner au moins un groupe",
"selectAtLeastOneServer": "Veuillez sélectionner au moins un serveur",
"skipAuth": "Ignorer l'authentification", "skipAuth": "Ignorer l'authentification",
"skipAuthDescription": "Contourner l'exigence de connexion pour l'accès au frontend et à l'API (DÉSACTIVÉ PAR DÉFAUT pour des raisons de sécurité)", "skipAuthDescription": "Contourner l'exigence de connexion pour l'accès au frontend et à l'API (DÉSACTIVÉ PAR DÉFAUT pour des raisons de sécurité)",
"pythonIndexUrl": "URL du dépôt de paquets Python", "pythonIndexUrl": "URL du dépôt de paquets Python",
@@ -700,22 +673,6 @@
"importFailed": "Échec de l'importation des serveurs", "importFailed": "Échec de l'importation des serveurs",
"partialSuccess": "{{count}} serveur(s) sur {{total}} importé(s) avec succès. Certains serveurs ont échoué :" "partialSuccess": "{{count}} serveur(s) sur {{total}} importé(s) avec succès. Certains serveurs ont échoué :"
}, },
"groupImport": {
"button": "Importer",
"title": "Importer des groupes depuis JSON",
"inputLabel": "Configuration JSON des groupes",
"inputHelp": "Collez votre configuration JSON de groupes. Chaque groupe peut contenir une liste de serveurs.",
"preview": "Aperçu",
"previewTitle": "Aperçu des groupes à importer",
"import": "Importer",
"importing": "Importation en cours...",
"invalidFormat": "Format JSON invalide. Le JSON doit contenir un tableau 'groups'.",
"missingName": "Chaque groupe doit avoir un champ 'name'.",
"parseError": "Échec de l'analyse du JSON. Veuillez vérifier le format et réessayer.",
"addFailed": "Échec de l'ajout du groupe",
"importFailed": "Échec de l'importation des groupes",
"partialSuccess": "{{count}} groupe(s) sur {{total}} importé(s) avec succès. Certains groupes ont échoué :"
},
"users": { "users": {
"add": "Ajouter un utilisateur", "add": "Ajouter un utilisateur",
"addNew": "Ajouter un nouvel utilisateur", "addNew": "Ajouter un nouvel utilisateur",

View File

@@ -61,7 +61,6 @@
"emptyFields": "Kullanıcı adı ve şifre boş olamaz", "emptyFields": "Kullanıcı adı ve şifre boş olamaz",
"loginFailed": "Giriş başarısız, lütfen kullanıcı adınızı ve şifrenizi kontrol edin", "loginFailed": "Giriş başarısız, lütfen kullanıcı adınızı ve şifrenizi kontrol edin",
"loginError": "Giriş sırasında bir hata oluştu", "loginError": "Giriş sırasında bir hata oluştu",
"serverUnavailable": "Sunucuya bağlanılamıyor. Lütfen ağ bağlantınızı kontrol edin veya daha sonra tekrar deneyin",
"currentPassword": "Mevcut Şifre", "currentPassword": "Mevcut Şifre",
"newPassword": "Yeni Şifre", "newPassword": "Yeni Şifre",
"confirmPassword": "Şifreyi Onayla", "confirmPassword": "Şifreyi Onayla",
@@ -255,11 +254,7 @@
"type": "Tür", "type": "Tür",
"repeated": "Tekrarlanan", "repeated": "Tekrarlanan",
"valueHint": "Değer İpucu", "valueHint": "Değer İpucu",
"choices": "Seçenekler", "choices": "Seçenekler"
"actions": "Eylemler",
"saving": "Kaydediliyor...",
"active": "Aktif",
"inactive": "Pasif"
}, },
"nav": { "nav": {
"dashboard": "Kontrol Paneli", "dashboard": "Kontrol Paneli",
@@ -559,28 +554,6 @@
"bearerAuthKey": "Bearer Kimlik Doğrulama Anahtarı", "bearerAuthKey": "Bearer Kimlik Doğrulama Anahtarı",
"bearerAuthKeyDescription": "Bearer token'da gerekli olacak kimlik doğrulama anahtarı", "bearerAuthKeyDescription": "Bearer token'da gerekli olacak kimlik doğrulama anahtarı",
"bearerAuthKeyPlaceholder": "Bearer kimlik doğrulama anahtarını girin", "bearerAuthKeyPlaceholder": "Bearer kimlik doğrulama anahtarını girin",
"bearerKeysSectionTitle": "Anahtarlar",
"bearerKeysSectionDescription": "Farklı erişim kapsamlarına sahip birden fazla anahtarı yönetin.",
"noBearerKeys": "Henüz yapılandırılmış herhangi bir anahtar yok.",
"bearerKeyName": "Ad",
"bearerKeyToken": "Token",
"bearerKeyEnabled": "Etkin",
"bearerKeyAccessType": "Erişim kapsamı",
"bearerKeyAccessAll": "Tümü",
"bearerKeyAccessGroups": "Gruplar",
"bearerKeyAccessServers": "Sunucular",
"bearerKeyAccessCustom": "Özel",
"bearerKeyAllowedGroups": "İzin verilen gruplar",
"bearerKeyAllowedServers": "İzin verilen sunucular",
"addBearerKey": "Anahtar ekle",
"addBearerKeyButton": "Oluştur",
"bearerKeyRequired": "Ad ve token zorunludur",
"deleteBearerKeyConfirm": "Bu anahtarı silmek istediğinizden emin misiniz?",
"generate": "Oluştur",
"selectGroups": "Grupları Seç",
"selectServers": "Sunucuları Seç",
"selectAtLeastOneGroup": "Lütfen en az bir grup seçin",
"selectAtLeastOneServer": "Lütfen en az bir sunucu seçin",
"skipAuth": "Kimlik Doğrulamayı Atla", "skipAuth": "Kimlik Doğrulamayı Atla",
"skipAuthDescription": "Arayüz ve API erişimi için giriş gereksinimini atla (Güvenlik için VARSAYILAN KAPALI)", "skipAuthDescription": "Arayüz ve API erişimi için giriş gereksinimini atla (Güvenlik için VARSAYILAN KAPALI)",
"pythonIndexUrl": "Python Paket Deposu URL'si", "pythonIndexUrl": "Python Paket Deposu URL'si",
@@ -700,22 +673,6 @@
"importFailed": "Sunucular içe aktarılamadı", "importFailed": "Sunucular içe aktarılamadı",
"partialSuccess": "{{total}} sunucudan {{count}} tanesi başarıyla içe aktarıldı. Bazı sunucular başarısız oldu:" "partialSuccess": "{{total}} sunucudan {{count}} tanesi başarıyla içe aktarıldı. Bazı sunucular başarısız oldu:"
}, },
"groupImport": {
"button": "İçe Aktar",
"title": "JSON'dan Grupları İçe Aktar",
"inputLabel": "Grup Yapılandırma JSON",
"inputHelp": "Grup yapılandırma JSON'unuzu yapıştırın. Her grup bir sunucu listesi içerebilir.",
"preview": "Önizle",
"previewTitle": "İçe Aktarılacak Grupları Önizle",
"import": "İçe Aktar",
"importing": "İçe aktarılıyor...",
"invalidFormat": "Geçersiz JSON formatı. JSON bir 'groups' dizisi içermelidir.",
"missingName": "Her grubun bir 'name' alanı olmalıdır.",
"parseError": "JSON ayrıştırılamadı. Lütfen formatı kontrol edip tekrar deneyin.",
"addFailed": "Grup eklenemedi",
"importFailed": "Gruplar içe aktarılamadı",
"partialSuccess": "{{total}} gruptan {{count}} tanesi başarıyla içe aktarıldı. Bazı gruplar başarısız oldu:"
},
"users": { "users": {
"add": "Kullanıcı Ekle", "add": "Kullanıcı Ekle",
"addNew": "Yeni Kullanıcı Ekle", "addNew": "Yeni Kullanıcı Ekle",

View File

@@ -61,7 +61,6 @@
"emptyFields": "用户名和密码不能为空", "emptyFields": "用户名和密码不能为空",
"loginFailed": "登录失败,请检查用户名和密码", "loginFailed": "登录失败,请检查用户名和密码",
"loginError": "登录过程中出现错误", "loginError": "登录过程中出现错误",
"serverUnavailable": "无法连接到服务器,请检查网络连接或稍后再试",
"currentPassword": "当前密码", "currentPassword": "当前密码",
"newPassword": "新密码", "newPassword": "新密码",
"confirmPassword": "确认密码", "confirmPassword": "确认密码",
@@ -256,11 +255,7 @@
"type": "类型", "type": "类型",
"repeated": "可重复", "repeated": "可重复",
"valueHint": "值提示", "valueHint": "值提示",
"choices": "可选值", "choices": "可选值"
"actions": "操作",
"saving": "保存中...",
"active": "已激活",
"inactive": "未激活"
}, },
"nav": { "nav": {
"dashboard": "仪表盘", "dashboard": "仪表盘",
@@ -294,7 +289,7 @@
"routeConfig": "安全配置", "routeConfig": "安全配置",
"installConfig": "安装", "installConfig": "安装",
"smartRouting": "智能路由", "smartRouting": "智能路由",
"oauthServer": "OAuth" "oauthServer": "OAuth 服务器"
}, },
"groups": { "groups": {
"title": "分组管理" "title": "分组管理"
@@ -560,28 +555,6 @@
"bearerAuthKey": "Bearer 认证密钥", "bearerAuthKey": "Bearer 认证密钥",
"bearerAuthKeyDescription": "Bearer 令牌中需要携带的认证密钥", "bearerAuthKeyDescription": "Bearer 令牌中需要携带的认证密钥",
"bearerAuthKeyPlaceholder": "请输入 Bearer 认证密钥", "bearerAuthKeyPlaceholder": "请输入 Bearer 认证密钥",
"bearerKeysSectionTitle": "密钥",
"bearerKeysSectionDescription": "管理多条密钥,并为不同密钥配置不同的访问范围。",
"noBearerKeys": "当前还没有配置任何密钥。",
"bearerKeyName": "名称",
"bearerKeyToken": "密钥值",
"bearerKeyEnabled": "启用",
"bearerKeyAccessType": "访问范围",
"bearerKeyAccessAll": "全部",
"bearerKeyAccessGroups": "指定分组",
"bearerKeyAccessServers": "指定服务器",
"bearerKeyAccessCustom": "自定义",
"bearerKeyAllowedGroups": "允许访问的分组",
"bearerKeyAllowedServers": "允许访问的服务器",
"addBearerKey": "新增密钥",
"addBearerKeyButton": "创建",
"bearerKeyRequired": "名称和密钥值为必填项",
"deleteBearerKeyConfirm": "确定要删除这条密钥吗?",
"generate": "生成",
"selectGroups": "选择分组",
"selectServers": "选择服务器",
"selectAtLeastOneGroup": "请至少选择一个分组",
"selectAtLeastOneServer": "请至少选择一个服务",
"skipAuth": "免登录开关", "skipAuth": "免登录开关",
"skipAuthDescription": "跳过前端和 API 访问的登录要求(默认关闭确保安全性)", "skipAuthDescription": "跳过前端和 API 访问的登录要求(默认关闭确保安全性)",
"pythonIndexUrl": "Python 包仓库地址", "pythonIndexUrl": "Python 包仓库地址",
@@ -702,22 +675,6 @@
"importFailed": "导入服务器失败", "importFailed": "导入服务器失败",
"partialSuccess": "成功导入 {{count}} / {{total}} 个服务器。部分服务器失败:" "partialSuccess": "成功导入 {{count}} / {{total}} 个服务器。部分服务器失败:"
}, },
"groupImport": {
"button": "导入",
"title": "从 JSON 导入分组",
"inputLabel": "分组配置 JSON",
"inputHelp": "粘贴您的分组配置 JSON。每个分组可以包含一个服务器列表。",
"preview": "预览",
"previewTitle": "预览要导入的分组",
"import": "导入",
"importing": "导入中...",
"invalidFormat": "无效的 JSON 格式。JSON 必须包含 'groups' 数组。",
"missingName": "每个分组必须有 'name' 字段。",
"parseError": "解析 JSON 失败。请检查格式后重试。",
"addFailed": "添加分组失败",
"importFailed": "导入分组失败",
"partialSuccess": "成功导入 {{count}} / {{total}} 个分组。部分分组失败:"
},
"users": { "users": {
"add": "添加", "add": "添加",
"addNew": "添加新用户", "addNew": "添加新用户",

View File

@@ -63,6 +63,5 @@
"requiresAuthentication": false "requiresAuthentication": false
} }
} }
}, }
"bearerKeys": []
} }

View File

@@ -73,7 +73,6 @@
"postgres": "^3.4.7", "postgres": "^3.4.7",
"reflect-metadata": "^0.2.2", "reflect-metadata": "^0.2.2",
"typeorm": "^0.3.26", "typeorm": "^0.3.26",
"undici": "^7.16.0",
"uuid": "^11.1.0" "uuid": "^11.1.0"
}, },
"devDependencies": { "devDependencies": {

33
pnpm-lock.yaml generated
View File

@@ -99,9 +99,6 @@ importers:
typeorm: typeorm:
specifier: ^0.3.26 specifier: ^0.3.26
version: 0.3.27(pg@8.16.3)(reflect-metadata@0.2.2)(ts-node@10.9.2(@swc/core@1.15.3)(@types/node@24.6.2)(typescript@5.9.2)) version: 0.3.27(pg@8.16.3)(reflect-metadata@0.2.2)(ts-node@10.9.2(@swc/core@1.15.3)(@types/node@24.6.2)(typescript@5.9.2))
undici:
specifier: ^7.16.0
version: 7.16.0
uuid: uuid:
specifier: ^11.1.0 specifier: ^11.1.0
version: 11.1.0 version: 11.1.0
@@ -201,7 +198,7 @@ importers:
version: 0.552.0(react@19.2.1) version: 0.552.0(react@19.2.1)
next: next:
specifier: ^15.5.0 specifier: ^15.5.0
version: 15.5.9(@babel/core@7.28.4)(react-dom@19.2.1(react@19.2.1))(react@19.2.1) version: 15.5.7(@babel/core@7.28.4)(react-dom@19.2.1(react@19.2.1))(react@19.2.1)
postcss: postcss:
specifier: ^8.5.6 specifier: ^8.5.6
version: 8.5.6 version: 8.5.6
@@ -1125,8 +1122,8 @@ packages:
'@napi-rs/wasm-runtime@0.2.12': '@napi-rs/wasm-runtime@0.2.12':
resolution: {integrity: sha512-ZVWUcfwY4E/yPitQJl481FjFo3K22D6qF0DuFH6Y/nbnE11GY5uguDxZMGXPQ8WQ0128MXQD7TnfHyK4oWoIJQ==} resolution: {integrity: sha512-ZVWUcfwY4E/yPitQJl481FjFo3K22D6qF0DuFH6Y/nbnE11GY5uguDxZMGXPQ8WQ0128MXQD7TnfHyK4oWoIJQ==}
'@next/env@15.5.9': '@next/env@15.5.7':
resolution: {integrity: sha512-4GlTZ+EJM7WaW2HEZcyU317tIQDjkQIyENDLxYJfSWlfqguN+dHkZgyQTV/7ykvobU7yEH5gKvreNrH4B6QgIg==} resolution: {integrity: sha512-4h6Y2NyEkIEN7Z8YxkA27pq6zTkS09bUSYC0xjd0NpwFxjnIKeZEeH591o5WECSmjpUhLn3H2QLJcDye3Uzcvg==}
'@next/swc-darwin-arm64@15.5.7': '@next/swc-darwin-arm64@15.5.7':
resolution: {integrity: sha512-IZwtxCEpI91HVU/rAUOOobWSZv4P2DeTtNaCdHqLcTJU4wdNXgAySvKa/qJCgR5m6KI8UsKDXtO2B31jcaw1Yw==} resolution: {integrity: sha512-IZwtxCEpI91HVU/rAUOOobWSZv4P2DeTtNaCdHqLcTJU4wdNXgAySvKa/qJCgR5m6KI8UsKDXtO2B31jcaw1Yw==}
@@ -2239,8 +2236,8 @@ packages:
caniuse-lite@1.0.30001737: caniuse-lite@1.0.30001737:
resolution: {integrity: sha512-BiloLiXtQNrY5UyF0+1nSJLXUENuhka2pzy2Fx5pGxqavdrxSCW4U6Pn/PoG3Efspi2frRbHpBV2XsrPE6EDlw==} resolution: {integrity: sha512-BiloLiXtQNrY5UyF0+1nSJLXUENuhka2pzy2Fx5pGxqavdrxSCW4U6Pn/PoG3Efspi2frRbHpBV2XsrPE6EDlw==}
caniuse-lite@1.0.30001760: caniuse-lite@1.0.30001759:
resolution: {integrity: sha512-7AAMPcueWELt1p3mi13HR/LHH0TJLT11cnwDJEs3xA4+CK/PLKeO9Kl1oru24htkyUKtkGCvAx4ohB0Ttry8Dw==} resolution: {integrity: sha512-Pzfx9fOKoKvevQf8oCXoyNRQ5QyxJj+3O0Rqx2V5oxT61KGx8+n6hV/IUyJeifUci2clnmmKVpvtiqRzgiWjSw==}
chalk@4.1.2: chalk@4.1.2:
resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==} resolution: {integrity: sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==}
@@ -3517,8 +3514,8 @@ packages:
neo-async@2.6.2: neo-async@2.6.2:
resolution: {integrity: sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==} resolution: {integrity: sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==}
next@15.5.9: next@15.5.7:
resolution: {integrity: sha512-agNLK89seZEtC5zUHwtut0+tNrc0Xw4FT/Dg+B/VLEo9pAcS9rtTKpek3V6kVcVwsB2YlqMaHdfZL4eLEVYuCg==} resolution: {integrity: sha512-+t2/0jIJ48kUpGKkdlhgkv+zPTEOoXyr60qXe68eB/pl3CMJaLeIGjzp5D6Oqt25hCBiBTt8wEeeAzfJvUKnPQ==}
engines: {node: ^18.18.0 || ^19.8.0 || >= 20.0.0} engines: {node: ^18.18.0 || ^19.8.0 || >= 20.0.0}
hasBin: true hasBin: true
peerDependencies: peerDependencies:
@@ -4434,10 +4431,6 @@ packages:
undici-types@7.13.0: undici-types@7.13.0:
resolution: {integrity: sha512-Ov2Rr9Sx+fRgagJ5AX0qvItZG/JKKoBRAVITs1zk7IqZGTJUwgUr7qoYBpWwakpWilTZFM98rG/AFRocu10iIQ==} resolution: {integrity: sha512-Ov2Rr9Sx+fRgagJ5AX0qvItZG/JKKoBRAVITs1zk7IqZGTJUwgUr7qoYBpWwakpWilTZFM98rG/AFRocu10iIQ==}
undici@7.16.0:
resolution: {integrity: sha512-QEg3HPMll0o3t2ourKwOeUAZ159Kn9mx5pnzHRQO8+Wixmh88YdZRiIwat0iNzNNXn0yoEtXJqFpyW7eM8BV7g==}
engines: {node: '>=20.18.1'}
universalify@2.0.1: universalify@2.0.1:
resolution: {integrity: sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==} resolution: {integrity: sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==}
engines: {node: '>= 10.0.0'} engines: {node: '>= 10.0.0'}
@@ -5483,7 +5476,7 @@ snapshots:
'@tybys/wasm-util': 0.10.1 '@tybys/wasm-util': 0.10.1
optional: true optional: true
'@next/env@15.5.9': {} '@next/env@15.5.7': {}
'@next/swc-darwin-arm64@15.5.7': '@next/swc-darwin-arm64@15.5.7':
optional: true optional: true
@@ -6534,7 +6527,7 @@ snapshots:
caniuse-lite@1.0.30001737: {} caniuse-lite@1.0.30001737: {}
caniuse-lite@1.0.30001760: {} caniuse-lite@1.0.30001759: {}
chalk@4.1.2: chalk@4.1.2:
dependencies: dependencies:
@@ -8050,11 +8043,11 @@ snapshots:
neo-async@2.6.2: {} neo-async@2.6.2: {}
next@15.5.9(@babel/core@7.28.4)(react-dom@19.2.1(react@19.2.1))(react@19.2.1): next@15.5.7(@babel/core@7.28.4)(react-dom@19.2.1(react@19.2.1))(react@19.2.1):
dependencies: dependencies:
'@next/env': 15.5.9 '@next/env': 15.5.7
'@swc/helpers': 0.5.15 '@swc/helpers': 0.5.15
caniuse-lite: 1.0.30001760 caniuse-lite: 1.0.30001759
postcss: 8.4.31 postcss: 8.4.31
react: 19.2.1 react: 19.2.1
react-dom: 19.2.1(react@19.2.1) react-dom: 19.2.1(react@19.2.1)
@@ -8953,8 +8946,6 @@ snapshots:
undici-types@7.13.0: {} undici-types@7.13.0: {}
undici@7.16.0: {}
universalify@2.0.1: {} universalify@2.0.1: {}
unpipe@1.0.0: {} unpipe@1.0.0: {}

View File

@@ -1,169 +0,0 @@
import { Request, Response } from 'express';
import { ApiResponse, BearerKey } from '../types/index.js';
import { getBearerKeyDao, getSystemConfigDao } from '../dao/index.js';
const requireAdmin = async (req: Request, res: Response): Promise<boolean> => {
const systemConfigDao = getSystemConfigDao();
const systemConfig = await systemConfigDao.get();
if (systemConfig?.routing?.skipAuth) {
return true;
}
const user = (req as any).user;
if (!user || !user.isAdmin) {
res.status(403).json({
success: false,
message: 'Admin privileges required',
});
return false;
}
return true;
};
export const getBearerKeys = async (req: Request, res: Response): Promise<void> => {
if (!(await requireAdmin(req, res))) return;
try {
const dao = getBearerKeyDao();
const keys = await dao.findAll();
const response: ApiResponse = {
success: true,
data: keys,
};
res.json(response);
} catch (error) {
console.error('Failed to get bearer keys:', error);
res.status(500).json({
success: false,
message: 'Failed to get bearer keys',
});
}
};
export const createBearerKey = async (req: Request, res: Response): Promise<void> => {
if (!(await requireAdmin(req, res))) return;
try {
const { name, token, enabled, accessType, allowedGroups, allowedServers } =
req.body as Partial<BearerKey>;
if (!name || typeof name !== 'string') {
res.status(400).json({ success: false, message: 'Key name is required' });
return;
}
if (!token || typeof token !== 'string') {
res.status(400).json({ success: false, message: 'Token value is required' });
return;
}
if (!accessType || !['all', 'groups', 'servers', 'custom'].includes(accessType)) {
res.status(400).json({ success: false, message: 'Invalid accessType' });
return;
}
const dao = getBearerKeyDao();
const key = await dao.create({
name,
token,
enabled: enabled ?? true,
accessType,
allowedGroups: Array.isArray(allowedGroups) ? allowedGroups : [],
allowedServers: Array.isArray(allowedServers) ? allowedServers : [],
});
const response: ApiResponse = {
success: true,
data: key,
};
res.status(201).json(response);
} catch (error) {
console.error('Failed to create bearer key:', error);
res.status(500).json({
success: false,
message: 'Failed to create bearer key',
});
}
};
export const updateBearerKey = async (req: Request, res: Response): Promise<void> => {
if (!(await requireAdmin(req, res))) return;
try {
const { id } = req.params;
if (!id) {
res.status(400).json({ success: false, message: 'Key id is required' });
return;
}
const { name, token, enabled, accessType, allowedGroups, allowedServers } =
req.body as Partial<BearerKey>;
const updates: Partial<BearerKey> = {};
if (name !== undefined) updates.name = name;
if (token !== undefined) updates.token = token;
if (enabled !== undefined) updates.enabled = enabled;
if (accessType !== undefined) {
if (!['all', 'groups', 'servers', 'custom'].includes(accessType)) {
res.status(400).json({ success: false, message: 'Invalid accessType' });
return;
}
updates.accessType = accessType as BearerKey['accessType'];
}
if (allowedGroups !== undefined) {
updates.allowedGroups = Array.isArray(allowedGroups) ? allowedGroups : [];
}
if (allowedServers !== undefined) {
updates.allowedServers = Array.isArray(allowedServers) ? allowedServers : [];
}
const dao = getBearerKeyDao();
const updated = await dao.update(id, updates);
if (!updated) {
res.status(404).json({ success: false, message: 'Bearer key not found' });
return;
}
const response: ApiResponse = {
success: true,
data: updated,
};
res.json(response);
} catch (error) {
console.error('Failed to update bearer key:', error);
res.status(500).json({
success: false,
message: 'Failed to update bearer key',
});
}
};
export const deleteBearerKey = async (req: Request, res: Response): Promise<void> => {
if (!(await requireAdmin(req, res))) return;
try {
const { id } = req.params;
if (!id) {
res.status(400).json({ success: false, message: 'Key id is required' });
return;
}
const dao = getBearerKeyDao();
const deleted = await dao.delete(id);
if (!deleted) {
res.status(404).json({ success: false, message: 'Bearer key not found' });
return;
}
const response: ApiResponse = {
success: true,
};
res.json(response);
} catch (error) {
console.error('Failed to delete bearer key:', error);
res.status(500).json({
success: false,
message: 'Failed to delete bearer key',
});
}
};

View File

@@ -1,19 +1,10 @@
import { Request, Response } from 'express'; import { Request, Response } from 'express';
import config from '../config/index.js'; import config from '../config/index.js';
import { loadSettings } from '../config/index.js'; import { loadSettings, loadOriginalSettings } from '../config/index.js';
import { getDataService } from '../services/services.js'; import { getDataService } from '../services/services.js';
import { DataService } from '../services/dataService.js'; import { DataService } from '../services/dataService.js';
import { IUser } from '../types/index.js'; import { IUser } from '../types/index.js';
import { import { getServerDao } from '../dao/DaoFactory.js';
getGroupDao,
getOAuthClientDao,
getOAuthTokenDao,
getServerDao,
getSystemConfigDao,
getUserConfigDao,
getUserDao,
getBearerKeyDao,
} from '../dao/DaoFactory.js';
const dataService: DataService = getDataService(); const dataService: DataService = getDataService();
@@ -137,43 +128,8 @@ export const getMcpSettingsJson = async (req: Request, res: Response): Promise<v
}, },
}); });
} else { } else {
// Return full settings via DAO layer (supports both file and database modes) // Return full settings
const [ const settings = loadOriginalSettings();
servers,
users,
groups,
systemConfig,
userConfigs,
oauthClients,
oauthTokens,
bearerKeys,
] = await Promise.all([
getServerDao().findAll(),
getUserDao().findAll(),
getGroupDao().findAll(),
getSystemConfigDao().get(),
getUserConfigDao().getAll(),
getOAuthClientDao().findAll(),
getOAuthTokenDao().findAll(),
getBearerKeyDao().findAll(),
]);
const mcpServers: Record<string, any> = {};
for (const { name: serverConfigName, ...config } of servers) {
mcpServers[serverConfigName] = removeNullValues(config);
}
const settings = {
mcpServers,
users,
groups,
systemConfig,
userConfigs,
oauthClients,
oauthTokens,
bearerKeys,
};
res.json({ res.json({
success: true, success: true,
data: settings, data: settings,

View File

@@ -1,11 +1,5 @@
import { Request, Response } from 'express'; import { Request, Response } from 'express';
import { import { ApiResponse } from '../types/index.js';
ApiResponse,
AddGroupRequest,
BatchCreateGroupsRequest,
BatchCreateGroupsResponse,
BatchGroupResult,
} from '../types/index.js';
import { import {
getAllGroups, getAllGroups,
getGroupByIdOrName, getGroupByIdOrName,
@@ -112,143 +106,6 @@ export const createNewGroup = async (req: Request, res: Response): Promise<void>
} }
}; };
// Batch create groups - validates and creates multiple groups in one request
export const batchCreateGroups = async (req: Request, res: Response): Promise<void> => {
try {
const { groups } = req.body as BatchCreateGroupsRequest;
// Validate request body
if (!groups || !Array.isArray(groups)) {
res.status(400).json({
success: false,
message: 'Request body must contain a "groups" array',
});
return;
}
if (groups.length === 0) {
res.status(400).json({
success: false,
message: 'Groups array cannot be empty',
});
return;
}
// Helper function to validate a single group configuration
const validateGroupConfig = (group: AddGroupRequest): { valid: boolean; message?: string } => {
if (!group.name || typeof group.name !== 'string') {
return { valid: false, message: 'Group name is required and must be a string' };
}
if (group.description !== undefined && typeof group.description !== 'string') {
return { valid: false, message: 'Group description must be a string' };
}
if (group.servers !== undefined && !Array.isArray(group.servers)) {
return { valid: false, message: 'Group servers must be an array' };
}
// Validate server configurations if provided in new format
if (group.servers) {
for (const server of group.servers) {
if (typeof server === 'object' && server !== null) {
if (!server.name || typeof server.name !== 'string') {
return {
valid: false,
message: 'Server configuration must have a name property',
};
}
if (
server.tools !== undefined &&
server.tools !== 'all' &&
!Array.isArray(server.tools)
) {
return {
valid: false,
message: 'Server tools must be "all" or an array of tool names',
};
}
}
}
}
return { valid: true };
};
// Process each group
const results: BatchGroupResult[] = [];
let successCount = 0;
let failureCount = 0;
// Get current user for owner field
const currentUser = (req as any).user;
const defaultOwner = currentUser?.username || 'admin';
for (const groupData of groups) {
const { name, description, servers } = groupData;
// Validate group configuration
const validation = validateGroupConfig(groupData);
if (!validation.valid) {
results.push({
name: name || 'unknown',
success: false,
message: validation.message,
});
failureCount++;
continue;
}
try {
const serverList = Array.isArray(servers) ? servers : [];
const newGroup = await createGroup(name, description, serverList, defaultOwner);
if (newGroup) {
results.push({
name,
success: true,
message: 'Group created successfully',
});
successCount++;
} else {
results.push({
name,
success: false,
message: 'Failed to create group or group name already exists',
});
failureCount++;
}
} catch (error) {
results.push({
name,
success: false,
message: error instanceof Error ? error.message : 'Failed to create group',
});
failureCount++;
}
}
// Return response
const response: BatchCreateGroupsResponse = {
success: successCount > 0,
successCount,
failureCount,
results,
};
// Use 207 Multi-Status if there were partial failures, 200 if all succeeded
const statusCode = failureCount > 0 && successCount > 0 ? 207 : successCount > 0 ? 200 : 400;
res.status(statusCode).json(response);
} catch (error) {
console.error('Batch create groups error:', error);
res.status(500).json({
success: false,
message: 'Internal server error',
});
}
};
// Update an existing group // Update an existing group
export const updateExistingGroup = async (req: Request, res: Response): Promise<void> => { export const updateExistingGroup = async (req: Request, res: Response): Promise<void> => {
try { try {

View File

@@ -1,13 +1,5 @@
import { Request, Response } from 'express'; import { Request, Response } from 'express';
import { import { ApiResponse, AddServerRequest, McpSettings } from '../types/index.js';
ApiResponse,
AddServerRequest,
McpSettings,
BatchCreateServersRequest,
BatchCreateServersResponse,
BatchServerResult,
ServerConfig,
} from '../types/index.js';
import { import {
getServersInfo, getServersInfo,
addServer, addServer,
@@ -23,7 +15,6 @@ import { syncAllServerToolsEmbeddings } from '../services/vectorSearchService.js
import { createSafeJSON } from '../utils/serialization.js'; import { createSafeJSON } from '../utils/serialization.js';
import { cloneDefaultOAuthServerConfig } from '../constants/oauthServerDefaults.js'; import { cloneDefaultOAuthServerConfig } from '../constants/oauthServerDefaults.js';
import { getServerDao, getGroupDao, getSystemConfigDao } from '../dao/DaoFactory.js'; import { getServerDao, getGroupDao, getSystemConfigDao } from '../dao/DaoFactory.js';
import { getBearerKeyDao } from '../dao/DaoFactory.js';
export const getAllServers = async (_: Request, res: Response): Promise<void> => { export const getAllServers = async (_: Request, res: Response): Promise<void> => {
try { try {
@@ -66,31 +57,12 @@ export const getAllSettings = async (_: Request, res: Response): Promise<void> =
const systemConfigDao = getSystemConfigDao(); const systemConfigDao = getSystemConfigDao();
const systemConfig = await systemConfigDao.get(); const systemConfig = await systemConfigDao.get();
// Ensure smart routing config has DB URL set if environment variable is present
const dbUrlEnv = process.env.DB_URL || '';
if (!systemConfig.smartRouting) {
systemConfig.smartRouting = {
enabled: false,
dbUrl: dbUrlEnv ? '${DB_URL}' : '',
openaiApiBaseUrl: '',
openaiApiKey: '',
openaiApiEmbeddingModel: '',
};
} else if (!systemConfig.smartRouting.dbUrl) {
systemConfig.smartRouting.dbUrl = dbUrlEnv ? '${DB_URL}' : '';
}
// Get bearer auth keys from DAO
const bearerKeyDao = getBearerKeyDao();
const bearerKeys = await bearerKeyDao.findAll();
// Merge all data into settings object // Merge all data into settings object
const settings: McpSettings = { const settings: McpSettings = {
...fileSettings, ...fileSettings,
mcpServers, mcpServers,
groups, groups,
systemConfig, systemConfig,
bearerKeys,
}; };
const response: ApiResponse = { const response: ApiResponse = {
@@ -217,177 +189,6 @@ export const createServer = async (req: Request, res: Response): Promise<void> =
} }
}; };
// Batch create servers - validates and creates multiple servers in one request
export const batchCreateServers = async (req: Request, res: Response): Promise<void> => {
try {
const { servers } = req.body as BatchCreateServersRequest;
// Validate request body
if (!servers || !Array.isArray(servers)) {
res.status(400).json({
success: false,
message: 'Request body must contain a "servers" array',
});
return;
}
if (servers.length === 0) {
res.status(400).json({
success: false,
message: 'Servers array cannot be empty',
});
return;
}
// Helper function to validate a single server configuration
const validateServerConfig = (
name: string,
config: ServerConfig,
): { valid: boolean; message?: string } => {
if (!name || typeof name !== 'string') {
return { valid: false, message: 'Server name is required and must be a string' };
}
if (!config || typeof config !== 'object') {
return { valid: false, message: 'Server configuration is required and must be an object' };
}
if (
!config.url &&
!config.openapi?.url &&
!config.openapi?.schema &&
(!config.command || !config.args)
) {
return {
valid: false,
message:
'Server configuration must include either a URL, OpenAPI specification URL or schema, or command with arguments',
};
}
// Validate server type if specified
if (config.type && !['stdio', 'sse', 'streamable-http', 'openapi'].includes(config.type)) {
return {
valid: false,
message: 'Server type must be one of: stdio, sse, streamable-http, openapi',
};
}
// Validate URL is provided for sse and streamable-http types
if ((config.type === 'sse' || config.type === 'streamable-http') && !config.url) {
return { valid: false, message: `URL is required for ${config.type} server type` };
}
// Validate OpenAPI specification URL or schema is provided for openapi type
if (config.type === 'openapi' && !config.openapi?.url && !config.openapi?.schema) {
return {
valid: false,
message: 'OpenAPI specification URL or schema is required for openapi server type',
};
}
// Validate headers if provided
if (config.headers && typeof config.headers !== 'object') {
return { valid: false, message: 'Headers must be an object' };
}
// Validate that headers are only used with sse, streamable-http, and openapi types
if (config.headers && config.type === 'stdio') {
return { valid: false, message: 'Headers are not supported for stdio server type' };
}
return { valid: true };
};
// Process each server
const results: BatchServerResult[] = [];
let successCount = 0;
let failureCount = 0;
// Get current user for owner field
const currentUser = (req as any).user;
const defaultOwner = currentUser?.username || 'admin';
for (const server of servers) {
const { name, config } = server;
// Validate server configuration
const validation = validateServerConfig(name, config);
if (!validation.valid) {
results.push({
name: name || 'unknown',
success: false,
message: validation.message,
});
failureCount++;
continue;
}
try {
// Set default keep-alive interval for SSE servers if not specified
if ((config.type === 'sse' || (!config.type && config.url)) && !config.keepAliveInterval) {
config.keepAliveInterval = 60000; // Default 60 seconds for SSE servers
}
// Set owner property if not provided
if (!config.owner) {
config.owner = defaultOwner;
}
// Attempt to add server
const result = await addServer(name, config);
if (result.success) {
results.push({
name,
success: true,
});
successCount++;
} else {
results.push({
name,
success: false,
message: result.message || 'Failed to add server',
});
failureCount++;
}
} catch (error) {
results.push({
name,
success: false,
message: error instanceof Error ? error.message : 'Internal server error',
});
failureCount++;
}
}
// Notify tool changes if any server was added successfully
if (successCount > 0) {
notifyToolChanged();
}
// Prepare response
const response: ApiResponse<BatchCreateServersResponse> = {
success: successCount > 0, // Success if at least one server was created
data: {
success: successCount > 0,
successCount,
failureCount,
results,
},
};
// Return 207 Multi-Status if there were partial failures, 200 if all succeeded, 400 if all failed
const statusCode = failureCount === 0 ? 200 : successCount === 0 ? 400 : 207;
res.status(statusCode).json(response);
} catch (error) {
console.error('Batch create servers error:', error);
res.status(500).json({
success: false,
message: 'Internal server error',
});
}
};
export const deleteServer = async (req: Request, res: Response): Promise<void> => { export const deleteServer = async (req: Request, res: Response): Promise<void> => {
try { try {
const { name } = req.params; const { name } = req.params;
@@ -423,7 +224,7 @@ export const deleteServer = async (req: Request, res: Response): Promise<void> =
export const updateServer = async (req: Request, res: Response): Promise<void> => { export const updateServer = async (req: Request, res: Response): Promise<void> => {
try { try {
const { name } = req.params; const { name } = req.params;
const { config, newName } = req.body; const { config } = req.body;
if (!name) { if (!name) {
res.status(400).json({ res.status(400).json({
success: false, success: false,
@@ -510,52 +311,12 @@ export const updateServer = async (req: Request, res: Response): Promise<void> =
config.owner = currentUser?.username || 'admin'; config.owner = currentUser?.username || 'admin';
} }
// Check if server name is being changed const result = await addOrUpdateServer(name, config, true); // Allow override for updates
const isRenaming = newName && newName !== name;
// If renaming, validate the new name and update references
if (isRenaming) {
const serverDao = getServerDao();
// Check if new name already exists
if (await serverDao.exists(newName)) {
res.status(400).json({
success: false,
message: `Server name '${newName}' already exists`,
});
return;
}
// Rename the server
const renamed = await serverDao.rename(name, newName);
if (!renamed) {
res.status(404).json({
success: false,
message: 'Server not found',
});
return;
}
// Update references in groups
const groupDao = getGroupDao();
await groupDao.updateServerName(name, newName);
// Update references in bearer keys
const bearerKeyDao = getBearerKeyDao();
await bearerKeyDao.updateServerName(name, newName);
}
// Use the final server name (new name if renaming, otherwise original name)
const finalName = isRenaming ? newName : name;
const result = await addOrUpdateServer(finalName, config, true); // Allow override for updates
if (result.success) { if (result.success) {
notifyToolChanged(finalName); notifyToolChanged(name);
res.json({ res.json({
success: true, success: true,
message: isRenaming message: 'Server updated successfully',
? `Server renamed and updated successfully`
: 'Server updated successfully',
}); });
} else { } else {
res.status(404).json({ res.status(404).json({
@@ -564,10 +325,9 @@ export const updateServer = async (req: Request, res: Response): Promise<void> =
}); });
} }
} catch (error) { } catch (error) {
console.error('Failed to update server:', error);
res.status(500).json({ res.status(500).json({
success: false, success: false,
message: error instanceof Error ? error.message : 'Internal server error', message: 'Internal server error',
}); });
} }
}; };
@@ -1033,8 +793,7 @@ export const updateSystemConfig = async (req: Request, res: Response): Promise<v
if (typeof smartRouting.enabled === 'boolean') { if (typeof smartRouting.enabled === 'boolean') {
// If enabling Smart Routing, validate required fields // If enabling Smart Routing, validate required fields
if (smartRouting.enabled) { if (smartRouting.enabled) {
const currentDbUrl = const currentDbUrl = smartRouting.dbUrl || systemConfig.smartRouting.dbUrl;
process.env.DB_URL || smartRouting.dbUrl || systemConfig.smartRouting.dbUrl;
const currentOpenaiApiKey = const currentOpenaiApiKey =
smartRouting.openaiApiKey || systemConfig.smartRouting.openaiApiKey; smartRouting.openaiApiKey || systemConfig.smartRouting.openaiApiKey;

View File

@@ -1,159 +0,0 @@
import { randomUUID } from 'node:crypto';
import { BearerKey } from '../types/index.js';
import { JsonFileBaseDao } from './base/JsonFileBaseDao.js';
/**
* DAO interface for bearer authentication keys
*/
export interface BearerKeyDao {
findAll(): Promise<BearerKey[]>;
findEnabled(): Promise<BearerKey[]>;
findById(id: string): Promise<BearerKey | undefined>;
findByToken(token: string): Promise<BearerKey | undefined>;
create(data: Omit<BearerKey, 'id'>): Promise<BearerKey>;
update(id: string, data: Partial<Omit<BearerKey, 'id'>>): Promise<BearerKey | null>;
delete(id: string): Promise<boolean>;
/**
* Update server name in all bearer keys (when server is renamed)
*/
updateServerName(oldName: string, newName: string): Promise<number>;
}
/**
* JSON file-based BearerKey DAO implementation
* Stores keys under the top-level `bearerKeys` field in mcp_settings.json
* and performs one-time migration from legacy routing.enableBearerAuth/bearerAuthKey.
*/
export class BearerKeyDaoImpl extends JsonFileBaseDao implements BearerKeyDao {
private async loadKeysWithMigration(): Promise<BearerKey[]> {
const settings = await this.loadSettings();
// Treat an existing array (including an empty array) as already migrated.
// Otherwise, when there are no configured keys, we'd rewrite mcp_settings.json
// on every request, which also clears the global settings cache.
if (Array.isArray(settings.bearerKeys)) {
return settings.bearerKeys;
}
// Perform one-time migration from legacy routing config if present
const routing = settings.systemConfig?.routing || {};
const enableBearerAuth: boolean = !!routing.enableBearerAuth;
const rawKey: string = (routing.bearerAuthKey || '').trim();
let migrated: BearerKey[] = [];
if (rawKey) {
// Cases 2 and 3 in migration rules
migrated = [
{
id: randomUUID(),
name: 'default',
token: rawKey,
enabled: enableBearerAuth,
accessType: 'all',
allowedGroups: [],
allowedServers: [],
},
];
}
// Cases 1 and 4 both result in empty keys list
settings.bearerKeys = migrated;
await this.saveSettings(settings);
return migrated;
}
private async saveKeys(keys: BearerKey[]): Promise<void> {
const settings = await this.loadSettings();
settings.bearerKeys = keys;
await this.saveSettings(settings);
}
async findAll(): Promise<BearerKey[]> {
return await this.loadKeysWithMigration();
}
async findEnabled(): Promise<BearerKey[]> {
const keys = await this.loadKeysWithMigration();
return keys.filter((key) => key.enabled);
}
async findById(id: string): Promise<BearerKey | undefined> {
const keys = await this.loadKeysWithMigration();
return keys.find((key) => key.id === id);
}
async findByToken(token: string): Promise<BearerKey | undefined> {
const keys = await this.loadKeysWithMigration();
return keys.find((key) => key.token === token);
}
async create(data: Omit<BearerKey, 'id'>): Promise<BearerKey> {
const keys = await this.loadKeysWithMigration();
const newKey: BearerKey = {
id: randomUUID(),
...data,
};
keys.push(newKey);
await this.saveKeys(keys);
return newKey;
}
async update(id: string, data: Partial<Omit<BearerKey, 'id'>>): Promise<BearerKey | null> {
const keys = await this.loadKeysWithMigration();
const index = keys.findIndex((key) => key.id === id);
if (index === -1) {
return null;
}
const updated: BearerKey = {
...keys[index],
...data,
id: keys[index].id,
};
keys[index] = updated;
await this.saveKeys(keys);
return updated;
}
async delete(id: string): Promise<boolean> {
const keys = await this.loadKeysWithMigration();
const next = keys.filter((key) => key.id !== id);
if (next.length === keys.length) {
return false;
}
await this.saveKeys(next);
return true;
}
async updateServerName(oldName: string, newName: string): Promise<number> {
const keys = await this.loadKeysWithMigration();
let updatedCount = 0;
for (const key of keys) {
let updated = false;
if (key.allowedServers && key.allowedServers.length > 0) {
const newServers = key.allowedServers.map((server) => {
if (server === oldName) {
updated = true;
return newName;
}
return server;
});
if (updated) {
key.allowedServers = newServers;
updatedCount++;
}
}
}
if (updatedCount > 0) {
await this.saveKeys(keys);
}
return updatedCount;
}
}

View File

@@ -1,103 +0,0 @@
import { BearerKeyDao } from './BearerKeyDao.js';
import { BearerKey as BearerKeyModel } from '../types/index.js';
import { BearerKeyRepository } from '../db/repositories/BearerKeyRepository.js';
/**
* Database-backed implementation of BearerKeyDao
*/
export class BearerKeyDaoDbImpl implements BearerKeyDao {
private repository: BearerKeyRepository;
constructor() {
this.repository = new BearerKeyRepository();
}
private toModel(entity: import('../db/entities/BearerKey.js').BearerKey): BearerKeyModel {
return {
id: entity.id,
name: entity.name,
token: entity.token,
enabled: entity.enabled,
accessType: entity.accessType,
allowedGroups: entity.allowedGroups ?? [],
allowedServers: entity.allowedServers ?? [],
};
}
async findAll(): Promise<BearerKeyModel[]> {
const entities = await this.repository.findAll();
return entities.map((e) => this.toModel(e));
}
async findEnabled(): Promise<BearerKeyModel[]> {
const entities = await this.repository.findAll();
return entities.filter((e) => e.enabled).map((e) => this.toModel(e));
}
async findById(id: string): Promise<BearerKeyModel | undefined> {
const entity = await this.repository.findById(id);
return entity ? this.toModel(entity) : undefined;
}
async findByToken(token: string): Promise<BearerKeyModel | undefined> {
const entity = await this.repository.findByToken(token);
return entity ? this.toModel(entity) : undefined;
}
async create(data: Omit<BearerKeyModel, 'id'>): Promise<BearerKeyModel> {
const entity = await this.repository.create({
name: data.name,
token: data.token,
enabled: data.enabled,
accessType: data.accessType,
allowedGroups: data.allowedGroups ?? [],
allowedServers: data.allowedServers ?? [],
} as any);
return this.toModel(entity as any);
}
async update(
id: string,
data: Partial<Omit<BearerKeyModel, 'id'>>,
): Promise<BearerKeyModel | null> {
const entity = await this.repository.update(id, {
name: data.name,
token: data.token,
enabled: data.enabled,
accessType: data.accessType,
allowedGroups: data.allowedGroups,
allowedServers: data.allowedServers,
} as any);
return entity ? this.toModel(entity as any) : null;
}
async delete(id: string): Promise<boolean> {
return await this.repository.delete(id);
}
async updateServerName(oldName: string, newName: string): Promise<number> {
const allKeys = await this.repository.findAll();
let updatedCount = 0;
for (const key of allKeys) {
let updated = false;
if (key.allowedServers && key.allowedServers.length > 0) {
const newServers = key.allowedServers.map((server) => {
if (server === oldName) {
updated = true;
return newName;
}
return server;
});
if (updated) {
await this.repository.update(key.id, { allowedServers: newServers });
updatedCount++;
}
}
}
return updatedCount;
}
}

View File

@@ -5,8 +5,6 @@ import { SystemConfigDao, SystemConfigDaoImpl } from './SystemConfigDao.js';
import { UserConfigDao, UserConfigDaoImpl } from './UserConfigDao.js'; import { UserConfigDao, UserConfigDaoImpl } from './UserConfigDao.js';
import { OAuthClientDao, OAuthClientDaoImpl } from './OAuthClientDao.js'; import { OAuthClientDao, OAuthClientDaoImpl } from './OAuthClientDao.js';
import { OAuthTokenDao, OAuthTokenDaoImpl } from './OAuthTokenDao.js'; import { OAuthTokenDao, OAuthTokenDaoImpl } from './OAuthTokenDao.js';
import { BearerKeyDao, BearerKeyDaoImpl } from './BearerKeyDao.js';
import { ToolCallActivityDao } from './ToolCallActivityDao.js';
/** /**
* DAO Factory interface for creating DAO instances * DAO Factory interface for creating DAO instances
@@ -19,8 +17,6 @@ export interface DaoFactory {
getUserConfigDao(): UserConfigDao; getUserConfigDao(): UserConfigDao;
getOAuthClientDao(): OAuthClientDao; getOAuthClientDao(): OAuthClientDao;
getOAuthTokenDao(): OAuthTokenDao; getOAuthTokenDao(): OAuthTokenDao;
getBearerKeyDao(): BearerKeyDao;
getToolCallActivityDao(): ToolCallActivityDao | null; // Only available in DB mode
} }
/** /**
@@ -36,7 +32,6 @@ export class JsonFileDaoFactory implements DaoFactory {
private userConfigDao: UserConfigDao | null = null; private userConfigDao: UserConfigDao | null = null;
private oauthClientDao: OAuthClientDao | null = null; private oauthClientDao: OAuthClientDao | null = null;
private oauthTokenDao: OAuthTokenDao | null = null; private oauthTokenDao: OAuthTokenDao | null = null;
private bearerKeyDao: BearerKeyDao | null = null;
/** /**
* Get singleton instance * Get singleton instance
@@ -101,18 +96,6 @@ export class JsonFileDaoFactory implements DaoFactory {
return this.oauthTokenDao; return this.oauthTokenDao;
} }
getBearerKeyDao(): BearerKeyDao {
if (!this.bearerKeyDao) {
this.bearerKeyDao = new BearerKeyDaoImpl();
}
return this.bearerKeyDao;
}
getToolCallActivityDao(): ToolCallActivityDao | null {
// Tool call activity is only available in DB mode
return null;
}
/** /**
* Reset all cached DAO instances (useful for testing) * Reset all cached DAO instances (useful for testing)
*/ */
@@ -124,7 +107,6 @@ export class JsonFileDaoFactory implements DaoFactory {
this.userConfigDao = null; this.userConfigDao = null;
this.oauthClientDao = null; this.oauthClientDao = null;
this.oauthTokenDao = null; this.oauthTokenDao = null;
this.bearerKeyDao = null;
} }
} }
@@ -197,18 +179,3 @@ export function getOAuthClientDao(): OAuthClientDao {
export function getOAuthTokenDao(): OAuthTokenDao { export function getOAuthTokenDao(): OAuthTokenDao {
return getDaoFactory().getOAuthTokenDao(); return getDaoFactory().getOAuthTokenDao();
} }
export function getBearerKeyDao(): BearerKeyDao {
return getDaoFactory().getBearerKeyDao();
}
export function getToolCallActivityDao(): ToolCallActivityDao | null {
return getDaoFactory().getToolCallActivityDao();
}
/**
* Check if the application is using database mode
*/
export function isUsingDatabase(): boolean {
return getDaoFactory().getToolCallActivityDao() !== null;
}

View File

@@ -7,8 +7,6 @@ import {
UserConfigDao, UserConfigDao,
OAuthClientDao, OAuthClientDao,
OAuthTokenDao, OAuthTokenDao,
BearerKeyDao,
ToolCallActivityDao,
} from './index.js'; } from './index.js';
import { UserDaoDbImpl } from './UserDaoDbImpl.js'; import { UserDaoDbImpl } from './UserDaoDbImpl.js';
import { ServerDaoDbImpl } from './ServerDaoDbImpl.js'; import { ServerDaoDbImpl } from './ServerDaoDbImpl.js';
@@ -17,8 +15,6 @@ import { SystemConfigDaoDbImpl } from './SystemConfigDaoDbImpl.js';
import { UserConfigDaoDbImpl } from './UserConfigDaoDbImpl.js'; import { UserConfigDaoDbImpl } from './UserConfigDaoDbImpl.js';
import { OAuthClientDaoDbImpl } from './OAuthClientDaoDbImpl.js'; import { OAuthClientDaoDbImpl } from './OAuthClientDaoDbImpl.js';
import { OAuthTokenDaoDbImpl } from './OAuthTokenDaoDbImpl.js'; import { OAuthTokenDaoDbImpl } from './OAuthTokenDaoDbImpl.js';
import { BearerKeyDaoDbImpl } from './BearerKeyDaoDbImpl.js';
import { ToolCallActivityDaoDbImpl } from './ToolCallActivityDao.js';
/** /**
* Database-backed DAO factory implementation * Database-backed DAO factory implementation
@@ -33,8 +29,6 @@ export class DatabaseDaoFactory implements DaoFactory {
private userConfigDao: UserConfigDao | null = null; private userConfigDao: UserConfigDao | null = null;
private oauthClientDao: OAuthClientDao | null = null; private oauthClientDao: OAuthClientDao | null = null;
private oauthTokenDao: OAuthTokenDao | null = null; private oauthTokenDao: OAuthTokenDao | null = null;
private bearerKeyDao: BearerKeyDao | null = null;
private toolCallActivityDao: ToolCallActivityDao | null = null;
/** /**
* Get singleton instance * Get singleton instance
@@ -99,20 +93,6 @@ export class DatabaseDaoFactory implements DaoFactory {
return this.oauthTokenDao!; return this.oauthTokenDao!;
} }
getBearerKeyDao(): BearerKeyDao {
if (!this.bearerKeyDao) {
this.bearerKeyDao = new BearerKeyDaoDbImpl();
}
return this.bearerKeyDao!;
}
getToolCallActivityDao(): ToolCallActivityDao | null {
if (!this.toolCallActivityDao) {
this.toolCallActivityDao = new ToolCallActivityDaoDbImpl();
}
return this.toolCallActivityDao;
}
/** /**
* Reset all cached DAO instances (useful for testing) * Reset all cached DAO instances (useful for testing)
*/ */
@@ -124,7 +104,5 @@ export class DatabaseDaoFactory implements DaoFactory {
this.userConfigDao = null; this.userConfigDao = null;
this.oauthClientDao = null; this.oauthClientDao = null;
this.oauthTokenDao = null; this.oauthTokenDao = null;
this.bearerKeyDao = null;
this.toolCallActivityDao = null;
} }
} }

View File

@@ -36,11 +36,6 @@ export interface GroupDao extends BaseDao<IGroup, string> {
* Find group by name * Find group by name
*/ */
findByName(name: string): Promise<IGroup | null>; findByName(name: string): Promise<IGroup | null>;
/**
* Update server name in all groups (when server is renamed)
*/
updateServerName(oldName: string, newName: string): Promise<number>;
} }
/** /**
@@ -223,39 +218,4 @@ export class GroupDaoImpl extends JsonFileBaseDao implements GroupDao {
const groups = await this.getAll(); const groups = await this.getAll();
return groups.find((group) => group.name === name) || null; return groups.find((group) => group.name === name) || null;
} }
async updateServerName(oldName: string, newName: string): Promise<number> {
const groups = await this.getAll();
let updatedCount = 0;
for (const group of groups) {
let updated = false;
const newServers = group.servers.map((server) => {
if (typeof server === 'string') {
if (server === oldName) {
updated = true;
return newName;
}
return server;
} else {
if (server.name === oldName) {
updated = true;
return { ...server, name: newName };
}
return server;
}
}) as IGroup['servers'];
if (updated) {
group.servers = newServers;
updatedCount++;
}
}
if (updatedCount > 0) {
await this.saveAll(groups);
}
return updatedCount;
}
} }

View File

@@ -151,35 +151,4 @@ export class GroupDaoDbImpl implements GroupDao {
owner: group.owner, owner: group.owner,
}; };
} }
async updateServerName(oldName: string, newName: string): Promise<number> {
const allGroups = await this.repository.findAll();
let updatedCount = 0;
for (const group of allGroups) {
let updated = false;
const newServers = group.servers.map((server) => {
if (typeof server === 'string') {
if (server === oldName) {
updated = true;
return newName;
}
return server;
} else {
if (server.name === oldName) {
updated = true;
return { ...server, name: newName };
}
return server;
}
});
if (updated) {
await this.update(group.id, { servers: newServers as any });
updatedCount++;
}
}
return updatedCount;
}
} }

View File

@@ -41,11 +41,6 @@ export interface ServerDao extends BaseDao<ServerConfigWithName, string> {
name: string, name: string,
prompts: Record<string, { enabled: boolean; description?: string }>, prompts: Record<string, { enabled: boolean; description?: string }>,
): Promise<boolean>; ): Promise<boolean>;
/**
* Rename a server (change its name/key)
*/
rename(oldName: string, newName: string): Promise<boolean>;
} }
/** /**
@@ -100,8 +95,7 @@ export class ServerDaoImpl extends JsonFileBaseDao implements ServerDao {
return { return {
...existing, ...existing,
...updates, ...updates,
// Keep the existing name unless explicitly updating via rename name: existing.name, // Name should not be updated
name: updates.name ?? existing.name,
}; };
} }
@@ -147,7 +141,9 @@ export class ServerDaoImpl extends JsonFileBaseDao implements ServerDao {
return null; return null;
} }
const updatedServer = this.updateEntity(servers[index], updates); // Don't allow name changes
const { name: _, ...allowedUpdates } = updates;
const updatedServer = this.updateEntity(servers[index], allowedUpdates);
servers[index] = updatedServer; servers[index] = updatedServer;
await this.saveAll(servers); await this.saveAll(servers);
@@ -211,22 +207,4 @@ export class ServerDaoImpl extends JsonFileBaseDao implements ServerDao {
const result = await this.update(name, { prompts }); const result = await this.update(name, { prompts });
return result !== null; return result !== null;
} }
async rename(oldName: string, newName: string): Promise<boolean> {
const servers = await this.getAll();
const index = servers.findIndex((server) => server.name === oldName);
if (index === -1) {
return false;
}
// Check if newName already exists
if (servers.find((server) => server.name === newName)) {
throw new Error(`Server ${newName} already exists`);
}
servers[index] = { ...servers[index], name: newName };
await this.saveAll(servers);
return true;
}
} }

View File

@@ -38,7 +38,6 @@ export class ServerDaoDbImpl implements ServerDao {
prompts: entity.prompts, prompts: entity.prompts,
options: entity.options, options: entity.options,
oauth: entity.oauth, oauth: entity.oauth,
openapi: entity.openapi,
}); });
return this.mapToServerConfig(server); return this.mapToServerConfig(server);
} }
@@ -62,7 +61,6 @@ export class ServerDaoDbImpl implements ServerDao {
prompts: entity.prompts, prompts: entity.prompts,
options: entity.options, options: entity.options,
oauth: entity.oauth, oauth: entity.oauth,
openapi: entity.openapi,
}); });
return server ? this.mapToServerConfig(server) : null; return server ? this.mapToServerConfig(server) : null;
} }
@@ -115,15 +113,6 @@ export class ServerDaoDbImpl implements ServerDao {
return result !== null; return result !== null;
} }
async rename(oldName: string, newName: string): Promise<boolean> {
// Check if newName already exists
if (await this.repository.exists(newName)) {
throw new Error(`Server ${newName} already exists`);
}
return await this.repository.rename(oldName, newName);
}
private mapToServerConfig(server: { private mapToServerConfig(server: {
name: string; name: string;
type?: string; type?: string;
@@ -140,7 +129,6 @@ export class ServerDaoDbImpl implements ServerDao {
prompts?: Record<string, { enabled: boolean; description?: string }>; prompts?: Record<string, { enabled: boolean; description?: string }>;
options?: Record<string, any>; options?: Record<string, any>;
oauth?: Record<string, any>; oauth?: Record<string, any>;
openapi?: Record<string, any>;
}): ServerConfigWithName { }): ServerConfigWithName {
return { return {
name: server.name, name: server.name,
@@ -158,7 +146,6 @@ export class ServerDaoDbImpl implements ServerDao {
prompts: server.prompts, prompts: server.prompts,
options: server.options, options: server.options,
oauth: server.oauth, oauth: server.oauth,
openapi: server.openapi,
}; };
} }
} }

View File

@@ -1,186 +0,0 @@
import {
IToolCallActivity,
IToolCallActivitySearchParams,
IToolCallActivityPage,
IToolCallActivityStats,
} from '../types/index.js';
import { ToolCallActivityRepository } from '../db/repositories/ToolCallActivityRepository.js';
/**
* Tool Call Activity DAO interface (DB mode only)
*/
export interface ToolCallActivityDao {
/**
* Create a new tool call activity
*/
create(activity: Omit<IToolCallActivity, 'id' | 'createdAt'>): Promise<IToolCallActivity>;
/**
* Find activity by ID
*/
findById(id: string): Promise<IToolCallActivity | null>;
/**
* Update an existing activity
*/
update(id: string, updates: Partial<IToolCallActivity>): Promise<IToolCallActivity | null>;
/**
* Delete an activity
*/
delete(id: string): Promise<boolean>;
/**
* Find activities with pagination and filtering
*/
findWithPagination(
page: number,
pageSize: number,
params?: IToolCallActivitySearchParams,
): Promise<IToolCallActivityPage>;
/**
* Get recent activities
*/
findRecent(limit: number): Promise<IToolCallActivity[]>;
/**
* Get activity statistics
*/
getStats(): Promise<IToolCallActivityStats>;
/**
* Delete old activities (cleanup)
*/
deleteOlderThan(date: Date): Promise<number>;
/**
* Count total activities
*/
count(): Promise<number>;
}
/**
* Database-backed implementation of ToolCallActivityDao
*/
export class ToolCallActivityDaoDbImpl implements ToolCallActivityDao {
private repository: ToolCallActivityRepository;
constructor() {
this.repository = new ToolCallActivityRepository();
}
async create(activity: Omit<IToolCallActivity, 'id' | 'createdAt'>): Promise<IToolCallActivity> {
const created = await this.repository.create({
serverName: activity.serverName,
toolName: activity.toolName,
keyId: activity.keyId,
keyName: activity.keyName,
status: activity.status,
request: activity.request,
response: activity.response,
errorMessage: activity.errorMessage,
durationMs: activity.durationMs,
clientIp: activity.clientIp,
sessionId: activity.sessionId,
groupName: activity.groupName,
});
return this.mapToInterface(created);
}
async findById(id: string): Promise<IToolCallActivity | null> {
const activity = await this.repository.findById(id);
return activity ? this.mapToInterface(activity) : null;
}
async update(
id: string,
updates: Partial<IToolCallActivity>,
): Promise<IToolCallActivity | null> {
const updated = await this.repository.update(id, {
serverName: updates.serverName,
toolName: updates.toolName,
keyId: updates.keyId,
keyName: updates.keyName,
status: updates.status,
request: updates.request,
response: updates.response,
errorMessage: updates.errorMessage,
durationMs: updates.durationMs,
clientIp: updates.clientIp,
sessionId: updates.sessionId,
groupName: updates.groupName,
});
return updated ? this.mapToInterface(updated) : null;
}
async delete(id: string): Promise<boolean> {
return await this.repository.delete(id);
}
async findWithPagination(
page: number = 1,
pageSize: number = 20,
params: IToolCallActivitySearchParams = {},
): Promise<IToolCallActivityPage> {
const result = await this.repository.findWithPagination(page, pageSize, params);
return {
items: result.items.map((item) => this.mapToInterface(item)),
total: result.total,
page: result.page,
pageSize: result.pageSize,
totalPages: result.totalPages,
};
}
async findRecent(limit: number = 10): Promise<IToolCallActivity[]> {
const activities = await this.repository.findRecent(limit);
return activities.map((activity) => this.mapToInterface(activity));
}
async getStats(): Promise<IToolCallActivityStats> {
return await this.repository.getStats();
}
async deleteOlderThan(date: Date): Promise<number> {
return await this.repository.deleteOlderThan(date);
}
async count(): Promise<number> {
return await this.repository.count();
}
private mapToInterface(activity: {
id: string;
serverName: string;
toolName: string;
keyId?: string;
keyName?: string;
status: 'pending' | 'success' | 'error';
request?: string;
response?: string;
errorMessage?: string;
durationMs?: number;
clientIp?: string;
sessionId?: string;
groupName?: string;
createdAt: Date;
}): IToolCallActivity {
return {
id: activity.id,
serverName: activity.serverName,
toolName: activity.toolName,
keyId: activity.keyId,
keyName: activity.keyName,
status: activity.status,
request: activity.request,
response: activity.response,
errorMessage: activity.errorMessage,
durationMs: activity.durationMs,
clientIp: activity.clientIp,
sessionId: activity.sessionId,
groupName: activity.groupName,
createdAt: activity.createdAt,
};
}
}

View File

@@ -8,8 +8,6 @@ export * from './SystemConfigDao.js';
export * from './UserConfigDao.js'; export * from './UserConfigDao.js';
export * from './OAuthClientDao.js'; export * from './OAuthClientDao.js';
export * from './OAuthTokenDao.js'; export * from './OAuthTokenDao.js';
export * from './BearerKeyDao.js';
export * from './ToolCallActivityDao.js';
// Export database implementations // Export database implementations
export * from './UserDaoDbImpl.js'; export * from './UserDaoDbImpl.js';
@@ -19,7 +17,6 @@ export * from './SystemConfigDaoDbImpl.js';
export * from './UserConfigDaoDbImpl.js'; export * from './UserConfigDaoDbImpl.js';
export * from './OAuthClientDaoDbImpl.js'; export * from './OAuthClientDaoDbImpl.js';
export * from './OAuthTokenDaoDbImpl.js'; export * from './OAuthTokenDaoDbImpl.js';
export * from './BearerKeyDaoDbImpl.js';
// Export the DAO factory and convenience functions // Export the DAO factory and convenience functions
export * from './DaoFactory.js'; export * from './DaoFactory.js';

View File

@@ -25,54 +25,46 @@ const createRequiredExtensions = async (dataSource: DataSource): Promise<void> =
}; };
// Get database URL from smart routing config or fallback to environment variable // Get database URL from smart routing config or fallback to environment variable
const getDatabaseUrl = async (): Promise<string> => { const getDatabaseUrl = (): string => {
return (await getSmartRoutingConfig()).dbUrl; return getSmartRoutingConfig().dbUrl;
}; };
// Default database configuration (without URL - will be set during initialization) // Default database configuration
const getDefaultConfig = async (): Promise<DataSourceOptions> => { const defaultConfig: DataSourceOptions = {
return {
type: 'postgres', type: 'postgres',
url: await getDatabaseUrl(), url: getDatabaseUrl(),
synchronize: true, synchronize: true,
entities: entities, entities: entities,
subscribers: [VectorEmbeddingSubscriber], subscribers: [VectorEmbeddingSubscriber],
};
}; };
// AppDataSource is the TypeORM data source (initialized with empty config, will be updated) // AppDataSource is the TypeORM data source
let appDataSource: DataSource | null = null; let appDataSource = new DataSource(defaultConfig);
// Global promise to track initialization status // Global promise to track initialization status
let initializationPromise: Promise<DataSource> | null = null; let initializationPromise: Promise<DataSource> | null = null;
// Function to create a new DataSource with updated configuration // Function to create a new DataSource with updated configuration
export const updateDataSourceConfig = async (): Promise<DataSource> => { export const updateDataSourceConfig = (): DataSource => {
const newConfig = await getDefaultConfig(); const newConfig: DataSourceOptions = {
...defaultConfig,
url: getDatabaseUrl(),
};
// If the configuration has changed, we need to create a new DataSource // If the configuration has changed, we need to create a new DataSource
if (appDataSource) {
const currentUrl = (appDataSource.options as any).url; const currentUrl = (appDataSource.options as any).url;
const newUrl = (newConfig as any).url; if (currentUrl !== newConfig.url) {
if (currentUrl !== newUrl) {
console.log('Database URL configuration changed, updating DataSource...'); console.log('Database URL configuration changed, updating DataSource...');
appDataSource = new DataSource(newConfig); appDataSource = new DataSource(newConfig);
// Reset initialization promise when configuration changes // Reset initialization promise when configuration changes
initializationPromise = null; initializationPromise = null;
} }
} else {
// First time initialization
appDataSource = new DataSource(newConfig);
}
return appDataSource; return appDataSource;
}; };
// Get the current AppDataSource instance // Get the current AppDataSource instance
export const getAppDataSource = (): DataSource => { export const getAppDataSource = (): DataSource => {
if (!appDataSource) {
throw new Error('Database not initialized. Call initializeDatabase() first.');
}
return appDataSource; return appDataSource;
}; };
@@ -80,7 +72,7 @@ export const getAppDataSource = (): DataSource => {
export const reconnectDatabase = async (): Promise<DataSource> => { export const reconnectDatabase = async (): Promise<DataSource> => {
try { try {
// Close existing connection if it exists // Close existing connection if it exists
if (appDataSource && appDataSource.isInitialized) { if (appDataSource.isInitialized) {
console.log('Closing existing database connection...'); console.log('Closing existing database connection...');
await appDataSource.destroy(); await appDataSource.destroy();
} }
@@ -89,7 +81,7 @@ export const reconnectDatabase = async (): Promise<DataSource> => {
initializationPromise = null; initializationPromise = null;
// Update configuration and reconnect // Update configuration and reconnect
appDataSource = await updateDataSourceConfig(); appDataSource = updateDataSourceConfig();
return await initializeDatabase(); return await initializeDatabase();
} catch (error) { } catch (error) {
console.error('Error during database reconnection:', error); console.error('Error during database reconnection:', error);
@@ -106,7 +98,7 @@ export const initializeDatabase = async (): Promise<DataSource> => {
} }
// If already initialized, return the existing instance // If already initialized, return the existing instance
if (appDataSource && appDataSource.isInitialized) { if (appDataSource.isInitialized) {
console.log('Database already initialized, returning existing instance'); console.log('Database already initialized, returning existing instance');
return Promise.resolve(appDataSource); return Promise.resolve(appDataSource);
} }
@@ -130,7 +122,7 @@ export const initializeDatabase = async (): Promise<DataSource> => {
const performDatabaseInitialization = async (): Promise<DataSource> => { const performDatabaseInitialization = async (): Promise<DataSource> => {
try { try {
// Update configuration before initializing // Update configuration before initializing
appDataSource = await updateDataSourceConfig(); appDataSource = updateDataSourceConfig();
if (!appDataSource.isInitialized) { if (!appDataSource.isInitialized) {
console.log('Initializing database connection...'); console.log('Initializing database connection...');
@@ -258,8 +250,7 @@ const performDatabaseInitialization = async (): Promise<DataSource> => {
console.log('Database connection established successfully.'); console.log('Database connection established successfully.');
// Run one final setup check after schema synchronization is done // Run one final setup check after schema synchronization is done
const config = await getDefaultConfig(); if (defaultConfig.synchronize) {
if (config.synchronize) {
try { try {
console.log('Running final vector configuration check...'); console.log('Running final vector configuration check...');
@@ -334,12 +325,12 @@ const performDatabaseInitialization = async (): Promise<DataSource> => {
// Get database connection status // Get database connection status
export const isDatabaseConnected = (): boolean => { export const isDatabaseConnected = (): boolean => {
return appDataSource ? appDataSource.isInitialized : false; return appDataSource.isInitialized;
}; };
// Close database connection // Close database connection
export const closeDatabase = async (): Promise<void> => { export const closeDatabase = async (): Promise<void> => {
if (appDataSource && appDataSource.isInitialized) { if (appDataSource.isInitialized) {
await appDataSource.destroy(); await appDataSource.destroy();
console.log('Database connection closed.'); console.log('Database connection closed.');
} }

View File

@@ -1,43 +0,0 @@
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
} from 'typeorm';
/**
* Bearer authentication key entity
* Stores multiple bearer keys with per-key enable/disable and scoped access control
*/
@Entity({ name: 'bearer_keys' })
export class BearerKey {
@PrimaryGeneratedColumn('uuid')
id: string;
@Column({ type: 'varchar', length: 100 })
name: string;
@Column({ type: 'varchar', length: 512 })
token: string;
@Column({ type: 'boolean', default: true })
enabled: boolean;
@Column({ type: 'varchar', length: 20, default: 'all' })
accessType: 'all' | 'groups' | 'servers' | 'custom';
@Column({ type: 'simple-json', nullable: true })
allowedGroups?: string[];
@Column({ type: 'simple-json', nullable: true })
allowedServers?: string[];
@CreateDateColumn({ name: 'created_at', type: 'timestamp' })
createdAt: Date;
@UpdateDateColumn({ name: 'updated_at', type: 'timestamp' })
updatedAt: Date;
}
export default BearerKey;

View File

@@ -59,9 +59,6 @@ export class Server {
@Column({ type: 'simple-json', nullable: true }) @Column({ type: 'simple-json', nullable: true })
oauth?: Record<string, any>; oauth?: Record<string, any>;
@Column({ type: 'simple-json', nullable: true })
openapi?: Record<string, any>;
@CreateDateColumn({ name: 'created_at', type: 'timestamp' }) @CreateDateColumn({ name: 'created_at', type: 'timestamp' })
createdAt: Date; createdAt: Date;

View File

@@ -1,62 +0,0 @@
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
Index,
} from 'typeorm';
/**
* Tool call activity entity for logging tool invocations (DB mode only)
*/
@Entity({ name: 'tool_call_activities' })
export class ToolCallActivity {
@PrimaryGeneratedColumn('uuid')
id: string;
@Index()
@Column({ type: 'varchar', length: 255, name: 'server_name' })
serverName: string;
@Index()
@Column({ type: 'varchar', length: 255, name: 'tool_name' })
toolName: string;
@Index()
@Column({ type: 'varchar', length: 255, name: 'key_id', nullable: true })
keyId?: string;
@Column({ type: 'varchar', length: 255, name: 'key_name', nullable: true })
keyName?: string;
@Index()
@Column({ type: 'varchar', length: 50, default: 'pending' })
status: 'pending' | 'success' | 'error';
@Column({ type: 'text', nullable: true })
request?: string;
@Column({ type: 'text', nullable: true })
response?: string;
@Column({ type: 'text', name: 'error_message', nullable: true })
errorMessage?: string;
@Column({ type: 'int', name: 'duration_ms', nullable: true })
durationMs?: number;
@Column({ type: 'varchar', length: 100, name: 'client_ip', nullable: true })
clientIp?: string;
@Column({ type: 'varchar', length: 255, name: 'session_id', nullable: true })
sessionId?: string;
@Column({ type: 'varchar', length: 255, name: 'group_name', nullable: true })
groupName?: string;
@Index()
@CreateDateColumn({ name: 'created_at', type: 'timestamp' })
createdAt: Date;
}
export default ToolCallActivity;

View File

@@ -6,8 +6,6 @@ import SystemConfig from './SystemConfig.js';
import UserConfig from './UserConfig.js'; import UserConfig from './UserConfig.js';
import OAuthClient from './OAuthClient.js'; import OAuthClient from './OAuthClient.js';
import OAuthToken from './OAuthToken.js'; import OAuthToken from './OAuthToken.js';
import BearerKey from './BearerKey.js';
import ToolCallActivity from './ToolCallActivity.js';
// Export all entities // Export all entities
export default [ export default [
@@ -19,20 +17,7 @@ export default [
UserConfig, UserConfig,
OAuthClient, OAuthClient,
OAuthToken, OAuthToken,
BearerKey,
ToolCallActivity,
]; ];
// Export individual entities for direct use // Export individual entities for direct use
export { export { VectorEmbedding, User, Server, Group, SystemConfig, UserConfig, OAuthClient, OAuthToken };
VectorEmbedding,
User,
Server,
Group,
SystemConfig,
UserConfig,
OAuthClient,
OAuthToken,
BearerKey,
ToolCallActivity,
};

View File

@@ -1,75 +0,0 @@
import { Repository } from 'typeorm';
import { BearerKey } from '../entities/BearerKey.js';
import { getAppDataSource } from '../connection.js';
/**
* Repository for BearerKey entity
*/
export class BearerKeyRepository {
private repository: Repository<BearerKey>;
constructor() {
this.repository = getAppDataSource().getRepository(BearerKey);
}
/**
* Find all bearer keys
*/
async findAll(): Promise<BearerKey[]> {
return await this.repository.find({ order: { createdAt: 'ASC' } });
}
/**
* Count bearer keys
*/
async count(): Promise<number> {
return await this.repository.count();
}
/**
* Find bearer key by id
*/
async findById(id: string): Promise<BearerKey | null> {
return await this.repository.findOne({ where: { id } });
}
/**
* Find bearer key by token value
*/
async findByToken(token: string): Promise<BearerKey | null> {
return await this.repository.findOne({ where: { token } });
}
/**
* Create a new bearer key
*/
async create(data: Omit<BearerKey, 'id' | 'createdAt' | 'updatedAt'>): Promise<BearerKey> {
const entity = this.repository.create(data);
return await this.repository.save(entity);
}
/**
* Update an existing bearer key
*/
async update(
id: string,
updates: Partial<Omit<BearerKey, 'id' | 'createdAt' | 'updatedAt'>>,
): Promise<BearerKey | null> {
const existing = await this.findById(id);
if (!existing) {
return null;
}
const merged = this.repository.merge(existing, updates);
return await this.repository.save(merged);
}
/**
* Delete a bearer key
*/
async delete(id: string): Promise<boolean> {
const result = await this.repository.delete({ id });
return (result.affected ?? 0) > 0;
}
}
export default BearerKeyRepository;

View File

@@ -89,19 +89,6 @@ export class ServerRepository {
async setEnabled(name: string, enabled: boolean): Promise<Server | null> { async setEnabled(name: string, enabled: boolean): Promise<Server | null> {
return await this.update(name, { enabled }); return await this.update(name, { enabled });
} }
/**
* Rename a server
*/
async rename(oldName: string, newName: string): Promise<boolean> {
const server = await this.findByName(oldName);
if (!server) {
return false;
}
server.name = newName;
await this.repository.save(server);
return true;
}
} }
export default ServerRepository; export default ServerRepository;

View File

@@ -1,200 +0,0 @@
import { Repository, FindOptionsWhere, ILike, Between } from 'typeorm';
import { ToolCallActivity } from '../entities/ToolCallActivity.js';
import { getAppDataSource } from '../connection.js';
/**
* Search parameters for filtering tool call activities
*/
export interface ToolCallActivitySearchParams {
serverName?: string;
toolName?: string;
keyId?: string;
status?: 'pending' | 'success' | 'error';
groupName?: string;
startDate?: Date;
endDate?: Date;
searchQuery?: string;
}
/**
* Pagination result for tool call activities
*/
export interface ToolCallActivityPage {
items: ToolCallActivity[];
total: number;
page: number;
pageSize: number;
totalPages: number;
}
/**
* Repository for ToolCallActivity entity
*/
export class ToolCallActivityRepository {
private repository: Repository<ToolCallActivity>;
constructor() {
this.repository = getAppDataSource().getRepository(ToolCallActivity);
}
/**
* Create a new tool call activity
*/
async create(
activity: Omit<ToolCallActivity, 'id' | 'createdAt'>,
): Promise<ToolCallActivity> {
const newActivity = this.repository.create(activity);
return await this.repository.save(newActivity);
}
/**
* Find activity by ID
*/
async findById(id: string): Promise<ToolCallActivity | null> {
return await this.repository.findOne({ where: { id } });
}
/**
* Update an existing activity
*/
async update(
id: string,
updates: Partial<ToolCallActivity>,
): Promise<ToolCallActivity | null> {
const activity = await this.findById(id);
if (!activity) {
return null;
}
const updated = this.repository.merge(activity, updates);
return await this.repository.save(updated);
}
/**
* Delete an activity
*/
async delete(id: string): Promise<boolean> {
const result = await this.repository.delete({ id });
return (result.affected ?? 0) > 0;
}
/**
* Find activities with pagination and filtering
*/
async findWithPagination(
page: number = 1,
pageSize: number = 20,
params: ToolCallActivitySearchParams = {},
): Promise<ToolCallActivityPage> {
const where: FindOptionsWhere<ToolCallActivity>[] = [];
const baseWhere: FindOptionsWhere<ToolCallActivity> = {};
// Add filters
if (params.serverName) {
baseWhere.serverName = params.serverName;
}
if (params.toolName) {
baseWhere.toolName = params.toolName;
}
if (params.keyId) {
baseWhere.keyId = params.keyId;
}
if (params.status) {
baseWhere.status = params.status;
}
if (params.groupName) {
baseWhere.groupName = params.groupName;
}
if (params.startDate && params.endDate) {
baseWhere.createdAt = Between(params.startDate, params.endDate);
}
// Handle search query - search across multiple fields
if (params.searchQuery) {
const searchPattern = `%${params.searchQuery}%`;
where.push(
{ ...baseWhere, serverName: ILike(searchPattern) },
{ ...baseWhere, toolName: ILike(searchPattern) },
{ ...baseWhere, keyName: ILike(searchPattern) },
{ ...baseWhere, groupName: ILike(searchPattern) },
);
} else {
where.push(baseWhere);
}
const [items, total] = await this.repository.findAndCount({
where: where.length > 0 ? where : undefined,
order: { createdAt: 'DESC' },
skip: (page - 1) * pageSize,
take: pageSize,
});
return {
items,
total,
page,
pageSize,
totalPages: Math.ceil(total / pageSize),
};
}
/**
* Get recent activities
*/
async findRecent(limit: number = 10): Promise<ToolCallActivity[]> {
return await this.repository.find({
order: { createdAt: 'DESC' },
take: limit,
});
}
/**
* Get activity statistics
*/
async getStats(): Promise<{
total: number;
success: number;
error: number;
pending: number;
avgDurationMs: number;
}> {
const stats = await this.repository
.createQueryBuilder('activity')
.select([
'COUNT(*) as total',
'SUM(CASE WHEN status = \'success\' THEN 1 ELSE 0 END) as success',
'SUM(CASE WHEN status = \'error\' THEN 1 ELSE 0 END) as error',
'SUM(CASE WHEN status = \'pending\' THEN 1 ELSE 0 END) as pending',
'AVG(duration_ms) as avgDurationMs',
])
.getRawOne();
return {
total: parseInt(stats?.total || '0', 10),
success: parseInt(stats?.success || '0', 10),
error: parseInt(stats?.error || '0', 10),
pending: parseInt(stats?.pending || '0', 10),
avgDurationMs: parseFloat(stats?.avgDurationMs || '0'),
};
}
/**
* Delete old activities (cleanup)
*/
async deleteOlderThan(date: Date): Promise<number> {
const result = await this.repository
.createQueryBuilder()
.delete()
.where('created_at < :date', { date })
.execute();
return result.affected ?? 0;
}
/**
* Count total activities
*/
async count(): Promise<number> {
return await this.repository.count();
}
}
export default ToolCallActivityRepository;

View File

@@ -6,8 +6,6 @@ import { SystemConfigRepository } from './SystemConfigRepository.js';
import { UserConfigRepository } from './UserConfigRepository.js'; import { UserConfigRepository } from './UserConfigRepository.js';
import { OAuthClientRepository } from './OAuthClientRepository.js'; import { OAuthClientRepository } from './OAuthClientRepository.js';
import { OAuthTokenRepository } from './OAuthTokenRepository.js'; import { OAuthTokenRepository } from './OAuthTokenRepository.js';
import { BearerKeyRepository } from './BearerKeyRepository.js';
import { ToolCallActivityRepository } from './ToolCallActivityRepository.js';
// Export all repositories // Export all repositories
export { export {
@@ -19,6 +17,4 @@ export {
UserConfigRepository, UserConfigRepository,
OAuthClientRepository, OAuthClientRepository,
OAuthTokenRepository, OAuthTokenRepository,
BearerKeyRepository,
ToolCallActivityRepository,
}; };

View File

@@ -5,15 +5,9 @@ import defaultConfig from '../config/index.js';
import { JWT_SECRET } from '../config/jwt.js'; import { JWT_SECRET } from '../config/jwt.js';
import { getToken } from '../models/OAuth.js'; import { getToken } from '../models/OAuth.js';
import { isOAuthServerEnabled } from '../services/oauthServerService.js'; import { isOAuthServerEnabled } from '../services/oauthServerService.js';
import { getBearerKeyDao } from '../dao/index.js';
import { BearerKey } from '../types/index.js';
const validateBearerAuth = async (req: Request): Promise<boolean> => { const validateBearerAuth = (req: Request, routingConfig: any): boolean => {
const bearerKeyDao = getBearerKeyDao(); if (!routingConfig.enableBearerAuth) {
const enabledKeys = await bearerKeyDao.findEnabled();
// If there are no enabled keys, bearer auth via static keys is disabled
if (enabledKeys.length === 0) {
return false; return false;
} }
@@ -22,21 +16,7 @@ const validateBearerAuth = async (req: Request): Promise<boolean> => {
return false; return false;
} }
const token = authHeader.substring(7).trim(); return authHeader.substring(7) === routingConfig.bearerAuthKey;
if (!token) {
return false;
}
const matchingKey: BearerKey | undefined = enabledKeys.find((key) => key.token === token);
if (!matchingKey) {
console.warn('Bearer auth failed: token did not match any configured bearer key');
return false;
}
console.log(
`Bearer auth succeeded with key id=${matchingKey.id}, name=${matchingKey.name}, accessType=${matchingKey.accessType}`,
);
return true;
}; };
const readonlyAllowPaths = ['/tools/call/']; const readonlyAllowPaths = ['/tools/call/'];
@@ -67,6 +47,8 @@ export const auth = async (req: Request, res: Response, next: NextFunction): Pro
const routingConfig = loadSettings().systemConfig?.routing || { const routingConfig = loadSettings().systemConfig?.routing || {
enableGlobalRoute: true, enableGlobalRoute: true,
enableGroupNameRoute: true, enableGroupNameRoute: true,
enableBearerAuth: false,
bearerAuthKey: '',
skipAuth: false, skipAuth: false,
}; };
@@ -75,8 +57,8 @@ export const auth = async (req: Request, res: Response, next: NextFunction): Pro
return; return;
} }
// Check if bearer auth via configured keys can validate this request // Check if bearer auth is enabled and validate it
if (await validateBearerAuth(req)) { if (validateBearerAuth(req, routingConfig)) {
next(); next();
return; return;
} }

View File

@@ -6,7 +6,6 @@ import {
getAllSettings, getAllSettings,
getServerConfig, getServerConfig,
createServer, createServer,
batchCreateServers,
updateServer, updateServer,
deleteServer, deleteServer,
toggleServer, toggleServer,
@@ -21,7 +20,6 @@ import {
getGroups, getGroups,
getGroup, getGroup,
createNewGroup, createNewGroup,
batchCreateGroups,
updateExistingGroup, updateExistingGroup,
deleteExistingGroup, deleteExistingGroup,
addServerToExistingGroup, addServerToExistingGroup,
@@ -106,12 +104,6 @@ import {
updateClientConfiguration, updateClientConfiguration,
deleteClientRegistration, deleteClientRegistration,
} from '../controllers/oauthDynamicRegistrationController.js'; } from '../controllers/oauthDynamicRegistrationController.js';
import {
getBearerKeys,
createBearerKey,
updateBearerKey,
deleteBearerKey,
} from '../controllers/bearerKeyController.js';
import { auth } from '../middlewares/auth.js'; import { auth } from '../middlewares/auth.js';
const router = express.Router(); const router = express.Router();
@@ -142,7 +134,6 @@ export const initRoutes = (app: express.Application): void => {
router.get('/servers/:name', getServerConfig); router.get('/servers/:name', getServerConfig);
router.get('/settings', getAllSettings); router.get('/settings', getAllSettings);
router.post('/servers', createServer); router.post('/servers', createServer);
router.post('/servers/batch', batchCreateServers);
router.put('/servers/:name', updateServer); router.put('/servers/:name', updateServer);
router.delete('/servers/:name', deleteServer); router.delete('/servers/:name', deleteServer);
router.post('/servers/:name/toggle', toggleServer); router.post('/servers/:name/toggle', toggleServer);
@@ -157,7 +148,6 @@ export const initRoutes = (app: express.Application): void => {
router.get('/groups', getGroups); router.get('/groups', getGroups);
router.get('/groups/:id', getGroup); router.get('/groups/:id', getGroup);
router.post('/groups', createNewGroup); router.post('/groups', createNewGroup);
router.post('/groups/batch', batchCreateGroups);
router.put('/groups/:id', updateExistingGroup); router.put('/groups/:id', updateExistingGroup);
router.delete('/groups/:id', deleteExistingGroup); router.delete('/groups/:id', deleteExistingGroup);
router.post('/groups/:id/servers', addServerToExistingGroup); router.post('/groups/:id/servers', addServerToExistingGroup);
@@ -193,12 +183,6 @@ export const initRoutes = (app: express.Application): void => {
router.delete('/oauth/clients/:clientId', deleteClient); router.delete('/oauth/clients/:clientId', deleteClient);
router.post('/oauth/clients/:clientId/regenerate-secret', regenerateSecret); router.post('/oauth/clients/:clientId/regenerate-secret', regenerateSecret);
// Bearer authentication key management (admin only)
router.get('/auth/keys', getBearerKeys);
router.post('/auth/keys', createBearerKey);
router.put('/auth/keys/:id', updateBearerKey);
router.delete('/auth/keys/:id', deleteBearerKey);
// Tool management routes // Tool management routes
router.post('/tools/call/:server', callTool); router.post('/tools/call/:server', callTool);

View File

@@ -29,9 +29,9 @@ export const getGroupByIdOrName = async (key: string): Promise<IGroup | undefine
const systemConfigDao = getSystemConfigDao(); const systemConfigDao = getSystemConfigDao();
const systemConfig = await systemConfigDao.get(); const systemConfig = await systemConfigDao.get();
const routingConfig = { const routingConfig = systemConfig?.routing || {
enableGlobalRoute: systemConfig?.routing?.enableGlobalRoute ?? true, enableGlobalRoute: true,
enableGroupNameRoute: systemConfig?.routing?.enableGroupNameRoute ?? true, enableGroupNameRoute: true,
}; };
const groups = await getAllGroups(); const groups = await getAllGroups();

View File

@@ -48,9 +48,7 @@ export const setupClientKeepAlive = async (
await (serverInfo.client as any).ping(); await (serverInfo.client as any).ping();
console.log(`Keep-alive ping successful for server: ${serverInfo.name}`); console.log(`Keep-alive ping successful for server: ${serverInfo.name}`);
} else { } else {
await serverInfo.client await serverInfo.client.listTools({ timeout: 5000 }).catch(() => void 0);
.listTools({}, { ...(serverInfo.options || {}), timeout: 5000 })
.catch(() => void 0);
} }
} }
} catch (error) { } catch (error) {

View File

@@ -325,7 +325,7 @@ export class MCPHubOAuthProvider implements OAuthClientProvider {
return; return;
} }
console.log(`Saving OAuth tokens: ${JSON.stringify(tokens)} for server: ${this.serverName}`); console.log(`Saving OAuth tokens for server: ${this.serverName}`);
const updatedConfig = await persistTokens(this.serverName, { const updatedConfig = await persistTokens(this.serverName, {
accessToken: tokens.access_token, accessToken: tokens.access_token,

View File

@@ -14,7 +14,6 @@ import {
StreamableHTTPClientTransport, StreamableHTTPClientTransport,
StreamableHTTPClientTransportOptions, StreamableHTTPClientTransportOptions,
} from '@modelcontextprotocol/sdk/client/streamableHttp.js'; } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
import { createFetchWithProxy, getProxyConfigFromEnv } from './proxy.js';
import { ServerInfo, ServerConfig, Tool } from '../types/index.js'; import { ServerInfo, ServerConfig, Tool } from '../types/index.js';
import { expandEnvVars, replaceEnvVars, getNameSeparator } from '../config/index.js'; import { expandEnvVars, replaceEnvVars, getNameSeparator } from '../config/index.js';
import config from '../config/index.js'; import config from '../config/index.js';
@@ -135,10 +134,6 @@ export const cleanupAllServers = (): void => {
// Helper function to create transport based on server configuration // Helper function to create transport based on server configuration
export const createTransportFromConfig = async (name: string, conf: ServerConfig): Promise<any> => { export const createTransportFromConfig = async (name: string, conf: ServerConfig): Promise<any> => {
let transport; let transport;
const env: Record<string, string> = {
...(process.env as Record<string, string>),
...replaceEnvVars(conf.env || {}),
};
if (conf.type === 'streamable-http') { if (conf.type === 'streamable-http') {
const options: StreamableHTTPClientTransportOptions = {}; const options: StreamableHTTPClientTransportOptions = {};
@@ -157,8 +152,6 @@ export const createTransportFromConfig = async (name: string, conf: ServerConfig
console.log(`OAuth provider configured for server: ${name}`); console.log(`OAuth provider configured for server: ${name}`);
} }
options.fetch = createFetchWithProxy(getProxyConfigFromEnv(env));
transport = new StreamableHTTPClientTransport(new URL(conf.url || ''), options); transport = new StreamableHTTPClientTransport(new URL(conf.url || ''), options);
} else if (conf.url) { } else if (conf.url) {
// SSE transport // SSE transport
@@ -181,11 +174,13 @@ export const createTransportFromConfig = async (name: string, conf: ServerConfig
console.log(`OAuth provider configured for server: ${name}`); console.log(`OAuth provider configured for server: ${name}`);
} }
options.fetch = createFetchWithProxy(getProxyConfigFromEnv(env));
transport = new SSEClientTransport(new URL(conf.url), options); transport = new SSEClientTransport(new URL(conf.url), options);
} else if (conf.command && conf.args) { } else if (conf.command && conf.args) {
// Stdio transport // Stdio transport
const env: Record<string, string> = {
...(process.env as Record<string, string>),
...replaceEnvVars(conf.env || {}),
};
env['PATH'] = expandEnvVars(process.env.PATH as string) || ''; env['PATH'] = expandEnvVars(process.env.PATH as string) || '';
const systemConfigDao = getSystemConfigDao(); const systemConfigDao = getSystemConfigDao();
@@ -241,8 +236,6 @@ const callToolWithReconnect = async (
for (let attempt = 0; attempt <= maxRetries; attempt++) { for (let attempt = 0; attempt <= maxRetries; attempt++) {
try { try {
const result = await serverInfo.client.callTool(toolParams, undefined, options || {}); const result = await serverInfo.client.callTool(toolParams, undefined, options || {});
// Check auth error
checkAuthError(result);
return result; return result;
} catch (error: any) { } catch (error: any) {
// Check if error message starts with "Error POSTing to endpoint (HTTP 40" // Check if error message starts with "Error POSTing to endpoint (HTTP 40"
@@ -621,37 +614,9 @@ export const registerAllTools = async (isInit: boolean, serverName?: string): Pr
export const getServersInfo = async (): Promise<Omit<ServerInfo, 'client' | 'transport'>[]> => { export const getServersInfo = async (): Promise<Omit<ServerInfo, 'client' | 'transport'>[]> => {
const allServers: ServerConfigWithName[] = await getServerDao().findAll(); const allServers: ServerConfigWithName[] = await getServerDao().findAll();
const dataService = getDataService(); const dataService = getDataService();
// Ensure that servers recently added via DAO but not yet initialized in serverInfos
// are still visible in the servers list. This avoids a race condition where
// a POST /api/servers immediately followed by GET /api/servers would not
// return the newly created server until background initialization completes.
const combinedServerInfos: ServerInfo[] = [...serverInfos];
const existingNames = new Set(combinedServerInfos.map((s) => s.name));
for (const server of allServers) {
if (!existingNames.has(server.name)) {
const isEnabled = server.enabled === undefined ? true : server.enabled;
combinedServerInfos.push({
name: server.name,
owner: server.owner,
// Newly created servers that are enabled should appear as "connecting"
// until the MCP client initialization completes. Disabled servers remain
// in the "disconnected" state.
status: isEnabled ? 'connecting' : 'disconnected',
error: null,
tools: [],
prompts: [],
createTime: Date.now(),
enabled: isEnabled,
});
}
}
const filterServerInfos: ServerInfo[] = dataService.filterData const filterServerInfos: ServerInfo[] = dataService.filterData
? dataService.filterData(combinedServerInfos) ? dataService.filterData(serverInfos)
: combinedServerInfos; : serverInfos;
const infos = filterServerInfos.map( const infos = filterServerInfos.map(
({ name, status, tools, prompts, createTime, error, oauth }) => { ({ name, status, tools, prompts, createTime, error, oauth }) => {
const serverConfig = allServers.find((server) => server.name === name); const serverConfig = allServers.find((server) => server.name === name);
@@ -832,25 +797,6 @@ export const addOrUpdateServer = async (
} }
}; };
// Check for authentication error in tool call result
function checkAuthError(result: any) {
if (Array.isArray(result.content) && result.content.length > 0) {
const text = result.content[0]?.text;
if (typeof text === 'string') {
let errorContent;
try {
errorContent = JSON.parse(text);
} catch (e) {
// Ignore JSON parse errors and continue
return;
}
if (errorContent.code === 401) {
throw new Error('Error POSTing to endpoint (HTTP 401 Unauthorized)');
}
}
}
}
// Close server client and transport // Close server client and transport
function closeServer(name: string) { function closeServer(name: string) {
const serverInfo = serverInfos.find((serverInfo) => serverInfo.name === name); const serverInfo = serverInfos.find((serverInfo) => serverInfo.name === name);

View File

@@ -1,167 +0,0 @@
/**
* HTTP/HTTPS proxy configuration utilities for MCP client transports.
*
* This module provides utilities to configure HTTP and HTTPS proxies when
* connecting to MCP servers. Proxies are configured by providing a custom
* fetch implementation that uses Node.js http/https agents with proxy support.
*
*/
import { FetchLike } from '@modelcontextprotocol/sdk/shared/transport.js';
/**
* Configuration options for HTTP/HTTPS proxy settings.
*/
export interface ProxyConfig {
/**
* HTTP proxy URL (e.g., 'http://proxy.example.com:8080')
* Can include authentication: 'http://user:pass@proxy.example.com:8080'
*/
httpProxy?: string;
/**
* HTTPS proxy URL (e.g., 'https://proxy.example.com:8443')
* Can include authentication: 'https://user:pass@proxy.example.com:8443'
*/
httpsProxy?: string;
/**
* Comma-separated list of hosts that should bypass the proxy
* (e.g., 'localhost,127.0.0.1,.example.com')
*/
noProxy?: string;
}
/**
* Creates a fetch function that uses the specified proxy configuration.
*
* This function returns a fetch implementation that routes requests through
* the configured HTTP/HTTPS proxies using undici's ProxyAgent.
*
* Note: This function requires the 'undici' package to be installed.
* Install it with: npm install undici
*
* @param config - Proxy configuration options
* @returns A fetch-compatible function configured to use the specified proxies
*
*/
export function createFetchWithProxy(config: ProxyConfig): FetchLike {
// If no proxy is configured, return the default fetch
if (!config.httpProxy && !config.httpsProxy) {
return fetch;
}
// Parse no_proxy list
const noProxyList = parseNoProxy(config.noProxy);
return async (url: string | URL, init?: RequestInit): Promise<Response> => {
const targetUrl = typeof url === 'string' ? new URL(url) : url;
// Check if host should bypass proxy
if (shouldBypassProxy(targetUrl.hostname, noProxyList)) {
return fetch(url, init);
}
// Determine which proxy to use based on protocol
const proxyUrl = targetUrl.protocol === 'https:' ? config.httpsProxy : config.httpProxy;
if (!proxyUrl) {
// No proxy configured for this protocol
return fetch(url, init);
}
// Use undici for proxy support if available
try {
// Dynamic import - undici is an optional peer dependency
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const undici = await import('undici' as any);
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const ProxyAgent = (undici as any).ProxyAgent;
const dispatcher = new ProxyAgent(proxyUrl);
return fetch(url, {
...init,
// @ts-expect-error - dispatcher is undici-specific
dispatcher,
});
} catch (error) {
// undici not available - throw error requiring installation
throw new Error(
'Proxy support requires the "undici" package. ' +
'Install it with: npm install undici\n' +
`Original error: ${error instanceof Error ? error.message : String(error)}`,
);
}
};
}
/**
* Parses a NO_PROXY environment variable value into a list of patterns.
*/
function parseNoProxy(noProxy?: string): string[] {
if (!noProxy) {
return [];
}
return noProxy
.split(',')
.map((item) => item.trim())
.filter((item) => item.length > 0);
}
/**
* Checks if a hostname should bypass the proxy based on NO_PROXY patterns.
*/
function shouldBypassProxy(hostname: string, noProxyList: string[]): boolean {
if (noProxyList.length === 0) {
return false;
}
const hostnameLower = hostname.toLowerCase();
for (const pattern of noProxyList) {
const patternLower = pattern.toLowerCase();
// Exact match
if (hostnameLower === patternLower) {
return true;
}
// Domain suffix match (e.g., .example.com matches sub.example.com)
if (patternLower.startsWith('.') && hostnameLower.endsWith(patternLower)) {
return true;
}
// Domain suffix match without leading dot
if (!patternLower.startsWith('.') && hostnameLower.endsWith('.' + patternLower)) {
return true;
}
// Special case: "*" matches everything
if (patternLower === '*') {
return true;
}
}
return false;
}
/**
* Creates a ProxyConfig from environment variables.
*
* This function reads standard proxy environment variables:
* - HTTP_PROXY, http_proxy
* - HTTPS_PROXY, https_proxy
* - NO_PROXY, no_proxy
*
* Lowercase versions take precedence over uppercase versions.
*
* @returns A ProxyConfig object populated from environment variables
*/
export function getProxyConfigFromEnv(env: Record<string, string>): ProxyConfig {
return {
httpProxy: env.http_proxy || env.HTTP_PROXY,
httpsProxy: env.https_proxy || env.HTTPS_PROXY,
noProxy: env.no_proxy || env.NO_PROXY,
};
}

View File

@@ -47,30 +47,6 @@ jest.mock('../dao/index.js', () => ({
getSystemConfigDao: jest.fn(() => ({ getSystemConfigDao: jest.fn(() => ({
get: jest.fn().mockImplementation(() => Promise.resolve(currentSystemConfig)), get: jest.fn().mockImplementation(() => Promise.resolve(currentSystemConfig)),
})), })),
getBearerKeyDao: jest.fn(() => ({
// Keep these unit tests aligned with legacy routing semantics:
// enableBearerAuth + bearerAuthKey -> one enabled key (token=bearerAuthKey)
// otherwise -> no enabled keys (bearer auth effectively disabled)
findEnabled: jest.fn().mockImplementation(async () => {
const routing = (currentSystemConfig as any)?.routing || {};
const enabled = !!routing.enableBearerAuth;
const token = String(routing.bearerAuthKey || '').trim();
if (!enabled || !token) {
return [];
}
return [
{
id: 'test-key-id',
name: 'default',
token,
enabled: true,
accessType: 'all',
allowedGroups: [],
allowedServers: [],
},
];
}),
})),
})); }));
// Mock oauthBearer // Mock oauthBearer

View File

@@ -6,10 +6,10 @@ import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/
import { isInitializeRequest } from '@modelcontextprotocol/sdk/types.js'; import { isInitializeRequest } from '@modelcontextprotocol/sdk/types.js';
import { deleteMcpServer, getMcpServer } from './mcpService.js'; import { deleteMcpServer, getMcpServer } from './mcpService.js';
import config from '../config/index.js'; import config from '../config/index.js';
import { getBearerKeyDao, getGroupDao, getServerDao, getSystemConfigDao } from '../dao/index.js'; import { getSystemConfigDao } from '../dao/index.js';
import { UserContextService } from './userContextService.js'; import { UserContextService } from './userContextService.js';
import { RequestContextService } from './requestContextService.js'; import { RequestContextService } from './requestContextService.js';
import { IUser, BearerKey } from '../types/index.js'; import { IUser } from '../types/index.js';
import { resolveOAuthUserFromToken } from '../utils/oauthBearer.js'; import { resolveOAuthUserFromToken } from '../utils/oauthBearer.js';
export const transports: { export const transports: {
@@ -30,187 +30,40 @@ type BearerAuthResult =
reason: 'missing' | 'invalid'; reason: 'missing' | 'invalid';
}; };
/**
* Check if a string is a valid UUID v4 format
*/
const isValidUUID = (str: string): boolean => {
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$/i;
return uuidRegex.test(str);
};
const isBearerKeyAllowedForRequest = async (req: Request, key: BearerKey): Promise<boolean> => {
const paramValue = (req.params as any)?.group as string | undefined;
// accessType 'all' allows all requests
if (key.accessType === 'all') {
return true;
}
// No parameter value means global route
if (!paramValue) {
// Only accessType 'all' allows global routes
return false;
}
try {
const groupDao = getGroupDao();
const serverDao = getServerDao();
// Step 1: Try to match as a group (by name or id), since group has higher priority
let matchedGroup = await groupDao.findByName(paramValue);
if (!matchedGroup && isValidUUID(paramValue)) {
// Only try findById if the parameter is a valid UUID
matchedGroup = await groupDao.findById(paramValue);
}
if (matchedGroup) {
// Matched as a group
if (key.accessType === 'groups') {
// For group-scoped keys, check if the matched group is in allowedGroups
const allowedGroups = key.allowedGroups || [];
return allowedGroups.includes(matchedGroup.name) || allowedGroups.includes(matchedGroup.id);
}
if (key.accessType === 'servers') {
// For server-scoped keys, check if any server in the group is allowed
const allowedServers = key.allowedServers || [];
if (allowedServers.length === 0) {
return false;
}
if (!Array.isArray(matchedGroup.servers)) {
return false;
}
const groupServerNames = matchedGroup.servers.map((server) =>
typeof server === 'string' ? server : server.name,
);
return groupServerNames.some((name) => allowedServers.includes(name));
}
if (key.accessType === 'custom') {
// For custom-scoped keys, check if the group is allowed OR if any server in the group is allowed
const allowedGroups = key.allowedGroups || [];
const allowedServers = key.allowedServers || [];
// Check if the group itself is allowed
const groupAllowed =
allowedGroups.includes(matchedGroup.name) || allowedGroups.includes(matchedGroup.id);
if (groupAllowed) {
return true;
}
// Check if any server in the group is allowed
if (allowedServers.length > 0 && Array.isArray(matchedGroup.servers)) {
const groupServerNames = matchedGroup.servers.map((server) =>
typeof server === 'string' ? server : server.name,
);
return groupServerNames.some((name) => allowedServers.includes(name));
}
return false;
}
// Unknown accessType with matched group
return false;
}
// Step 2: Not a group, try to match as a server name
const matchedServer = await serverDao.findById(paramValue);
if (matchedServer) {
// Matched as a server
if (key.accessType === 'groups') {
// For group-scoped keys, server access is not allowed
return false;
}
if (key.accessType === 'servers' || key.accessType === 'custom') {
// For server-scoped or custom-scoped keys, check if the server is in allowedServers
const allowedServers = key.allowedServers || [];
return allowedServers.includes(matchedServer.name);
}
// Unknown accessType with matched server
return false;
}
// Step 3: Not a valid group or server, deny access
console.warn(
`Bearer key access denied: parameter '${paramValue}' does not match any group or server`,
);
return false;
} catch (error) {
console.error('Error checking bearer key request access:', error);
return false;
}
};
const validateBearerAuth = async (req: Request): Promise<BearerAuthResult> => { const validateBearerAuth = async (req: Request): Promise<BearerAuthResult> => {
const bearerKeyDao = getBearerKeyDao(); const systemConfigDao = getSystemConfigDao();
const enabledKeys = await bearerKeyDao.findEnabled(); const systemConfig = await systemConfigDao.get();
const routingConfig = systemConfig?.routing || {
enableGlobalRoute: true,
enableGroupNameRoute: true,
enableBearerAuth: false,
bearerAuthKey: '',
};
if (routingConfig.enableBearerAuth) {
const authHeader = req.headers.authorization; const authHeader = req.headers.authorization;
const hasBearerHeader = !!authHeader && authHeader.startsWith('Bearer '); if (!authHeader || !authHeader.startsWith('Bearer ')) {
return { valid: false, reason: 'missing' };
// If no enabled keys are configured, bearer auth is effectively disabled.
// We still allow OAuth bearer tokens to attach user context in this case.
if (enabledKeys.length === 0) {
if (!hasBearerHeader) {
return { valid: true };
} }
const token = authHeader!.substring(7).trim(); const token = authHeader.substring(7); // Remove "Bearer " prefix
if (!token) { if (token.trim().length === 0) {
return { valid: false, reason: 'missing' };
}
if (token === routingConfig.bearerAuthKey) {
return { valid: true }; return { valid: true };
} }
const oauthUser = await resolveOAuthUserFromToken(token); const oauthUser = await resolveOAuthUserFromToken(token);
if (oauthUser) { if (oauthUser) {
console.log('Authenticated request using OAuth bearer token without configured keys');
return { valid: true, user: oauthUser }; return { valid: true, user: oauthUser };
} }
// When there are no keys, a non-OAuth bearer token should not block access
return { valid: true };
}
// When keys exist, bearer header is required
if (!hasBearerHeader) {
return { valid: false, reason: 'missing' };
}
const token = authHeader!.substring(7).trim();
if (!token) {
return { valid: false, reason: 'missing' };
}
// First, try to match a configured bearer key
const matchingKey = enabledKeys.find((key) => key.token === token);
if (matchingKey) {
const allowed = await isBearerKeyAllowedForRequest(req, matchingKey);
if (!allowed) {
console.warn(
`Bearer key rejected due to scope restrictions: id=${matchingKey.id}, name=${matchingKey.name}, accessType=${matchingKey.accessType}`,
);
return { valid: false, reason: 'invalid' }; return { valid: false, reason: 'invalid' };
} }
console.log(
`Bearer key authenticated: id=${matchingKey.id}, name=${matchingKey.name}, accessType=${matchingKey.accessType}`,
);
return { valid: true }; return { valid: true };
}
// Fallback: treat token as potential OAuth access token
const oauthUser = await resolveOAuthUserFromToken(token);
if (oauthUser) {
console.log('Authenticated request using OAuth bearer token (no matching static key)');
return { valid: true, user: oauthUser };
}
console.warn('Bearer authentication failed: token did not match any key or OAuth user');
return { valid: false, reason: 'invalid' };
}; };
const attachUserContextFromBearer = (result: BearerAuthResult, res: Response): void => { const attachUserContextFromBearer = (result: BearerAuthResult, res: Response): void => {
@@ -545,9 +398,9 @@ export const handleMcpPostRequest = async (req: Request, res: Response): Promise
// Get filtered settings based on user context (after setting user context) // Get filtered settings based on user context (after setting user context)
const systemConfigDao = getSystemConfigDao(); const systemConfigDao = getSystemConfigDao();
const systemConfig = await systemConfigDao.get(); const systemConfig = await systemConfigDao.get();
const routingConfig = { const routingConfig = systemConfig?.routing || {
enableGlobalRoute: systemConfig?.routing?.enableGlobalRoute ?? true, enableGlobalRoute: true,
enableGroupNameRoute: systemConfig?.routing?.enableGroupNameRoute ?? true, enableGroupNameRoute: true,
}; };
if (!group && !routingConfig.enableGlobalRoute) { if (!group && !routingConfig.enableGlobalRoute) {
res.status(403).send('Global routes are disabled. Please specify a group ID.'); res.status(403).send('Global routes are disabled. Please specify a group ID.');

View File

@@ -1,13 +1,13 @@
import { getRepositoryFactory } from '../db/index.js'; import { getRepositoryFactory } from '../db/index.js';
import { VectorEmbeddingRepository } from '../db/repositories/index.js'; import { VectorEmbeddingRepository } from '../db/repositories/index.js';
import { Tool } from '../types/index.js'; import { Tool } from '../types/index.js';
import { getAppDataSource, isDatabaseConnected, initializeDatabase } from '../db/connection.js'; import { getAppDataSource, initializeDatabase } from '../db/connection.js';
import { getSmartRoutingConfig } from '../utils/smartRouting.js'; import { getSmartRoutingConfig } from '../utils/smartRouting.js';
import OpenAI from 'openai'; import OpenAI from 'openai';
// Get OpenAI configuration from smartRouting settings or fallback to environment variables // Get OpenAI configuration from smartRouting settings or fallback to environment variables
const getOpenAIConfig = async () => { const getOpenAIConfig = () => {
const smartRoutingConfig = await getSmartRoutingConfig(); const smartRoutingConfig = getSmartRoutingConfig();
return { return {
apiKey: smartRoutingConfig.openaiApiKey, apiKey: smartRoutingConfig.openaiApiKey,
baseURL: smartRoutingConfig.openaiApiBaseUrl, baseURL: smartRoutingConfig.openaiApiBaseUrl,
@@ -34,8 +34,8 @@ const getDimensionsForModel = (model: string): number => {
}; };
// Initialize the OpenAI client with smartRouting configuration // Initialize the OpenAI client with smartRouting configuration
const getOpenAIClient = async () => { const getOpenAIClient = () => {
const config = await getOpenAIConfig(); const config = getOpenAIConfig();
return new OpenAI({ return new OpenAI({
apiKey: config.apiKey, // Get API key from smartRouting settings or environment variables apiKey: config.apiKey, // Get API key from smartRouting settings or environment variables
baseURL: config.baseURL, // Get base URL from smartRouting settings or fallback to default baseURL: config.baseURL, // Get base URL from smartRouting settings or fallback to default
@@ -53,8 +53,9 @@ const getOpenAIClient = async () => {
* @returns Promise with vector embedding as number array * @returns Promise with vector embedding as number array
*/ */
async function generateEmbedding(text: string): Promise<number[]> { async function generateEmbedding(text: string): Promise<number[]> {
const config = await getOpenAIConfig(); try {
const openai = await getOpenAIClient(); const config = getOpenAIConfig();
const openai = getOpenAIClient();
// Check if API key is configured // Check if API key is configured
if (!openai.apiKey) { if (!openai.apiKey) {
@@ -73,6 +74,11 @@ async function generateEmbedding(text: string): Promise<number[]> {
// Return the embedding // Return the embedding
return response.data[0].embedding; return response.data[0].embedding;
} catch (error) {
console.error('Error generating embedding:', error);
console.warn('Falling back to simple embedding method');
return generateFallbackEmbedding(text);
}
} }
/** /**
@@ -192,18 +198,12 @@ export const saveToolsAsVectorEmbeddings = async (
return; return;
} }
const smartRoutingConfig = await getSmartRoutingConfig(); const smartRoutingConfig = getSmartRoutingConfig();
if (!smartRoutingConfig.enabled) { if (!smartRoutingConfig.enabled) {
return; return;
} }
// Ensure database is initialized before using repository const config = getOpenAIConfig();
if (!isDatabaseConnected()) {
console.info('Database not initialized, initializing...');
await initializeDatabase();
}
const config = await getOpenAIConfig();
const vectorRepository = getRepositoryFactory( const vectorRepository = getRepositoryFactory(
'vectorEmbeddings', 'vectorEmbeddings',
)() as VectorEmbeddingRepository; )() as VectorEmbeddingRepository;
@@ -227,6 +227,7 @@ export const saveToolsAsVectorEmbeddings = async (
.filter(Boolean) .filter(Boolean)
.join(' '); .join(' ');
try {
// Generate embedding // Generate embedding
const embedding = await generateEmbedding(searchableText); const embedding = await generateEmbedding(searchableText);
@@ -247,11 +248,15 @@ export const saveToolsAsVectorEmbeddings = async (
}, },
config.embeddingModel, // Store the model used for this embedding config.embeddingModel, // Store the model used for this embedding
); );
} catch (toolError) {
console.error(`Error processing tool ${tool.name} for server ${serverName}:`, toolError);
// Continue with the next tool rather than failing the whole batch
}
} }
console.log(`Saved ${tools.length} tool embeddings for server: ${serverName}`); console.log(`Saved ${tools.length} tool embeddings for server: ${serverName}`);
} catch (error) { } catch (error) {
console.error(`Error saving tool embeddings for server ${serverName}:${error}`); console.error(`Error saving tool embeddings for server ${serverName}:`, error);
} }
}; };
@@ -376,7 +381,7 @@ export const getAllVectorizedTools = async (
}> }>
> => { > => {
try { try {
const config = await getOpenAIConfig(); const config = getOpenAIConfig();
const vectorRepository = getRepositoryFactory( const vectorRepository = getRepositoryFactory(
'vectorEmbeddings', 'vectorEmbeddings',
)() as VectorEmbeddingRepository; )() as VectorEmbeddingRepository;

View File

@@ -243,19 +243,6 @@ export interface OAuthServerConfig {
}; };
} }
// Bearer authentication key configuration
export type BearerKeyAccessType = 'all' | 'groups' | 'servers' | 'custom';
export interface BearerKey {
id: string; // Unique identifier for the key
name: string; // Human readable key name
token: string; // Bearer token value
enabled: boolean; // Whether this key is enabled
accessType: BearerKeyAccessType; // Access scope type
allowedGroups?: string[]; // Allowed group names when accessType === 'groups' or 'custom'
allowedServers?: string[]; // Allowed server names when accessType === 'servers' or 'custom'
}
// Represents the settings for MCP servers // Represents the settings for MCP servers
export interface McpSettings { export interface McpSettings {
users?: IUser[]; // Array of user credentials and permissions users?: IUser[]; // Array of user credentials and permissions
@@ -267,7 +254,6 @@ export interface McpSettings {
userConfigs?: Record<string, UserConfig>; // User-specific configurations userConfigs?: Record<string, UserConfig>; // User-specific configurations
oauthClients?: IOAuthClient[]; // OAuth clients for MCPHub's authorization server oauthClients?: IOAuthClient[]; // OAuth clients for MCPHub's authorization server
oauthTokens?: IOAuthToken[]; // Persisted OAuth tokens (access + refresh) for authorization server oauthTokens?: IOAuthToken[]; // Persisted OAuth tokens (access + refresh) for authorization server
bearerKeys?: BearerKey[]; // Bearer authentication keys (multi-key configuration)
} }
// Configuration details for an individual server // Configuration details for an individual server
@@ -434,98 +420,3 @@ export interface AddServerRequest {
name: string; // Name of the server to add name: string; // Name of the server to add
config: ServerConfig; // Configuration details for the server config: ServerConfig; // Configuration details for the server
} }
// Request payload for batch creating servers
export interface BatchCreateServersRequest {
servers: AddServerRequest[]; // Array of servers to create
}
// Result for a single server in batch operation
export interface BatchServerResult {
name: string; // Server name
success: boolean; // Whether the operation succeeded
message?: string; // Error message if failed
}
// Response for batch create servers operation
export interface BatchCreateServersResponse {
success: boolean; // Overall operation success (true if at least one server succeeded)
successCount: number; // Number of servers successfully created
failureCount: number; // Number of servers that failed
results: BatchServerResult[]; // Detailed results for each server
}
// Request payload for adding a new group
export interface AddGroupRequest {
name: string; // Name of the group to add
description?: string; // Optional description of the group
servers?: string[] | IGroupServerConfig[]; // Array of server names or server configurations
}
// Request payload for batch creating groups
export interface BatchCreateGroupsRequest {
groups: AddGroupRequest[]; // Array of groups to create
}
// Result for a single group in batch operation
export interface BatchGroupResult {
name: string; // Group name
success: boolean; // Whether the operation succeeded
message?: string; // Error message if failed
}
// Response for batch create groups operation
export interface BatchCreateGroupsResponse {
success: boolean; // Overall operation success (true if at least one group succeeded)
successCount: number; // Number of groups successfully created
failureCount: number; // Number of groups that failed
results: BatchGroupResult[]; // Detailed results for each group
}
// Tool call activity interface for logging tool invocations (DB mode only)
export interface IToolCallActivity {
id?: string;
serverName: string;
toolName: string;
keyId?: string;
keyName?: string;
status: 'pending' | 'success' | 'error';
request?: string;
response?: string;
errorMessage?: string;
durationMs?: number;
clientIp?: string;
sessionId?: string;
groupName?: string;
createdAt?: Date;
}
// Tool call activity search parameters
export interface IToolCallActivitySearchParams {
serverName?: string;
toolName?: string;
keyId?: string;
status?: 'pending' | 'success' | 'error';
groupName?: string;
startDate?: Date;
endDate?: Date;
searchQuery?: string;
}
// Tool call activity pagination result
export interface IToolCallActivityPage {
items: IToolCallActivity[];
total: number;
page: number;
pageSize: number;
totalPages: number;
}
// Tool call activity statistics
export interface IToolCallActivityStats {
total: number;
success: number;
error: number;
pending: number;
avgDurationMs: number;
}

View File

@@ -1,122 +0,0 @@
import { jest } from '@jest/globals';
// Mocks must be defined before importing the module under test.
const initializeDatabaseMock = jest.fn(async () => undefined);
jest.mock('../db/connection.js', () => ({
initializeDatabase: initializeDatabaseMock,
}));
const setDaoFactoryMock = jest.fn();
jest.mock('../dao/DaoFactory.js', () => ({
setDaoFactory: setDaoFactoryMock,
}));
jest.mock('../dao/DatabaseDaoFactory.js', () => ({
DatabaseDaoFactory: {
getInstance: jest.fn(() => ({
/* noop */
})),
},
}));
const loadOriginalSettingsMock = jest.fn(() => ({ users: [] }));
jest.mock('../config/index.js', () => ({
loadOriginalSettings: loadOriginalSettingsMock,
}));
const userRepoCountMock = jest.fn<() => Promise<number>>();
jest.mock('../db/repositories/UserRepository.js', () => ({
UserRepository: jest.fn().mockImplementation(() => ({
count: userRepoCountMock,
})),
}));
const bearerKeyCountMock = jest.fn<() => Promise<number>>();
const bearerKeyCreateMock =
jest.fn<
(data: {
name: string;
token: string;
enabled: boolean;
accessType: string;
allowedGroups: string[];
allowedServers: string[];
}) => Promise<unknown>
>();
jest.mock('../db/repositories/BearerKeyRepository.js', () => ({
BearerKeyRepository: jest.fn().mockImplementation(() => ({
count: bearerKeyCountMock,
create: bearerKeyCreateMock,
})),
}));
const systemConfigGetMock = jest.fn<() => Promise<any>>();
jest.mock('../db/repositories/SystemConfigRepository.js', () => ({
SystemConfigRepository: jest.fn().mockImplementation(() => ({
get: systemConfigGetMock,
})),
}));
describe('initializeDatabaseMode legacy bearer auth migration', () => {
beforeEach(() => {
jest.clearAllMocks();
});
it('skips legacy migration when bearerKeys table already has data', async () => {
userRepoCountMock.mockResolvedValue(1);
bearerKeyCountMock.mockResolvedValue(2);
systemConfigGetMock.mockResolvedValue({
routing: { enableBearerAuth: true, bearerAuthKey: 'db-key' },
});
const { initializeDatabaseMode } = await import('./migration.js');
const ok = await initializeDatabaseMode();
expect(ok).toBe(true);
expect(initializeDatabaseMock).toHaveBeenCalled();
expect(loadOriginalSettingsMock).not.toHaveBeenCalled();
expect(systemConfigGetMock).not.toHaveBeenCalled();
expect(bearerKeyCreateMock).not.toHaveBeenCalled();
});
it('migrates legacy routing bearerAuthKey into bearerKeys when users exist and keys table is empty', async () => {
userRepoCountMock.mockResolvedValue(3);
bearerKeyCountMock.mockResolvedValue(0);
systemConfigGetMock.mockResolvedValue({
routing: { enableBearerAuth: true, bearerAuthKey: 'db-key' },
});
const { initializeDatabaseMode } = await import('./migration.js');
const ok = await initializeDatabaseMode();
expect(ok).toBe(true);
expect(loadOriginalSettingsMock).not.toHaveBeenCalled();
expect(systemConfigGetMock).toHaveBeenCalledTimes(1);
expect(bearerKeyCreateMock).toHaveBeenCalledTimes(1);
expect(bearerKeyCreateMock).toHaveBeenCalledWith({
name: 'default',
token: 'db-key',
enabled: true,
accessType: 'all',
allowedGroups: [],
allowedServers: [],
});
});
it('does not migrate when routing has no bearerAuthKey', async () => {
userRepoCountMock.mockResolvedValue(1);
bearerKeyCountMock.mockResolvedValue(0);
systemConfigGetMock.mockResolvedValue({
routing: { enableBearerAuth: true, bearerAuthKey: ' ' },
});
const { initializeDatabaseMode } = await import('./migration.js');
const ok = await initializeDatabaseMode();
expect(ok).toBe(true);
expect(loadOriginalSettingsMock).not.toHaveBeenCalled();
expect(systemConfigGetMock).toHaveBeenCalledTimes(1);
expect(bearerKeyCreateMock).not.toHaveBeenCalled();
});
});

View File

@@ -9,7 +9,6 @@ import { SystemConfigRepository } from '../db/repositories/SystemConfigRepositor
import { UserConfigRepository } from '../db/repositories/UserConfigRepository.js'; import { UserConfigRepository } from '../db/repositories/UserConfigRepository.js';
import { OAuthClientRepository } from '../db/repositories/OAuthClientRepository.js'; import { OAuthClientRepository } from '../db/repositories/OAuthClientRepository.js';
import { OAuthTokenRepository } from '../db/repositories/OAuthTokenRepository.js'; import { OAuthTokenRepository } from '../db/repositories/OAuthTokenRepository.js';
import { BearerKeyRepository } from '../db/repositories/BearerKeyRepository.js';
/** /**
* Migrate from file-based configuration to database * Migrate from file-based configuration to database
@@ -34,7 +33,6 @@ export async function migrateToDatabase(): Promise<boolean> {
const userConfigRepo = new UserConfigRepository(); const userConfigRepo = new UserConfigRepository();
const oauthClientRepo = new OAuthClientRepository(); const oauthClientRepo = new OAuthClientRepository();
const oauthTokenRepo = new OAuthTokenRepository(); const oauthTokenRepo = new OAuthTokenRepository();
const bearerKeyRepo = new BearerKeyRepository();
// Migrate users // Migrate users
if (settings.users && settings.users.length > 0) { if (settings.users && settings.users.length > 0) {
@@ -77,7 +75,6 @@ export async function migrateToDatabase(): Promise<boolean> {
prompts: config.prompts, prompts: config.prompts,
options: config.options, options: config.options,
oauth: config.oauth, oauth: config.oauth,
openapi: config.openapi,
}); });
console.log(` - Created server: ${name}`); console.log(` - Created server: ${name}`);
} else { } else {
@@ -122,52 +119,6 @@ export async function migrateToDatabase(): Promise<boolean> {
console.log(' - System configuration updated'); console.log(' - System configuration updated');
} }
// Migrate bearer auth keys
console.log('Migrating bearer authentication keys...');
// Prefer explicit bearerKeys if present in settings
if (Array.isArray(settings.bearerKeys) && settings.bearerKeys.length > 0) {
for (const key of settings.bearerKeys) {
await bearerKeyRepo.create({
name: key.name,
token: key.token,
enabled: key.enabled,
accessType: key.accessType,
allowedGroups: key.allowedGroups ?? [],
allowedServers: key.allowedServers ?? [],
} as any);
console.log(` - Migrated bearer key: ${key.name} (${key.id ?? 'no-id'})`);
}
} else if (settings.systemConfig?.routing) {
// Fallback to legacy routing.enableBearerAuth / bearerAuthKey
const routing = settings.systemConfig.routing as any;
const enableBearerAuth: boolean = !!routing.enableBearerAuth;
const rawKey: string = (routing.bearerAuthKey || '').trim();
// Migration rules:
// 1) enable=false, key empty -> no keys
// 2) enable=false, key present -> one disabled key (name=default)
// 3) enable=true, key present -> one enabled key (name=default)
// 4) enable=true, key empty -> no keys
if (rawKey) {
await bearerKeyRepo.create({
name: 'default',
token: rawKey,
enabled: enableBearerAuth,
accessType: 'all',
allowedGroups: [],
allowedServers: [],
} as any);
console.log(
` - Migrated legacy bearer auth config to key: default (enabled=${enableBearerAuth})`,
);
} else {
console.log(' - No legacy bearer auth key found, skipping bearer key migration');
}
} else {
console.log(' - No bearer auth configuration found, skipping bearer key migration');
}
// Migrate user configs // Migrate user configs
if (settings.userConfigs) { if (settings.userConfigs) {
const usernames = Object.keys(settings.userConfigs); const usernames = Object.keys(settings.userConfigs);
@@ -255,9 +206,6 @@ export async function initializeDatabaseMode(): Promise<boolean> {
// Check if migration is needed // Check if migration is needed
const userRepo = new UserRepository(); const userRepo = new UserRepository();
const bearerKeyRepo = new BearerKeyRepository();
const systemConfigRepo = new SystemConfigRepository();
const userCount = await userRepo.count(); const userCount = await userRepo.count();
if (userCount === 0) { if (userCount === 0) {
@@ -268,36 +216,6 @@ export async function initializeDatabaseMode(): Promise<boolean> {
} }
} else { } else {
console.log(`Database already contains ${userCount} users, skipping migration`); console.log(`Database already contains ${userCount} users, skipping migration`);
// One-time migration for legacy bearer auth config stored inside DB routing settings.
// If bearerKeys table already has data, do nothing.
const bearerKeyCount = await bearerKeyRepo.count();
if (bearerKeyCount > 0) {
console.log(
`Bearer keys table already contains ${bearerKeyCount} keys, skipping legacy bearer auth migration`,
);
} else {
const systemConfig = await systemConfigRepo.get();
const routing = (systemConfig as any)?.routing || {};
const enableBearerAuth: boolean = !!routing.enableBearerAuth;
const rawKey: string = (routing.bearerAuthKey || '').trim();
if (rawKey) {
await bearerKeyRepo.create({
name: 'default',
token: rawKey,
enabled: enableBearerAuth,
accessType: 'all',
allowedGroups: [],
allowedServers: [],
} as any);
console.log(
` - Migrated legacy DB routing bearer auth config to key: default (enabled=${enableBearerAuth})`,
);
} else {
console.log('No legacy DB routing bearer auth key found, skipping bearer key migration');
}
}
} }
console.log('✅ Database mode initialized successfully'); console.log('✅ Database mode initialized successfully');

View File

@@ -1,5 +1,4 @@
import { expandEnvVars } from '../config/index.js'; import { loadSettings, expandEnvVars } from '../config/index.js';
import { getSystemConfigDao } from '../dao/DaoFactory.js';
/** /**
* Smart routing configuration interface * Smart routing configuration interface
@@ -23,11 +22,10 @@ export interface SmartRoutingConfig {
* *
* @returns {SmartRoutingConfig} Complete smart routing configuration * @returns {SmartRoutingConfig} Complete smart routing configuration
*/ */
export async function getSmartRoutingConfig(): Promise<SmartRoutingConfig> { export function getSmartRoutingConfig(): SmartRoutingConfig {
// Get system config from DAO const settings = loadSettings();
const systemConfigDao = getSystemConfigDao(); const smartRoutingSettings: Partial<SmartRoutingConfig> =
const systemConfig = await systemConfigDao.get(); settings.systemConfig?.smartRouting || {};
const smartRoutingSettings: Partial<SmartRoutingConfig> = systemConfig.smartRouting || {};
return { return {
// Enabled status - check multiple environment variables // Enabled status - check multiple environment variables

View File

@@ -1,7 +1,11 @@
import { getMcpSettingsJson } from '../../src/controllers/configController.js'; import { getMcpSettingsJson } from '../../src/controllers/configController.js';
import * as config from '../../src/config/index.js';
import * as DaoFactory from '../../src/dao/DaoFactory.js'; import * as DaoFactory from '../../src/dao/DaoFactory.js';
import { Request, Response } from 'express'; import { Request, Response } from 'express';
// Mock the config module
jest.mock('../../src/config/index.js');
// Mock the DaoFactory module
jest.mock('../../src/dao/DaoFactory.js'); jest.mock('../../src/dao/DaoFactory.js');
describe('ConfigController - getMcpSettingsJson', () => { describe('ConfigController - getMcpSettingsJson', () => {
@@ -9,18 +13,9 @@ describe('ConfigController - getMcpSettingsJson', () => {
let mockResponse: Partial<Response>; let mockResponse: Partial<Response>;
let mockJson: jest.Mock; let mockJson: jest.Mock;
let mockStatus: jest.Mock; let mockStatus: jest.Mock;
let mockServerDao: { findById: jest.Mock; findAll: jest.Mock }; let mockServerDao: { findById: jest.Mock };
let mockUserDao: { findAll: jest.Mock };
let mockGroupDao: { findAll: jest.Mock };
let mockSystemConfigDao: { get: jest.Mock };
let mockUserConfigDao: { getAll: jest.Mock };
let mockOAuthClientDao: { findAll: jest.Mock };
let mockOAuthTokenDao: { findAll: jest.Mock };
let mockBearerKeyDao: { findAll: jest.Mock };
beforeEach(() => { beforeEach(() => {
jest.clearAllMocks();
mockJson = jest.fn(); mockJson = jest.fn();
mockStatus = jest.fn().mockReturnThis(); mockStatus = jest.fn().mockReturnThis();
mockRequest = { mockRequest = {
@@ -30,28 +25,40 @@ describe('ConfigController - getMcpSettingsJson', () => {
json: mockJson, json: mockJson,
status: mockStatus, status: mockStatus,
}; };
mockServerDao = { mockServerDao = {
findById: jest.fn(), findById: jest.fn(),
findAll: jest.fn(),
}; };
mockUserDao = { findAll: jest.fn() };
mockGroupDao = { findAll: jest.fn() };
mockSystemConfigDao = { get: jest.fn() };
mockUserConfigDao = { getAll: jest.fn() };
mockOAuthClientDao = { findAll: jest.fn() };
mockOAuthTokenDao = { findAll: jest.fn() };
mockBearerKeyDao = { findAll: jest.fn() };
// Wire DaoFactory convenience functions to our mocks // Setup ServerDao mock
(DaoFactory.getServerDao as unknown as jest.Mock).mockReturnValue(mockServerDao); (DaoFactory.getServerDao as jest.Mock).mockReturnValue(mockServerDao);
(DaoFactory.getUserDao as unknown as jest.Mock).mockReturnValue(mockUserDao);
(DaoFactory.getGroupDao as unknown as jest.Mock).mockReturnValue(mockGroupDao); // Reset mocks
(DaoFactory.getSystemConfigDao as unknown as jest.Mock).mockReturnValue(mockSystemConfigDao); jest.clearAllMocks();
(DaoFactory.getUserConfigDao as unknown as jest.Mock).mockReturnValue(mockUserConfigDao); });
(DaoFactory.getOAuthClientDao as unknown as jest.Mock).mockReturnValue(mockOAuthClientDao);
(DaoFactory.getOAuthTokenDao as unknown as jest.Mock).mockReturnValue(mockOAuthTokenDao); describe('Full Settings Export', () => {
(DaoFactory.getBearerKeyDao as unknown as jest.Mock).mockReturnValue(mockBearerKeyDao); it('should handle settings without users array', async () => {
const mockSettings = {
mcpServers: {
'test-server': {
command: 'test',
args: ['--test'],
},
},
};
(config.loadOriginalSettings as jest.Mock).mockReturnValue(mockSettings);
await getMcpSettingsJson(mockRequest as Request, mockResponse as Response);
expect(mockJson).toHaveBeenCalledWith({
success: true,
data: {
mcpServers: mockSettings.mcpServers,
users: undefined,
},
});
});
}); });
describe('Individual Server Export', () => { describe('Individual Server Export', () => {
@@ -139,14 +146,10 @@ describe('ConfigController - getMcpSettingsJson', () => {
describe('Error Handling', () => { describe('Error Handling', () => {
it('should handle errors gracefully and return 500', async () => { it('should handle errors gracefully and return 500', async () => {
mockServerDao.findAll.mockRejectedValue(new Error('boom')); const errorMessage = 'Failed to load settings';
mockUserDao.findAll.mockResolvedValue([]); (config.loadOriginalSettings as jest.Mock).mockImplementation(() => {
mockGroupDao.findAll.mockResolvedValue([]); throw new Error(errorMessage);
mockSystemConfigDao.get.mockResolvedValue({}); });
mockUserConfigDao.getAll.mockResolvedValue({});
mockOAuthClientDao.findAll.mockResolvedValue([]);
mockOAuthTokenDao.findAll.mockResolvedValue([]);
mockBearerKeyDao.findAll.mockResolvedValue([]);
await getMcpSettingsJson(mockRequest as Request, mockResponse as Response); await getMcpSettingsJson(mockRequest as Request, mockResponse as Response);

View File

@@ -1,97 +0,0 @@
import fs from 'fs';
import os from 'os';
import path from 'path';
import { BearerKeyDaoImpl } from '../../src/dao/BearerKeyDao.js';
const writeSettings = (settingsPath: string, settings: unknown): void => {
fs.writeFileSync(settingsPath, JSON.stringify(settings, null, 2), 'utf8');
};
describe('BearerKeyDaoImpl migration + settings caching behavior', () => {
let tmpDir: string;
let settingsPath: string;
let originalSettingsEnv: string | undefined;
beforeEach(() => {
tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'mcphub-bearer-keys-'));
settingsPath = path.join(tmpDir, 'mcp_settings.json');
originalSettingsEnv = process.env.MCPHUB_SETTING_PATH;
process.env.MCPHUB_SETTING_PATH = settingsPath;
});
afterEach(() => {
if (originalSettingsEnv === undefined) {
delete process.env.MCPHUB_SETTING_PATH;
} else {
process.env.MCPHUB_SETTING_PATH = originalSettingsEnv;
}
try {
fs.rmSync(tmpDir, { recursive: true, force: true });
} catch {
// ignore cleanup errors
}
});
it('does not rewrite settings when bearerKeys exists as an empty array', async () => {
writeSettings(settingsPath, {
mcpServers: {},
users: [],
systemConfig: {
routing: {
enableBearerAuth: false,
bearerAuthKey: '',
},
},
bearerKeys: [],
});
const writeSpy = jest.spyOn(fs, 'writeFileSync');
const dao = new BearerKeyDaoImpl();
const enabled1 = await dao.findEnabled();
const enabled2 = await dao.findEnabled();
expect(enabled1).toEqual([]);
expect(enabled2).toEqual([]);
// The DAO should NOT persist anything because bearerKeys already exists.
expect(writeSpy).not.toHaveBeenCalled();
writeSpy.mockRestore();
});
it('migrates legacy bearerAuthKey only once', async () => {
writeSettings(settingsPath, {
mcpServers: {},
users: [],
systemConfig: {
routing: {
enableBearerAuth: true,
bearerAuthKey: 'legacy-token',
},
},
// bearerKeys is intentionally missing to trigger migration
});
const writeSpy = jest.spyOn(fs, 'writeFileSync');
const dao = new BearerKeyDaoImpl();
const enabled1 = await dao.findEnabled();
expect(enabled1).toHaveLength(1);
expect(enabled1[0].token).toBe('legacy-token');
expect(enabled1[0].enabled).toBe(true);
const enabled2 = await dao.findEnabled();
expect(enabled2).toHaveLength(1);
expect(enabled2[0].token).toBe('legacy-token');
// One write for the migration, no further writes on subsequent reads.
expect(writeSpy).toHaveBeenCalledTimes(1);
writeSpy.mockRestore();
});
});

View File

@@ -31,28 +31,14 @@ jest.mock('../../src/utils/oauthBearer.js', () => ({
resolveOAuthUserFromToken: jest.fn(), resolveOAuthUserFromToken: jest.fn(),
})); }));
// Mock DAO accessors used by sseService (avoid file-based DAOs and migrations)
jest.mock('../../src/dao/index.js', () => ({
getBearerKeyDao: jest.fn(),
getGroupDao: jest.fn(),
getSystemConfigDao: jest.fn(),
}));
// Mock config module default export used by sseService
jest.mock('../../src/config/index.js', () => ({
__esModule: true,
default: { basePath: '' },
loadSettings: jest.fn(),
}));
import { Request, Response } from 'express'; import { Request, Response } from 'express';
import { handleSseConnection, transports } from '../../src/services/sseService.js'; import { handleSseConnection, transports } from '../../src/services/sseService.js';
import * as mcpService from '../../src/services/mcpService.js'; import * as mcpService from '../../src/services/mcpService.js';
import * as configModule from '../../src/config/index.js'; import * as configModule from '../../src/config/index.js';
import * as daoIndex from '../../src/dao/index.js';
// Mock remaining dependencies // Mock remaining dependencies
jest.mock('../../src/services/mcpService.js'); jest.mock('../../src/services/mcpService.js');
jest.mock('../../src/config/index.js');
// Mock UserContextService with getInstance pattern // Mock UserContextService with getInstance pattern
const mockUserContextService = { const mockUserContextService = {
@@ -155,24 +141,6 @@ describe('Keepalive Functionality', () => {
}; };
(mcpService.getMcpServer as jest.Mock).mockReturnValue(mockMcpServer); (mcpService.getMcpServer as jest.Mock).mockReturnValue(mockMcpServer);
// Mock bearer key + system config DAOs used by sseService
const mockBearerKeyDao = {
findEnabled: jest.fn().mockResolvedValue([]),
};
(daoIndex.getBearerKeyDao as unknown as jest.Mock).mockReturnValue(mockBearerKeyDao);
const mockSystemConfigDao = {
get: jest.fn().mockResolvedValue({
routing: {
enableGlobalRoute: true,
enableGroupNameRoute: true,
enableBearerAuth: false,
bearerAuthKey: '',
},
}),
};
(daoIndex.getSystemConfigDao as unknown as jest.Mock).mockReturnValue(mockSystemConfigDao);
// Mock loadSettings // Mock loadSettings
(configModule.loadSettings as jest.Mock).mockReturnValue({ (configModule.loadSettings as jest.Mock).mockReturnValue({
systemConfig: { systemConfig: {