Generate comprehensive GitHub Copilot instructions for MCPHub development (#314)

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: samanhappy <2755122+samanhappy@users.noreply.github.com>
This commit is contained in:
Copilot
2025-08-27 21:58:06 +08:00
committed by GitHub
parent f577351f04
commit 5dd3e7978e
13 changed files with 395 additions and 174 deletions

View File

@@ -1,50 +1,237 @@
# MCPHub Coding Instructions
**ALWAYS follow these instructions first and only fallback to additional search and context gathering if the information here is incomplete or found to be in error.**
## Project Overview
MCPHub is a TypeScript/Node.js MCP server management hub that provides unified access through HTTP endpoints.
MCPHub is a TypeScript/Node.js MCP (Model Context Protocol) server management hub that provides unified access through HTTP endpoints. It serves as a centralized dashboard for managing multiple MCP servers with real-time monitoring, authentication, and flexible routing.
**Core Components:**
- **Backend**: Express.js + TypeScript + ESM (`src/server.ts`)
- **Frontend**: React/Vite + Tailwind CSS (`frontend/`)
- **MCP Integration**: Connects multiple MCP servers (`src/services/mcpService.ts`)
- **Authentication**: JWT-based with bcrypt password hashing
- **Configuration**: JSON-based MCP server definitions (`mcp_settings.json`)
## Development Environment
## Working Effectively
### Bootstrap and Setup (CRITICAL - Follow Exact Steps)
```bash
# Install pnpm if not available
npm install -g pnpm
# Install dependencies - takes ~30 seconds
pnpm install
pnpm dev # Start both backend and frontend
pnpm backend:dev # Backend only
pnpm frontend:dev # Frontend only
# Setup environment (optional)
cp .env.example .env
# Build and test to verify setup
pnpm lint # ~3 seconds - NEVER CANCEL
pnpm backend:build # ~5 seconds - NEVER CANCEL
pnpm test:ci # ~16 seconds - NEVER CANCEL. Set timeout to 60+ seconds
pnpm frontend:build # ~5 seconds - NEVER CANCEL
pnpm build # ~10 seconds total - NEVER CANCEL. Set timeout to 60+ seconds
```
## Project Conventions
**CRITICAL TIMING**: These commands are fast but NEVER CANCEL them. Always wait for completion.
### File Structure
### Development Environment
- `src/services/` - Core business logic
- `src/controllers/` - HTTP request handlers
- `src/types/index.ts` - TypeScript type definitions
```bash
# Start both backend and frontend (recommended for most development)
pnpm dev # Backend on :3001, Frontend on :5173
# OR start separately (required on Windows, optional on Linux/macOS)
# Terminal 1: Backend only
pnpm backend:dev # Runs on port 3000 (or PORT env var)
# Terminal 2: Frontend only
pnpm frontend:dev # Runs on port 5173, proxies API to backend
```
**NEVER CANCEL**: Development servers may take 10-15 seconds to fully initialize all MCP servers.
### Build Commands (Production)
```bash
# Full production build - takes ~10 seconds total
pnpm build # NEVER CANCEL - Set timeout to 60+ seconds
# Individual builds
pnpm backend:build # TypeScript compilation - ~5 seconds
pnpm frontend:build # Vite build - ~5 seconds
# Start production server
pnpm start # Requires dist/ and frontend/dist/ to exist
```
### Testing and Validation
```bash
# Run all tests - takes ~16 seconds with 73 tests
pnpm test:ci # NEVER CANCEL - Set timeout to 60+ seconds
# Development testing
pnpm test # Interactive mode
pnpm test:watch # Watch mode for development
pnpm test:coverage # With coverage report
# Code quality
pnpm lint # ESLint - ~3 seconds
pnpm format # Prettier formatting - ~3 seconds
```
**CRITICAL**: All tests MUST pass before committing. Do not modify tests to make them pass unless specifically required for your changes.
## Manual Validation Requirements
**ALWAYS perform these validation steps after making changes:**
### 1. Basic Application Functionality
```bash
# Start the application
pnpm dev
# Verify backend responds (in another terminal)
curl http://localhost:3000/api/health
# Expected: Should return health status
# Verify frontend serves
curl -I http://localhost:3000/
# Expected: HTTP 200 OK with HTML content
```
### 2. MCP Server Integration Test
```bash
# Check MCP servers are loading (look for log messages)
# Expected log output should include:
# - "Successfully connected client for server: [name]"
# - "Successfully listed [N] tools for server: [name]"
# - Some servers may fail due to missing API keys (normal in dev)
```
### 3. Build Verification
```bash
# Verify production build works
pnpm build
node scripts/verify-dist.js
# Expected: "✅ Verification passed! Frontend and backend dist files are present."
```
**NEVER skip these validation steps**. If any fail, debug and fix before proceeding.
## Project Structure and Key Files
### Critical Backend Files
- `src/index.ts` - Application entry point
- `src/server.ts` - Express server setup and middleware
- `src/services/mcpService.ts` - **Core MCP server management logic**
- `src/config/index.ts` - Configuration management
- `src/routes/` - HTTP route definitions
- `src/controllers/` - HTTP request handlers
- `src/dao/` - Data access layer for users, groups, servers
- `src/types/index.ts` - TypeScript type definitions
### Key Notes
### Critical Frontend Files
- `frontend/src/` - React application source
- `frontend/src/pages/` - Page components (development entry point)
- `frontend/src/components/` - Reusable UI components
- Use ESM modules: Import with `.js` extensions, not `.ts`
- Configuration file: `mcp_settings.json`
- Endpoint formats: `/mcp/{group|server}` and `/mcp/$smart`
- All code comments must be written in English
- Frontend uses i18n with resource files in `locales/` folder
- Server-side code should use appropriate abstraction layers for extensibility and replaceability
### Configuration Files
- `mcp_settings.json` - **MCP server definitions and user accounts**
- `package.json` - Dependencies and scripts
- `tsconfig.json` - TypeScript configuration
- `jest.config.cjs` - Test configuration
- `.eslintrc.json` - Linting rules
## Development Process
### Docker and Deployment
- `Dockerfile` - Multi-stage build with Python base + Node.js
- `entrypoint.sh` - Docker startup script
- `bin/cli.js` - NPM package CLI entry point
- For complex features, implement step by step and wait for confirmation before proceeding to the next step
- After implementing features, no separate summary documentation is needed - update README.md and README.zh.md as appropriate
## Development Process and Conventions
### Code Style Requirements
- **ESM modules**: Always use `.js` extensions in imports, not `.ts`
- **English only**: All code comments must be written in English
- **TypeScript strict**: Follow strict type checking rules
- **Import style**: `import { something } from './file.js'` (note .js extension)
### Key Configuration Notes
- **MCP servers**: Defined in `mcp_settings.json` with command/args
- **Endpoints**: `/mcp/{group|server}` and `/mcp/$smart` for routing
- **i18n**: Frontend uses react-i18next with files in `locales/` folder
- **Authentication**: JWT tokens with bcrypt password hashing
- **Default credentials**: admin/admin123 (configured in mcp_settings.json)
### Development Entry Points
- **Add MCP server**: Modify `mcp_settings.json` and restart
- **New API endpoint**: Add route in `src/routes/`, controller in `src/controllers/`
- **Frontend feature**: Start from `frontend/src/pages/` or `frontend/src/components/`
- **Add tests**: Follow patterns in `tests/` directory
- **MCP Servers**: Modify `src/services/mcpService.ts`
- **API Endpoints**: Add routes in `src/routes/`, controllers in `src/controllers/`
- **Frontend Features**: Start from `frontend/src/pages/`
- **Testing**: Follow existing patterns in `tests/`
### Common Development Tasks
#### Adding a new MCP server:
1. Add server definition to `mcp_settings.json`
2. Restart backend to load new server
3. Check logs for successful connection
4. Test via dashboard or API endpoints
#### API development:
1. Define route in `src/routes/`
2. Implement controller in `src/controllers/`
3. Add types in `src/types/index.ts` if needed
4. Write tests in `tests/controllers/`
#### Frontend development:
1. Create/modify components in `frontend/src/components/`
2. Add pages in `frontend/src/pages/`
3. Update routing if needed
4. Test in development mode with `pnpm frontend:dev`
## Validation and CI Requirements
### Before Committing - ALWAYS Run:
```bash
pnpm lint # Must pass - ~3 seconds
pnpm backend:build # Must compile - ~5 seconds
pnpm test:ci # All tests must pass - ~16 seconds
pnpm build # Full build must work - ~10 seconds
```
**CRITICAL**: CI will fail if any of these commands fail. Fix issues locally first.
### CI Pipeline (.github/workflows/ci.yml)
- Runs on Node.js 20.x
- Tests: linting, type checking, unit tests with coverage
- **NEVER CANCEL**: CI builds may take 2-3 minutes total
## Troubleshooting
### Common Issues
- **"uvx command not found"**: Some MCP servers require `uvx` (Python package manager) - this is expected in development
- **Port already in use**: Change PORT environment variable or kill existing processes
- **Frontend not loading**: Ensure frontend was built with `pnpm frontend:build`
- **MCP server connection failed**: Check server command/args in `mcp_settings.json`
### Build Failures
- **TypeScript errors**: Run `pnpm backend:build` to see compilation errors
- **Test failures**: Run `pnpm test:verbose` for detailed test output
- **Lint errors**: Run `pnpm lint` and fix reported issues
### Development Issues
- **Backend not starting**: Check for port conflicts, verify `mcp_settings.json` syntax
- **Frontend proxy errors**: Ensure backend is running before starting frontend
- **Hot reload not working**: Restart development server
## Performance Notes
- **Install time**: pnpm install takes ~30 seconds
- **Build time**: Full build takes ~10 seconds
- **Test time**: Complete test suite takes ~16 seconds
- **Startup time**: Backend initialization takes 10-15 seconds (MCP server connections)
**Remember**: NEVER CANCEL any build or test commands. Always wait for completion even if they seem slow.

View File

@@ -1,16 +1,16 @@
import { McpSettings, IUser, ServerConfig } from '../types/index.js';
import {
UserDao,
ServerDao,
GroupDao,
SystemConfigDao,
import {
UserDao,
ServerDao,
GroupDao,
SystemConfigDao,
UserConfigDao,
ServerConfigWithName,
UserDaoImpl,
ServerDaoImpl,
GroupDaoImpl,
SystemConfigDaoImpl,
UserConfigDaoImpl
UserConfigDaoImpl,
} from '../dao/index.js';
/**
@@ -22,7 +22,7 @@ export class DaoConfigService {
private serverDao: ServerDao,
private groupDao: GroupDao,
private systemConfigDao: SystemConfigDao,
private userConfigDao: UserConfigDao
private userConfigDao: UserConfigDao,
) {}
/**
@@ -34,7 +34,7 @@ export class DaoConfigService {
this.serverDao.findAll(),
this.groupDao.findAll(),
this.systemConfigDao.get(),
this.userConfigDao.getAll()
this.userConfigDao.getAll(),
]);
// Convert servers back to the original format
@@ -49,7 +49,7 @@ export class DaoConfigService {
mcpServers,
groups,
systemConfig,
userConfigs
userConfigs,
};
// Apply user-specific filtering if needed
@@ -96,7 +96,7 @@ export class DaoConfigService {
if (settings.mcpServers) {
const currentServers = await this.serverDao.findAll();
const currentServerNames = new Set(currentServers.map((s: ServerConfigWithName) => s.name));
for (const [name, config] of Object.entries(settings.mcpServers)) {
const serverWithName: ServerConfigWithName = { name, ...config };
if (currentServerNames.has(name)) {
@@ -118,7 +118,7 @@ export class DaoConfigService {
if (settings.groups) {
const currentGroups = await this.groupDao.findAll();
const currentGroupIds = new Set(currentGroups.map((g: any) => g.id));
for (const group of settings.groups) {
if (group.id && currentGroupIds.has(group.id)) {
promises.push(this.groupDao.update(group.id, group));
@@ -128,7 +128,7 @@ export class DaoConfigService {
}
// Remove groups that are no longer in the settings
const newGroupIds = new Set(settings.groups.map(g => g.id).filter(Boolean));
const newGroupIds = new Set(settings.groups.map((g) => g.id).filter(Boolean));
for (const existingGroup of currentGroups) {
if (!newGroupIds.has(existingGroup.id)) {
promises.push(this.groupDao.delete(existingGroup.id));
@@ -173,7 +173,7 @@ export class DaoConfigService {
}
const filteredGroups = (settings.groups || []).filter(
group => group.owner === user.username || group.owner === undefined
(group) => group.owner === user.username || group.owner === undefined,
);
return {
@@ -182,7 +182,7 @@ export class DaoConfigService {
groups: filteredGroups,
users: [], // Non-admin users can't see user list
systemConfig: {}, // Non-admin users can't see system config
userConfigs: { [user.username]: settings.userConfigs?.[user.username] || {} }
userConfigs: { [user.username]: settings.userConfigs?.[user.username] || {} },
};
}
@@ -190,9 +190,9 @@ export class DaoConfigService {
* Merge settings for non-admin users
*/
private mergeSettingsForUser(
currentSettings: McpSettings,
newSettings: McpSettings,
user: IUser
currentSettings: McpSettings,
newSettings: McpSettings,
user: IUser,
): McpSettings {
if (user.isAdmin) {
return newSettings;
@@ -214,14 +214,14 @@ export class DaoConfigService {
// Merge groups (only user's own groups)
if (newSettings.groups) {
const userGroups = newSettings.groups.filter(
group => !group.owner || group.owner === user.username
).map(group => ({ ...group, owner: user.username }));
const userGroups = newSettings.groups
.filter((group) => !group.owner || group.owner === user.username)
.map((group) => ({ ...group, owner: user.username }));
const otherGroups = (currentSettings.groups || []).filter(
group => group.owner !== user.username
(group) => group.owner !== user.username,
);
mergedSettings.groups = [...otherGroups, ...userGroups];
}
@@ -260,6 +260,6 @@ export function createDaoConfigService(): DaoConfigService {
new ServerDaoImpl(),
new GroupDaoImpl(),
new SystemConfigDaoImpl(),
new UserConfigDaoImpl()
new UserConfigDaoImpl(),
);
}

View File

@@ -4,7 +4,11 @@ import { getPackageVersion } from '../utils/version.js';
import { getDataService } from '../services/services.js';
import { DataService } from '../services/dataService.js';
import { DaoConfigService, createDaoConfigService } from './DaoConfigService.js';
import { loadOriginalSettings as legacyLoadSettings, saveSettings as legacySaveSettings, clearSettingsCache as legacyClearCache } from './index.js';
import {
loadOriginalSettings as legacyLoadSettings,
saveSettings as legacySaveSettings,
clearSettingsCache as legacyClearCache,
} from './index.js';
dotenv.config();
@@ -71,12 +75,12 @@ export const getSettingsCacheInfo = (): { hasCache: boolean; usingDao: boolean }
const daoInfo = daoConfigService.getCacheInfo();
return {
...daoInfo,
usingDao: true
usingDao: true,
};
} else {
return {
hasCache: false, // Legacy method doesn't expose cache info here
usingDao: false
usingDao: false,
};
}
};
@@ -108,14 +112,14 @@ export const getDaoConfigService = (): DaoConfigService => {
export const migrateToDao = async (): Promise<boolean> => {
try {
console.log('Starting migration from legacy format to DAO layer...');
// Load data using legacy method
const legacySettings = legacyLoadSettings();
// Save using DAO layer
switchToDao();
const success = await saveSettings(legacySettings);
if (success) {
console.log('Migration completed successfully');
return true;

View File

@@ -2,12 +2,7 @@
* Migration utilities for moving from legacy file-based config to DAO layer
*/
import {
loadSettings,
migrateToDao,
switchToDao,
switchToLegacy
} from './configManager.js';
import { loadSettings, migrateToDao, switchToDao, switchToLegacy } from './configManager.js';
import { UserDaoImpl, ServerDaoImpl, GroupDaoImpl } from '../dao/index.js';
/**
@@ -16,42 +11,41 @@ import { UserDaoImpl, ServerDaoImpl, GroupDaoImpl } from '../dao/index.js';
export async function validateMigration(): Promise<boolean> {
try {
console.log('Validating migration...');
// Load settings using DAO layer
switchToDao();
const daoSettings = await loadSettings();
// Load settings using legacy method
switchToLegacy();
const legacySettings = await loadSettings();
// Compare key metrics
const daoUserCount = daoSettings.users?.length || 0;
const legacyUserCount = legacySettings.users?.length || 0;
const daoServerCount = Object.keys(daoSettings.mcpServers || {}).length;
const legacyServerCount = Object.keys(legacySettings.mcpServers || {}).length;
const daoGroupCount = daoSettings.groups?.length || 0;
const legacyGroupCount = legacySettings.groups?.length || 0;
console.log('Data comparison:');
console.log(`Users: DAO=${daoUserCount}, Legacy=${legacyUserCount}`);
console.log(`Servers: DAO=${daoServerCount}, Legacy=${legacyServerCount}`);
console.log(`Groups: DAO=${daoGroupCount}, Legacy=${legacyGroupCount}`);
const isValid = (
const isValid =
daoUserCount === legacyUserCount &&
daoServerCount === legacyServerCount &&
daoGroupCount === legacyGroupCount
);
daoGroupCount === legacyGroupCount;
if (isValid) {
console.log('✅ Migration validation passed');
} else {
console.log('❌ Migration validation failed');
}
return isValid;
} catch (error) {
console.error('Migration validation error:', error);
@@ -65,34 +59,34 @@ export async function validateMigration(): Promise<boolean> {
export async function performMigration(): Promise<boolean> {
try {
console.log('🚀 Starting migration to DAO layer...');
// Step 1: Backup current data
console.log('📁 Creating backup of current data...');
switchToLegacy();
const _backupData = await loadSettings();
// Step 2: Perform migration
console.log('🔄 Migrating data to DAO layer...');
const migrationSuccess = await migrateToDao();
if (!migrationSuccess) {
console.error('❌ Migration failed');
return false;
}
// Step 3: Validate migration
console.log('🔍 Validating migration...');
const validationSuccess = await validateMigration();
if (!validationSuccess) {
console.error('❌ Migration validation failed');
// Could implement rollback here if needed
return false;
}
console.log('✅ Migration completed successfully!');
console.log('💡 You can now use the DAO layer by setting USE_DAO_LAYER=true');
return true;
} catch (error) {
console.error('Migration error:', error);
@@ -106,23 +100,23 @@ export async function performMigration(): Promise<boolean> {
export async function testDaoOperations(): Promise<boolean> {
try {
console.log('🧪 Testing DAO operations...');
switchToDao();
const userDao = new UserDaoImpl();
const serverDao = new ServerDaoImpl();
const groupDao = new GroupDaoImpl();
// Test user operations
console.log('Testing user operations...');
const testUser = await userDao.createWithHashedPassword('test-dao-user', 'password123', false);
console.log(`✅ Created test user: ${testUser.username}`);
const foundUser = await userDao.findByUsername('test-dao-user');
console.log(`✅ Found user: ${foundUser?.username}`);
const isValidPassword = await userDao.validateCredentials('test-dao-user', 'password123');
console.log(`✅ Password validation: ${isValidPassword}`);
// Test server operations
console.log('Testing server operations...');
const testServer = await serverDao.create({
@@ -130,33 +124,33 @@ export async function testDaoOperations(): Promise<boolean> {
command: 'node',
args: ['test.js'],
enabled: true,
owner: 'test-dao-user'
owner: 'test-dao-user',
});
console.log(`✅ Created test server: ${testServer.name}`);
const userServers = await serverDao.findByOwner('test-dao-user');
console.log(`✅ Found ${userServers.length} servers for user`);
// Test group operations
console.log('Testing group operations...');
const testGroup = await groupDao.create({
name: 'test-dao-group',
description: 'Test group for DAO operations',
servers: ['test-dao-server'],
owner: 'test-dao-user'
owner: 'test-dao-user',
});
console.log(`✅ Created test group: ${testGroup.name} (ID: ${testGroup.id})`);
const userGroups = await groupDao.findByOwner('test-dao-user');
console.log(`✅ Found ${userGroups.length} groups for user`);
// Cleanup test data
console.log('Cleaning up test data...');
await groupDao.delete(testGroup.id);
await serverDao.delete('test-dao-server');
await userDao.delete('test-dao-user');
console.log('✅ Test data cleaned up');
console.log('🎉 All DAO operations test passed!');
return true;
} catch (error) {
@@ -171,7 +165,7 @@ export async function testDaoOperations(): Promise<boolean> {
export async function performanceComparison(): Promise<void> {
try {
console.log('⚡ Performance comparison...');
// Test legacy approach
console.log('Testing legacy approach...');
switchToLegacy();
@@ -179,7 +173,7 @@ export async function performanceComparison(): Promise<void> {
await loadSettings();
const legacyTime = Date.now() - legacyStart;
console.log(`Legacy load time: ${legacyTime}ms`);
// Test DAO approach
console.log('Testing DAO approach...');
switchToDao();
@@ -187,13 +181,13 @@ export async function performanceComparison(): Promise<void> {
await loadSettings();
const daoTime = Date.now() - daoStart;
console.log(`DAO load time: ${daoTime}ms`);
// Comparison
const difference = daoTime - legacyTime;
const percentage = ((difference / legacyTime) * 100).toFixed(2);
console.log(`Performance difference: ${difference}ms (${percentage}%)`);
if (difference > 0) {
console.log(`DAO approach is ${percentage}% slower`);
} else {
@@ -210,14 +204,14 @@ export async function performanceComparison(): Promise<void> {
export async function generateMigrationReport(): Promise<any> {
try {
console.log('📊 Generating migration report...');
// Collect statistics from both approaches
switchToLegacy();
const legacySettings = await loadSettings();
switchToDao();
const daoSettings = await loadSettings();
const report = {
timestamp: new Date().toISOString(),
legacy: {
@@ -225,20 +219,20 @@ export async function generateMigrationReport(): Promise<any> {
servers: Object.keys(legacySettings.mcpServers || {}).length,
groups: legacySettings.groups?.length || 0,
systemConfigSections: Object.keys(legacySettings.systemConfig || {}).length,
userConfigs: Object.keys(legacySettings.userConfigs || {}).length
userConfigs: Object.keys(legacySettings.userConfigs || {}).length,
},
dao: {
users: daoSettings.users?.length || 0,
servers: Object.keys(daoSettings.mcpServers || {}).length,
groups: daoSettings.groups?.length || 0,
systemConfigSections: Object.keys(daoSettings.systemConfig || {}).length,
userConfigs: Object.keys(daoSettings.userConfigs || {}).length
}
userConfigs: Object.keys(daoSettings.userConfigs || {}).length,
},
};
console.log('📈 Migration Report:');
console.log(JSON.stringify(report, null, 2));
return report;
} catch (error) {
console.error('Report generation error:', error);

View File

@@ -31,7 +31,7 @@ export const streamLogs = (req: Request, res: Response): void => {
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive'
Connection: 'keep-alive',
});
// Send initial data
@@ -52,4 +52,4 @@ export const streamLogs = (req: Request, res: Response): void => {
console.error('Error streaming logs:', error);
res.status(500).json({ success: false, error: 'Error streaming logs' });
}
};
};

View File

@@ -7,7 +7,7 @@ import {
getMarketTags,
searchMarketServers,
filterMarketServersByCategory,
filterMarketServersByTag
filterMarketServersByTag,
} from '../services/marketService.js';
// Get all market servers
@@ -100,7 +100,7 @@ export const searchMarketServersByQuery = (req: Request, res: Response): void =>
try {
const { query } = req.query;
const searchQuery = typeof query === 'string' ? query : '';
const servers = searchMarketServers(searchQuery);
const response: ApiResponse = {
success: true,
@@ -119,7 +119,7 @@ export const searchMarketServersByQuery = (req: Request, res: Response): void =>
export const getMarketServersByCategory = (req: Request, res: Response): void => {
try {
const { category } = req.params;
const servers = filterMarketServersByCategory(category);
const response: ApiResponse = {
success: true,
@@ -138,7 +138,7 @@ export const getMarketServersByCategory = (req: Request, res: Response): void =>
export const getMarketServersByTag = (req: Request, res: Response): void => {
try {
const { tag } = req.params;
const servers = filterMarketServersByTag(tag);
const response: ApiResponse = {
success: true,
@@ -151,4 +151,4 @@ export const getMarketServersByTag = (req: Request, res: Response): void => {
message: 'Failed to filter market servers by tag',
});
}
};
};

View File

@@ -17,7 +17,7 @@ export const getPrompt = async (req: Request, res: Response): Promise<void> => {
}
const promptArgs = {
params: req.body as { [key: string]: any }
params: req.body as { [key: string]: any },
};
const result = await handleGetPromptRequest(promptArgs, serverName);
if (result.isError) {

View File

@@ -20,7 +20,7 @@ export interface DaoFactory {
*/
export class JsonFileDaoFactory implements DaoFactory {
private static instance: JsonFileDaoFactory;
private userDao: UserDao | null = null;
private serverDao: ServerDao | null = null;
private groupDao: GroupDao | null = null;

View File

@@ -43,11 +43,11 @@ export class SystemConfigDaoImpl extends JsonFileBaseDao implements SystemConfig
async update(config: Partial<SystemConfig>): Promise<SystemConfig> {
const settings = await this.loadSettings();
const currentConfig = settings.systemConfig || {};
// Deep merge configuration
const updatedConfig = this.deepMerge(currentConfig, config);
settings.systemConfig = updatedConfig;
await this.saveSettings(settings);
return updatedConfig;
}
@@ -55,10 +55,10 @@ export class SystemConfigDaoImpl extends JsonFileBaseDao implements SystemConfig
async reset(): Promise<SystemConfig> {
const settings = await this.loadSettings();
const defaultConfig: SystemConfig = {};
settings.systemConfig = defaultConfig;
await this.saveSettings(settings);
return defaultConfig;
}
@@ -67,7 +67,10 @@ export class SystemConfigDaoImpl extends JsonFileBaseDao implements SystemConfig
return config[section];
}
async updateSection<K extends keyof SystemConfig>(section: K, value: SystemConfig[K]): Promise<boolean> {
async updateSection<K extends keyof SystemConfig>(
section: K,
value: SystemConfig[K],
): Promise<boolean> {
try {
await this.update({ [section]: value } as Partial<SystemConfig>);
return true;
@@ -81,7 +84,7 @@ export class SystemConfigDaoImpl extends JsonFileBaseDao implements SystemConfig
*/
private deepMerge(target: any, source: any): any {
const result = { ...target };
for (const key in source) {
if (source[key] && typeof source[key] === 'object' && !Array.isArray(source[key])) {
result[key] = this.deepMerge(target[key] || {}, source[key]);
@@ -89,7 +92,7 @@ export class SystemConfigDaoImpl extends JsonFileBaseDao implements SystemConfig
result[key] = source[key];
}
}
return result;
}
}

View File

@@ -38,12 +38,19 @@ export interface UserConfigDao {
/**
* Get specific configuration section for user
*/
getSection<K extends keyof UserConfig>(username: string, section: K): Promise<UserConfig[K] | undefined>;
getSection<K extends keyof UserConfig>(
username: string,
section: K,
): Promise<UserConfig[K] | undefined>;
/**
* Update specific configuration section for user
*/
updateSection<K extends keyof UserConfig>(username: string, section: K, value: UserConfig[K]): Promise<boolean>;
updateSection<K extends keyof UserConfig>(
username: string,
section: K,
value: UserConfig[K],
): Promise<boolean>;
}
/**
@@ -62,28 +69,28 @@ export class UserConfigDaoImpl extends JsonFileBaseDao implements UserConfigDao
async update(username: string, config: Partial<UserConfig>): Promise<UserConfig> {
const settings = await this.loadSettings();
if (!settings.userConfigs) {
settings.userConfigs = {};
}
const currentConfig = settings.userConfigs[username] || {};
// Deep merge configuration
const updatedConfig = this.deepMerge(currentConfig, config);
settings.userConfigs[username] = updatedConfig;
await this.saveSettings(settings);
return updatedConfig;
}
async delete(username: string): Promise<boolean> {
const settings = await this.loadSettings();
if (!settings.userConfigs || !settings.userConfigs[username]) {
return false;
}
delete settings.userConfigs[username];
await this.saveSettings(settings);
return true;
@@ -99,12 +106,19 @@ export class UserConfigDaoImpl extends JsonFileBaseDao implements UserConfigDao
return this.update(username, defaultConfig);
}
async getSection<K extends keyof UserConfig>(username: string, section: K): Promise<UserConfig[K] | undefined> {
async getSection<K extends keyof UserConfig>(
username: string,
section: K,
): Promise<UserConfig[K] | undefined> {
const config = await this.get(username);
return config?.[section];
}
async updateSection<K extends keyof UserConfig>(username: string, section: K, value: UserConfig[K]): Promise<boolean> {
async updateSection<K extends keyof UserConfig>(
username: string,
section: K,
value: UserConfig[K],
): Promise<boolean> {
try {
await this.update(username, { [section]: value } as Partial<UserConfig>);
return true;
@@ -118,7 +132,7 @@ export class UserConfigDaoImpl extends JsonFileBaseDao implements UserConfigDao
*/
private deepMerge(target: any, source: any): any {
const result = { ...target };
for (const key in source) {
if (source[key] && typeof source[key] === 'object' && !Array.isArray(source[key])) {
result[key] = this.deepMerge(target[key] || {}, source[key]);
@@ -126,7 +140,7 @@ export class UserConfigDaoImpl extends JsonFileBaseDao implements UserConfigDao
result[key] = source[key];
}
}
return result;
}
}

View File

@@ -54,38 +54,38 @@ export abstract class BaseDaoImpl<T, K = string> implements BaseDao<T, K> {
async findById(id: K): Promise<T | null> {
const entities = await this.getAll();
return entities.find(entity => this.getEntityId(entity) === id) || null;
return entities.find((entity) => this.getEntityId(entity) === id) || null;
}
async create(data: Omit<T, 'id'>): Promise<T> {
const entities = await this.getAll();
const newEntity = this.createEntity(data);
entities.push(newEntity);
await this.saveAll(entities);
return newEntity;
}
async update(id: K, updates: Partial<T>): Promise<T | null> {
const entities = await this.getAll();
const index = entities.findIndex(entity => this.getEntityId(entity) === id);
const index = entities.findIndex((entity) => this.getEntityId(entity) === id);
if (index === -1) {
return null;
}
const updatedEntity = this.updateEntity(entities[index], updates);
entities[index] = updatedEntity;
await this.saveAll(entities);
return updatedEntity;
}
async delete(id: K): Promise<boolean> {
const entities = await this.getAll();
const index = entities.findIndex(entity => this.getEntityId(entity) === id);
const index = entities.findIndex((entity) => this.getEntityId(entity) === id);
if (index === -1) {
return false;
}

View File

@@ -1,18 +1,18 @@
/**
* Data access layer example and test utilities
*
*
* This file demonstrates how to use the DAO layer for managing different types of data
* in the MCPHub application.
*/
import {
getUserDao,
getServerDao,
getGroupDao,
getSystemConfigDao,
import {
getUserDao,
getServerDao,
getGroupDao,
getSystemConfigDao,
getUserConfigDao,
JsonFileDaoFactory,
setDaoFactory
setDaoFactory,
} from './DaoFactory.js';
/**
@@ -39,7 +39,10 @@ export async function exampleUserOperations() {
// Find all admin users
const admins = await userDao.findAdmins();
console.log('Admin users:', admins.map(u => u.username));
console.log(
'Admin users:',
admins.map((u) => u.username),
);
// Delete user
await userDao.delete('testuser');
@@ -58,21 +61,27 @@ export async function exampleServerOperations() {
command: 'node',
args: ['server.js'],
enabled: true,
owner: 'admin'
owner: 'admin',
});
console.log('Created server:', newServer.name);
// Find servers by owner
const userServers = await serverDao.findByOwner('admin');
console.log('Servers owned by admin:', userServers.map(s => s.name));
console.log(
'Servers owned by admin:',
userServers.map((s) => s.name),
);
// Find enabled servers
const enabledServers = await serverDao.findEnabled();
console.log('Enabled servers:', enabledServers.map(s => s.name));
console.log(
'Enabled servers:',
enabledServers.map((s) => s.name),
);
// Update server tools
await serverDao.updateTools('test-server', {
'tool1': { enabled: true, description: 'Test tool' }
tool1: { enabled: true, description: 'Test tool' },
});
console.log('Updated server tools');
@@ -92,13 +101,16 @@ export async function exampleGroupOperations() {
name: 'test-group',
description: 'Test group for development',
servers: ['server1', 'server2'],
owner: 'admin'
owner: 'admin',
});
console.log('Created group:', newGroup.name, 'with ID:', newGroup.id);
// Find groups by owner
const userGroups = await groupDao.findByOwner('admin');
console.log('Groups owned by admin:', userGroups.map(g => g.name));
console.log(
'Groups owned by admin:',
userGroups.map((g) => g.name),
);
// Add server to group
await groupDao.addServerToGroup(newGroup.id, 'server3');
@@ -106,7 +118,10 @@ export async function exampleGroupOperations() {
// Find groups containing specific server
const groupsWithServer = await groupDao.findByServer('server1');
console.log('Groups containing server1:', groupsWithServer.map(g => g.name));
console.log(
'Groups containing server1:',
groupsWithServer.map((g) => g.name),
);
// Remove server from group
await groupDao.removeServerFromGroup(newGroup.id, 'server2');
@@ -131,7 +146,7 @@ export async function exampleSystemConfigOperations() {
await systemConfigDao.updateSection('routing', {
enableGlobalRoute: true,
enableGroupNameRoute: true,
enableBearerAuth: false
enableBearerAuth: false,
});
console.log('Updated routing configuration');
@@ -139,7 +154,7 @@ export async function exampleSystemConfigOperations() {
await systemConfigDao.updateSection('install', {
pythonIndexUrl: 'https://pypi.org/simple/',
npmRegistry: 'https://registry.npmjs.org/',
baseUrl: 'https://mcphub.local'
baseUrl: 'https://mcphub.local',
});
console.log('Updated install configuration');
@@ -158,8 +173,8 @@ export async function exampleUserConfigOperations() {
await userConfigDao.update('admin', {
routing: {
enableGlobalRoute: false,
enableGroupNameRoute: true
}
enableGroupNameRoute: true,
},
});
console.log('Updated admin user config');
@@ -186,22 +201,22 @@ export async function exampleUserConfigOperations() {
export async function testAllDaoOperations() {
try {
console.log('=== Testing DAO Layer ===');
console.log('\n--- User Operations ---');
await exampleUserOperations();
console.log('\n--- Server Operations ---');
await exampleServerOperations();
console.log('\n--- Group Operations ---');
await exampleGroupOperations();
console.log('\n--- System Config Operations ---');
await exampleSystemConfigOperations();
console.log('\n--- User Config Operations ---');
await exampleUserConfigOperations();
console.log('\n=== DAO Layer Test Complete ===');
} catch (error) {
console.error('Error during DAO testing:', error);

View File

@@ -43,7 +43,7 @@ export const handleSseConnection = async (req: Request, res: Response): Promise<
const userContextService = UserContextService.getInstance();
const currentUser = userContextService.getCurrentUser();
const username = currentUser?.username;
// Check bearer auth using filtered settings
if (!validateBearerAuth(req)) {
console.warn('Bearer authentication failed or not provided');
@@ -74,7 +74,7 @@ export const handleSseConnection = async (req: Request, res: Response): Promise<
}
// Construct the appropriate messages path based on user context
const messagesPath = username
const messagesPath = username
? `${config.basePath}/${username}/messages`
: `${config.basePath}/messages`;
@@ -100,7 +100,7 @@ export const handleSseMessage = async (req: Request, res: Response): Promise<voi
const userContextService = UserContextService.getInstance();
const currentUser = userContextService.getCurrentUser();
const username = currentUser?.username;
// Check bearer auth using filtered settings
if (!validateBearerAuth(req)) {
res.status(401).send('Bearer authentication required or invalid token');
@@ -127,7 +127,9 @@ export const handleSseMessage = async (req: Request, res: Response): Promise<voi
const { transport, group } = transportData;
req.params.group = group;
req.query.group = group;
console.log(`Received message for sessionId: ${sessionId} in group: ${group}${username ? ` for user: ${username}` : ''}`);
console.log(
`Received message for sessionId: ${sessionId} in group: ${group}${username ? ` for user: ${username}` : ''}`,
);
await (transport as SSEServerTransport).handlePostMessage(req, res);
};
@@ -137,14 +139,14 @@ export const handleMcpPostRequest = async (req: Request, res: Response): Promise
const userContextService = UserContextService.getInstance();
const currentUser = userContextService.getCurrentUser();
const username = currentUser?.username;
const sessionId = req.headers['mcp-session-id'] as string | undefined;
const group = req.params.group;
const body = req.body;
console.log(
`Handling MCP post request for sessionId: ${sessionId} and group: ${group}${username ? ` for user: ${username}` : ''} with body: ${JSON.stringify(body)}`,
);
// Check bearer auth using filtered settings
if (!validateBearerAuth(req)) {
res.status(401).send('Bearer authentication required or invalid token');
@@ -183,7 +185,9 @@ export const handleMcpPostRequest = async (req: Request, res: Response): Promise
}
};
console.log(`MCP connection established: ${transport.sessionId}${username ? ` for user: ${username}` : ''}`);
console.log(
`MCP connection established: ${transport.sessionId}${username ? ` for user: ${username}` : ''}`,
);
await getMcpServer(transport.sessionId, group).connect(transport);
} else {
res.status(400).json({
@@ -206,9 +210,9 @@ export const handleMcpOtherRequest = async (req: Request, res: Response) => {
const userContextService = UserContextService.getInstance();
const currentUser = userContextService.getCurrentUser();
const username = currentUser?.username;
console.log(`Handling MCP other request${username ? ` for user: ${username}` : ''}`);
// Check bearer auth using filtered settings
if (!validateBearerAuth(req)) {
res.status(401).send('Bearer authentication required or invalid token');