Add Supabase key validation and simplify frontend state management

- Add backend validation to detect and warn about anon vs service keys
- Prevent startup with incorrect Supabase key configuration
- Consolidate frontend state management following KISS principles
- Remove duplicate state tracking and sessionStorage polling
- Add clear error display when backend fails to start
- Improve .env.example documentation with detailed key selection guide
- Add comprehensive test coverage for validation logic
- Remove unused test results checking to eliminate 404 errors

The implementation now warns users about key misconfiguration while
maintaining backward compatibility. Frontend state is simplified with
MainLayout as the single source of truth for backend status.
This commit is contained in:
Rasmus Widing
2025-08-16 00:10:23 +03:00
parent ad1b8bf70f
commit 3800280f2e
19 changed files with 848 additions and 317 deletions

View File

@@ -5,9 +5,22 @@
# https://supabase.com/dashboard/project/<your project ID>/settings/api # https://supabase.com/dashboard/project/<your project ID>/settings/api
SUPABASE_URL= SUPABASE_URL=
# Get your SUPABASE_SERVICE_KEY from the API Keys section of your Supabase project settings - # ⚠️ CRITICAL: You MUST use the SERVICE ROLE key, NOT the Anon key! ⚠️
# https://supabase.com/dashboard/project/<your project ID>/settings/api-keys #
# On this page it is called the service_role secret. # COMMON MISTAKE: Using the anon (public) key will cause ALL saves to fail with "permission denied"!
#
# How to get the CORRECT key:
# 1. Go to: https://supabase.com/dashboard/project/<your project ID>/settings/api
# 2. In the Settings menu, click on "API keys"
# 3. Find "Project API keys" section
# 4. You will see TWO keys - choose carefully:
# ❌ anon (public): WRONG - This is shorter, starts with "eyJhbGc..." and contains "anon" in the JWT
# ✅ service_role (secret): CORRECT - This is longer and contains "service_role" in the JWT
#
# The service_role key is typically much longer than the anon key.
# If you see errors like "Failed to save" or "Permission denied", you're using the wrong key!
#
# On the Supabase dashboard, it's labeled as "service_role" under "Project API keys"
SUPABASE_SERVICE_KEY= SUPABASE_SERVICE_KEY=
# Optional: Set log level for debugging # Optional: Set log level for debugging

View File

@@ -21,7 +21,7 @@
Archon is the **command center** for AI coding assistants. For you, it's a sleek interface to manage knowledge, context, and tasks for your projects. For the AI coding assistant(s), it's a **Model Context Protocol (MCP) server** to collaborate on and leverage the same knowledge, context, and tasks. Connect Claude Code, Kiro, Cursor, Windsurf, etc. to give your AI agents access to: Archon is the **command center** for AI coding assistants. For you, it's a sleek interface to manage knowledge, context, and tasks for your projects. For the AI coding assistant(s), it's a **Model Context Protocol (MCP) server** to collaborate on and leverage the same knowledge, context, and tasks. Connect Claude Code, Kiro, Cursor, Windsurf, etc. to give your AI agents access to:
- **Your documentation** (crawled websites, uploaded PDFs/docs) - **Your documentation** (crawled websites, uploaded PDFs/docs)
- **Smart search capabilities** with advanced RAG strategies - **Smart search capabilities** with advanced RAG strategies
- **Task management** integrated with your knowledge base - **Task management** integrated with your knowledge base
- **Real-time updates** as you add new content and collaborate with your coding assistant on tasks - **Real-time updates** as you add new content and collaborate with your coding assistant on tasks
- **Much more** coming soon to build Archon into an integrated environment for all context engineering - **Much more** coming soon to build Archon into an integrated environment for all context engineering
@@ -40,6 +40,7 @@ This new vision for Archon replaces the old one (the agenteer). Archon used to b
## Quick Start ## Quick Start
### Prerequisites ### Prerequisites
- [Docker Desktop](https://www.docker.com/products/docker-desktop/) - [Docker Desktop](https://www.docker.com/products/docker-desktop/)
- [Supabase](https://supabase.com/) account (free tier or local Supabase both work) - [Supabase](https://supabase.com/) account (free tier or local Supabase both work)
- [OpenAI API key](https://platform.openai.com/api-keys) (Gemini and Ollama are supported too!) - [OpenAI API key](https://platform.openai.com/api-keys) (Gemini and Ollama are supported too!)
@@ -47,12 +48,14 @@ This new vision for Archon replaces the old one (the agenteer). Archon used to b
### Setup Instructions ### Setup Instructions
1. **Clone Repository**: 1. **Clone Repository**:
```bash ```bash
git clone https://github.com/coleam00/archon.git git clone https://github.com/coleam00/archon.git
cd archon cd archon
``` ```
2. **Environment Configuration**: 2. **Environment Configuration**:
```bash ```bash
cp .env.example .env cp .env.example .env
# Edit .env and add your Supabase credentials: # Edit .env and add your Supabase credentials:
@@ -65,10 +68,11 @@ This new vision for Archon replaces the old one (the agenteer). Archon used to b
3. **Database Setup**: In your [Supabase project](https://supabase.com/dashboard) SQL Editor, copy, paste, and execute the contents of `migration/complete_setup.sql` 3. **Database Setup**: In your [Supabase project](https://supabase.com/dashboard) SQL Editor, copy, paste, and execute the contents of `migration/complete_setup.sql`
4. **Start Services**: 4. **Start Services**:
```bash ```bash
docker-compose up --build -d docker-compose up --build -d
``` ```
This starts the core microservices: This starts the core microservices:
- **Server**: Core API and business logic (Port: 8181) - **Server**: Core API and business logic (Port: 8181)
- **MCP Server**: Protocol interface for AI clients (Port: 8051) - **MCP Server**: Protocol interface for AI clients (Port: 8051)
@@ -90,17 +94,18 @@ If you need to completely reset your database and start fresh:
<summary>⚠️ <strong>Reset Database - This will delete ALL data for Archon!</strong></summary> <summary>⚠️ <strong>Reset Database - This will delete ALL data for Archon!</strong></summary>
1. **Run Reset Script**: In your Supabase SQL Editor, run the contents of `migration/RESET_DB.sql` 1. **Run Reset Script**: In your Supabase SQL Editor, run the contents of `migration/RESET_DB.sql`
⚠️ WARNING: This will delete all Archon specific tables and data! Nothing else will be touched in your DB though. ⚠️ WARNING: This will delete all Archon specific tables and data! Nothing else will be touched in your DB though.
2. **Rebuild Database**: After reset, run `migration/complete_setup.sql` to create all the tables again. 2. **Rebuild Database**: After reset, run `migration/complete_setup.sql` to create all the tables again.
3. **Restart Services**: 3. **Restart Services**:
```bash ```bash
docker-compose up -d docker-compose up -d
``` ```
4. **Reconfigure**: 4. **Reconfigure**:
- Select your LLM/embedding provider and set the API key again - Select your LLM/embedding provider and set the API key again
- Re-upload any documents or re-crawl websites - Re-upload any documents or re-crawl websites
@@ -121,23 +126,25 @@ Once everything is running:
### Core Services ### Core Services
| Service | Container Name | Default URL | Purpose | | Service | Container Name | Default URL | Purpose |
|---------|---------------|-------------|---------| | ------------------ | -------------- | --------------------- | --------------------------------- |
| **Web Interface** | archon-ui | http://localhost:3737 | Main dashboard and controls | | **Web Interface** | archon-ui | http://localhost:3737 | Main dashboard and controls |
| **API Service** | archon-server | http://localhost:8181 | Web crawling, document processing | | **API Service** | archon-server | http://localhost:8181 | Web crawling, document processing |
| **MCP Server** | archon-mcp | http://localhost:8051 | Model Context Protocol interface | | **MCP Server** | archon-mcp | http://localhost:8051 | Model Context Protocol interface |
| **Agents Service** | archon-agents | http://localhost:8052 | AI/ML operations, reranking | | **Agents Service** | archon-agents | http://localhost:8052 | AI/ML operations, reranking |
## What's Included ## What's Included
### 🧠 Knowledge Management ### 🧠 Knowledge Management
- **Smart Web Crawling**: Automatically detects and crawls entire documentation sites, sitemaps, and individual pages - **Smart Web Crawling**: Automatically detects and crawls entire documentation sites, sitemaps, and individual pages
- **Document Processing**: Upload and process PDFs, Word docs, markdown files, and text documents with intelligent chunking - **Document Processing**: Upload and process PDFs, Word docs, markdown files, and text documents with intelligent chunking
- **Code Example Extraction**: Automatically identifies and indexes code examples from documentation for enhanced search - **Code Example Extraction**: Automatically identifies and indexes code examples from documentation for enhanced search
- **Vector Search**: Advanced semantic search with contextual embeddings for precise knowledge retrieval - **Vector Search**: Advanced semantic search with contextual embeddings for precise knowledge retrieval
- **Source Management**: Organize knowledge by source, type, and tags for easy filtering - **Source Management**: Organize knowledge by source, type, and tags for easy filtering
### 🤖 AI Integration ### 🤖 AI Integration
- **Model Context Protocol (MCP)**: Connect any MCP-compatible client (Claude Code, Cursor, even non-AI coding assistants like Claude Desktop) - **Model Context Protocol (MCP)**: Connect any MCP-compatible client (Claude Code, Cursor, even non-AI coding assistants like Claude Desktop)
- **10 MCP Tools**: Comprehensive yet simple set of tools for RAG queries, task management, and project operations - **10 MCP Tools**: Comprehensive yet simple set of tools for RAG queries, task management, and project operations
- **Multi-LLM Support**: Works with OpenAI, Ollama, and Google Gemini models - **Multi-LLM Support**: Works with OpenAI, Ollama, and Google Gemini models
@@ -145,12 +152,14 @@ Once everything is running:
- **Real-time Streaming**: Live responses from AI agents with progress tracking - **Real-time Streaming**: Live responses from AI agents with progress tracking
### 📋 Project & Task Management ### 📋 Project & Task Management
- **Hierarchical Projects**: Organize work with projects, features, and tasks in a structured workflow - **Hierarchical Projects**: Organize work with projects, features, and tasks in a structured workflow
- **AI-Assisted Creation**: Generate project requirements and tasks using integrated AI agents - **AI-Assisted Creation**: Generate project requirements and tasks using integrated AI agents
- **Document Management**: Version-controlled documents with collaborative editing capabilities - **Document Management**: Version-controlled documents with collaborative editing capabilities
- **Progress Tracking**: Real-time updates and status management across all project activities - **Progress Tracking**: Real-time updates and status management across all project activities
### 🔄 Real-time Collaboration ### 🔄 Real-time Collaboration
- **WebSocket Updates**: Live progress tracking for crawling, processing, and AI operations - **WebSocket Updates**: Live progress tracking for crawling, processing, and AI operations
- **Multi-user Support**: Collaborative knowledge building and project management - **Multi-user Support**: Collaborative knowledge building and project management
- **Background Processing**: Asynchronous operations that don't block the user interface - **Background Processing**: Asynchronous operations that don't block the user interface
@@ -184,17 +193,17 @@ Archon uses true microservices architecture with clear separation of concerns:
### Service Responsibilities ### Service Responsibilities
| Service | Location | Purpose | Key Features | | Service | Location | Purpose | Key Features |
|---------|----------|---------|--------------| | -------------- | -------------------- | ---------------------------- | ------------------------------------------------------------------ |
| **Frontend** | `archon-ui-main/` | Web interface and dashboard | React, TypeScript, TailwindCSS, Socket.IO client | | **Frontend** | `archon-ui-main/` | Web interface and dashboard | React, TypeScript, TailwindCSS, Socket.IO client |
| **Server** | `python/src/server/` | Core business logic and APIs | FastAPI, service layer, Socket.IO broadcasts, all ML/AI operations | | **Server** | `python/src/server/` | Core business logic and APIs | FastAPI, service layer, Socket.IO broadcasts, all ML/AI operations |
| **MCP Server** | `python/src/mcp/` | MCP protocol interface | Lightweight HTTP wrapper, 10 MCP tools, session management | | **MCP Server** | `python/src/mcp/` | MCP protocol interface | Lightweight HTTP wrapper, 10 MCP tools, session management |
| **Agents** | `python/src/agents/` | PydanticAI agent hosting | Document and RAG agents, streaming responses | | **Agents** | `python/src/agents/` | PydanticAI agent hosting | Document and RAG agents, streaming responses |
### Communication Patterns ### Communication Patterns
- **HTTP-based**: All inter-service communication uses HTTP APIs - **HTTP-based**: All inter-service communication uses HTTP APIs
- **Socket.IO**: Real-time updates from Server to Frontend - **Socket.IO**: Real-time updates from Server to Frontend
- **MCP Protocol**: AI clients connect to MCP Server via SSE or stdio - **MCP Protocol**: AI clients connect to MCP Server via SSE or stdio
- **No Direct Imports**: Services are truly independent with no shared code dependencies - **No Direct Imports**: Services are truly independent with no shared code dependencies
@@ -208,8 +217,9 @@ Archon uses true microservices architecture with clear separation of concerns:
## 🔧 Configuring Custom Ports & Hostname ## 🔧 Configuring Custom Ports & Hostname
By default, Archon services run on the following ports: By default, Archon services run on the following ports:
- **Archon-UI**: 3737 - **Archon-UI**: 3737
- **Archon-Server**: 8181 - **Archon-Server**: 8181
- **Archon-MCP**: 8051 - **Archon-MCP**: 8051
- **Archon-Agents**: 8052 - **Archon-Agents**: 8052
- **Archon-Docs**: 3838 (optional) - **Archon-Docs**: 3838 (optional)
@@ -228,6 +238,7 @@ ARCHON_DOCS_PORT=3838
``` ```
Example: Running on different ports: Example: Running on different ports:
```bash ```bash
ARCHON_SERVER_PORT=8282 ARCHON_SERVER_PORT=8282
ARCHON_MCP_PORT=8151 ARCHON_MCP_PORT=8151
@@ -248,11 +259,13 @@ HOST=myserver.com # Use public domain
``` ```
This is useful when: This is useful when:
- Running Archon on a different machine and accessing it remotely - Running Archon on a different machine and accessing it remotely
- Using a custom domain name for your installation - Using a custom domain name for your installation
- Deploying in a network environment where `localhost` isn't accessible - Deploying in a network environment where `localhost` isn't accessible
After changing hostname or ports: After changing hostname or ports:
1. Restart Docker containers: `docker-compose down && docker-compose up -d` 1. Restart Docker containers: `docker-compose down && docker-compose up -d`
2. Access the UI at: `http://${HOST}:${ARCHON_UI_PORT}` 2. Access the UI at: `http://${HOST}:${ARCHON_UI_PORT}`
3. Update your AI client configuration with the new hostname and MCP port 3. Update your AI client configuration with the new hostname and MCP port
@@ -265,7 +278,7 @@ For development with hot reload:
# Backend services (with auto-reload) # Backend services (with auto-reload)
docker-compose up archon-server archon-mcp archon-agents --build docker-compose up archon-server archon-mcp archon-agents --build
# Frontend (with hot reload) # Frontend (with hot reload)
cd archon-ui-main && npm run dev cd archon-ui-main && npm run dev
# Documentation (with hot reload) # Documentation (with hot reload)

View File

@@ -21,5 +21,9 @@ COPY . .
# Expose Vite's default port # Expose Vite's default port
EXPOSE 5173 EXPOSE 5173
# Add a small startup script to wait a moment before starting Vite
# This helps ensure the backend is fully ready even after healthcheck passes
RUN echo '#!/bin/sh\nsleep 3\nexec npm run dev -- --host 0.0.0.0' > /app/start.sh && chmod +x /app/start.sh
# Start Vite dev server with host binding for Docker # Start Vite dev server with host binding for Docker
CMD ["npm", "run", "dev", "--", "--host", "0.0.0.0"] CMD ["/app/start.sh"]

View File

@@ -0,0 +1,74 @@
import React from 'react';
import { AlertCircle, Terminal, RefreshCw } from 'lucide-react';
export const BackendStartupError: React.FC = () => {
const handleRetry = () => {
// Reload the page to retry
window.location.reload();
};
return (
<div className="fixed inset-0 z-[10000] bg-black/90 backdrop-blur-sm flex items-center justify-center p-8">
<div className="max-w-2xl w-full">
<div className="bg-red-950/50 border-2 border-red-500/50 rounded-lg p-8 shadow-2xl backdrop-blur-md">
<div className="flex items-start gap-4">
<AlertCircle className="w-8 h-8 text-red-500 flex-shrink-0 mt-1" />
<div className="space-y-4 flex-1">
<h2 className="text-2xl font-bold text-red-100">
Backend Service Startup Failure
</h2>
<p className="text-red-200">
The Archon backend service failed to start. This is typically due to a configuration issue.
</p>
<div className="bg-black/50 rounded-lg p-4 border border-red-900/50">
<div className="flex items-center gap-2 mb-3 text-red-300">
<Terminal className="w-5 h-5" />
<span className="font-semibold">Check Docker Logs</span>
</div>
<p className="text-red-100 font-mono text-sm mb-3">
Check the <span className="text-red-400 font-bold">Archon-Server</span> logs in Docker Desktop for detailed error information.
</p>
<div className="space-y-2 text-xs text-red-300">
<p>1. Open Docker Desktop</p>
<p>2. Go to Containers tab</p>
<p>3. Click on <span className="text-red-400 font-semibold">Archon-Server</span></p>
<p>4. View the logs for the specific error message</p>
</div>
</div>
<div className="bg-yellow-950/30 border border-yellow-700/30 rounded-lg p-3">
<p className="text-yellow-200 text-sm">
<strong>Common issue:</strong> Using an ANON key instead of SERVICE key in your .env file
</p>
</div>
<div className="pt-4 border-t border-red-900/30">
<p className="text-red-300 text-sm">
After fixing the issue in your .env file, recreate the Docker containers:
</p>
<code className="block mt-2 p-2 bg-black/70 rounded text-red-100 font-mono text-sm">
docker compose down && docker compose up -d
</code>
<p className="text-red-300 text-xs mt-2">
Note: Use 'down' and 'up', not 'restart' - containers need to be recreated to load new environment variables
</p>
</div>
<div className="flex justify-center pt-4">
<button
onClick={handleRetry}
className="flex items-center gap-2 px-4 py-2 bg-red-600/20 hover:bg-red-600/30 border border-red-500/50 rounded-lg text-red-100 transition-colors"
>
<RefreshCw className="w-4 h-4" />
Retry Connection
</button>
</div>
</div>
</div>
</div>
</div>
</div>
);
};

View File

@@ -6,6 +6,7 @@ import { X } from 'lucide-react';
import { useToast } from '../../contexts/ToastContext'; import { useToast } from '../../contexts/ToastContext';
import { credentialsService } from '../../services/credentialsService'; import { credentialsService } from '../../services/credentialsService';
import { isLmConfigured } from '../../utils/onboarding'; import { isLmConfigured } from '../../utils/onboarding';
import { BackendStartupError } from '../BackendStartupError';
/** /**
* Props for the MainLayout component * Props for the MainLayout component
*/ */
@@ -29,13 +30,14 @@ export const MainLayout: React.FC<MainLayoutProps> = ({
const navigate = useNavigate(); const navigate = useNavigate();
const location = useLocation(); const location = useLocation();
const [backendReady, setBackendReady] = useState(false); const [backendReady, setBackendReady] = useState(false);
const [backendStartupFailed, setBackendStartupFailed] = useState(false);
// Check backend readiness // Check backend readiness
useEffect(() => { useEffect(() => {
const checkBackendHealth = async (retryCount = 0) => { const checkBackendHealth = async (retryCount = 0) => {
const maxRetries = 10; // Increased retries for initialization const maxRetries = 3; // 3 retries total
const retryDelay = 1000; const retryDelay = 1500; // 1.5 seconds between retries
try { try {
// Create AbortController for proper timeout handling // Create AbortController for proper timeout handling
@@ -58,6 +60,7 @@ export const MainLayout: React.FC<MainLayoutProps> = ({
if (healthData.ready === true) { if (healthData.ready === true) {
console.log('✅ Backend is fully initialized'); console.log('✅ Backend is fully initialized');
setBackendReady(true); setBackendReady(true);
setBackendStartupFailed(false);
} else { } else {
// Backend is starting up but not ready yet // Backend is starting up but not ready yet
console.log(`🔄 Backend initializing... (attempt ${retryCount + 1}/${maxRetries}):`, healthData.message || 'Loading credentials...'); console.log(`🔄 Backend initializing... (attempt ${retryCount + 1}/${maxRetries}):`, healthData.message || 'Loading credentials...');
@@ -66,9 +69,10 @@ export const MainLayout: React.FC<MainLayoutProps> = ({
if (retryCount < maxRetries) { if (retryCount < maxRetries) {
setTimeout(() => { setTimeout(() => {
checkBackendHealth(retryCount + 1); checkBackendHealth(retryCount + 1);
}, retryDelay); // Constant 1s retry during initialization }, retryDelay); // Constant 1.5s retry during initialization
} else { } else {
console.warn('Backend initialization taking too long - skipping credential check'); console.warn('Backend initialization taking too long - proceeding anyway');
// Don't mark as failed yet, just not fully ready
setBackendReady(false); setBackendReady(false);
} }
} }
@@ -80,7 +84,10 @@ export const MainLayout: React.FC<MainLayoutProps> = ({
const errorMessage = error instanceof Error const errorMessage = error instanceof Error
? (error.name === 'AbortError' ? 'Request timeout (5s)' : error.message) ? (error.name === 'AbortError' ? 'Request timeout (5s)' : error.message)
: 'Unknown error'; : 'Unknown error';
console.log(`Backend not ready yet (attempt ${retryCount + 1}/${maxRetries}):`, errorMessage); // Only log after first attempt to reduce noise during normal startup
if (retryCount > 0) {
console.log(`Backend not ready yet (attempt ${retryCount + 1}/${maxRetries}):`, errorMessage);
}
// Retry if we haven't exceeded max retries // Retry if we haven't exceeded max retries
if (retryCount < maxRetries) { if (retryCount < maxRetries) {
@@ -88,8 +95,9 @@ export const MainLayout: React.FC<MainLayoutProps> = ({
checkBackendHealth(retryCount + 1); checkBackendHealth(retryCount + 1);
}, retryDelay * Math.pow(1.5, retryCount)); // Exponential backoff for connection errors }, retryDelay * Math.pow(1.5, retryCount)); // Exponential backoff for connection errors
} else { } else {
console.warn('Backend not ready after maximum retries - skipping credential check'); console.error('Backend startup failed after maximum retries - showing error message');
setBackendReady(false); setBackendReady(false);
setBackendStartupFailed(true);
} }
} }
}; };
@@ -99,11 +107,16 @@ export const MainLayout: React.FC<MainLayoutProps> = ({
setTimeout(() => { setTimeout(() => {
checkBackendHealth(); checkBackendHealth();
}, 1000); // Wait 1 second for initial app startup }, 1000); // Wait 1 second for initial app startup
}, [showToast, navigate]); // Removed backendReady from dependencies to prevent double execution }, []); // Empty deps - only run once on mount
// Check for onboarding redirect after backend is ready // Check for onboarding redirect after backend is ready
useEffect(() => { useEffect(() => {
const checkOnboarding = async () => { const checkOnboarding = async () => {
// Skip if backend failed to start
if (backendStartupFailed) {
return;
}
// Skip if not ready, already on onboarding, or already dismissed // Skip if not ready, already on onboarding, or already dismissed
if (!backendReady || location.pathname === '/onboarding') { if (!backendReady || location.pathname === '/onboarding') {
return; return;
@@ -152,9 +165,12 @@ export const MainLayout: React.FC<MainLayoutProps> = ({
}; };
checkOnboarding(); checkOnboarding();
}, [backendReady, location.pathname, navigate, showToast]); }, [backendReady, backendStartupFailed, location.pathname, navigate, showToast]);
return <div className="relative min-h-screen bg-white dark:bg-black overflow-hidden"> return <div className="relative min-h-screen bg-white dark:bg-black overflow-hidden">
{/* Show backend startup error if backend failed to start */}
{backendStartupFailed && <BackendStartupError />}
{/* Fixed full-page background grid that doesn't scroll */} {/* Fixed full-page background grid that doesn't scroll */}
<div className="fixed inset-0 neon-grid pointer-events-none z-0"></div> <div className="fixed inset-0 neon-grid pointer-events-none z-0"></div>
{/* Floating Navigation */} {/* Floating Navigation */}

View File

@@ -1,10 +1,10 @@
import { useState } from 'react'; import { useState } from "react";
import { Key, ExternalLink, Save, Loader } from 'lucide-react'; import { Key, ExternalLink, Save, Loader } from "lucide-react";
import { Input } from '../ui/Input'; import { Input } from "../ui/Input";
import { Button } from '../ui/Button'; import { Button } from "../ui/Button";
import { Select } from '../ui/Select'; import { Select } from "../ui/Select";
import { useToast } from '../../contexts/ToastContext'; import { useToast } from "../../contexts/ToastContext";
import { credentialsService } from '../../services/credentialsService'; import { credentialsService } from "../../services/credentialsService";
interface ProviderStepProps { interface ProviderStepProps {
onSaved: () => void; onSaved: () => void;
@@ -12,14 +12,14 @@ interface ProviderStepProps {
} }
export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => { export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => {
const [provider, setProvider] = useState('openai'); const [provider, setProvider] = useState("openai");
const [apiKey, setApiKey] = useState(''); const [apiKey, setApiKey] = useState("");
const [saving, setSaving] = useState(false); const [saving, setSaving] = useState(false);
const { showToast } = useToast(); const { showToast } = useToast();
const handleSave = async () => { const handleSave = async () => {
if (!apiKey.trim()) { if (!apiKey.trim()) {
showToast('Please enter an API key', 'error'); showToast("Please enter an API key", "error");
return; return;
} }
@@ -27,60 +27,50 @@ export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => {
try { try {
// Save the API key // Save the API key
await credentialsService.createCredential({ await credentialsService.createCredential({
key: 'OPENAI_API_KEY', key: "OPENAI_API_KEY",
value: apiKey, value: apiKey,
is_encrypted: true, is_encrypted: true,
category: 'api_keys' category: "api_keys",
}); });
// Update the provider setting if needed // Update the provider setting if needed
await credentialsService.updateCredential({ await credentialsService.updateCredential({
key: 'LLM_PROVIDER', key: "LLM_PROVIDER",
value: 'openai', value: "openai",
is_encrypted: false, is_encrypted: false,
category: 'rag_strategy' category: "rag_strategy",
}); });
showToast('API key saved successfully!', 'success'); showToast("API key saved successfully!", "success");
// Mark onboarding as dismissed when API key is saved // Mark onboarding as dismissed when API key is saved
localStorage.setItem('onboardingDismissed', 'true'); localStorage.setItem("onboardingDismissed", "true");
onSaved(); onSaved();
} catch (error) { } catch (error) {
// Detailed error handling for critical configuration per alpha principles // Log error for debugging per alpha principles
const errorMessage = error instanceof Error ? error.message : 'Unknown error'; const errorMessage =
const errorDetails = { error instanceof Error ? error.message : "Unknown error";
context: 'API key configuration', console.error("Failed to save API key:", error);
operation: 'save_openai_key',
provider: 'openai',
error: errorMessage,
timestamp: new Date().toISOString()
};
// Log with full context and stack trace
console.error('API_KEY_SAVE_FAILED:', errorDetails, error);
// Show specific error details to help user resolve the issue // Show specific error details to help user resolve the issue
if (errorMessage.includes('duplicate') || errorMessage.includes('already exists')) { if (
errorMessage.includes("duplicate") ||
errorMessage.includes("already exists")
) {
showToast( showToast(
'API key already exists. Please update it in Settings if you want to change it.', "API key already exists. Please update it in Settings if you want to change it.",
'warning' "warning",
); );
} else if (errorMessage.includes('network') || errorMessage.includes('fetch')) { } else if (
errorMessage.includes("network") ||
errorMessage.includes("fetch")
) {
showToast( showToast(
`Network error while saving API key: ${errorMessage}. Please check your connection.`, `Network error while saving API key: ${errorMessage}. Please check your connection.`,
'error' "error",
);
} else if (errorMessage.includes('unauthorized') || errorMessage.includes('forbidden')) {
showToast(
`Permission error: ${errorMessage}. Please check backend configuration.`,
'error'
); );
} else { } else {
// Show the actual error for unknown issues // Show the actual error for unknown issues
showToast( showToast(`Failed to save API key: ${errorMessage}`, "error");
`Failed to save API key: ${errorMessage}`,
'error'
);
} }
} finally { } finally {
setSaving(false); setSaving(false);
@@ -88,9 +78,9 @@ export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => {
}; };
const handleSkip = () => { const handleSkip = () => {
showToast('You can configure your provider in Settings', 'info'); showToast("You can configure your provider in Settings", "info");
// Mark onboarding as dismissed when skipping // Mark onboarding as dismissed when skipping
localStorage.setItem('onboardingDismissed', 'true'); localStorage.setItem("onboardingDismissed", "true");
onSkip(); onSkip();
}; };
@@ -103,21 +93,24 @@ export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => {
value={provider} value={provider}
onChange={(e) => setProvider(e.target.value)} onChange={(e) => setProvider(e.target.value)}
options={[ options={[
{ value: 'openai', label: 'OpenAI' }, { value: "openai", label: "OpenAI" },
{ value: 'google', label: 'Google Gemini' }, { value: "google", label: "Google Gemini" },
{ value: 'ollama', label: 'Ollama (Local)' }, { value: "ollama", label: "Ollama (Local)" },
]} ]}
accentColor="green" accentColor="green"
/> />
<p className="mt-2 text-sm text-gray-600 dark:text-zinc-400"> <p className="mt-2 text-sm text-gray-600 dark:text-zinc-400">
{provider === 'openai' && 'OpenAI provides powerful models like GPT-4. You\'ll need an API key from OpenAI.'} {provider === "openai" &&
{provider === 'google' && 'Google Gemini offers advanced AI capabilities. Configure in Settings after setup.'} "OpenAI provides powerful models like GPT-4. You'll need an API key from OpenAI."}
{provider === 'ollama' && 'Ollama runs models locally on your machine. Configure in Settings after setup.'} {provider === "google" &&
"Google Gemini offers advanced AI capabilities. Configure in Settings after setup."}
{provider === "ollama" &&
"Ollama runs models locally on your machine. Configure in Settings after setup."}
</p> </p>
</div> </div>
{/* OpenAI API Key Input */} {/* OpenAI API Key Input */}
{provider === 'openai' && ( {provider === "openai" && (
<> <>
<div> <div>
<Input <Input
@@ -152,10 +145,16 @@ export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => {
size="lg" size="lg"
onClick={handleSave} onClick={handleSave}
disabled={saving || !apiKey.trim()} disabled={saving || !apiKey.trim()}
icon={saving ? <Loader className="w-4 h-4 animate-spin" /> : <Save className="w-4 h-4" />} icon={
saving ? (
<Loader className="w-4 h-4 animate-spin" />
) : (
<Save className="w-4 h-4" />
)
}
className="flex-1" className="flex-1"
> >
{saving ? 'Saving...' : 'Save & Continue'} {saving ? "Saving..." : "Save & Continue"}
</Button> </Button>
<Button <Button
variant="outline" variant="outline"
@@ -171,15 +170,17 @@ export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => {
)} )}
{/* Non-OpenAI Provider Message */} {/* Non-OpenAI Provider Message */}
{provider !== 'openai' && ( {provider !== "openai" && (
<div className="space-y-4"> <div className="space-y-4">
<div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg"> <div className="p-4 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-800 rounded-lg">
<p className="text-sm text-blue-800 dark:text-blue-200"> <p className="text-sm text-blue-800 dark:text-blue-200">
{provider === 'google' && 'Google Gemini configuration will be available in Settings after setup.'} {provider === "google" &&
{provider === 'ollama' && 'Ollama configuration will be available in Settings after setup. Make sure Ollama is running locally.'} "Google Gemini configuration will be available in Settings after setup."}
{provider === "ollama" &&
"Ollama configuration will be available in Settings after setup. Make sure Ollama is running locally."}
</p> </p>
</div> </div>
<div className="flex gap-3 pt-4"> <div className="flex gap-3 pt-4">
<Button <Button
variant="primary" variant="primary"
@@ -188,27 +189,30 @@ export const ProviderStep = ({ onSaved, onSkip }: ProviderStepProps) => {
// Save the provider selection for non-OpenAI providers // Save the provider selection for non-OpenAI providers
try { try {
await credentialsService.updateCredential({ await credentialsService.updateCredential({
key: 'LLM_PROVIDER', key: "LLM_PROVIDER",
value: provider, value: provider,
is_encrypted: false, is_encrypted: false,
category: 'rag_strategy' category: "rag_strategy",
}); });
showToast(`${provider === 'google' ? 'Google Gemini' : 'Ollama'} selected as provider`, 'success'); showToast(
`${provider === "google" ? "Google Gemini" : "Ollama"} selected as provider`,
"success",
);
// Mark onboarding as dismissed // Mark onboarding as dismissed
localStorage.setItem('onboardingDismissed', 'true'); localStorage.setItem("onboardingDismissed", "true");
onSaved(); onSaved();
} catch (error) { } catch (error) {
console.error('Failed to save provider selection:', error); console.error("Failed to save provider selection:", error);
showToast('Failed to save provider selection', 'error'); showToast("Failed to save provider selection", "error");
} }
}} }}
className="flex-1" className="flex-1"
> >
Continue with {provider === 'google' ? 'Gemini' : 'Ollama'} Continue with {provider === "google" ? "Gemini" : "Ollama"}
</Button> </Button>
</div> </div>
</div> </div>
)} )}
</div> </div>
); );
}; };

View File

@@ -70,23 +70,15 @@ export const TestStatus = () => {
}; };
}, []); }, []);
// Check for test results availability // Test results availability - not implemented yet
useEffect(() => { useEffect(() => {
const checkResults = async () => { setHasResults(false);
const hasTestResults = await testService.hasTestResults();
setHasResults(hasTestResults);
};
checkResults();
}, []); }, []);
// Check for results when UI tests complete // Check for results when UI tests complete
useEffect(() => { useEffect(() => {
if (!uiTest.isRunning && uiTest.exitCode === 0) { if (!uiTest.isRunning && uiTest.exitCode === 0) {
// Small delay to ensure files are written setHasResults(false);
setTimeout(async () => {
const hasTestResults = await testService.hasTestResults();
setHasResults(hasTestResults);
}, 2000);
} }
}, [uiTest.isRunning, uiTest.exitCode]); }, [uiTest.isRunning, uiTest.exitCode]);

View File

@@ -1,19 +1,35 @@
import { useState, useEffect } from 'react'; import { useState, useEffect } from "react";
import { Loader, Settings, ChevronDown, ChevronUp, Palette, Key, Brain, Code, Activity, FileCode, Bug } from 'lucide-react'; import {
import { motion, AnimatePresence } from 'framer-motion'; Loader,
import { useToast } from '../contexts/ToastContext'; Settings,
import { useSettings } from '../contexts/SettingsContext'; ChevronDown,
import { useStaggeredEntrance } from '../hooks/useStaggeredEntrance'; ChevronUp,
import { FeaturesSection } from '../components/settings/FeaturesSection'; Palette,
import { APIKeysSection } from '../components/settings/APIKeysSection'; Key,
import { RAGSettings } from '../components/settings/RAGSettings'; Brain,
import { CodeExtractionSettings } from '../components/settings/CodeExtractionSettings'; Code,
import { TestStatus } from '../components/settings/TestStatus'; Activity,
import { IDEGlobalRules } from '../components/settings/IDEGlobalRules'; FileCode,
import { ButtonPlayground } from '../components/settings/ButtonPlayground'; Bug,
import { CollapsibleSettingsCard } from '../components/ui/CollapsibleSettingsCard'; } from "lucide-react";
import { BugReportButton } from '../components/bug-report/BugReportButton'; import { motion, AnimatePresence } from "framer-motion";
import { credentialsService, RagSettings, CodeExtractionSettings as CodeExtractionSettingsType } from '../services/credentialsService'; import { useToast } from "../contexts/ToastContext";
import { useSettings } from "../contexts/SettingsContext";
import { useStaggeredEntrance } from "../hooks/useStaggeredEntrance";
import { FeaturesSection } from "../components/settings/FeaturesSection";
import { APIKeysSection } from "../components/settings/APIKeysSection";
import { RAGSettings } from "../components/settings/RAGSettings";
import { CodeExtractionSettings } from "../components/settings/CodeExtractionSettings";
import { TestStatus } from "../components/settings/TestStatus";
import { IDEGlobalRules } from "../components/settings/IDEGlobalRules";
import { ButtonPlayground } from "../components/settings/ButtonPlayground";
import { CollapsibleSettingsCard } from "../components/ui/CollapsibleSettingsCard";
import { BugReportButton } from "../components/bug-report/BugReportButton";
import {
credentialsService,
RagSettings,
CodeExtractionSettings as CodeExtractionSettingsType,
} from "../services/credentialsService";
export const SettingsPage = () => { export const SettingsPage = () => {
const [ragSettings, setRagSettings] = useState<RagSettings>({ const [ragSettings, setRagSettings] = useState<RagSettings>({
@@ -22,56 +38,56 @@ export const SettingsPage = () => {
USE_HYBRID_SEARCH: true, USE_HYBRID_SEARCH: true,
USE_AGENTIC_RAG: true, USE_AGENTIC_RAG: true,
USE_RERANKING: true, USE_RERANKING: true,
MODEL_CHOICE: 'gpt-4.1-nano' MODEL_CHOICE: "gpt-4.1-nano",
});
const [codeExtractionSettings, setCodeExtractionSettings] = useState<CodeExtractionSettingsType>({
MIN_CODE_BLOCK_LENGTH: 250,
MAX_CODE_BLOCK_LENGTH: 5000,
ENABLE_COMPLETE_BLOCK_DETECTION: true,
ENABLE_LANGUAGE_SPECIFIC_PATTERNS: true,
ENABLE_PROSE_FILTERING: true,
MAX_PROSE_RATIO: 0.15,
MIN_CODE_INDICATORS: 3,
ENABLE_DIAGRAM_FILTERING: true,
ENABLE_CONTEXTUAL_LENGTH: true,
CODE_EXTRACTION_MAX_WORKERS: 3,
CONTEXT_WINDOW_SIZE: 1000,
ENABLE_CODE_SUMMARIES: true
}); });
const [codeExtractionSettings, setCodeExtractionSettings] =
useState<CodeExtractionSettingsType>({
MIN_CODE_BLOCK_LENGTH: 250,
MAX_CODE_BLOCK_LENGTH: 5000,
ENABLE_COMPLETE_BLOCK_DETECTION: true,
ENABLE_LANGUAGE_SPECIFIC_PATTERNS: true,
ENABLE_PROSE_FILTERING: true,
MAX_PROSE_RATIO: 0.15,
MIN_CODE_INDICATORS: 3,
ENABLE_DIAGRAM_FILTERING: true,
ENABLE_CONTEXTUAL_LENGTH: true,
CODE_EXTRACTION_MAX_WORKERS: 3,
CONTEXT_WINDOW_SIZE: 1000,
ENABLE_CODE_SUMMARIES: true,
});
const [loading, setLoading] = useState(true); const [loading, setLoading] = useState(true);
const [error, setError] = useState<string | null>(null); const [error, setError] = useState<string | null>(null);
const [showButtonPlayground, setShowButtonPlayground] = useState(false); const [showButtonPlayground, setShowButtonPlayground] = useState(false);
const { showToast } = useToast(); const { showToast } = useToast();
const { projectsEnabled } = useSettings(); const { projectsEnabled } = useSettings();
// Use staggered entrance animation // Use staggered entrance animation
const { isVisible, containerVariants, itemVariants, titleVariants } = useStaggeredEntrance( const { isVisible, containerVariants, itemVariants, titleVariants } =
[1, 2, 3, 4], useStaggeredEntrance([1, 2, 3, 4], 0.15);
0.15
);
// Load settings on mount // Load settings on mount
useEffect(() => { useEffect(() => {
loadSettings(); loadSettings();
}, []); }, []);
const loadSettings = async () => { const loadSettings = async (isRetry = false) => {
try { try {
setLoading(true); setLoading(true);
setError(null); setError(null);
// Load RAG settings // Load RAG settings
const ragSettingsData = await credentialsService.getRagSettings(); const ragSettingsData = await credentialsService.getRagSettings();
setRagSettings(ragSettingsData); setRagSettings(ragSettingsData);
// Load Code Extraction settings // Load Code Extraction settings
const codeExtractionSettingsData = await credentialsService.getCodeExtractionSettings(); const codeExtractionSettingsData =
await credentialsService.getCodeExtractionSettings();
setCodeExtractionSettings(codeExtractionSettingsData); setCodeExtractionSettings(codeExtractionSettingsData);
} catch (err) { } catch (err) {
setError('Failed to load settings'); setError("Failed to load settings");
console.error(err); console.error(err);
showToast('Failed to load settings', 'error'); showToast("Failed to load settings", "error");
} finally { } finally {
setLoading(false); setLoading(false);
} }
@@ -88,12 +104,15 @@ export const SettingsPage = () => {
return ( return (
<motion.div <motion.div
initial="hidden" initial="hidden"
animate={isVisible ? 'visible' : 'hidden'} animate={isVisible ? "visible" : "hidden"}
variants={containerVariants} variants={containerVariants}
className="w-full" className="w-full"
> >
{/* Header */} {/* Header */}
<motion.div className="flex justify-between items-center mb-8" variants={itemVariants}> <motion.div
className="flex justify-between items-center mb-8"
variants={itemVariants}
>
<motion.h1 <motion.h1
className="text-3xl font-bold text-gray-800 dark:text-white flex items-center gap-3" className="text-3xl font-bold text-gray-800 dark:text-white flex items-center gap-3"
variants={titleVariants} variants={titleVariants}
@@ -103,6 +122,7 @@ export const SettingsPage = () => {
</motion.h1> </motion.h1>
</motion.div> </motion.div>
{/* Main content with two-column layout */} {/* Main content with two-column layout */}
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6"> <div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Left Column */} {/* Left Column */}
@@ -165,7 +185,10 @@ export const SettingsPage = () => {
storageKey="rag-settings" storageKey="rag-settings"
defaultExpanded={true} defaultExpanded={true}
> >
<RAGSettings ragSettings={ragSettings} setRagSettings={setRagSettings} /> <RAGSettings
ragSettings={ragSettings}
setRagSettings={setRagSettings}
/>
</CollapsibleSettingsCard> </CollapsibleSettingsCard>
</motion.div> </motion.div>
<motion.div variants={itemVariants}> <motion.div variants={itemVariants}>
@@ -176,9 +199,9 @@ export const SettingsPage = () => {
storageKey="code-extraction" storageKey="code-extraction"
defaultExpanded={true} defaultExpanded={true}
> >
<CodeExtractionSettings <CodeExtractionSettings
codeExtractionSettings={codeExtractionSettings} codeExtractionSettings={codeExtractionSettings}
setCodeExtractionSettings={setCodeExtractionSettings} setCodeExtractionSettings={setCodeExtractionSettings}
/> />
</CollapsibleSettingsCard> </CollapsibleSettingsCard>
</motion.div> </motion.div>
@@ -194,7 +217,8 @@ export const SettingsPage = () => {
> >
<div className="space-y-4"> <div className="space-y-4">
<p className="text-sm text-gray-600 dark:text-gray-400"> <p className="text-sm text-gray-600 dark:text-gray-400">
Found a bug or issue? Report it to help improve Archon V2 Alpha. Found a bug or issue? Report it to help improve Archon V2
Alpha.
</p> </p>
<div className="flex justify-start"> <div className="flex justify-start">
<BugReportButton variant="secondary" size="md"> <BugReportButton variant="secondary" size="md">
@@ -234,7 +258,7 @@ export const SettingsPage = () => {
{showButtonPlayground && ( {showButtonPlayground && (
<motion.div <motion.div
initial={{ opacity: 0, height: 0 }} initial={{ opacity: 0, height: 0 }}
animate={{ opacity: 1, height: 'auto' }} animate={{ opacity: 1, height: "auto" }}
exit={{ opacity: 0, height: 0 }} exit={{ opacity: 0, height: 0 }}
transition={{ duration: 0.3 }} transition={{ duration: 0.3 }}
className="overflow-hidden" className="overflow-hidden"
@@ -257,4 +281,4 @@ export const SettingsPage = () => {
)} )}
</motion.div> </motion.div>
); );
}; };

View File

@@ -53,56 +53,81 @@ export interface CodeExtractionSettings {
ENABLE_CODE_SUMMARIES: boolean; ENABLE_CODE_SUMMARIES: boolean;
} }
import { getApiUrl } from '../config/api'; import { getApiUrl } from "../config/api";
class CredentialsService { class CredentialsService {
private baseUrl = getApiUrl(); private baseUrl = getApiUrl();
private handleCredentialError(error: any, context: string): Error {
const errorMessage = error instanceof Error ? error.message : String(error);
// Check for network errors
if (
errorMessage.toLowerCase().includes("network") ||
errorMessage.includes("fetch") ||
errorMessage.includes("Failed to fetch")
) {
return new Error(
`Network error while ${context.toLowerCase()}: ${errorMessage}. ` +
`Please check your connection and server status.`,
);
}
// Return original error with context
return new Error(`${context} failed: ${errorMessage}`);
}
async getAllCredentials(): Promise<Credential[]> { async getAllCredentials(): Promise<Credential[]> {
const response = await fetch(`${this.baseUrl}/api/credentials`); const response = await fetch(`${this.baseUrl}/api/credentials`);
if (!response.ok) { if (!response.ok) {
throw new Error('Failed to fetch credentials'); throw new Error("Failed to fetch credentials");
} }
return response.json(); return response.json();
} }
async getCredentialsByCategory(category: string): Promise<Credential[]> { async getCredentialsByCategory(category: string): Promise<Credential[]> {
const response = await fetch(`${this.baseUrl}/api/credentials/categories/${category}`); const response = await fetch(
`${this.baseUrl}/api/credentials/categories/${category}`,
);
if (!response.ok) { if (!response.ok) {
throw new Error(`Failed to fetch credentials for category: ${category}`); throw new Error(`Failed to fetch credentials for category: ${category}`);
} }
const result = await response.json(); const result = await response.json();
// The API returns {credentials: {...}} where credentials is a dict // The API returns {credentials: {...}} where credentials is a dict
// Convert to array format expected by frontend // Convert to array format expected by frontend
if (result.credentials && typeof result.credentials === 'object') { if (result.credentials && typeof result.credentials === "object") {
return Object.entries(result.credentials).map(([key, value]: [string, any]) => { return Object.entries(result.credentials).map(
if (value && typeof value === 'object' && value.is_encrypted) { ([key, value]: [string, any]) => {
return { if (value && typeof value === "object" && value.is_encrypted) {
key, return {
value: undefined, key,
encrypted_value: value.encrypted_value, value: undefined,
is_encrypted: true, encrypted_value: value.encrypted_value,
category, is_encrypted: true,
description: value.description category,
}; description: value.description,
} else { };
return { } else {
key, return {
value: value, key,
encrypted_value: undefined, value: value,
is_encrypted: false, encrypted_value: undefined,
category, is_encrypted: false,
description: '' category,
}; description: "",
} };
}); }
},
);
} }
return []; return [];
} }
async getCredential(key: string): Promise<{ key: string; value?: string; is_encrypted?: boolean }> { async getCredential(
key: string,
): Promise<{ key: string; value?: string; is_encrypted?: boolean }> {
const response = await fetch(`${this.baseUrl}/api/credentials/${key}`); const response = await fetch(`${this.baseUrl}/api/credentials/${key}`);
if (!response.ok) { if (!response.ok) {
if (response.status === 404) { if (response.status === 404) {
@@ -115,24 +140,24 @@ class CredentialsService {
} }
async getRagSettings(): Promise<RagSettings> { async getRagSettings(): Promise<RagSettings> {
const ragCredentials = await this.getCredentialsByCategory('rag_strategy'); const ragCredentials = await this.getCredentialsByCategory("rag_strategy");
const apiKeysCredentials = await this.getCredentialsByCategory('api_keys'); const apiKeysCredentials = await this.getCredentialsByCategory("api_keys");
const settings: RagSettings = { const settings: RagSettings = {
USE_CONTEXTUAL_EMBEDDINGS: false, USE_CONTEXTUAL_EMBEDDINGS: false,
CONTEXTUAL_EMBEDDINGS_MAX_WORKERS: 3, CONTEXTUAL_EMBEDDINGS_MAX_WORKERS: 3,
USE_HYBRID_SEARCH: true, USE_HYBRID_SEARCH: true,
USE_AGENTIC_RAG: true, USE_AGENTIC_RAG: true,
USE_RERANKING: true, USE_RERANKING: true,
MODEL_CHOICE: 'gpt-4.1-nano', MODEL_CHOICE: "gpt-4.1-nano",
LLM_PROVIDER: 'openai', LLM_PROVIDER: "openai",
LLM_BASE_URL: '', LLM_BASE_URL: "",
EMBEDDING_MODEL: '', EMBEDDING_MODEL: "",
// Crawling Performance Settings defaults // Crawling Performance Settings defaults
CRAWL_BATCH_SIZE: 50, CRAWL_BATCH_SIZE: 50,
CRAWL_MAX_CONCURRENT: 10, CRAWL_MAX_CONCURRENT: 10,
CRAWL_WAIT_STRATEGY: 'domcontentloaded', CRAWL_WAIT_STRATEGY: "domcontentloaded",
CRAWL_PAGE_TIMEOUT: 60000, // Increased from 30s to 60s for documentation sites CRAWL_PAGE_TIMEOUT: 60000, // Increased from 30s to 60s for documentation sites
CRAWL_DELAY_BEFORE_HTML: 0.5, CRAWL_DELAY_BEFORE_HTML: 0.5,
// Storage Performance Settings defaults // Storage Performance Settings defaults
DOCUMENT_STORAGE_BATCH_SIZE: 50, DOCUMENT_STORAGE_BATCH_SIZE: 50,
@@ -143,30 +168,50 @@ class CredentialsService {
MEMORY_THRESHOLD_PERCENT: 80, MEMORY_THRESHOLD_PERCENT: 80,
DISPATCHER_CHECK_INTERVAL: 30, DISPATCHER_CHECK_INTERVAL: 30,
CODE_EXTRACTION_BATCH_SIZE: 50, CODE_EXTRACTION_BATCH_SIZE: 50,
CODE_SUMMARY_MAX_WORKERS: 3 CODE_SUMMARY_MAX_WORKERS: 3,
}; };
// Map credentials to settings // Map credentials to settings
[...ragCredentials, ...apiKeysCredentials].forEach(cred => { [...ragCredentials, ...apiKeysCredentials].forEach((cred) => {
if (cred.key in settings) { if (cred.key in settings) {
// String fields // String fields
if (['MODEL_CHOICE', 'LLM_PROVIDER', 'LLM_BASE_URL', 'EMBEDDING_MODEL', 'CRAWL_WAIT_STRATEGY'].includes(cred.key)) { if (
(settings as any)[cred.key] = cred.value || ''; [
} "MODEL_CHOICE",
"LLM_PROVIDER",
"LLM_BASE_URL",
"EMBEDDING_MODEL",
"CRAWL_WAIT_STRATEGY",
].includes(cred.key)
) {
(settings as any)[cred.key] = cred.value || "";
}
// Number fields // Number fields
else if (['CONTEXTUAL_EMBEDDINGS_MAX_WORKERS', 'CRAWL_BATCH_SIZE', 'CRAWL_MAX_CONCURRENT', else if (
'CRAWL_PAGE_TIMEOUT', 'DOCUMENT_STORAGE_BATCH_SIZE', 'EMBEDDING_BATCH_SIZE', [
'DELETE_BATCH_SIZE', 'MEMORY_THRESHOLD_PERCENT', 'DISPATCHER_CHECK_INTERVAL', "CONTEXTUAL_EMBEDDINGS_MAX_WORKERS",
'CODE_EXTRACTION_BATCH_SIZE', 'CODE_SUMMARY_MAX_WORKERS'].includes(cred.key)) { "CRAWL_BATCH_SIZE",
(settings as any)[cred.key] = parseInt(cred.value || '0', 10) || (settings as any)[cred.key]; "CRAWL_MAX_CONCURRENT",
"CRAWL_PAGE_TIMEOUT",
"DOCUMENT_STORAGE_BATCH_SIZE",
"EMBEDDING_BATCH_SIZE",
"DELETE_BATCH_SIZE",
"MEMORY_THRESHOLD_PERCENT",
"DISPATCHER_CHECK_INTERVAL",
"CODE_EXTRACTION_BATCH_SIZE",
"CODE_SUMMARY_MAX_WORKERS",
].includes(cred.key)
) {
(settings as any)[cred.key] =
parseInt(cred.value || "0", 10) || (settings as any)[cred.key];
} }
// Float fields // Float fields
else if (cred.key === 'CRAWL_DELAY_BEFORE_HTML') { else if (cred.key === "CRAWL_DELAY_BEFORE_HTML") {
settings[cred.key] = parseFloat(cred.value || '0.5') || 0.5; settings[cred.key] = parseFloat(cred.value || "0.5") || 0.5;
} }
// Boolean fields // Boolean fields
else { else {
(settings as any)[cred.key] = cred.value === 'true'; (settings as any)[cred.key] = cred.value === "true";
} }
} }
}); });
@@ -175,71 +220,96 @@ class CredentialsService {
} }
async updateCredential(credential: Credential): Promise<Credential> { async updateCredential(credential: Credential): Promise<Credential> {
const response = await fetch(`${this.baseUrl}/api/credentials/${credential.key}`, { try {
method: 'PUT', const response = await fetch(
headers: { `${this.baseUrl}/api/credentials/${credential.key}`,
'Content-Type': 'application/json', {
}, method: "PUT",
body: JSON.stringify(credential), headers: {
}); "Content-Type": "application/json",
},
if (!response.ok) { body: JSON.stringify(credential),
throw new Error('Failed to update credential'); },
);
if (!response.ok) {
const errorText = await response.text();
throw new Error(`HTTP ${response.status}: ${errorText}`);
}
return response.json();
} catch (error) {
throw this.handleCredentialError(
error,
`Updating credential '${credential.key}'`,
);
} }
return response.json();
} }
async createCredential(credential: Credential): Promise<Credential> { async createCredential(credential: Credential): Promise<Credential> {
const response = await fetch(`${this.baseUrl}/api/credentials`, { try {
method: 'POST', const response = await fetch(`${this.baseUrl}/api/credentials`, {
headers: { method: "POST",
'Content-Type': 'application/json', headers: {
}, "Content-Type": "application/json",
body: JSON.stringify(credential), },
}); body: JSON.stringify(credential),
});
if (!response.ok) {
throw new Error('Failed to create credential'); if (!response.ok) {
const errorText = await response.text();
throw new Error(`HTTP ${response.status}: ${errorText}`);
}
return response.json();
} catch (error) {
throw this.handleCredentialError(
error,
`Creating credential '${credential.key}'`,
);
} }
return response.json();
} }
async deleteCredential(key: string): Promise<void> { async deleteCredential(key: string): Promise<void> {
const response = await fetch(`${this.baseUrl}/api/credentials/${key}`, { try {
method: 'DELETE', const response = await fetch(`${this.baseUrl}/api/credentials/${key}`, {
}); method: "DELETE",
});
if (!response.ok) {
throw new Error('Failed to delete credential'); if (!response.ok) {
const errorText = await response.text();
throw new Error(`HTTP ${response.status}: ${errorText}`);
}
} catch (error) {
throw this.handleCredentialError(error, `Deleting credential '${key}'`);
} }
} }
async updateRagSettings(settings: RagSettings): Promise<void> { async updateRagSettings(settings: RagSettings): Promise<void> {
const promises = []; const promises = [];
// Update all RAG strategy settings // Update all RAG strategy settings
for (const [key, value] of Object.entries(settings)) { for (const [key, value] of Object.entries(settings)) {
// Skip undefined values // Skip undefined values
if (value === undefined) continue; if (value === undefined) continue;
promises.push( promises.push(
this.updateCredential({ this.updateCredential({
key, key,
value: value.toString(), value: value.toString(),
is_encrypted: false, is_encrypted: false,
category: 'rag_strategy', category: "rag_strategy",
}) }),
); );
} }
await Promise.all(promises); await Promise.all(promises);
} }
async getCodeExtractionSettings(): Promise<CodeExtractionSettings> { async getCodeExtractionSettings(): Promise<CodeExtractionSettings> {
const codeExtractionCredentials = await this.getCredentialsByCategory('code_extraction'); const codeExtractionCredentials =
await this.getCredentialsByCategory("code_extraction");
const settings: CodeExtractionSettings = { const settings: CodeExtractionSettings = {
MIN_CODE_BLOCK_LENGTH: 250, MIN_CODE_BLOCK_LENGTH: 250,
MAX_CODE_BLOCK_LENGTH: 5000, MAX_CODE_BLOCK_LENGTH: 5000,
@@ -252,21 +322,24 @@ class CredentialsService {
ENABLE_CONTEXTUAL_LENGTH: true, ENABLE_CONTEXTUAL_LENGTH: true,
CODE_EXTRACTION_MAX_WORKERS: 3, CODE_EXTRACTION_MAX_WORKERS: 3,
CONTEXT_WINDOW_SIZE: 1000, CONTEXT_WINDOW_SIZE: 1000,
ENABLE_CODE_SUMMARIES: true ENABLE_CODE_SUMMARIES: true,
}; };
// Map credentials to settings // Map credentials to settings
codeExtractionCredentials.forEach(cred => { codeExtractionCredentials.forEach((cred) => {
if (cred.key in settings) { if (cred.key in settings) {
const key = cred.key as keyof CodeExtractionSettings; const key = cred.key as keyof CodeExtractionSettings;
if (typeof settings[key] === 'number') { if (typeof settings[key] === "number") {
if (key === 'MAX_PROSE_RATIO') { if (key === "MAX_PROSE_RATIO") {
settings[key] = parseFloat(cred.value || '0.15'); settings[key] = parseFloat(cred.value || "0.15");
} else { } else {
settings[key] = parseInt(cred.value || settings[key].toString(), 10); settings[key] = parseInt(
cred.value || settings[key].toString(),
10,
);
} }
} else if (typeof settings[key] === 'boolean') { } else if (typeof settings[key] === "boolean") {
settings[key] = cred.value === 'true'; settings[key] = cred.value === "true";
} }
} }
}); });
@@ -274,9 +347,11 @@ class CredentialsService {
return settings; return settings;
} }
async updateCodeExtractionSettings(settings: CodeExtractionSettings): Promise<void> { async updateCodeExtractionSettings(
settings: CodeExtractionSettings,
): Promise<void> {
const promises = []; const promises = [];
// Update all code extraction settings // Update all code extraction settings
for (const [key, value] of Object.entries(settings)) { for (const [key, value] of Object.entries(settings)) {
promises.push( promises.push(
@@ -284,13 +359,13 @@ class CredentialsService {
key, key,
value: value.toString(), value: value.toString(),
is_encrypted: false, is_encrypted: false,
category: 'code_extraction', category: "code_extraction",
}) }),
); );
} }
await Promise.all(promises); await Promise.all(promises);
} }
} }
export const credentialsService = new CredentialsService(); export const credentialsService = new CredentialsService();

View File

@@ -242,18 +242,6 @@ class TestService {
} }
} }
/**
* Check if test results are available
*/
async hasTestResults(): Promise<boolean> {
try {
// Check for latest test results via API
const response = await fetch(`${API_BASE_URL}/api/tests/latest-results`);
return response.ok;
} catch {
return false;
}
}
/** /**
* Get coverage data for Test Results Modal from new API endpoints with fallback * Get coverage data for Test Results Modal from new API endpoints with fallback

View File

@@ -1,6 +1,7 @@
import { render, screen, fireEvent } from '@testing-library/react' import { render, screen, fireEvent } from '@testing-library/react'
import { describe, test, expect, vi } from 'vitest' import { describe, test, expect, vi, beforeEach, afterEach } from 'vitest'
import React from 'react' import React from 'react'
import { credentialsService } from '../src/services/credentialsService'
describe('Error Handling Tests', () => { describe('Error Handling Tests', () => {
test('api error simulation', () => { test('api error simulation', () => {
@@ -196,4 +197,40 @@ describe('Error Handling Tests', () => {
fireEvent.click(screen.getByText('500 Error')) fireEvent.click(screen.getByText('500 Error'))
expect(screen.getByRole('alert')).toHaveTextContent('Something went wrong on our end') expect(screen.getByRole('alert')).toHaveTextContent('Something went wrong on our end')
}) })
})
describe('CredentialsService Error Handling', () => {
const originalFetch = global.fetch
beforeEach(() => {
global.fetch = vi.fn() as any
})
afterEach(() => {
global.fetch = originalFetch
})
test('should handle network errors with context', async () => {
const mockError = new Error('Network request failed')
;(global.fetch as any).mockRejectedValueOnce(mockError)
await expect(credentialsService.createCredential({
key: 'TEST_KEY',
value: 'test',
is_encrypted: false,
category: 'test'
})).rejects.toThrow(/Network error while creating credential 'test_key'/)
})
test('should preserve context in error messages', async () => {
const mockError = new Error('database error')
;(global.fetch as any).mockRejectedValueOnce(mockError)
await expect(credentialsService.updateCredential({
key: 'OPENAI_API_KEY',
value: 'sk-test',
is_encrypted: true,
category: 'api_keys'
})).rejects.toThrow(/Updating credential 'OPENAI_API_KEY' failed/)
})
}) })

View File

@@ -74,7 +74,7 @@ describe('Onboarding Detection Tests', () => {
{ key: 'LLM_PROVIDER', value: 'openai', category: 'rag_strategy' } { key: 'LLM_PROVIDER', value: 'openai', category: 'rag_strategy' }
] ]
const apiKeyCreds: NormalizedCredential[] = [ const apiKeyCreds: NormalizedCredential[] = [
{ key: 'OPENAI_API_KEY', is_encrypted: true, category: 'api_keys' } { key: 'OPENAI_API_KEY', is_encrypted: true, encrypted_value: 'encrypted_sk-test123', category: 'api_keys' }
] ]
expect(isLmConfigured(ragCreds, apiKeyCreds)).toBe(true) expect(isLmConfigured(ragCreds, apiKeyCreds)).toBe(true)

View File

@@ -2,6 +2,9 @@ import { expect, afterEach, vi } from 'vitest'
import { cleanup } from '@testing-library/react' import { cleanup } from '@testing-library/react'
import '@testing-library/jest-dom/vitest' import '@testing-library/jest-dom/vitest'
// Set required environment variables for tests
process.env.ARCHON_SERVER_PORT = '8181'
// Clean up after each test // Clean up after each test
afterEach(() => { afterEach(() => {
cleanup() cleanup()
@@ -15,7 +18,7 @@ global.fetch = vi.fn(() =>
text: () => Promise.resolve(''), text: () => Promise.resolve(''),
status: 200, status: 200,
} as Response) } as Response)
) ) as any
// Mock WebSocket // Mock WebSocket
class MockWebSocket { class MockWebSocket {

View File

@@ -3,7 +3,7 @@
-- ===================================================== -- =====================================================
-- This script combines all migrations into a single file -- This script combines all migrations into a single file
-- for easy one-time database initialization -- for easy one-time database initialization
-- --
-- Run this script in your Supabase SQL Editor to set up -- Run this script in your Supabase SQL Editor to set up
-- the complete Archon database schema and initial data -- the complete Archon database schema and initial data
-- ===================================================== -- =====================================================
@@ -47,9 +47,9 @@ BEGIN
END; END;
$$ language 'plpgsql'; $$ language 'plpgsql';
CREATE TRIGGER update_archon_settings_updated_at CREATE TRIGGER update_archon_settings_updated_at
BEFORE UPDATE ON archon_settings BEFORE UPDATE ON archon_settings
FOR EACH ROW FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column(); EXECUTE FUNCTION update_updated_at_column();
-- Create RLS (Row Level Security) policies for settings -- Create RLS (Row Level Security) policies for settings
@@ -197,10 +197,10 @@ CREATE TABLE IF NOT EXISTS archon_crawled_pages (
source_id TEXT NOT NULL, source_id TEXT NOT NULL,
embedding VECTOR(1536), -- OpenAI embeddings are 1536 dimensions embedding VECTOR(1536), -- OpenAI embeddings are 1536 dimensions
created_at TIMESTAMP WITH TIME ZONE DEFAULT timezone('utc'::text, now()) NOT NULL, created_at TIMESTAMP WITH TIME ZONE DEFAULT timezone('utc'::text, now()) NOT NULL,
-- Add a unique constraint to prevent duplicate chunks for the same URL -- Add a unique constraint to prevent duplicate chunks for the same URL
UNIQUE(url, chunk_number), UNIQUE(url, chunk_number),
-- Add foreign key constraint to sources table -- Add foreign key constraint to sources table
FOREIGN KEY (source_id) REFERENCES archon_sources(source_id) FOREIGN KEY (source_id) REFERENCES archon_sources(source_id)
); );
@@ -221,10 +221,10 @@ CREATE TABLE IF NOT EXISTS archon_code_examples (
source_id TEXT NOT NULL, source_id TEXT NOT NULL,
embedding VECTOR(1536), -- OpenAI embeddings are 1536 dimensions embedding VECTOR(1536), -- OpenAI embeddings are 1536 dimensions
created_at TIMESTAMP WITH TIME ZONE DEFAULT timezone('utc'::text, now()) NOT NULL, created_at TIMESTAMP WITH TIME ZONE DEFAULT timezone('utc'::text, now()) NOT NULL,
-- Add a unique constraint to prevent duplicate chunks for the same URL -- Add a unique constraint to prevent duplicate chunks for the same URL
UNIQUE(url, chunk_number), UNIQUE(url, chunk_number),
-- Add foreign key constraint to sources table -- Add foreign key constraint to sources table
FOREIGN KEY (source_id) REFERENCES archon_sources(source_id) FOREIGN KEY (source_id) REFERENCES archon_sources(source_id)
); );
@@ -416,7 +416,7 @@ CREATE TABLE IF NOT EXISTS archon_document_versions (
created_at TIMESTAMPTZ DEFAULT NOW(), created_at TIMESTAMPTZ DEFAULT NOW(),
-- Ensure we have either project_id OR task_id, not both -- Ensure we have either project_id OR task_id, not both
CONSTRAINT chk_project_or_task CHECK ( CONSTRAINT chk_project_or_task CHECK (
(project_id IS NOT NULL AND task_id IS NULL) OR (project_id IS NOT NULL AND task_id IS NULL) OR
(project_id IS NULL AND task_id IS NOT NULL) (project_id IS NULL AND task_id IS NOT NULL)
), ),
-- Unique constraint to prevent duplicate version numbers per field -- Unique constraint to prevent duplicate version numbers per field
@@ -439,51 +439,51 @@ CREATE INDEX IF NOT EXISTS idx_archon_document_versions_version_number ON archon
CREATE INDEX IF NOT EXISTS idx_archon_document_versions_created_at ON archon_document_versions(created_at); CREATE INDEX IF NOT EXISTS idx_archon_document_versions_created_at ON archon_document_versions(created_at);
-- Apply triggers to tables -- Apply triggers to tables
CREATE OR REPLACE TRIGGER update_archon_projects_updated_at CREATE OR REPLACE TRIGGER update_archon_projects_updated_at
BEFORE UPDATE ON archon_projects BEFORE UPDATE ON archon_projects
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column(); FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
CREATE OR REPLACE TRIGGER update_archon_tasks_updated_at CREATE OR REPLACE TRIGGER update_archon_tasks_updated_at
BEFORE UPDATE ON archon_tasks BEFORE UPDATE ON archon_tasks
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column(); FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
-- Soft delete function for tasks -- Soft delete function for tasks
CREATE OR REPLACE FUNCTION archive_task( CREATE OR REPLACE FUNCTION archive_task(
task_id_param UUID, task_id_param UUID,
archived_by_param TEXT DEFAULT 'system' archived_by_param TEXT DEFAULT 'system'
) )
RETURNS BOOLEAN AS $$ RETURNS BOOLEAN AS $$
DECLARE DECLARE
task_exists BOOLEAN; task_exists BOOLEAN;
BEGIN BEGIN
-- Check if task exists and is not already archived -- Check if task exists and is not already archived
SELECT EXISTS( SELECT EXISTS(
SELECT 1 FROM archon_tasks SELECT 1 FROM archon_tasks
WHERE id = task_id_param AND archived = FALSE WHERE id = task_id_param AND archived = FALSE
) INTO task_exists; ) INTO task_exists;
IF NOT task_exists THEN IF NOT task_exists THEN
RETURN FALSE; RETURN FALSE;
END IF; END IF;
-- Archive the task -- Archive the task
UPDATE archon_tasks UPDATE archon_tasks
SET SET
archived = TRUE, archived = TRUE,
archived_at = NOW(), archived_at = NOW(),
archived_by = archived_by_param, archived_by = archived_by_param,
updated_at = NOW() updated_at = NOW()
WHERE id = task_id_param; WHERE id = task_id_param;
-- Also archive all subtasks -- Also archive all subtasks
UPDATE archon_tasks UPDATE archon_tasks
SET SET
archived = TRUE, archived = TRUE,
archived_at = NOW(), archived_at = NOW(),
archived_by = archived_by_param, archived_by = archived_by_param,
updated_at = NOW() updated_at = NOW()
WHERE parent_task_id = task_id_param AND archived = FALSE; WHERE parent_task_id = task_id_param AND archived = FALSE;
RETURN TRUE; RETURN TRUE;
END; END;
$$ LANGUAGE plpgsql; $$ LANGUAGE plpgsql;
@@ -520,8 +520,8 @@ CREATE TABLE IF NOT EXISTS archon_prompts (
CREATE INDEX IF NOT EXISTS idx_archon_prompts_name ON archon_prompts(prompt_name); CREATE INDEX IF NOT EXISTS idx_archon_prompts_name ON archon_prompts(prompt_name);
-- Add trigger to automatically update updated_at timestamp -- Add trigger to automatically update updated_at timestamp
CREATE OR REPLACE TRIGGER update_archon_prompts_updated_at CREATE OR REPLACE TRIGGER update_archon_prompts_updated_at
BEFORE UPDATE ON archon_prompts BEFORE UPDATE ON archon_prompts
FOR EACH ROW EXECUTE FUNCTION update_updated_at_column(); FOR EACH ROW EXECUTE FUNCTION update_updated_at_column();
-- ===================================================== -- =====================================================
@@ -787,9 +787,9 @@ Remember: Create production-ready data models.', 'System prompt for creating dat
-- SETUP COMPLETE -- SETUP COMPLETE
-- ===================================================== -- =====================================================
-- Your Archon database is now fully configured! -- Your Archon database is now fully configured!
-- --
-- Next steps: -- Next steps:
-- 1. Add your OpenAI API key via the Settings UI -- 1. Add your OpenAI API key via the Settings UI
-- 2. Enable Projects feature if needed -- 2. Enable Projects feature if needed
-- 3. Start crawling websites or uploading documents -- 3. Start crawling websites or uploading documents
-- ===================================================== -- =====================================================

View File

@@ -6,6 +6,8 @@ import os
from dataclasses import dataclass from dataclasses import dataclass
from urllib.parse import urlparse from urllib.parse import urlparse
import jwt
class ConfigurationError(Exception): class ConfigurationError(Exception):
"""Raised when there's an error in configuration.""" """Raised when there's an error in configuration."""
@@ -46,6 +48,38 @@ def validate_openai_api_key(api_key: str) -> bool:
return True return True
def validate_supabase_key(supabase_key: str) -> tuple[bool, str]:
"""Validate Supabase key type and return validation result.
Returns:
tuple[bool, str]: (is_valid, message)
- (False, "ANON_KEY_DETECTED") if anon key detected
- (True, "VALID_SERVICE_KEY") if service key detected
- (False, "UNKNOWN_KEY_TYPE:{role}") for unknown roles
- (True, "UNABLE_TO_VALIDATE") if JWT cannot be decoded
"""
if not supabase_key:
return False, "EMPTY_KEY"
try:
# Decode JWT without verification to check the 'role' claim
# We don't verify the signature since we only need to check the role
decoded = jwt.decode(supabase_key, options={"verify_signature": False})
role = decoded.get("role")
if role == "anon":
return False, "ANON_KEY_DETECTED"
elif role == "service_role":
return True, "VALID_SERVICE_KEY"
else:
return False, f"UNKNOWN_KEY_TYPE:{role}"
except Exception:
# If we can't decode the JWT, we'll allow it to proceed
# This handles new key formats or non-JWT keys
return True, "UNABLE_TO_VALIDATE"
def validate_supabase_url(url: str) -> bool: def validate_supabase_url(url: str) -> bool:
"""Validate Supabase URL format.""" """Validate Supabase URL format."""
if not url: if not url:
@@ -80,6 +114,29 @@ def load_environment_config() -> EnvironmentConfig:
validate_openai_api_key(openai_api_key) validate_openai_api_key(openai_api_key)
validate_supabase_url(supabase_url) validate_supabase_url(supabase_url)
# Validate Supabase key type
is_valid_key, key_message = validate_supabase_key(supabase_service_key)
if not is_valid_key:
if key_message == "ANON_KEY_DETECTED":
raise ConfigurationError(
"CRITICAL: You are using a Supabase ANON key instead of a SERVICE key.\n\n"
"The ANON key is a public key with read-only permissions that cannot write to the database.\n"
"This will cause all database operations to fail with 'permission denied' errors.\n\n"
"To fix this:\n"
"1. Go to your Supabase project dashboard\n"
"2. Navigate to Settings > API keys\n"
"3. Find the 'service_role' key (NOT the 'anon' key)\n"
"4. Update your SUPABASE_SERVICE_KEY environment variable\n\n"
"Key characteristics:\n"
"- ANON key: Starts with 'eyJ...' and has role='anon' (public, read-only)\n"
"- SERVICE key: Starts with 'eyJ...' and has role='service_role' (private, full access)\n\n"
"Current key role detected: anon"
)
elif key_message.startswith("UNKNOWN_KEY_TYPE:"):
role = key_message.split(":", 1)[1]
print(f"WARNING: Unknown Supabase key role '{role}'. Proceeding but may cause issues.")
# For UNABLE_TO_VALIDATE, we continue silently
# Optional environment variables with defaults # Optional environment variables with defaults
host = os.getenv("HOST", "0.0.0.0") host = os.getenv("HOST", "0.0.0.0")
port_str = os.getenv("PORT") port_str = os.getenv("PORT")
@@ -97,8 +154,8 @@ def load_environment_config() -> EnvironmentConfig:
# Validate and convert port # Validate and convert port
try: try:
port = int(port_str) port = int(port_str)
except ValueError: except ValueError as e:
raise ConfigurationError(f"PORT must be a valid integer, got: {port_str}") raise ConfigurationError(f"PORT must be a valid integer, got: {port_str}") from e
return EnvironmentConfig( return EnvironmentConfig(
openai_api_key=openai_api_key, openai_api_key=openai_api_key,
@@ -110,6 +167,11 @@ def load_environment_config() -> EnvironmentConfig:
) )
def get_config() -> EnvironmentConfig:
"""Get environment configuration with validation."""
return load_environment_config()
def get_rag_strategy_config() -> RAGStrategyConfig: def get_rag_strategy_config() -> RAGStrategyConfig:
"""Load RAG strategy configuration from environment variables.""" """Load RAG strategy configuration from environment variables."""

View File

@@ -80,6 +80,11 @@ async def lifespan(app: FastAPI):
logger.info("🚀 Starting Archon backend...") logger.info("🚀 Starting Archon backend...")
try: try:
# Validate configuration FIRST - check for anon vs service key
from .config.config import get_config
get_config() # This will raise ConfigurationError if anon key detected
# Initialize credentials from database FIRST - this is the foundation for everything else # Initialize credentials from database FIRST - this is the foundation for everything else
await initialize_credentials() await initialize_credentials()

View File

@@ -227,14 +227,11 @@ class CredentialService:
self._cache[key] = value self._cache[key] = value
# Upsert to database with proper conflict handling # Upsert to database with proper conflict handling
result = ( # Since we validate service key at startup, permission errors here indicate actual database issues
supabase.table("archon_settings") supabase.table("archon_settings").upsert(
.upsert( data,
data, on_conflict="key", # Specify the unique column for conflict resolution
on_conflict="key", # Specify the unique column for conflict resolution ).execute()
)
.execute()
)
# Invalidate RAG settings cache if this is a rag_strategy setting # Invalidate RAG settings cache if this is a rag_strategy setting
if category == "rag_strategy": if category == "rag_strategy":
@@ -256,7 +253,8 @@ class CredentialService:
try: try:
supabase = self._get_supabase_client() supabase = self._get_supabase_client()
result = supabase.table("archon_settings").delete().eq("key", key).execute() # Since we validate service key at startup, we can directly execute
supabase.table("archon_settings").delete().eq("key", key).execute()
# Remove from cache # Remove from cache
if key in self._cache: if key in self._cache:

View File

@@ -56,3 +56,5 @@ def test_existing_credential_returns_normally(client, mock_supabase_client):
assert data["is_encrypted"] is False assert data["is_encrypted"] is False
# Should not have is_default flag for real credentials # Should not have is_default flag for real credentials
assert "is_default" not in data assert "is_default" not in data

View File

@@ -0,0 +1,221 @@
"""
Unit tests for Supabase key validation functionality.
Tests the JWT-based validation of anon vs service keys.
"""
import pytest
import jwt
from unittest.mock import patch, MagicMock
from src.server.config.config import (
validate_supabase_key,
ConfigurationError,
load_environment_config,
)
def test_validate_anon_key():
"""Test validation detects anon key correctly."""
# Create mock anon key JWT
anon_payload = {"role": "anon", "iss": "supabase"}
anon_token = jwt.encode(anon_payload, "secret", algorithm="HS256")
is_valid, msg = validate_supabase_key(anon_token)
assert is_valid == False
assert msg == "ANON_KEY_DETECTED"
def test_validate_service_key():
"""Test validation detects service key correctly."""
# Create mock service key JWT
service_payload = {"role": "service_role", "iss": "supabase"}
service_token = jwt.encode(service_payload, "secret", algorithm="HS256")
is_valid, msg = validate_supabase_key(service_token)
assert is_valid == True
assert msg == "VALID_SERVICE_KEY"
def test_validate_unknown_key():
"""Test validation handles unknown key roles."""
# Create mock key with unknown role
unknown_payload = {"role": "custom", "iss": "supabase"}
unknown_token = jwt.encode(unknown_payload, "secret", algorithm="HS256")
is_valid, msg = validate_supabase_key(unknown_token)
assert is_valid == False
assert "UNKNOWN_KEY_TYPE" in msg
assert "custom" in msg
def test_validate_invalid_jwt():
"""Test validation handles invalid JWT format gracefully."""
is_valid, msg = validate_supabase_key("not-a-jwt")
# Should allow invalid JWT to proceed (might be new format)
assert is_valid == True
assert msg == "UNABLE_TO_VALIDATE"
def test_validate_empty_key():
"""Test validation handles empty key."""
is_valid, msg = validate_supabase_key("")
assert is_valid == False
assert msg == "EMPTY_KEY"
def test_config_raises_on_anon_key():
"""Test that configuration loading raises error when anon key detected."""
# Create a mock anon key JWT
anon_payload = {"role": "anon", "iss": "supabase"}
mock_anon_key = jwt.encode(anon_payload, "secret", algorithm="HS256")
with patch.dict(
"os.environ",
{
"SUPABASE_URL": "https://test.supabase.co",
"SUPABASE_SERVICE_KEY": mock_anon_key,
"OPENAI_API_KEY": "" # Clear any existing key
},
clear=True # Clear all env vars to ensure isolation
):
with pytest.raises(ConfigurationError) as exc_info:
load_environment_config()
error_message = str(exc_info.value)
assert "CRITICAL: You are using a Supabase ANON key" in error_message
assert "service_role" in error_message
assert "permission denied" in error_message
def test_config_accepts_service_key():
"""Test that configuration loading accepts service key."""
# Create a mock service key JWT
service_payload = {"role": "service_role", "iss": "supabase"}
mock_service_key = jwt.encode(service_payload, "secret", algorithm="HS256")
with patch.dict(
"os.environ",
{
"SUPABASE_URL": "https://test.supabase.co",
"SUPABASE_SERVICE_KEY": mock_service_key,
"PORT": "8051", # Required for config
"OPENAI_API_KEY": "" # Clear any existing key
},
clear=True # Clear all env vars to ensure isolation
):
# Should not raise an exception
config = load_environment_config()
assert config.supabase_service_key == mock_service_key
def test_config_handles_invalid_jwt():
"""Test that configuration loading handles invalid JWT gracefully."""
with patch.dict(
"os.environ",
{
"SUPABASE_URL": "https://test.supabase.co",
"SUPABASE_SERVICE_KEY": "invalid-jwt-key",
"PORT": "8051", # Required for config
"OPENAI_API_KEY": "" # Clear any existing key
},
clear=True # Clear all env vars to ensure isolation
):
with patch("builtins.print") as mock_print:
# Should not raise an exception for invalid JWT
config = load_environment_config()
assert config.supabase_service_key == "invalid-jwt-key"
def test_config_warns_on_unknown_role():
"""Test that configuration loading warns for unknown roles.
NOTE: This currently prints a warning but doesn't fail.
TODO: Per alpha principles, unknown key types should fail fast, not just warn.
"""
# Create a mock key with unknown role
unknown_payload = {"role": "custom_role", "iss": "supabase"}
mock_unknown_key = jwt.encode(unknown_payload, "secret", algorithm="HS256")
with patch.dict(
"os.environ",
{
"SUPABASE_URL": "https://test.supabase.co",
"SUPABASE_SERVICE_KEY": mock_unknown_key,
"PORT": "8051", # Required for config
"OPENAI_API_KEY": "" # Clear any existing key
},
clear=True # Clear all env vars to ensure isolation
):
with patch("builtins.print") as mock_print:
# Should not raise an exception but should print warning
config = load_environment_config()
assert config.supabase_service_key == mock_unknown_key
# Check that warning was printed
mock_print.assert_called_once()
call_args = mock_print.call_args[0][0]
assert "WARNING: Unknown Supabase key role 'custom_role'" in call_args
def test_config_raises_on_anon_key_with_port():
"""Test that anon key detection works properly with all required env vars."""
# Create a mock anon key JWT
anon_payload = {"role": "anon", "iss": "supabase"}
mock_anon_key = jwt.encode(anon_payload, "secret", algorithm="HS256")
with patch.dict(
"os.environ",
{
"SUPABASE_URL": "https://test.supabase.co",
"SUPABASE_SERVICE_KEY": mock_anon_key,
"PORT": "8051",
"OPENAI_API_KEY": "sk-test123" # Valid OpenAI key
},
clear=True
):
# Should still raise ConfigurationError for anon key even with valid OpenAI key
with pytest.raises(ConfigurationError) as exc_info:
load_environment_config()
error_message = str(exc_info.value)
assert "CRITICAL: You are using a Supabase ANON key" in error_message
def test_jwt_decoding_with_real_structure():
"""Test JWT decoding with realistic Supabase JWT structure."""
# More realistic Supabase JWT payload structure
realistic_anon_payload = {
"aud": "authenticated",
"exp": 1999999999,
"iat": 1234567890,
"iss": "supabase",
"ref": "abcdefghij",
"role": "anon",
}
realistic_service_payload = {
"aud": "authenticated",
"exp": 1999999999,
"iat": 1234567890,
"iss": "supabase",
"ref": "abcdefghij",
"role": "service_role",
}
anon_token = jwt.encode(realistic_anon_payload, "secret", algorithm="HS256")
service_token = jwt.encode(realistic_service_payload, "secret", algorithm="HS256")
# Test anon key detection
is_valid_anon, msg_anon = validate_supabase_key(anon_token)
assert is_valid_anon == False
assert msg_anon == "ANON_KEY_DETECTED"
# Test service key detection
is_valid_service, msg_service = validate_supabase_key(service_token)
assert is_valid_service == True
assert msg_service == "VALID_SERVICE_KEY"