Files
AIclinicalresearch/backend/prisma/manual-migrations/001_add_postgres_cache_and_checkpoint.sql
HaHafeng 96290d2f76 feat(aia): Implement Protocol Agent MVP with reusable Agent framework
Sprint 1-3 Completed (Backend + Frontend):

Backend (Sprint 1-2):
- Implement 5-layer Agent framework (Query->Planner->Executor->Tools->Reflection)
- Create agent_schema with 6 tables (agent_definitions, stages, prompts, sessions, traces, reflexion_rules)
- Create protocol_schema with 2 tables (protocol_contexts, protocol_generations)
- Implement Protocol Agent core services (Orchestrator, ContextService, PromptBuilder)
- Integrate LLM service adapter (DeepSeek/Qwen/GPT-5/Claude)
- 6 API endpoints with full authentication
- 10/10 API tests passed

Frontend (Sprint 3):
- Add Protocol Agent entry in AgentHub (indigo theme card)
- Implement ProtocolAgentPage with 3-column layout
- Collapsible sidebar (Gemini style, 48px <-> 280px)
- StatePanel with 5 stage cards (scientific_question, pico, study_design, sample_size, endpoints)
- ChatArea with sync button and action cards integration
- 100% prototype design restoration (608 lines CSS)
- Detailed endpoints structure: baseline, exposure, outcomes, confounders

Features:
- 5-stage dialogue flow for research protocol design
- Conversation-driven interaction with sync-to-protocol button
- Real-time context state management
- One-click protocol generation button (UI ready, backend pending)

Database:
- agent_schema: 6 tables for reusable Agent framework
- protocol_schema: 2 tables for Protocol Agent
- Seed data: 1 agent + 5 stages + 9 prompts + 4 reflexion rules

Code Stats:
- Backend: 13 files, 4338 lines
- Frontend: 14 files, 2071 lines
- Total: 27 files, 6409 lines

Status: MVP core functionality completed, pending frontend-backend integration testing

Next: Sprint 4 - One-click protocol generation + Word export
2026-01-24 17:29:24 +08:00

129 lines
2.3 KiB
SQL
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
-- ==================== Postgres-Only 改造:手动迁移 ====================
-- 文件: 001_add_postgres_cache_and_checkpoint.sql
-- 目的: 添加缓存表和断点续传字段
-- 日期: 2025-12-13
-- 说明: 避免Prisma migrate的shadow database问题手动添加所需表和字段
-- ==================== 1. 创建缓存表 (AppCache) ====================
CREATE TABLE IF NOT EXISTS platform_schema.app_cache (
id SERIAL PRIMARY KEY,
key VARCHAR(500) UNIQUE NOT NULL,
value JSONB NOT NULL,
expires_at TIMESTAMP NOT NULL,
created_at TIMESTAMP DEFAULT NOW()
);
-- 创建索引优化过期查询和key查询
CREATE INDEX IF NOT EXISTS idx_app_cache_expires
ON platform_schema.app_cache(expires_at);
CREATE INDEX IF NOT EXISTS idx_app_cache_key_expires
ON platform_schema.app_cache(key, expires_at);
-- ==================== 2. 为AslScreeningTask添加新字段 ====================
-- 任务拆分支持字段
ALTER TABLE asl_schema.screening_tasks
ADD COLUMN IF NOT EXISTS total_batches INTEGER DEFAULT 1,
ADD COLUMN IF NOT EXISTS processed_batches INTEGER DEFAULT 0,
ADD COLUMN IF NOT EXISTS current_batch_index INTEGER DEFAULT 0;
-- 断点续传支持字段
ALTER TABLE asl_schema.screening_tasks
ADD COLUMN IF NOT EXISTS current_index INTEGER DEFAULT 0,
ADD COLUMN IF NOT EXISTS last_checkpoint TIMESTAMP,
ADD COLUMN IF NOT EXISTS checkpoint_data JSONB;
-- ==================== 3. 验证创建结果 ====================
-- 查看app_cache表结构
SELECT
column_name,
data_type,
is_nullable,
column_default
FROM information_schema.columns
WHERE table_schema = 'platform_schema'
AND table_name = 'app_cache'
ORDER BY ordinal_position;
-- 查看screening_tasks新增字段
SELECT
column_name,
data_type,
is_nullable,
column_default
FROM information_schema.columns
WHERE table_schema = 'asl_schema'
AND table_name = 'screening_tasks'
AND column_name IN (
'total_batches', 'processed_batches', 'current_batch_index',
'current_index', 'last_checkpoint', 'checkpoint_data'
)
ORDER BY ordinal_position;
-- ==================== 完成 ====================
-- ✅ 缓存表已创建
-- ✅ 任务拆分字段已添加
-- ✅ 断点续传字段已添加