Files
AIclinicalresearch/backend/prisma/manual-migrations/001_add_postgres_cache_and_checkpoint.sql
HaHafeng dfc0fe0b9a feat(pkb): Integrate pgvector and create Dify replacement plan
Summary:
- Migrate PostgreSQL to pgvector/pgvector:pg15 Docker image
- Successfully install and verify pgvector 0.8.1 extension
- Create comprehensive Dify-to-pgvector migration plan
- Update PKB module documentation with pgvector status
- Update system documentation with pgvector integration

Key changes:
- docker-compose.yml: Switch to pgvector/pgvector:pg15 image
- Add EkbDocument and EkbChunk data model design
- Design R-C-R-G hybrid retrieval architecture
- Add clinical data JSONB fields (pico, studyDesign, regimen, safety, criteria, endpoints)
- Create detailed 10-day implementation roadmap

Documentation updates:
- PKB module status: pgvector RAG infrastructure ready
- System status: pgvector 0.8.1 integrated
- New: Dify replacement development plan (01-Dify替换为pgvector开发计划.md)
- New: Enterprise medical knowledge base solution V2

Tested: PostgreSQL with pgvector verified, frontend and backend functionality confirmed
2026-01-20 00:00:58 +08:00

120 lines
2.3 KiB
SQL
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
-- ==================== Postgres-Only 改造:手动迁移 ====================
-- 文件: 001_add_postgres_cache_and_checkpoint.sql
-- 目的: 添加缓存表和断点续传字段
-- 日期: 2025-12-13
-- 说明: 避免Prisma migrate的shadow database问题手动添加所需表和字段
-- ==================== 1. 创建缓存表 (AppCache) ====================
CREATE TABLE IF NOT EXISTS platform_schema.app_cache (
id SERIAL PRIMARY KEY,
key VARCHAR(500) UNIQUE NOT NULL,
value JSONB NOT NULL,
expires_at TIMESTAMP NOT NULL,
created_at TIMESTAMP DEFAULT NOW()
);
-- 创建索引优化过期查询和key查询
CREATE INDEX IF NOT EXISTS idx_app_cache_expires
ON platform_schema.app_cache(expires_at);
CREATE INDEX IF NOT EXISTS idx_app_cache_key_expires
ON platform_schema.app_cache(key, expires_at);
-- ==================== 2. 为AslScreeningTask添加新字段 ====================
-- 任务拆分支持字段
ALTER TABLE asl_schema.screening_tasks
ADD COLUMN IF NOT EXISTS total_batches INTEGER DEFAULT 1,
ADD COLUMN IF NOT EXISTS processed_batches INTEGER DEFAULT 0,
ADD COLUMN IF NOT EXISTS current_batch_index INTEGER DEFAULT 0;
-- 断点续传支持字段
ALTER TABLE asl_schema.screening_tasks
ADD COLUMN IF NOT EXISTS current_index INTEGER DEFAULT 0,
ADD COLUMN IF NOT EXISTS last_checkpoint TIMESTAMP,
ADD COLUMN IF NOT EXISTS checkpoint_data JSONB;
-- ==================== 3. 验证创建结果 ====================
-- 查看app_cache表结构
SELECT
column_name,
data_type,
is_nullable,
column_default
FROM information_schema.columns
WHERE table_schema = 'platform_schema'
AND table_name = 'app_cache'
ORDER BY ordinal_position;
-- 查看screening_tasks新增字段
SELECT
column_name,
data_type,
is_nullable,
column_default
FROM information_schema.columns
WHERE table_schema = 'asl_schema'
AND table_name = 'screening_tasks'
AND column_name IN (
'total_batches', 'processed_batches', 'current_batch_index',
'current_index', 'last_checkpoint', 'checkpoint_data'
)
ORDER BY ordinal_position;
-- ==================== 完成 ====================
-- ✅ 缓存表已创建
-- ✅ 任务拆分字段已添加
-- ✅ 断点续传字段已添加