Major Changes: - Add StreamingService with OpenAI Compatible format - Upgrade Chat component V2 with Ant Design X integration - Implement AIA module with 12 intelligent agents - Update API routes to unified /api/v1 prefix - Update system documentation Backend (~1300 lines): - common/streaming: OpenAI Compatible adapter - modules/aia: 12 agents, conversation service, streaming integration - Update route versions (RVW, PKB to v1) Frontend (~3500 lines): - modules/aia: AgentHub + ChatWorkspace (100% prototype restoration) - shared/Chat: AIStreamChat, ThinkingBlock, useAIStream Hook - Update API endpoints to v1 Documentation: - AIA module status guide - Universal capabilities catalog - System overview updates - All module documentation sync Tested: Stream response verified, authentication working Status: AIA V2.0 core completed (85%)
13 KiB
CloseAI髮<EFBFBD><EFBFBD>謖<EFBFBD>漉
*譁<EFBFBD>。」迚域悽<EFBFBD>? v1.0
*蛻帛サコ譌・譛滂シ? 2025-11-09
逕ィ騾費シ<EFBFBD> 騾夊ソ④loseAI莉」逅<EFBDA3>ケウ蜿ー隶ソ髣ョOpenAI GPT-5蜥靴laude-4.5
*騾ら畑蝨コ譎ッ<EFBFBD>? AI譎コ閭ス譁<EFBDBD>鍵蜿梧ィ。蝙狗ュ幃峨<E5B3A8>ォ倩エィ驥乗枚譛ャ逕滓<E98095>
<EFBFBD>搭 CloseAI邂莉?
莉荵域弍CloseAI<EFBFBD>?
CloseAI譏ッ荳荳?API莉」逅<EFBFBD>ケウ蜿ー*<2A>御クコ荳ュ蝗ス逕ィ謌キ謠蝉セ帷ィウ螳夂噪OpenAI蜥靴laude API隶ソ髣ョ譛榊苅縲?
*譬ク蠢<EFBFBD>シ伜漢<EFBFBD>?
- 笨?蝗ス蜀<EFBDBD>峩霑橸シ梧裏髴遘大ュヲ荳顔ス<E9A194>
- 笨?荳荳ェAPI Key蜷梧慮隹<E685AE>畑OpenAI蜥靴laude
- 笨?蜈シ螳ケOpenAI SDK譬<4B>㊥謗・蜿」
- 笨?謾ッ謖∵怙譁ー讓。蝙具シ<E585B7>PT-5縲,laude-4.5<EFBFBD>?
*螳倡ス托シ? https://platform.openai-proxy.org
<EFBFBD>肌 驟咲スョ菫。諱ッ
邇ッ蠅<EFBFBD>序驥城<EFBFBD>鄂ョ
# CloseAI扈滉クAPI Key
CLOSEAI_API_KEY=sk-cu0iepbXYGGx2jc7BqP6ogtSWmP6fk918qV3RUdtGC3Edlpo
# OpenAI遶ッ轤ケ
CLOSEAI_OPENAI_BASE_URL=https://api.openai-proxy.org/v1
# Claude遶ッ轤ケ
CLOSEAI_CLAUDE_BASE_URL=https://api.openai-proxy.org/anthropic
謾ッ謖∫噪讓。蝙?
| 讓。蝙<EFBFBD> | Model ID | 隸エ譏<EFBFBD> | 騾ら畑蝨コ譎ッ |
|---|---|---|---|
| GPT-5-Pro | gpt-5-pro |
譛譁ーGPT-5 箝? | 譁<EFBFBD>鍵邊セ蜃<EFBFBD>ュ幃峨∝、肴揩謗ィ逅? |
| GPT-4-Turbo | gpt-4-turbo-preview |
GPT-4鬮俶ァ閭ス迚? | 雍ィ驥剰ヲ∵アるォ倡噪莉サ蜉。 |
| GPT-3.5-Turbo | gpt-3.5-turbo |
蠢ォ騾溽サ乗オ守沿 | 邂蜊穂ササ蜉。縲∵<EFBFBD>譛ャ莨伜<EFBFBD>? |
| Claude-4.5-Sonnet | claude-sonnet-4-5-20250929 |
譛譁ーClaude 箝? | 隨ャ荳画婿莉イ陬√∫サ捺桷蛹冶セ灘<EFBFBD> |
| Claude-3.5-Sonnet | claude-3-5-sonnet-20241022 |
Claude-3.5遞ウ螳夂<EFBFBD>? | 鬮倩エィ驥乗枚譛ャ逕滓<EFBFBD>? |
<EFBFBD>捗 莉」遐<EFBDA3>寔謌<E5AF94>
1. 螳芽」<E88ABD>セ晁オ<E69981>
npm install openai
2. 蛻帛サコLLM譛榊苅邀?
*譁<EFBFBD>サカ菴咲スョ<EFBFBD>? backend/src/common/llm/closeai.service.ts
import OpenAI from 'openai';
import { config } from '../../config/env';
export class CloseAIService {
private openaiClient: OpenAI;
private claudeClient: OpenAI;
constructor() {
// OpenAI螳「謌キ遶ッ<E981B6>磯夊ソ④loseAI<41>?
this.openaiClient = new OpenAI({
apiKey: config.closeaiApiKey,
baseURL: config.closeaiOpenaiBaseUrl,
});
// Claude螳「謌キ遶ッ<E981B6>磯夊ソ④loseAI<41>?
this.claudeClient = new OpenAI({
apiKey: config.closeaiApiKey,
baseURL: config.closeaiClaudeBaseUrl,
});
}
/**
* 隹<>畑GPT-5-Pro
*/
async chatWithGPT5(prompt: string, systemPrompt?: string) {
const messages: any[] = [];
if (systemPrompt) {
messages.push({ role: 'system', content: systemPrompt });
}
messages.push({ role: 'user', content: prompt });
const response = await this.openaiClient.chat.completions.create({
model: 'gpt-5-pro',
messages,
temperature: 0.3,
max_tokens: 2000,
});
return {
content: response.choices[0].message.content,
usage: response.usage,
model: 'gpt-5-pro',
};
}
/**
* 隹<>畑Claude-4.5-Sonnet
*/
async chatWithClaude(prompt: string, systemPrompt?: string) {
const messages: any[] = [];
if (systemPrompt) {
messages.push({ role: 'system', content: systemPrompt });
}
messages.push({ role: 'user', content: prompt });
const response = await this.claudeClient.chat.completions.create({
model: 'claude-sonnet-4-5-20250929',
messages,
temperature: 0.3,
max_tokens: 2000,
});
return {
content: response.choices[0].message.content,
usage: response.usage,
model: 'claude-sonnet-4-5-20250929',
};
}
/**
* 豬∝シ丞桃蠎費シ<E8B2BB>PT-5<>?
*/
async *streamGPT5(prompt: string, systemPrompt?: string) {
const messages: any[] = [];
if (systemPrompt) {
messages.push({ role: 'system', content: systemPrompt });
}
messages.push({ role: 'user', content: prompt });
const stream = await this.openaiClient.chat.completions.create({
model: 'gpt-5-pro',
messages,
temperature: 0.3,
max_tokens: 2000,
stream: true,
});
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content || '';
if (content) {
yield content;
}
}
}
}
3. 扈滉クLLM譛榊苅<E6A68A>亥性4荳ェ讓。蝙具シ<E585B7>
*譁<EFBFBD>サカ菴咲スョ<EFBFBD>? backend/src/common/llm/llm.service.ts
import OpenAI from 'openai';
import { config } from '../../config/env';
export type LLMProvider = 'deepseek' | 'gpt5' | 'claude' | 'qwen';
export class UnifiedLLMService {
private deepseek: OpenAI;
private gpt5: OpenAI;
private claude: OpenAI;
private qwen: OpenAI;
constructor() {
// DeepSeek (逶エ霑<EFBDB4>)
this.deepseek = new OpenAI({
apiKey: config.deepseekApiKey,
baseURL: config.deepseekBaseUrl,
});
// GPT-5 (騾夊ソ④loseAI)
this.gpt5 = new OpenAI({
apiKey: config.closeaiApiKey,
baseURL: config.closeaiOpenaiBaseUrl,
});
// Claude (騾夊ソ④loseAI)
this.claude = new OpenAI({
apiKey: config.closeaiApiKey,
baseURL: config.closeaiClaudeBaseUrl,
});
// Qwen (螟<>畑)
this.qwen = new OpenAI({
apiKey: config.dashscopeApiKey,
baseURL: 'https://dashscope.aliyuncs.com/compatible-mode/v1',
});
}
/**
* 扈滉ク隹<C280>畑謗・蜿」
*/
async chat(
provider: LLMProvider,
prompt: string,
options?: {
systemPrompt?: string;
temperature?: number;
maxTokens?: number;
}
) {
const { systemPrompt, temperature = 0.3, maxTokens = 2000 } = options || {};
const messages: any[] = [];
if (systemPrompt) {
messages.push({ role: 'system', content: systemPrompt });
}
messages.push({ role: 'user', content: prompt });
// 騾画叫讓。蝙<EFBDA1>
const modelMap = {
deepseek: { client: this.deepseek, model: 'deepseek-chat' },
gpt5: { client: this.gpt5, model: 'gpt-5-pro' },
claude: { client: this.claude, model: 'claude-sonnet-4-5-20250929' },
qwen: { client: this.qwen, model: 'qwen-max' },
};
const { client, model } = modelMap[provider];
const response = await client.chat.completions.create({
model,
messages,
temperature,
max_tokens: maxTokens,
});
return {
content: response.choices[0].message.content || '',
usage: response.usage,
model,
provider,
};
}
}
<EFBFBD>識 AI譎コ閭ス譁<EFBDBD>鍵蠎皮畑蝨コ譎ッ
蝨コ譎ッ1<EFBFBD>壼曙讓。蝙句ッケ豈皮ュ幃会シ域耳闕撰シ俄ュ<EFBFBD>
*遲也払<EFBFBD>? DeepSeek<65>亥ソォ騾溷<E9A8BE>遲幢シ<E5B9A2> + GPT-5<>郁エィ驥丞、肴<EFBDA4>ク<EFBFBD><EFBDB8>
export class LiteratureScreeningService {
private llm: UnifiedLLMService;
constructor() {
this.llm = new UnifiedLLMService();
}
/**
* 蜿梧ィ。蝙区枚迪ョ遲幃?
*/
async screenLiterature(title: string, abstract: string, picoConfig: any) {
const prompt = `
隸キ譬ケ謐ョ莉・荳輝ICO譬<EFBFBD>㊥<EFBFBD>悟愛譁ュ霑咏ッ<EFBFBD>枚迪ョ譏ッ蜷ヲ蠎碑ッ・郤ウ蜈・<EFBFBD><EFBFBD>
**PICO譬<4F>㊥<EFBFBD>?*
- Population: ${picoConfig.population}
- Intervention: ${picoConfig.intervention}
- Comparison: ${picoConfig.comparison}
- Outcome: ${picoConfig.outcome}
**譁<>鍵菫。諱ッ<E8ABB1>?*
譬<EFBFBD>「假シ?{title}
鞫倩ヲ<EFBFBD>シ?{abstract}
隸キ霎灘<EFBFBD>JSON譬シ蠑擾シ?
{
"decision": "include/exclude/uncertain",
"reason": "蛻、譁ュ逅<EFBDAD>罰",
"confidence": 0.0-1.0
}
`;
// 蟷カ陦瑚ー<E7919A>畑荳、荳ェ讓。蝙<EFBDA1>
const [deepseekResult, gpt5Result] = await Promise.all([
this.llm.chat('deepseek', prompt),
this.llm.chat('gpt5', prompt),
]);
// 隗」譫千サ捺棡
const deepseekDecision = JSON.parse(deepseekResult.content);
const gpt5Decision = JSON.parse(gpt5Result.content);
// 螯よ棡荳、荳ェ讓。蝙倶ク閾エ<E996BE>檎峩謗・驥<EFBDA5>コウ
if (deepseekDecision.decision === gpt5Decision.decision) {
return {
finalDecision: deepseekDecision.decision,
consensus: 'high',
models: [deepseekDecision, gpt5Decision],
};
}
// 螯よ棡荳堺ク閾エ<E996BE>瑚ソ泌屓蜿梧婿諢剰ァ<E589B0>シ悟セ<E6829F>ココ蟾・螟肴<E89E9F>ク
return {
finalDecision: 'uncertain',
consensus: 'low',
models: [deepseekDecision, gpt5Decision],
needManualReview: true,
};
}
}
蝨コ譎ッ2<EFBFBD>壻ク画ィ。蝙句<EFBFBD>隸<EFBFBD>サイ陬<EFBFBD>
*遲也払<EFBFBD>? 蠖謎ク、荳ェ讓。蝙句<E89D99>遯∵慮<E288B5>悟星逕ィClaude菴應クコ隨ャ荳画婿莉イ陬?
async screenWithArbitration(title: string, abstract: string, picoConfig: any) {
// 隨ャ荳霓ョ<E99C93>壼曙讓。蝙狗ュ幃?
const initialScreen = await this.screenLiterature(title, abstract, picoConfig);
// 螯よ棡荳閾エ<E996BE>檎峩謗・霑泌屓
if (initialScreen.consensus === 'high') {
return initialScreen;
}
// 螯よ棡荳堺ク閾エ<E996BE>悟星逕ィClaude莉イ陬<EFBDB2>
console.log('蜿梧ィ。蝙狗サ捺棡荳堺ク閾エ<E996BE>悟星逕ィClaude莉イ陬<EFBDB2>...');
const claudeResult = await this.llm.chat('claude', prompt);
const claudeDecision = JSON.parse(claudeResult.content);
// 荳画ィ。蝙区兜逾?
const decisions = [
initialScreen.models[0].decision,
initialScreen.models[1].decision,
claudeDecision.decision,
];
const voteCount = {
include: decisions.filter(d => d === 'include').length,
exclude: decisions.filter(d => d === 'exclude').length,
uncertain: decisions.filter(d => d === 'uncertain').length,
};
// 螟壽焚蜀?
const finalDecision = Object.keys(voteCount).reduce((a, b) =>
voteCount[a] > voteCount[b] ? a : b
);
return {
finalDecision,
consensus: voteCount[finalDecision] >= 2 ? 'medium' : 'low',
models: [...initialScreen.models, claudeDecision],
arbitration: true,
};
}
蝨コ譎ッ3<EFBFBD>壽<EFBFBD>譛ャ莨伜喧遲也<EFBFBD>?
*遲也払<EFBFBD>? 蜿ェ蟇ケ荳咲。ョ螳夂噪扈捺棡菴ソ逕ィGPT-5螟肴<E89E9F>ク
async screenWithCostOptimization(title: string, abstract: string, picoConfig: any) {
// 隨ャ荳霓ョ<E99C93>夂畑DeepSeek蠢ォ騾溷<E9A8BE>遲幢シ井セソ螳懶シ?
const quickScreen = await this.llm.chat('deepseek', prompt);
const quickDecision = JSON.parse(quickScreen.content);
// 螯よ棡扈捺棡譏守。ョ<EFBDA1><EFBDAE>nclude謌貌xclude荳皮スョ菫。蠎ヲ>0.8<EFBFBD>会シ檎峩謗・驥<EFBFBD>コウ
if (quickDecision.confidence > 0.8 && quickDecision.decision !== 'uncertain') {
return {
finalDecision: quickDecision.decision,
consensus: 'high',
models: [quickDecision],
costOptimized: true,
};
}
// 蜷ヲ蛻呻シ檎畑GPT-5螟肴<E89E9F>ク
const detailedScreen = await this.llm.chat('gpt5', prompt);
const detailedDecision = JSON.parse(detailedScreen.content);
return {
finalDecision: detailedDecision.decision,
consensus: 'medium',
models: [quickDecision, detailedDecision],
costOptimized: true,
};
}
<EFBFBD>投 諤ァ閭ス蜥梧<E89CA5>譛ャ蟇ケ豈?
讓。蝙区ァ閭ス蟇ケ豈<EFBFBD>
| 謖<EFBFBD><EFBFBD><EFBFBD> | DeepSeek-V3 | GPT-5-Pro | Claude-4.5 | Qwen-Max |
|---|---|---|---|---|
| *蜃<EFBFBD>。ョ邇? | 85% | 95% 箝? | 93% | 82% |
| 騾溷コヲ | *蠢? 箝? | 荳ュ遲<EFBFBD> | 荳ュ遲<EFBFBD> | 蠢? |
| 謌先悽 | ツ・0.001/1K 箝? | ツ・0.10/1K | ツ・0.021/1K | ツ・0.004/1K |
| 荳ュ譁<EFBFBD>炊隗」 | 莨倡ァ 箝? | 莨倡ァ | 濶ッ螂ス | 莨倡ァ |
| *扈捺桷蛹冶セ灘<EFBFBD>? | 濶ッ螂ス | 莨倡ァ | 莨倡ァ 箝? | 濶ッ螂ス |
遲幃?000遽<30>枚迪ョ逧<EFBDAE><E980A7>譛ャ莨ー邂<EFBDB0>
遲也払A<EFBFBD>壼宵逕ィDeepSeek
- 謌先悽<EFBFBD>堋?0-30
- 蜃<EFBFBD>。ョ邇<EFBFBD>シ<EFBFBD>85%
- 騾ら畑<EFBFBD>夐「<EFBFBD>ョ玲怏髯撰シ悟庄謗・蜿嶺ク螳夊ッッ蟾?
*遲也払B<EFBFBD>咼eepSeek + GPT-5 蜿梧ィ。蝙?
- 謌先悽<EFBFBD>堋?50-200
- 蜃<EFBFBD>。ョ邇<EFBFBD>シ<EFBFBD>92%
- 騾ら畑<EFBFBD>夊エィ驥剰ヲ∵アるォ假シ碁「<EFBFBD>ョ怜<EFBFBD>雜?箝?謗ィ闕<EFBDA8>
*遲也払C<EFBFBD>壻ク画ィ。蝙句<EFBFBD>隸<EFBFBD>シ?0%蜀イ遯∝星逕ィClaude<64>?
- 謌先悽<EFBFBD>堋?80-220
- 蜃<EFBFBD>。ョ邇<EFBFBD>シ<EFBFBD>95%
- 騾ら畑<EFBFBD>壽怙鬮倩エィ驥剰ヲ∵ア?
*遲也払D<EFBFBD>壽<EFBFBD>譛ャ莨伜喧<EFBFBD><EFBFBD>80%逕ィDeepSeek<65>?0%逕ィGPT-5<>?
- 謌先悽<EFBFBD>堋?0-80
- 蜃<EFBFBD>。ョ邇<EFBFBD>シ<EFBFBD>90%
- 騾ら畑<EFBFBD>夊エィ驥丞柱謌先悽蟷ウ陦。 箝?諤ァ莉キ豈疲怙鬮?
笞<EFBFBD><EFBFBD><EFBFBD> 豕ィ諢丈コ矩。ケ
1. API Key螳牙<E89EB3>
// 笶?髞呵ッッ<EFBDAF>夂。ャ郛也<E9839B>、PI Key
const client = new OpenAI({
apiKey: 'sk-cu0iepbXYGGx2jc7BqP6ogtSWmP6fk918qV3RUdtGC3Edlpo',
});
// 笨?豁」遑ョ<E98191>壻サ守識蠅<E8AD98>序驥剰ッサ蜿<EFBDBB>
const client = new OpenAI({
apiKey: process.env.CLOSEAI_API_KEY,
});
2. 髞呵ッッ螟<EFBDAF>炊
async chat(provider: LLMProvider, prompt: string) {
try {
const response = await this.llm.chat(provider, prompt);
return response;
} catch (error) {
// CloseAI蜿ッ閭ス霑泌屓逧<E5B193>漠隸?
if (error.status === 429) {
// 騾溽紫髯仙宛
console.error('API隹<49>畑騾溽紫雜<E7B4AB>剞<EFBFBD>瑚ッキ遞榊錘驥崎ッ<E5B48E>');
} else if (error.status === 401) {
// 隶、隸∝、ア雍・
console.error('API Key譌<79>謨茨シ瑚ッキ譽譟・驟咲ス?);
} else if (error.status === 500) {
// 譛榊苅遶ッ髞呵ッ?
console.error('CloseAI譛榊苅蠑ょクク<EFBFBD>瑚ッキ遞榊錘驥崎ッ<EFBFBD>');
}
throw error;
}
}
3. 隸キ豎る㍾隸<E38DBE>
async chatWithRetry(provider: LLMProvider, prompt: string, maxRetries = 3) {
for (let i = 0; i < maxRetries; i++) {
try {
return await this.llm.chat(provider, prompt);
} catch (error) {
if (i === maxRetries - 1) throw error;
// 謖<>焚騾驕?
const delay = Math.pow(2, i) * 1000;
await new Promise(resolve => setTimeout(resolve, delay));
}
}
}
<EFBFBD>答 逶ク蜈ウ譁<EFBDB3>。」
*譖エ譁ー譌・蠢暦シ?
- 2025-11-09: 蛻帛サコ譁<EFBDBA>。」<EFBDA1>梧キサ蜉<EFBDBB>CloseAI髮<49><E9ABAE>謖<EFBFBD>漉
- 謾ッ謖;PT-5-Pro蜥靴laude-4.5-Sonnet譛譁ー讓。蝙?