refactor: 架构演进任务 1.2 + 2 + 3 完成

任务 1.2: Blueprint 迁移到 Repository
- alerts.py: get_alerts 和 resolve_alert 改用 alert_repo
- workorders.py: get_workorders 改用 workorder_repo.list_workorders
- 去掉了 blueprint 中的直接 session.query 调用

任务 2: 统一 LLM 客户端
- LLMClient 新增 async_generate/async_chat 异步方法(线程池包装)
- agent_assistant.py 改用统一的 LLMClient(不再依赖 agent/llm_client.py 的 LLMManager)
- 所有 LLM 调用统一走 src/core/llm_client.py

任务 3: MessagePipeline
- 创建 src/dialogue/message_pipeline.py
- 统一消息处理流程:租户解析  会话管理  消息处理
- handle_message 一步到位方法,各入口只需传 user_id + message
- service_manager.get_pipeline() 注册
This commit is contained in:
2026-04-08 08:35:31 +08:00
parent 24a5fad630
commit db992be02a
7 changed files with 176 additions and 154 deletions

View File

@@ -199,7 +199,35 @@ class LLMClient:
except Exception:
return False
# ── 异步接口(供 agent_assistant 等异步代码使用)──────────
async def async_generate(self, prompt: str, temperature: float = 0.7, max_tokens: int = 1000) -> str:
"""异步生成文本(在线程池中运行同步调用)"""
import asyncio
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(
None,
lambda: self.chat_completion(
[{"role": "user", "content": prompt}],
temperature=temperature, max_tokens=max_tokens
)
)
if "error" in result:
raise RuntimeError(result["error"])
return result["choices"][0]["message"]["content"]
async def async_chat(self, messages: List[Dict[str, str]], temperature: float = 0.7, max_tokens: int = 1000) -> str:
"""异步对话(在线程池中运行同步调用)"""
import asyncio
loop = asyncio.get_event_loop()
result = await loop.run_in_executor(
None,
lambda: self.chat_completion(messages, temperature=temperature, max_tokens=max_tokens)
)
if "error" in result:
raise RuntimeError(result["error"])
return result["choices"][0]["message"]["content"]
# ── 向后兼容别名 ──────────────────────────────────────────
# 旧代码中 `from src.core.llm_client import QwenClient` 仍然能用
QwenClient = LLMClient