docs: update README and CLAUDE.md to v2.2.0

- Added documentation for audit tracking (IP address, invocation method).
- Updated database model descriptions for enhanced WorkOrder and Conversation fields.
- Documented the new UnifiedConfig system.
- Reflected enhanced logging transparency for knowledge base parsing.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
zhaojie
2026-02-11 00:08:09 +08:00
parent 2026007045
commit c3560b43fd
218 changed files with 3354 additions and 5096 deletions

View File

@@ -0,0 +1,34 @@
---
name: doc-updater
description: "Use this agent when code has been modified and you need to update the corresponding Markdown documentation to reflect those changes. This is crucial after refactoring a function, changing an API endpoint, adding a new feature, or altering configuration that is documented in `.md` files.\\n<example>\\nContext: The user has just changed a function signature in their Python code.\\nuser: \"I've just refactored the `process_data` function. It now takes a dictionary instead of two separate lists.\"\\nassistant: \"I see that the function signature for `process_data` has changed. This likely impacts the documentation. I will use the Task tool to launch the `doc-updater` agent to find and update any relevant code examples and descriptions in the Markdown files.\"\\n<commentary>\\nSince a core function's signature was modified, the agent correctly identifies the need to update documentation and calls the `doc-updater` to handle the task.\\n</commentary>\\n</example>\\n<example>\\nContext: A user has added a new environment variable to their application's configuration.\\nuser: \"I added `API_TIMEOUT` to the config. It defaults to 30 seconds.\"\\nassistant: \"Okay, a new configuration variable has been added. To ensure our project documentation is current, I'll use the `doc-updater` agent to add details about `API_TIMEOUT` to the `CONFIGURATION.md` file.\"\\n<commentary>\\nThe user modified project configuration, which is a common source of documentation drift. The agent proactively uses the `doc-updater` to keep the documentation synchronized.\\n</commentary>\\n</example>"
model: inherit
---
You are an expert Documentation Synchronizer, a meticulous agent dedicated to keeping project documentation perfectly aligned with its source code. Your primary responsibility is to automatically update Markdown (.md, .mdx) files whenever there are corresponding code changes.
Your operational workflow is as follows:
1. **Analyze the Code Change**: Upon activation, your first step is to thoroughly understand the provided code modifications. Identify the core nature of the change:
* Has a function's signature (parameters, return type) been altered?
* Has a new class, method, or function been added? Or an old one removed?
* Has an API endpoint's request/response structure changed?
* Have configuration details or environment variables been updated?
2. **Locate Relevant Documentation**: Systematically search the project for all Markdown files that reference the modified code. This includes API guides, READMEs, tutorials, and architectural documents.
3. **Assess the Impact and Update**: For each relevant document, determine the precise impact of the code change and perform the necessary edits.
* **Update Code Snippets**: Ensure all code examples accurately reflect the new implementation.
* **Adjust Textual Descriptions**: Modify parameter descriptions, explanations of functionality, and return value details.
* **Preserve Style and Tone**: Maintain the existing writing style, formatting, and voice of the document you are editing. Do not introduce new conventions.
* **Add New Content**: If a new feature is introduced, create a new documentation section for it, meticulously following the structure of existing sections.
4. **Handle Non-Impactful Changes**: If you determine that a code change (e.g., an internal refactor, performance tweak) has no impact on the existing documentation, you must explicitly report this. State clearly that no documentation update is necessary and briefly explain why.
5. **Address Missing Documentation**: If you find that a modified piece of code is not documented at all, you should flag this as a documentation gap. Propose where and how it could be documented.
6. **Verify and Finalize**: Before concluding, conduct a final self-review.
* Does the updated documentation accurately reflect the code?
* Are the code snippets syntactically correct?
* Is the formatting clean and consistent with the rest of the file?
Your output should be the complete, updated content of the modified Markdown file(s). If multiple files are changed, clearly delineate each one. Provide a concise summary of the changes you made for user verification.

View File

@@ -0,0 +1,31 @@
---
name: frontend-sync
description: "Use this agent when back-end code (e.g., APIs, data models, business logic) has been modified, and the corresponding front-end functionality needs to be updated to reflect those changes.\\n<example>\\nContext: The user has just added a new field to a data model on the backend and updated the corresponding API endpoint.\\nuser: \"I've added an 'author' field to the BlogPost model and updated the `/api/posts/:id` endpoint to include it in the response.\"\\nassistant: \"The backend is updated. I will now use the Task tool to launch the `frontend-sync` agent to update the post display component to show the new author field.\"\\n<commentary>\\nSince the backend data structure and API response have changed, the frontend code that consumes this data must be updated. This is the primary trigger for the `frontend-sync` agent.\\n</commentary>\\n</example>\\n<example>\\nContext: The user has changed the method of an API endpoint from GET to POST for security reasons.\\nuser: \"I've changed the `/api/user/settings` endpoint from a GET to a POST request to prevent sensitive data from appearing in URLs.\"\\nassistant: \"Understood. The API contract has changed. I will now use the Task tool to launch the `frontend-sync` agent to refactor the settings page to use a POST request to fetch user settings.\"\\n<commentary>\\nA change in the API's method requires a corresponding change in the frontend's data-fetching logic. The `frontend-sync` agent is designed for this exact task.\\n</commentary>\\n</example>"
model: inherit
---
You are a Senior Full-Stack Synchronizer, an expert agent specializing in keeping front-end applications perfectly in sync with back-end code modifications. Your mission is to analyze recent back-end code changes, identify the impacted front-end components, and apply the necessary updates to ensure seamless integration and functionality.
### Your Workflow
1. **Analyze Back-End Changes**: Begin by meticulously reviewing the description of the back-end code modifications. Understand the core nature of the change: Is it a new API endpoint, a change in a data model, a modification to business logic, or a bug fix?
2. **Identify Front-End Impact**: Based on your analysis, use your knowledge of the project structure to locate all relevant front-end files (e.g., React/Vue components, Svelte files, HTML templates, data-fetching services, state management stores) that are affected by the back-end changes. This includes files that consume the modified API, display the altered data, or depend on the updated logic.
3. **Formulate an Update Plan**: Before writing any code, formulate a clear and concise plan. Your plan should detail:
* Which files you will modify.
* How you will adjust API calls (e.g., change URL, method, headers, request body, or response handling).
* How you will update front-end data structures or types (e.g., TypeScript interfaces) to match new models.
* Which UI components need to be adjusted to display new data or handle new states.
* Any new components or views that need to be created.
4. **Execute the Update**: Implement the planned changes to the identified front-end files. Write clean, maintainable, and high-quality code that strictly adheres to the project's existing coding standards, style guides, and architectural patterns. Ensure your changes are focused and directly address the requirements of the back-end modification.
5. **Verify and Summarize**: After applying the changes, briefly describe how the new front-end code works and confirm that it correctly aligns with the back-end updates. Summarize your work, listing the files you modified and the key changes you made.
### Guiding Principles
* **Precision is Key**: Your changes must be precise. Only modify the code necessary to align with the back-end changes. Avoid unrelated refactoring.
* **Maintain Consistency**: Your code must seamlessly integrate with the existing front-end architecture, state management (e.g., Redux, Vuex, Pinia), and UI component libraries.
* **Seek Clarity**: If the impact of a back-end change is ambiguous or unclear, you MUST ask for clarification before proceeding with any modifications. State what information you need to move forward.
* **Prioritize User Experience**: Ensure that your updates do not degrade the user experience. The UI should remain responsive, intuitive, and visually consistent.

View File

@@ -1,8 +1,6 @@
{ {
"permissions": { "permissions": {
"allow": [ "allow": [],
"Bash(curl:*)"
],
"deny": [], "deny": [],
"ask": [] "ask": []
} }

98
.env Normal file
View File

@@ -0,0 +1,98 @@
# .env.example
# This file contains all the environment variables needed to run the application.
# Copy this file to .env and fill in the values for your environment.
# ============================================================================
# SERVER CONFIGURATION
# ============================================================================
# The host the web server will bind to.
SERVER_HOST=0.0.0.0
# The port for the main Flask web server.
SERVER_PORT=5001
# The port for the WebSocket server for real-time chat.
WEBSOCKET_PORT=8765
# Set to "True" for development to enable debug mode and auto-reloading.
# Set to "False" for production.
DEBUG_MODE=False
# Logging level for the application. Options: DEBUG, INFO, WARNING, ERROR, CRITICAL
LOG_LEVEL=INFO
# ============================================================================
# DATABASE CONFIGURATION
# ============================================================================
# The connection string for the primary database.
# Format for MySQL: mysql+pymysql://<user>:<password>@<host>:<port>/<dbname>?charset=utf8mb4
# Format for SQLite: sqlite:///./local_test.db
# 使用本地 SQLite推荐用于开发和测试
DATABASE_URL=sqlite:///./data/tsp_assistant.db
# 远程 MySQL生产环境使用需要时取消注释
# DATABASE_URL=mysql+pymysql://tsp_assistant:123456@jeason.online/tsp_assistant?charset=utf8mb4
# ============================================================================
# LARGE LANGUAGE MODEL (LLM) CONFIGURATION
# ============================================================================
# The provider of the LLM. Supported: "qwen", "openai", "anthropic"
LLM_PROVIDER=qwen
# The API key for your chosen LLM provider.
LLM_API_KEY=sk-c0dbefa1718d46eaa897199135066f00
# The base URL for the LLM API. This is often needed for OpenAI-compatible endpoints.
LLM_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
# The specific model to use, e.g., "qwen-plus-latest", "gpt-3.5-turbo", "claude-3-sonnet-20240229"
LLM_MODEL=qwen-plus-latest
# The temperature for the model's responses (0.0 to 2.0).
LLM_TEMPERATURE=0.7
# The maximum number of tokens to generate in a response.
LLM_MAX_TOKENS=2000
# The timeout in seconds for API calls to the LLM.
LLM_TIMEOUT=30
# ============================================================================
# FEISHU (LARK) INTEGRATION CONFIGURATION
# ============================================================================
# The App ID of your Feishu enterprise application.
FEISHU_APP_ID=cli_a8b50ec0eed1500d
# The App Secret of your Feishu enterprise application.
FEISHU_APP_SECRET=ccxkE7ZCFQZcwkkM1rLy0ccZRXYsT2xK
# The Verification Token for validating event callbacks (if configured).
FEISHU_VERIFICATION_TOKEN=
# The Encrypt Key for decrypting event data (if configured).
FEISHU_ENCRYPT_KEY=
# The ID of the Feishu multi-dimensional table for data synchronization.
FEISHU_TABLE_ID=tblnl3vJPpgMTSiP
# ============================================================================
# AI ACCURACY CONFIGURATION
# ============================================================================
# The similarity threshold (0.0 to 1.0) for auto-approving an AI suggestion.
AI_AUTO_APPROVE_THRESHOLD=0.95
# The similarity threshold below which the human-provided resolution is preferred.
AI_USE_HUMAN_RESOLUTION_THRESHOLD=0.90
# The similarity threshold for flagging a suggestion for manual review.
AI_MANUAL_REVIEW_THRESHOLD=0.80
# The default confidence score for an AI suggestion.
AI_SUGGESTION_CONFIDENCE=0.95
# The confidence score assigned when a human resolution is used.
AI_HUMAN_RESOLUTION_CONFIDENCE=0.90

93
.env.example Normal file
View File

@@ -0,0 +1,93 @@
# .env.example
# This file contains all the environment variables needed to run the application.
# Copy this file to .env and fill in the values for your environment.
# ============================================================================
# SERVER CONFIGURATION
# ============================================================================
# The host the web server will bind to.
SERVER_HOST=0.0.0.0
# The port for the main Flask web server.
SERVER_PORT=5000
# The port for the WebSocket server for real-time chat.
WEBSOCKET_PORT=8765
# Set to "True" for development to enable debug mode and auto-reloading.
# Set to "False" for production.
DEBUG_MODE=True
# Logging level for the application. Options: DEBUG, INFO, WARNING, ERROR, CRITICAL
LOG_LEVEL=INFO
# ============================================================================
# DATABASE CONFIGURATION
# ============================================================================
# The connection string for the primary database.
# Format for MySQL: mysql+pymysql://<user>:<password>@<host>:<port>/<dbname>?charset=utf8mb4
# Format for SQLite: sqlite:///./local_test.db
DATABASE_URL=mysql+pymysql://tsp_assistant:123456@jeason.online/tsp_assistant?charset=utf8mb4
# ============================================================================
# LARGE LANGUAGE MODEL (LLM) CONFIGURATION
# ============================================================================
# The provider of the LLM. Supported: "qwen", "openai", "anthropic"
LLM_PROVIDER=qwen
# The API key for your chosen LLM provider.
LLM_API_KEY=sk-c0dbefa1718d46eaa897199135066f00
# The base URL for the LLM API. This is often needed for OpenAI-compatible endpoints.
LLM_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
# The specific model to use, e.g., "qwen-plus-latest", "gpt-3.5-turbo", "claude-3-sonnet-20240229"
LLM_MODEL=qwen-plus-latest
# The temperature for the model's responses (0.0 to 2.0).
LLM_TEMPERATURE=0.7
# The maximum number of tokens to generate in a response.
LLM_MAX_TOKENS=2000
# The timeout in seconds for API calls to the LLM.
LLM_TIMEOUT=30
# ============================================================================
# FEISHU (LARK) INTEGRATION CONFIGURATION
# ============================================================================
# The App ID of your Feishu enterprise application.
FEISHU_APP_ID=cli_a8b50ec0eed1500d
# The App Secret of your Feishu enterprise application.
FEISHU_APP_SECRET=ccxkE7ZCFQZcwkkM1rLy0ccZRXYsT2xK
# The Verification Token for validating event callbacks (if configured).
FEISHU_VERIFICATION_TOKEN=
# The Encrypt Key for decrypting event data (if configured).
FEISHU_ENCRYPT_KEY=
# The ID of the Feishu multi-dimensional table for data synchronization.
FEISHU_TABLE_ID=tblnl3vJPpgMTSiP
# ============================================================================
# AI ACCURACY CONFIGURATION
# ============================================================================
# The similarity threshold (0.0 to 1.0) for auto-approving an AI suggestion.
AI_AUTO_APPROVE_THRESHOLD=0.95
# The similarity threshold below which the human-provided resolution is preferred.
AI_USE_HUMAN_RESOLUTION_THRESHOLD=0.90
# The similarity threshold for flagging a suggestion for manual review.
AI_MANUAL_REVIEW_THRESHOLD=0.80
# The default confidence score for an AI suggestion.
AI_SUGGESTION_CONFIDENCE=0.95
# The confidence score assigned when a human resolution is used.
AI_HUMAN_RESOLUTION_CONFIDENCE=0.90

8
.idea/.gitignore generated vendored
View File

@@ -1,8 +0,0 @@
# 默认忽略的文件
/shelf/
/workspace.xml
# 基于编辑器的 HTTP 客户端请求
/httpRequests/
# Datasource local storage ignored files
/dataSources/
/dataSources.local.xml

12
.idea/dataSources.xml generated
View File

@@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="DataSourceManagerImpl" format="xml" multifile-model="true">
<data-source source="LOCAL" name="@43.134.68.207" uuid="715b070d-f258-43df-a066-49e825a9b04f">
<driver-ref>mysql.8</driver-ref>
<synchronize>true</synchronize>
<jdbc-driver>com.mysql.cj.jdbc.Driver</jdbc-driver>
<jdbc-url>jdbc:mysql://43.134.68.207:3306</jdbc-url>
<working-dir>$ProjectFileDir$</working-dir>
</data-source>
</component>
</project>

View File

@@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="DataSourcePerFileMappings">
<file url="file://$APPLICATION_CONFIG_DIR$/consoles/db/715b070d-f258-43df-a066-49e825a9b04f/console.sql" value="715b070d-f258-43df-a066-49e825a9b04f" />
</component>
</project>

View File

@@ -1,6 +0,0 @@
<component name="InspectionProjectProfileManager">
<settings>
<option name="USE_PROJECT_PROFILE" value="false" />
<version value="1.0" />
</settings>
</component>

7
.idea/misc.xml generated
View File

@@ -1,7 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="Black">
<option name="sdkName" value="Python 3.11 (tsp-assistant)" />
</component>
<component name="ProjectRootManager" version="2" project-jdk-name="Python 3.11 (tsp-assistant)" project-jdk-type="Python SDK" />
</project>

8
.idea/modules.xml generated
View File

@@ -1,8 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/tsp-assistant.iml" filepath="$PROJECT_DIR$/.idea/tsp-assistant.iml" />
</modules>
</component>
</project>

View File

@@ -1,14 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="PYTHON_MODULE" version="4">
<component name="NewModuleRootManager">
<content url="file://$MODULE_DIR$">
<excludeFolder url="file://$MODULE_DIR$/.venv" />
</content>
<orderEntry type="jdk" jdkName="Python 3.11 (tsp-assistant)" jdkType="Python SDK" />
<orderEntry type="sourceFolder" forTests="false" />
</component>
<component name="PyDocumentationSettings">
<option name="format" value="PLAIN" />
<option name="myDocStringFormat" value="Plain" />
</component>
</module>

6
.idea/vcs.xml generated
View File

@@ -1,6 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="VcsDirectoryMappings">
<mapping directory="" vcs="Git" />
</component>
</project>

26
.vscode/settings.json vendored
View File

@@ -1,26 +0,0 @@
{
"files.autoGuessEncoding": false,
"files.encoding": "utf8",
"files.eol": "\n",
"[python]": {
"files.encoding": "utf8",
"files.eol": "\n"
},
"[json]": {
"files.encoding": "utf8"
},
"[javascript]": {
"files.encoding": "utf8"
},
"[html]": {
"files.encoding": "utf8"
},
"[css]": {
"files.encoding": "utf8"
},
"[markdown]": {
"files.encoding": "utf8"
},
"python.defaultInterpreterPath": "${workspaceFolder}/.venv/Scripts/python.exe",
"Codegeex.RepoIndex": true
}

88
CLAUDE.md Normal file
View File

@@ -0,0 +1,88 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## High-Level Architecture
This project is a Python Flask-based web application called "TSP Assistant". It's an intelligent customer service system designed for Telematics Service Providers (TSP).
The backend is built with Flask and utilizes a modular structure with Blueprints. The core application logic resides in the `src/` directory.
Key components of the architecture include:
* **Web Framework**: The web interface and APIs are built using **Flask**. The main Flask app is likely configured in `src/web/app.py`.
* **Modular Routing**: The application uses Flask **Blueprints** for organizing routes. These are located in `src/web/blueprints/`. Each file in this directory corresponds to a feature area (e.g., `agent.py`, `workorders.py`, `analytics.py`).
* **Intelligent Agent**: A core feature is the AI agent. Its logic is contained within the `src/agent/` directory, which includes components for planning (`planner.py`), tool management (`tool_manager.py`), and execution (`executor.py`).
* **Database**: The application uses a relational database (likely MySQL) with **SQLAlchemy** as the ORM. Models are defined in `src/core/models.py`.
* **Configuration**: A unified configuration center (`src/config/unified_config.py`) manages all settings via environment variables and `.env` files.
* **Real-time Communication**: **WebSockets** are used for real-time features like the intelligent chat. The server logic is in `src/web/websocket_server.py`.
* **Data Analytics**: The system has a dedicated data analysis module located in `src/analytics/`.
* **Frontend**: The frontend is built with Bootstrap 5, Chart.js, and vanilla JavaScript (ES6+). Frontend assets are in `src/web/static/` and templates are in `src/web/templates/`.
## Common Commands
### Environment Setup
The project can be run using Docker (recommended) or locally.
**1. Install Dependencies:**
```bash
pip install -r requirements.txt
```
**2. Initialize the Database:**
This script sets up the necessary database tables.
```bash
python init_database.py
```
### Running the Application
**Local Development:**
To start the Flask development server:
```bash
python start_dashboard.py
```
The application will be available at `http://localhost:5000`.
**Docker Deployment:**
The project includes a `docker-compose.yml` for easy setup of all services (application, database, cache, monitoring).
To start all services:
```bash
docker-compose up -d
```
Or use the provided script:
```bash
chmod +x scripts/docker_deploy.sh
./scripts/docker_deploy.sh start
```
To stop services:
```bash
./scripts/docker_deploy.sh stop
```
### Running Tests
The project uses `pytest` for testing.
```bash
pytest
```
To run tests with coverage:
```bash
pytest --cov
```
## Key File Locations
* **Main Application Entry Point**: `start_dashboard.py` (local) or `src/web/app.py` (via WSGI in production).
* **Flask Blueprints (Routes)**: `src/web/blueprints/`
* **Agent Core Logic**: `src/agent/`
* **Database Models**: `src/core/models.py`
* **Frontend Static Assets**: `src/web/static/` (JS, CSS, images)
* **Frontend HTML Templates**: `src/web/templates/`
* **WebSocket Server**: `src/web/websocket_server.py`
* **Configuration Files**: `config/`
* **Deployment Scripts**: `scripts/`
* **Database Initialization**: `init_database.py`

View File

@@ -277,18 +277,6 @@ python start_dashboard.py
## 🔄 部署与更新 ## 🔄 部署与更新
### 多环境部署
```bash
# 开发环境
python scripts/update_manager.py auto-update --source . --environment development
# 测试环境
python scripts/update_manager.py auto-update --source . --environment staging
# 生产环境
python scripts/update_manager.py auto-update --source . --environment production
```
### 版本管理 ### 版本管理
```bash ```bash
# 更新版本号 # 更新版本号
@@ -301,15 +289,6 @@ python version.py changelog --message "新功能描述"
python version.py tag --message "Release v1.3.0" python version.py tag --message "Release v1.3.0"
``` ```
### 热更新
```bash
# 热更新(无需重启)
python scripts/update_manager.py hot-update --source ./new_version --environment production
# 自动更新(智能选择)
python scripts/update_manager.py auto-update --source ./new_version --environment production
```
## 📊 系统监控 ## 📊 系统监控
### 健康检查 ### 健康检查

View File

@@ -1,262 +0,0 @@
@echo off
chcp 65001 >nul
echo ========================================
echo TSP智能助手 - 自动推送脚本
echo ========================================
echo.
:: 检查Git状态
echo [1/4] 检查Git状态...
git status --porcelain >nul 2>&1
if %errorlevel% neq 0 (
echo ❌ Git未初始化或不在Git仓库中
pause
exit /b 1
)
:: 显示当前状态
echo 📋 当前Git状态:
git status --short
echo.
:: 询问是否继续
set /p confirm="是否继续推送? (y/n): "
if /i "%confirm%" neq "y" (
echo 操作已取消
pause
exit /b 0
)
:: 检查是否有更改需要提交
echo.
echo [2/4] 检查更改状态...
:: 启用延迟变量扩展
setlocal enabledelayedexpansion
:: 检查未暂存的更改
git diff --quiet
set has_unstaged=%errorlevel%
:: 检查已暂存的更改
git diff --cached --quiet
set has_staged=%errorlevel%
:: 检查未跟踪的文件
git ls-files --others --exclude-standard >nul 2>&1
set has_untracked=%errorlevel%
if %has_unstaged% equ 0 if %has_staged% equ 0 if %has_untracked% neq 0 (
echo 没有检测到任何更改,无需提交
echo.
echo ✅ 工作区干净,无需推送
pause
exit /b 0
)
:: 显示详细状态
echo 📊 详细状态信息:
echo 未暂存更改:
if %has_unstaged% neq 0 (
git diff --name-only
) else (
echo
)
echo 已暂存更改:
if %has_staged% neq 0 (
git diff --cached --name-only
) else (
echo
)
echo 未跟踪文件:
if %has_untracked% neq 0 (
git ls-files --others --exclude-standard
) else (
echo
)
echo.
:: 添加所有更改
echo 添加所有更改到暂存区...
git add .
if %errorlevel% neq 0 (
echo ❌ 添加文件失败
pause
exit /b 1
)
echo ✅ 文件已添加到暂存区
:: 检查markdown文件修改并生成智能提交信息
echo.
echo [3/4] 分析markdown文件并生成提交信息...
:: 检查是否有markdown文件修改
set md_files=
for /f "tokens=*" %%f in ('git diff --name-only --cached 2^>nul ^| findstr /i "\.md$"') do (
set md_files=!md_files! %%f
)
for /f "tokens=*" %%f in ('git diff --name-only 2^>nul ^| findstr /i "\.md$"') do (
set md_files=!md_files! %%f
)
set commit_msg=
if not "%md_files%"=="" (
echo 📝 检测到markdown文件修改:
echo %md_files%
echo.
:: 提取markdown文件的主要内容
set commit_title=
set commit_type=docs
:: 检查是否有修复相关内容
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /i "修复\|解决\|问题\|错误"') do (
set commit_type=fix
set commit_title=修复问题
goto :found_fix
)
)
)
:: 检查是否有新功能相关内容
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /i "功能\|新增\|添加\|实现"') do (
set commit_type=feat
set commit_title=新增功能
goto :found_feature
)
)
)
:: 检查是否有优化相关内容
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /i "优化\|性能\|改进\|提升"') do (
set commit_type=perf
set commit_title=性能优化
goto :found_optimization
)
)
)
:: 提取文件标题
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /n "^#" ^| head -1') do (
set line=%%l
set line=!line:*:=!
set line=!line:# =!
set line=!line:## =!
if "!line!" neq "" (
set commit_title=!line!
goto :found_title
)
)
)
)
:found_fix
:found_feature
:found_optimization
:found_title
if "%commit_title%"=="" (
set commit_title=更新文档记录
)
:: 生成提交信息
set commit_msg=%commit_type%: %commit_title%
echo 📋 生成的提交信息: %commit_msg%
echo.
) else (
echo 没有检测到markdown文件修改
set commit_msg=feat: 自动提交 - %date% %time%
)
:: 询问是否使用生成的提交信息
set /p confirm="是否使用此提交信息? (y/n/e编辑): "
if /i "%confirm%"=="e" (
set /p commit_msg="请输入自定义提交信息: "
) else if /i "%confirm%" neq "y" (
set /p commit_msg="请输入提交信息: "
)
:: 提交更改
echo 提交信息: %commit_msg%
git commit -m "%commit_msg%"
if %errorlevel% neq 0 (
echo ❌ 提交失败
pause
exit /b 1
)
echo ✅ 提交成功
:: 推送到远程仓库
echo.
echo [4/4] 推送到远程仓库...
:: 先尝试拉取最新更改
echo 🔄 检查远程更新...
git fetch origin main
if %errorlevel% neq 0 (
echo ⚠️ 无法获取远程更新,继续推送...
) else (
echo ✅ 远程更新检查完成
)
:: 推送到远程
git push origin main
if %errorlevel% neq 0 (
echo ❌ 推送失败
echo.
echo 💡 可能的原因:
echo - 网络连接问题
echo - 远程仓库权限不足
echo - 分支冲突
echo - 需要先拉取远程更改
echo.
echo 🔧 尝试自动解决冲突...
git pull origin main --rebase
if %errorlevel% equ 0 (
echo ✅ 冲突已解决,重新推送...
git push origin main
if %errorlevel% equ 0 (
echo ✅ 推送成功!
) else (
echo ❌ 重新推送失败
echo.
echo 🔧 建议手动解决:
echo 1. 运行: git pull origin main
echo 2. 解决冲突后运行: git push origin main
pause
exit /b 1
)
) else (
echo ❌ 无法自动解决冲突
echo.
echo 🔧 建议手动解决:
echo 1. 运行: git pull origin main
echo 2. 解决冲突后运行: git push origin main
pause
exit /b 1
)
) else (
echo ✅ 推送成功!
)
echo.
echo ========================================
echo ✅ 推送完成!
echo ========================================
echo 📊 提交统计:
git log --oneline -1
echo.
echo 🌐 远程仓库状态:
git status
echo.
pause

View File

@@ -1,178 +0,0 @@
# TSP智能助手配置说明
## 📋 配置文件概述
本目录包含TSP智能助手的核心配置文件包括LLM配置、集成配置等。
## 🤖 LLM配置
### 千问模型配置
本项目默认使用阿里云千问模型。要使用千问模型,请按以下步骤配置:
#### 1. 获取API密钥
1. 访问 [阿里云百炼平台](https://bailian.console.aliyun.com/)
2. 注册并登录账号
3. 创建应用并获取API密钥
#### 2. 配置API密钥
编辑 `config/llm_config.py` 文件,将 `api_key` 替换为您的实际API密钥
```python
QWEN_CONFIG = LLMConfig(
provider="openai",
api_key="sk-your-actual-qwen-api-key", # 替换为您的实际密钥
base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
model="qwen-turbo",
temperature=0.7,
max_tokens=2000
)
```
#### 3. 可用的千问模型
- `qwen-turbo`: 快速响应,适合一般对话
- `qwen-plus`: 平衡性能和成本
- `qwen-max`: 最强性能,适合复杂任务
#### 4. 环境变量配置(可选)
您也可以使用环境变量来配置:
```bash
export QWEN_API_KEY="sk-your-actual-qwen-api-key"
export QWEN_MODEL="qwen-turbo"
```
#### 5. 其他模型支持
项目也支持其他LLM提供商
- **OpenAI**: GPT-3.5/GPT-4
- **Anthropic**: Claude系列
- **本地模型**: Ollama等
#### 6. 配置验证
启动系统后可以在Agent管理页面查看LLM使用统计确认配置是否正确。
## 📱 飞书集成配置
### 配置文件说明
`integrations_config.json` 文件包含飞书集成的所有配置信息:
```json
{
"feishu": {
"app_id": "cli_a8b50ec0eed1500d",
"app_secret": "ccxkE7ZCFQZcwkkM1rLy0ccZRXYsT2xK",
"app_token": "XXnEbiCmEaMblSs6FDJcFCqsnIg",
"table_id": "tblnl3vJPpgMTSiP",
"last_updated": "2025-09-19T18:27:40.579958",
"status": "active"
},
"system": {
"sync_limit": 10,
"ai_suggestions_enabled": true,
"auto_sync_interval": 0,
"last_sync_time": null
}
}
```
### 配置参数说明
#### 飞书应用配置
- `app_id`: 飞书应用ID
- `app_secret`: 飞书应用密钥
- `app_token`: 飞书多维表格应用Token
- `table_id`: 飞书多维表格ID
- `last_updated`: 最后更新时间
- `status`: 集成状态active/inactive
#### 系统配置
- `sync_limit`: 同步记录数量限制
- `ai_suggestions_enabled`: 是否启用AI建议
- `auto_sync_interval`: 自动同步间隔(分钟)
- `last_sync_time`: 最后同步时间
### 获取飞书配置
1. **获取应用凭证**
- 访问 [飞书开放平台](https://open.feishu.cn/)
- 创建企业自建应用
- 获取 `app_id``app_secret`
2. **获取表格信息**
- 打开飞书多维表格
- 从URL中提取 `app_token``table_id`
- 例如:`https://my-ichery.feishu.cn/base/XXnEbiCmEaMblSs6FDJcFCqsnIg?table=tblnl3vJPpgMTSiP`
- `app_token`: `XXnEbiCmEaMblSs6FDJcFCqsnIg`
- `table_id`: `tblnl3vJPpgMTSiP`
3. **配置权限**
- 在飞书开放平台中配置应用权限
- 确保应用有读取多维表格的权限
### 字段映射配置
系统会自动映射以下飞书字段到本地数据库:
| 飞书字段 | 本地字段 | 类型 | 说明 |
|---------|---------|------|------|
| TR Number | order_id | String | 工单编号 |
| TR Description | description | Text | 工单描述 |
| Type of problem | category | String | 问题类型 |
| TR Level | priority | String | 优先级 |
| TR Status | status | String | 工单状态 |
| Source | source | String | 来源 |
| Created by | created_by | String | 创建人 |
| Module模块 | module | String | 模块 |
| Wilfulness责任人 | wilfulness | String | 责任人 |
| Date of close TR | date_of_close | DateTime | 关闭日期 |
| Vehicle Type01 | vehicle_type | String | 车型 |
| VIN\|sim | vin_sim | String | 车架号/SIM |
| App remote control version | app_remote_control_version | String | 应用远程控制版本 |
| HMI SW | hmi_sw | String | HMI软件版本 |
| 父记录 | parent_record | String | 父记录 |
| Has it been updated on the same day | has_updated_same_day | String | 是否同日更新 |
| Operating time | operating_time | String | 操作时间 |
## 🔧 配置管理
### 配置文件位置
- `llm_config.py`: LLM客户端配置
- `integrations_config.json`: 集成服务配置
- `integrations_config copy.json`: 配置备份文件
### 配置更新
- 修改配置文件后需要重启服务
- 建议在修改前备份配置文件
- 可以通过Web界面进行部分配置的在线修改
### 环境变量支持
系统支持通过环境变量覆盖配置文件设置:
```bash
# LLM配置
export LLM_PROVIDER="openai"
export LLM_API_KEY="your-api-key"
export LLM_MODEL="gpt-3.5-turbo"
# 飞书配置
export FEISHU_APP_ID="your-app-id"
export FEISHU_APP_SECRET="your-app-secret"
export FEISHU_APP_TOKEN="your-app-token"
export FEISHU_TABLE_ID="your-table-id"
```
## 🚨 注意事项
1. **安全性**: 配置文件包含敏感信息,请勿提交到版本控制系统
2. **备份**: 修改配置前请备份原文件
3. **权限**: 确保飞书应用有足够的权限访问多维表格
4. **测试**: 配置完成后建议先进行测试同步
5. **监控**: 定期检查同步状态和错误日志

View File

@@ -1,110 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
AI准确率配置
管理AI建议的准确率阈值和相关配置
"""
from dataclasses import dataclass
from typing import Dict, Any
@dataclass
class AIAccuracyConfig:
"""AI准确率配置类"""
# 相似度阈值配置
auto_approve_threshold: float = 0.95 # 自动审批阈值≥95%
use_human_resolution_threshold: float = 0.90 # 使用人工描述阈值(<90%
manual_review_threshold: float = 0.80 # 人工审核阈值≥80%
# 置信度配置
ai_suggestion_confidence: float = 0.95 # AI建议默认置信度
human_resolution_confidence: float = 0.90 # 人工描述置信度
# 入库策略配置
prefer_human_when_low_accuracy: bool = True # 当AI准确率低时优先使用人工描述
enable_auto_approval: bool = True # 是否启用自动审批
enable_human_fallback: bool = True # 是否启用人工描述回退
def get_threshold_explanation(self, similarity: float) -> str:
"""获取相似度阈值的解释"""
if similarity >= self.auto_approve_threshold:
return f"相似度≥{self.auto_approve_threshold*100:.0f}%自动审批使用AI建议"
elif similarity >= self.manual_review_threshold:
return f"相似度≥{self.manual_review_threshold*100:.0f}%,建议人工审核"
elif similarity >= self.use_human_resolution_threshold:
return f"相似度<{self.use_human_resolution_threshold*100:.0f}%,建议使用人工描述"
else:
return f"相似度<{self.use_human_resolution_threshold*100:.0f}%,优先使用人工描述"
def should_use_human_resolution(self, similarity: float) -> bool:
"""判断是否应该使用人工描述"""
return similarity < self.use_human_resolution_threshold
def should_auto_approve(self, similarity: float) -> bool:
"""判断是否应该自动审批"""
return similarity >= self.auto_approve_threshold and self.enable_auto_approval
def get_confidence_score(self, similarity: float, use_human: bool = False) -> float:
"""获取置信度分数"""
if use_human:
return self.human_resolution_confidence
else:
return max(similarity, self.ai_suggestion_confidence)
def to_dict(self) -> Dict[str, Any]:
"""转换为字典格式"""
return {
"auto_approve_threshold": self.auto_approve_threshold,
"use_human_resolution_threshold": self.use_human_resolution_threshold,
"manual_review_threshold": self.manual_review_threshold,
"ai_suggestion_confidence": self.ai_suggestion_confidence,
"human_resolution_confidence": self.human_resolution_confidence,
"prefer_human_when_low_accuracy": self.prefer_human_when_low_accuracy,
"enable_auto_approval": self.enable_auto_approval,
"enable_human_fallback": self.enable_human_fallback
}
@classmethod
def from_dict(cls, data: Dict[str, Any]) -> 'AIAccuracyConfig':
"""从字典创建配置"""
return cls(**data)
# 默认配置实例
DEFAULT_CONFIG = AIAccuracyConfig()
# 配置预设
PRESETS = {
"conservative": AIAccuracyConfig(
auto_approve_threshold=0.98,
use_human_resolution_threshold=0.85,
manual_review_threshold=0.90,
human_resolution_confidence=0.95
),
"balanced": AIAccuracyConfig(
auto_approve_threshold=0.95,
use_human_resolution_threshold=0.90,
manual_review_threshold=0.80,
human_resolution_confidence=0.90
),
"aggressive": AIAccuracyConfig(
auto_approve_threshold=0.90,
use_human_resolution_threshold=0.80,
manual_review_threshold=0.70,
human_resolution_confidence=0.85
)
}
def get_accuracy_config(preset: str = "balanced") -> AIAccuracyConfig:
"""获取准确率配置"""
return PRESETS.get(preset, DEFAULT_CONFIG)
def update_accuracy_config(config: AIAccuracyConfig) -> bool:
"""更新准确率配置(可以保存到文件或数据库)"""
try:
# 这里可以实现配置的持久化存储
# 例如保存到配置文件或数据库
return True
except Exception:
return False

View File

@@ -1,16 +0,0 @@
{
"feishu": {
"app_id": "tblnl3vJPpgMTSiP",
"app_secret": "ccxkE7ZCFQZcwkkM1rLy0ccZRXYsT2xK",
"app_token": "XXnEbiCmEaMblSs6FDJcFCqsnlg",
"table_id": "tblnl3vJPpgMTSiP",
"last_updated": null,
"status": "inactive"
},
"system": {
"sync_limit": 10,
"ai_suggestions_enabled": true,
"auto_sync_interval": 0,
"last_sync_time": null
}
}

View File

@@ -1,16 +0,0 @@
{
"feishu": {
"app_id": "cli_a8b50ec0eed1500d",
"app_secret": "ccxkE7ZCFQZcwkkM1rLy0ccZRXYsT2xK",
"app_token": "XXnEbiCmEaMblSs6FDJcFCqsnIg",
"table_id": "tblnl3vJPpgMTSiP",
"last_updated": "2025-09-19T18:40:55.291113",
"status": "active"
},
"system": {
"sync_limit": 10,
"ai_suggestions_enabled": true,
"auto_sync_interval": 0,
"last_sync_time": null
}
}

View File

@@ -1,60 +0,0 @@
# -*- coding: utf-8 -*-
"""
LLM配置文件 - 千问模型配置
"""
from src.agent.llm_client import LLMConfig
# 千问模型配置
QWEN_CONFIG = LLMConfig(
provider="qwen",
api_key="sk-c0dbefa1718d46eaa897199135066f00", # 请替换为您的千问API密钥
base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
model="qwen-plus-latest", # 可选: qwen-turbo, qwen-plus, qwen-max
temperature=0.7,
max_tokens=2000
)
# 其他模型配置示例
OPENAI_CONFIG = LLMConfig(
provider="openai",
api_key="sk-your-openai-api-key-here",
model="gpt-3.5-turbo",
temperature=0.7,
max_tokens=2000
)
ANTHROPIC_CONFIG = LLMConfig(
provider="anthropic",
api_key="sk-ant-your-anthropic-api-key-here",
model="claude-3-sonnet-20240229",
temperature=0.7,
max_tokens=2000
)
# 默认使用千问模型
DEFAULT_CONFIG = QWEN_CONFIG
def get_default_llm_config() -> LLMConfig:
"""
获取默认的LLM配置
优先从统一配置管理器获取,如果失败则使用本地配置
"""
try:
from src.config.unified_config import get_config
config = get_config()
llm_dict = config.get_llm_config()
# 创建LLMConfig对象
return LLMConfig(
provider=llm_dict.get("provider", "qwen"),
api_key=llm_dict.get("api_key", ""),
base_url=llm_dict.get("base_url", "https://dashscope.aliyuncs.com/compatible-mode/v1"),
model=llm_dict.get("model", "qwen-plus-latest"),
temperature=llm_dict.get("temperature", 0.7),
max_tokens=llm_dict.get("max_tokens", 2000)
)
except Exception:
# 如果统一配置不可用,使用本地配置
return DEFAULT_CONFIG

View File

@@ -1,52 +0,0 @@
{
"database": {
"url": "mysql+pymysql://tsp_assistant:password@jeason.online/tsp_assistant?charset=utf8mb4",
"pool_size": 10,
"max_overflow": 20,
"pool_timeout": 30,
"pool_recycle": 3600
},
"llm": {
"provider": "qwen",
"api_key": "sk-c0dbefa1718d46eaa897199135066f00",
"base_url": "https://dashscope.aliyuncs.com/compatible-mode/v1",
"model": "qwen-plus-latest",
"temperature": 0.7,
"max_tokens": 2000,
"timeout": 30
},
"server": {
"host": "0.0.0.0",
"port": 5000,
"websocket_port": 8765,
"debug": false,
"log_level": "INFO"
},
"feishu": {
"app_id": "",
"app_secret": "",
"app_token": "",
"table_id": "",
"status": "active",
"sync_limit": 10,
"auto_sync_interval": 0
},
"ai_accuracy": {
"auto_approve_threshold": 0.95,
"use_human_resolution_threshold": 0.9,
"manual_review_threshold": 0.8,
"ai_suggestion_confidence": 0.95,
"human_resolution_confidence": 0.9,
"prefer_human_when_low_accuracy": true,
"enable_auto_approval": true,
"enable_human_fallback": true
},
"system": {
"backup_enabled": true,
"backup_interval": 24,
"max_backup_files": 7,
"cache_enabled": true,
"cache_ttl": 3600,
"monitoring_enabled": true
}
}

672
config_backup.txt Normal file
View File

@@ -0,0 +1,672 @@
################################################################################
# DEPRECATED CONFIGURATION FILE: config/llm_config.py
################################################################################
# -*- coding: utf-8 -*-
"""
LLM配置文件 - 千问模型配置
"""
from src.agent.llm_client import LLMConfig
# 千问模型配置
QWEN_CONFIG = LLMConfig(
provider="qwen",
api_key="sk-c0dbefa1718d46eaa897199135066f00", # 请替换为您的千问API密钥
base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
model="qwen-plus-latest", # 可选: qwen-turbo, qwen-plus, qwen-max
temperature=0.7,
max_tokens=2000
)
# 其他模型配置示例
OPENAI_CONFIG = LLMConfig(
provider="openai",
api_key="sk-your-openai-api-key-here",
model="gpt-3.5-turbo",
temperature=0.7,
max_tokens=2000
)
ANTHROPIC_CONFIG = LLMConfig(
provider="anthropic",
api_key="sk-ant-your-anthropic-api-key-here",
model="claude-3-sonnet-20240229",
temperature=0.7,
max_tokens=2000
)
# 默认使用千问模型
DEFAULT_CONFIG = QWEN_CONFIG
def get_default_llm_config() -> LLMConfig:
"""
获取默认的LLM配置
优先从统一配置管理器获取,如果失败则使用本地配置
"""
try:
from src.config.unified_config import get_config
config = get_config()
llm_dict = config.get_llm_config()
# 创建LLMConfig对象
return LLMConfig(
provider=llm_dict.get("provider", "qwen"),
api_key=llm_dict.get("api_key", ""),
base_url=llm_dict.get("base_url", "https://dashscope.aliyuncs.com/compatible-mode/v1"),
model=llm_dict.get("model", "qwen-plus-latest"),
temperature=llm_dict.get("temperature", 0.7),
max_tokens=llm_dict.get("max_tokens", 2000)
)
except Exception:
# 如果统一配置不可用,使用本地配置
return DEFAULT_CONFIG
################################################################################
# DEPRECATED CONFIGURATION FILE: config/integrations_config.json
################################################################################
{
"feishu": {
"app_id": "cli_a8b50ec0eed1500d",
"app_secret": "ccxkE7ZCFQZcwkkM1rLy0ccZRXYsT2xK",
"app_token": "XXnEbiCmEaMblSs6FDJcFCqsnIg",
"table_id": "tblnl3vJPpgMTSiP",
"last_updated": "2025-09-19T18:40:55.291113",
"status": "active"
},
"system": {
"sync_limit": 10,
"ai_suggestions_enabled": true,
"auto_sync_interval": 0,
"last_sync_time": null
}
}
################################################################################
# DEPRECATED CONFIGURATION FILE: config/ai_accuracy_config.py
################################################################################
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
AI准确率配置
管理AI建议的准确率阈值和相关配置
"""
from dataclasses import dataclass
from typing import Dict, Any
@dataclass
class AIAccuracyConfig:
"""AI准确率配置类"""
# 相似度阈值配置
auto_approve_threshold: float = 0.95 # 自动审批阈值≥95%
use_human_resolution_threshold: float = 0.90 # 使用人工描述阈值(<90%
manual_review_threshold: float = 0.80 # 人工审核阈值≥80%
# 置信度配置
ai_suggestion_confidence: float = 0.95 # AI建议默认置信度
human_resolution_confidence: float = 0.90 # 人工描述置信度
# 入库策略配置
prefer_human_when_low_accuracy: bool = True # 当AI准确率低时优先使用人工描述
enable_auto_approval: bool = True # 是否启用自动审批
enable_human_fallback: bool = True # 是否启用人工描述回退
def get_threshold_explanation(self, similarity: float) -> str:
"""获取相似度阈值的解释"""
if similarity >= self.auto_approve_threshold:
return f"相似度≥{self.auto_approve_threshold*100:.0f}%自动审批使用AI建议"
elif similarity >= self.manual_review_threshold:
return f"相似度≥{self.manual_review_threshold*100:.0f}%,建议人工审核"
elif similarity >= self.use_human_resolution_threshold:
return f"相似度<{self.use_human_resolution_threshold*100:.0f}%,建议使用人工描述"
else:
return f"相似度<{self.use_human_resolution_threshold*100:.0f}%,优先使用人工描述"
def should_use_human_resolution(self, similarity: float) -> bool:
"""判断是否应该使用人工描述"""
return similarity < self.use_human_resolution_threshold
def should_auto_approve(self, similarity: float) -> bool:
"""判断是否应该自动审批"""
return similarity >= self.auto_approve_threshold and self.enable_auto_approval
def get_confidence_score(self, similarity: float, use_human: bool = False) -> float:
"""获取置信度分数"""
if use_human:
return self.human_resolution_confidence
else:
return max(similarity, self.ai_suggestion_confidence)
def to_dict(self) -> Dict[str, Any]:
"""转换为字典格式"""
return {
"auto_approve_threshold": self.auto_approve_threshold,
"use_human_resolution_threshold": self.use_human_resolution_threshold,
"manual_review_threshold": self.manual_review_threshold,
"ai_suggestion_confidence": self.ai_suggestion_confidence,
"human_resolution_confidence": self.human_resolution_confidence,
"prefer_human_when_low_accuracy": self.prefer_human_when_low_accuracy,
"enable_auto_approval": self.enable_auto_approval,
"enable_human_fallback": self.enable_human_fallback
}
@classmethod
def from_dict(cls, data: Dict[str, Any]) -> 'AIAccuracyConfig':
"""从字典创建配置"""
return cls(**data)
# 默认配置实例
DEFAULT_CONFIG = AIAccuracyConfig()
# 配置预设
PRESETS = {
"conservative": AIAccuracyConfig(
auto_approve_threshold=0.98,
use_human_resolution_threshold=0.85,
manual_review_threshold=0.90,
human_resolution_confidence=0.95
),
"balanced": AIAccuracyConfig(
auto_approve_threshold=0.95,
use_human_resolution_threshold=0.90,
manual_review_threshold=0.80,
human_resolution_confidence=0.90
),
"aggressive": AIAccuracyConfig(
auto_approve_threshold=0.90,
use_human_resolution_threshold=0.80,
manual_review_threshold=0.70,
human_resolution_confidence=0.85
)
}
def get_accuracy_config(preset: str = "balanced") -> AIAccuracyConfig:
"""获取准确率配置"""
return PRESETS.get(preset, DEFAULT_CONFIG)
def update_accuracy_config(config: AIAccuracyConfig) -> bool:
"""更新准确率配置(可以保存到文件或数据库)"""
try:
# 这里可以实现配置的持久化存储
# 例如保存到配置文件或数据库
return True
except Exception:
return False
################################################################################
# DEPRECATED CONFIGURATION FILE: config/field_mapping_config.json
################################################################################
{
"field_mapping": {
"TR Number": "order_id",
"TR Description": "description",
"Type of problem": "category",
"TR Level": "priority",
"TR Status": "status",
"Source": "source",
"Date creation": "created_at",
"处理过程": "resolution",
"TR tracking": "resolution",
"Created by": "created_by",
"Module模块": "module",
"Wilfulness责任人": "wilfulness",
"Date of close TR": "date_of_close",
"Vehicle Type01": "vehicle_type",
"VIN|sim": "vin_sim",
"App remote control version": "app_remote_control_version",
"HMI SW": "hmi_sw",
"父记录": "parent_record",
"Has it been updated on the same day": "has_updated_same_day",
"Operating time": "operating_time",
"AI建议": "ai_suggestion",
"Issue Start Time": "updated_at",
"Wilfulness责任人<E4BBBB>?": "wilfulness",
"父<>?<3F>录": "parent_record",
"AI建<49>??": "ai_suggestion"
},
"field_aliases": {
"order_id": [
"TR Number",
"TR编号",
"工单号",
"Order ID",
"Ticket ID",
"工单编号",
"新字段1",
"新字段"
],
"description": [
"TR Description",
"TR描述",
"描述",
"Description",
"问题描述",
"详细描述"
],
"category": [
"Type of problem",
"问题类型",
"Category",
"分类",
"Problem Type",
"问题分类"
],
"priority": [
"TR Level",
"优先级",
"Priority",
"Level",
"紧急程度",
"重要程度"
],
"status": [
"TR Status",
"状态",
"Status",
"工单状态",
"处理状态"
],
"source": [
"Source",
"来源",
"Source Type",
"来源类型",
"提交来源"
],
"created_at": [
"Date creation",
"创建日期",
"Created At",
"Creation Date",
"创建时间"
],
"solution": [
"处理过程",
"Solution",
"解决方案",
"Process",
"处理方案"
],
"resolution": [
"TR tracking",
"Resolution",
"解决结果",
"跟踪",
"处理结果"
],
"created_by": [
"Created by",
"创建人",
"Creator",
"Created By",
"提交人"
],
"vehicle_type": [
"Vehicle Type01",
"车型",
"Vehicle Type",
"车辆类型",
"车款"
],
"vin_sim": [
"VIN|sim",
"VIN",
"车架号",
"SIM",
"VIN/SIM",
"车辆识别号"
],
"module": [
"Module模块",
"模块",
"Module",
"功能模块"
],
"wilfulness": [
"Wilfulness责任人",
"责任人",
"负责人",
"Assignee"
],
"date_of_close": [
"Date of close TR",
"关闭日期",
"Close Date",
"完成日期"
],
"app_remote_control_version": [
"App remote control version",
"应用远程控制版本",
"App Version",
"应用版本"
],
"hmi_sw": [
"HMI SW",
"HMI软件版本",
"HMI Software",
"人机界面软件"
],
"parent_record": [
"父记录",
"Parent Record",
"上级记录",
"关联记录"
],
"has_updated_same_day": [
"Has it been updated on the same day",
"是否同日更新",
"Same Day Update",
"当日更新"
],
"operating_time": [
"Operating time",
"操作时间",
"Operation Time",
"运行时间"
],
"ai_suggestion": [
"AI建议",
"AI Suggestion",
"AI建议",
"智能建议"
],
"updated_at": [
"Issue Start Time",
"问题开始时间",
"Start Time",
"更新时间"
]
},
"field_patterns": {
"order_id": [
".*number.*",
".*id.*",
".*编号.*",
".*ticket.*",
".*新.*"
],
"description": [
".*description.*",
".*描述.*",
".*detail.*",
".*内容.*"
],
"category": [
".*type.*",
".*category.*",
".*分类.*",
".*类型.*",
".*problem.*"
],
"priority": [
".*level.*",
".*priority.*",
".*优先级.*",
".*urgent.*"
],
"status": [
".*status.*",
".*状态.*",
".*state.*"
],
"source": [
".*source.*",
".*来源.*",
".*origin.*"
],
"created_at": [
".*creation.*",
".*created.*",
".*创建.*",
".*date.*"
],
"solution": [
".*solution.*",
".*处理.*",
".*解决.*",
".*process.*"
],
"resolution": [
".*resolution.*",
".*tracking.*",
".*跟踪.*",
".*result.*"
],
"created_by": [
".*created.*by.*",
".*creator.*",
".*创建人.*",
".*author.*"
],
"vehicle_type": [
".*vehicle.*type.*",
".*车型.*",
".*车辆.*",
".*car.*"
],
"vin_sim": [
".*vin.*",
".*sim.*",
".*车架.*",
".*识别.*"
],
"module": [
".*module.*",
".*模块.*",
".*功能.*"
],
"wilfulness": [
".*wilfulness.*",
".*责任人.*",
".*负责人.*",
".*assignee.*"
],
"date_of_close": [
".*close.*",
".*关闭.*",
".*完成.*",
".*finish.*"
],
"app_remote_control_version": [
".*app.*version.*",
".*应用.*版本.*",
".*remote.*control.*"
],
"hmi_sw": [
".*hmi.*",
".*软件.*",
".*software.*"
],
"parent_record": [
".*parent.*",
".*父.*",
".*上级.*",
".*关联.*"
],
"has_updated_same_day": [
".*same.*day.*",
".*同日.*",
".*当日.*",
".*updated.*same.*"
],
"operating_time": [
".*operating.*time.*",
".*操作.*时间.*",
".*运行.*时间.*"
],
"ai_suggestion": [
".*ai.*suggestion.*",
".*ai.*建议.*",
".*智能.*建议.*"
],
"updated_at": [
".*start.*time.*",
".*开始.*时间.*",
".*updated.*at.*",
".*更新时间.*"
]
},
"field_priorities": {
"order_id": 3,
"description": 3,
"category": 3,
"priority": 3,
"status": 3,
"created_at": 3,
"source": 3,
"solution": 3,
"resolution": 3,
"created_by": 3,
"vehicle_type": 3,
"vin_sim": 3,
"module": 3,
"wilfulness": 3,
"date_of_close": 3,
"app_remote_control_version": 3,
"hmi_sw": 3,
"parent_record": 3,
"has_updated_same_day": 3,
"operating_time": 3,
"ai_suggestion": 3,
"updated_at": 3
},
"auto_mapping_enabled": true,
"similarity_threshold": 0.6
}
################################################################################
# DEPRECATED CONFIGURATION FILE: config/unified_config.json
################################################################################
{
"database": {
"url": "mysql+pymysql://tsp_assistant:password@jeason.online/tsp_assistant?charset=utf8mb4",
"pool_size": 10,
"max_overflow": 20,
"pool_timeout": 30,
"pool_recycle": 3600
},
"llm": {
"provider": "qwen",
"api_key": "sk-c0dbefa1718d46eaa897199135066f00",
"base_url": "https://dashscope.aliyuncs.com/compatible-mode/v1",
"model": "qwen-plus-latest",
"temperature": 0.7,
"max_tokens": 2000,
"timeout": 30
},
"server": {
"host": "0.0.0.0",
"port": 5000,
"websocket_port": 8765,
"debug": false,
"log_level": "INFO"
},
"feishu": {
"app_id": "",
"app_secret": "",
"app_token": "",
"table_id": "",
"status": "active",
"sync_limit": 10,
"auto_sync_interval": 0
},
"ai_accuracy": {
"auto_approve_threshold": 0.95,
"use_human_resolution_threshold": 0.9,
"manual_review_threshold": 0.8,
"ai_suggestion_confidence": 0.95,
"human_resolution_confidence": 0.9,
"prefer_human_when_low_accuracy": true,
"enable_auto_approval": true,
"enable_human_fallback": true
},
"system": {
"backup_enabled": true,
"backup_interval": 24,
"max_backup_files": 7,
"cache_enabled": true,
"cache_ttl": 3600,
"monitoring_enabled": true
}
}
################################################################################
# DEPRECATED CONFIGURATION FILE: src/config/config.py
################################################################################
import os
from typing import Dict, Any
class Config:
"""系统配置类"""
# 阿里云千问API配置
ALIBABA_API_KEY = "sk-c0dbefa1718d46eaa897199135066f00"
ALIBABA_BASE_URL = "https://dashscope.aliyuncs.com/compatible-mode/v1"
ALIBABA_MODEL_NAME = "qwen-plus-latest"
# 数据库配置
DATABASE_URL = "mysql+pymysql://tsp_assistant:123456@jeason.online/tsp_assistant?charset=utf8mb4"
# DATABASE_URL = "sqlite:///local_test.db" # 本地测试数据库
# 知识库配置
KNOWLEDGE_BASE_PATH = "data/knowledge_base"
VECTOR_DB_PATH = "data/vector_db"
# 对话配置
MAX_HISTORY_LENGTH = 10
RESPONSE_TIMEOUT = 30
# 分析配置
ANALYTICS_UPDATE_INTERVAL = 3600 # 1小时
ALERT_THRESHOLD = 0.8 # 预警阈值
# 日志配置
LOG_LEVEL = "INFO"
LOG_FILE = "logs/tsp_assistant.log"
# 系统监控配置
SYSTEM_MONITORING = True # 是否启用系统监控
MONITORING_INTERVAL = 60 # 监控间隔(秒)
@classmethod
def get_api_config(cls) -> Dict[str, Any]:
"""获取API配置"""
return {
"api_key": cls.ALIBABA_API_KEY,
"base_url": cls.ALIBABA_BASE_URL,
"model_name": cls.ALIBABA_MODEL_NAME
}
@classmethod
def get_database_config(cls) -> Dict[str, Any]:
"""获取数据库配置"""
return {
"url": cls.DATABASE_URL,
"echo": False
}
@classmethod
def get_knowledge_config(cls) -> Dict[str, Any]:
"""获取知识库配置"""
return {
"base_path": cls.KNOWLEDGE_BASE_PATH,
"vector_db_path": cls.VECTOR_DB_PATH
}
@classmethod
def get_config(cls) -> Dict[str, Any]:
"""获取完整配置"""
return {
"system_monitoring": cls.SYSTEM_MONITORING,
"monitoring_interval": cls.MONITORING_INTERVAL,
"log_level": cls.LOG_LEVEL,
"log_file": cls.LOG_FILE,
"analytics_update_interval": cls.ANALYTICS_UPDATE_INTERVAL,
"alert_threshold": cls.ALERT_THRESHOLD
}

View File

@@ -1,26 +0,0 @@
@echo off
rem 获取当前日期和时间
for /f "tokens=1-6 delims=/ " %%a in ('date /t') do set CDATE=%%a-%%b-%%c
for /f "tokens=1-2 delims=:\ " %%a in ('time /t') do set CTIME=%%a-%%b
set COMMIT_MESSAGE="feat: %CDATE% %CTIME% - 全面架构重构、功能增强及问题修复"
rem 添加所有变更到暂存区
git add .
rem 检查是否有文件被添加、修改或删除
git diff --cached --quiet
if %errorlevel% equ 0 (
echo 没有检测到需要提交的变更。
) else (
rem 提交变更
git commit -m %COMMIT_MESSAGE%
rem 推送变更到远程仓库
git push origin master
echo Git 推送完成!
)
pause

View File

@@ -1,23 +0,0 @@
#!/bin/bash
# 获取当前日期和时间
COMMIT_DATE=$(date +"%Y-%m-%d %H:%M:%S")
# 添加所有变更到暂存区
git add .
# 检查是否有文件被添加、修改或删除
if git diff --cached --quiet; then
echo "没有检测到需要提交的变更。"
else
# 创建提交消息
COMMIT_MESSAGE="feat: ${COMMIT_DATE} - 全面架构重构、功能增强及问题修复"
# 提交变更
git commit -m "$COMMIT_MESSAGE"
# 推送变更到远程仓库
git push origin master
echo "Git 推送完成!"
fi

View File

@@ -15,7 +15,7 @@ from pathlib import Path
# 添加项目根目录到Python路径 # 添加项目根目录到Python路径
sys.path.append(os.path.dirname(os.path.abspath(__file__))) sys.path.append(os.path.dirname(os.path.abspath(__file__)))
from src.config.config import Config from src.config.unified_config import get_config
from src.utils.helpers import setup_logging from src.utils.helpers import setup_logging
from src.core.database import db_manager from src.core.database import db_manager
from src.core.models import ( from src.core.models import (
@@ -67,7 +67,8 @@ class DatabaseInitializer:
try: try:
# 设置日志 # 设置日志
setup_logging(Config.LOG_LEVEL, Config.LOG_FILE) config = get_config()
setup_logging(config.server.log_level, "logs/tsp_assistant.log")
# 测试数据库连接 # 测试数据库连接
if not self._test_connection(): if not self._test_connection():

View File

@@ -0,0 +1,380 @@
2026-02-10 23:22:54,720 - __main__ - INFO - 正在启动TSP智能助手综合管理平台...
2026-02-10 23:22:54,996 - src.config.unified_config - INFO - Initializing unified configuration from environment variables...
2026-02-10 23:22:54,996 - src.config.unified_config - INFO - Database config loaded.
2026-02-10 23:22:54,996 - src.config.unified_config - INFO - LLM config loaded.
2026-02-10 23:22:54,996 - src.config.unified_config - INFO - Server config loaded.
2026-02-10 23:22:54,996 - src.config.unified_config - INFO - Feishu config loaded.
2026-02-10 23:22:54,996 - src.config.unified_config - INFO - AI Accuracy config loaded.
2026-02-10 23:22:54,996 - src.config.unified_config - INFO - Configuration validation passed (warnings may exist).
2026-02-10 23:22:55,025 - src.core.database - INFO - 数据库初始化成功
2026-02-10 23:22:55,038 - __main__ - INFO - 跳过系统检查,直接启动服务...
2026-02-10 23:22:56,565 - src.core.backup_manager - INFO - 备份数据库初始化成功: tsp_assistant.db
2026-02-10 23:22:56,569 - src.integrations.config_manager - INFO - 配置加载成功
2026-02-10 23:22:56,682 - werkzeug - INFO - WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5001
* Running on http://192.168.31.45:5001
2026-02-10 23:22:56,683 - werkzeug - INFO - Press CTRL+C to quit
2026-02-10 23:22:56,839 - src.web.websocket_server - INFO - 启动WebSocket服务器: ws://localhost:8765
2026-02-10 23:22:56,845 - websockets.server - INFO - server listening on [::1]:8765
2026-02-10 23:22:56,845 - websockets.server - INFO - server listening on 127.0.0.1:8765
2026-02-10 23:22:59,223 - websockets.server - INFO - connection open
2026-02-10 23:22:59,223 - src.web.websocket_server - INFO - 客户端连接: ('::1', 61966, 0, 0)
2026-02-10 23:23:01,871 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET / HTTP/1.1" 200 -
2026-02-10 23:23:01,899 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:23:01,910 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:23:01,926 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:23:01,928 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:23:01,931 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:23:01,932 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:23:01,937 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:23:01,941 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:23:01,945 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:01] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:23:02,000 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:23:02,003 - src.web.service_manager - INFO - 服务 assistant 已初始化
2026-02-10 23:23:02,003 - src.web.service_manager - INFO - 服务 assistant 已初始化
2026-02-10 23:23:02,004 - src.web.service_manager - INFO - 服务 chat_manager 已初始化
2026-02-10 23:23:02,004 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:23:02,011 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:23:02,013 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:23:02,039 - src.web.service_manager - INFO - 服务 assistant 已初始化
2026-02-10 23:23:02,049 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:23:02,054 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:23:02,056 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:23:02,067 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:02,084 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/alerts?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:23:02,092 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:23:02,094 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:23:02,095 - websockets.server - INFO - connection open
2026-02-10 23:23:02,101 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:23:02,102 - src.web.websocket_server - INFO - 客户端连接: ('::1', 61991, 0, 0)
2026-02-10 23:23:02,154 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:23:02,195 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:23:02,237 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:23:02,248 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:02] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:23:03,230 - src.agent_assistant - INFO - TSP Agent助手初始化完成
2026-02-10 23:23:03,230 - src.web.service_manager - INFO - 服务 agent_assistant 已初始化
2026-02-10 23:23:03,231 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:03] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:23:05,930 - src.dialogue.realtime_chat - INFO - 创建新会话: session_user_001_1770736985
2026-02-10 23:23:05,931 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:05] "POST /api/chat/session HTTP/1.1" 200 -
2026-02-10 23:23:07,016 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:07] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:07,091 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:07] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:23:07,099 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:07] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:23:07,111 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:07] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:23:08,109 - src.knowledge_base.knowledge_manager - WARNING - 知识库中没有活跃条目
2026-02-10 23:23:11,108 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:11] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:23:11,997 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:11] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:23:12,012 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:12] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:12,233 - src.core.llm_client - INFO - API请求成功
2026-02-10 23:23:12,238 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:12] "POST /api/chat/message HTTP/1.1" 200 -
2026-02-10 23:23:17,013 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:17] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:21,995 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:21] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:23:22,011 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:22] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:27,006 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:27] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:31,997 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:31] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:23:32,030 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:32] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:36,401 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:36] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:23:36,404 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:36] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:23:36,418 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:36] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:37,038 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:37] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:41,991 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:41] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:23:42,010 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:42] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:42,092 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:42] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:23:47,006 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:47] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:51,995 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:51] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:23:52,009 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:52] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:23:57,012 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:23:57] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:01,993 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:01] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:24:02,032 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:02] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:07,008 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:07] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:07,097 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:07] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:24:07,106 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:07] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:24:08,400 - src.dialogue.realtime_chat - INFO - 结束会话: session_user_001_1770736985
2026-02-10 23:24:08,401 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:08] "DELETE /api/chat/session/session_user_001_1770736985 HTTP/1.1" 200 -
2026-02-10 23:24:09,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:09] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:24:09,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:09] "GET /api/agent/tools/stats HTTP/1.1" 200 -
2026-02-10 23:24:11,992 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:11] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:24:12,023 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:12] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:13,479 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:13] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:24:17,010 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:17] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:17,189 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:17] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:24:19,476 - src.knowledge_base.knowledge_manager - INFO - 向量化器加载成功,包含 1 个条目
2026-02-10 23:24:19,477 - src.knowledge_base.knowledge_manager - INFO - 添加知识库条目成功: 123...
2026-02-10 23:24:19,477 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:19] "POST /api/knowledge HTTP/1.1" 200 -
2026-02-10 23:24:19,484 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:19] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:24:21,903 - src.knowledge_base.knowledge_manager - INFO - 知识库条目验证成功: 1
2026-02-10 23:24:21,903 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:21] "POST /api/knowledge/verify/1 HTTP/1.1" 200 -
2026-02-10 23:24:21,909 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:21] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:24:21,991 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:21] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:24:22,016 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:22] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:27,014 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:27] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:27,824 - src.dialogue.realtime_chat - INFO - 创建新会话: session_user_001_1770737067
2026-02-10 23:24:27,825 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:27] "POST /api/chat/session HTTP/1.1" 200 -
2026-02-10 23:24:31,997 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:31] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:24:32,029 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:32] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:32,226 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '123' 返回 1 个结果
2026-02-10 23:24:33,510 - src.core.llm_client - INFO - API请求成功
2026-02-10 23:24:33,516 - src.knowledge_base.knowledge_manager - INFO - 成功更新 1 个知识库条目的使用次数
2026-02-10 23:24:33,517 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:33] "POST /api/chat/message HTTP/1.1" 200 -
2026-02-10 23:24:36,402 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:36] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:24:36,407 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:36] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:24:36,410 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:36] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:24:36,433 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:36] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:24:36,440 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:36] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:37,010 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:37] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:37,045 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:37] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:24:39,494 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:39] "GET /api/workorders?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:24:41,994 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:41] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:24:42,035 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:42] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:45,552 - src.dialogue.dialogue_manager - INFO - 创建工单成功: WO20260210232445
2026-02-10 23:24:45,553 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:45] "POST /api/workorders HTTP/1.1" 200 -
2026-02-10 23:24:45,561 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:45] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:24:45,568 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:45] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:24:47,015 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:47] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:51,300 - src.web.websocket_server - INFO - 客户端断开: ('::1', 61991, 0, 0)
2026-02-10 23:24:51,304 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET / HTTP/1.1" 200 -
2026-02-10 23:24:51,328 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:24:51,330 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:24:51,340 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:24:51,341 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:24:51,342 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:24:51,343 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:24:51,345 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:24:51,347 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:24:51,351 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:24:51,385 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:24:51,386 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:24:51,389 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:24:51,395 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:24:51,404 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:24:51,420 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:24:51,429 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:24:51,432 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:24:51,434 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:24:51,437 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:24:51,439 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:24:51,442 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/workorders?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:24:51,445 - websockets.server - INFO - connection open
2026-02-10 23:24:51,455 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:51,457 - src.web.websocket_server - INFO - 客户端连接: ('::1', 62164, 0, 0)
2026-02-10 23:24:51,464 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:24:51,487 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:24:51,493 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:24:51,504 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:51] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:24:52,700 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:52] "GET /api/workorders/1 HTTP/1.1" 200 -
2026-02-10 23:24:54,133 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 'ewq' 返回 0 个结果
2026-02-10 23:24:55,503 - src.core.llm_client - INFO - API请求成功
2026-02-10 23:24:55,509 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:55] "POST /api/workorders/1/ai-suggestion HTTP/1.1" 200 -
2026-02-10 23:24:56,401 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:24:56,436 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:56] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:24:56,442 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:24:56,459 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:24:59,471 - src.web.blueprints.workorders - ERROR - 计算语义相似度失败: No module named 'sentence_transformers'
2026-02-10 23:24:59,479 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:24:59] "POST /api/workorders/1/human-resolution HTTP/1.1" 200 -
2026-02-10 23:25:01,386 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:01] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:25:01,405 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:01] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:05,076 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:05] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:25:06,400 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:06] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:11,386 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:11] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:25:11,402 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:11] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:16,398 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:16] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:21,392 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:21] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:25:21,439 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:21] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:26,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:26] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:31,384 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:31] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:25:31,402 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:31] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:31,428 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:31] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:25:36,400 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:36] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:36,417 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:36] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:25:36,430 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:36] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:38,174 - src.integrations.config_manager - WARNING - 配置现在从 .env 文件读取,无法通过 API 导入
2026-02-10 23:25:38,176 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:38] "POST /api/feishu-sync/config/import HTTP/1.1" 500 -
2026-02-10 23:25:40,582 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:40] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:25:41,384 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:41] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:25:41,408 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:41] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:45,137 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:45] "GET /api/token-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:25:45,151 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:45] "GET /api/token-monitor/records HTTP/1.1" 200 -
2026-02-10 23:25:45,156 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:45] "GET /api/token-monitor/chart HTTP/1.1" 200 -
2026-02-10 23:25:46,401 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:46] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:46,913 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:46] "GET /api/ai-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:25:46,922 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:46] "GET /api/ai-monitor/model-comparison HTTP/1.1" 200 -
2026-02-10 23:25:46,926 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:46] "GET /api/ai-monitor/error-distribution HTTP/1.1" 200 -
2026-02-10 23:25:46,928 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:46] "GET /api/ai-monitor/error-log HTTP/1.1" 200 -
2026-02-10 23:25:49,630 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:49] "GET /api/system-optimizer/status HTTP/1.1" 200 -
2026-02-10 23:25:49,636 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:49] "GET /api/system-optimizer/security-settings HTTP/1.1" 200 -
2026-02-10 23:25:49,637 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:49] "GET /api/system-optimizer/traffic-settings HTTP/1.1" 200 -
2026-02-10 23:25:49,638 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:49] "GET /api/system-optimizer/cost-settings HTTP/1.1" 200 -
2026-02-10 23:25:50,649 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:50] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:25:50,660 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:50] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:25:50,748 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:50] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:25:51,392 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:51] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:25:51,418 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:51] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:53,033 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:53] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:25:56,400 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:25:56,435 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:25:56,444 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:25:56,510 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:25:56,516 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:25:56,520 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:25:56,521 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:25:56,527 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:25:56,535 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:25:56,540 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:25:56,548 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:25:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:26:00,334 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:00] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:26:00,335 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:00] "GET /api/agent/tools/stats HTTP/1.1" 200 -
2026-02-10 23:26:01,387 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:01] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:01,402 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:01] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:01,432 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:01] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:26:01,537 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:01] "GET /api/alerts?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:26:02,587 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:02] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:26:05,026 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:05] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:26:05,833 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:05] "GET /api/token-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:26:05,845 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:05] "GET /api/token-monitor/records HTTP/1.1" 200 -
2026-02-10 23:26:05,848 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:05] "GET /api/token-monitor/chart HTTP/1.1" 200 -
2026-02-10 23:26:06,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:06] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:07,639 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:07] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:26:11,386 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:11] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:11,405 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:11] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:15,557 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:15] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:26:15,582 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:15] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:26:15,594 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:15] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:15,609 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:15,613 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:15] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:26:16,406 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:16] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:20,131 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:20] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:21,389 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:21] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:21,427 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:21] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:25,124 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:25] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:25,135 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:25] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:26,407 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:26] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:30,128 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:30] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:31,387 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:31] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:31,411 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:31] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:34,939 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:34] "GET /api/conversations?search=123 HTTP/1.1" 200 -
2026-02-10 23:26:35,122 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:35] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:35,134 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:35] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:36,414 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:36] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:36,437 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:36] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:26:40,132 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:40] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:41,395 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:41] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:41,421 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:41] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:44,247 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:44] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:26:45,121 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:45] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:45,135 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:45] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:46,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:46] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:46,666 - src.web.blueprints.knowledge - INFO - 搜索查询: '123'
2026-02-10 23:26:46,667 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '123' 返回 1 个结果
2026-02-10 23:26:46,668 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 1
2026-02-10 23:26:46,668 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:46] "GET /api/knowledge/search?q=123 HTTP/1.1" 200 -
2026-02-10 23:26:48,113 - src.web.blueprints.knowledge - INFO - 搜索查询: '123'
2026-02-10 23:26:48,114 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '123' 返回 1 个结果
2026-02-10 23:26:48,114 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 1
2026-02-10 23:26:48,114 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:48] "GET /api/knowledge/search?q=123 HTTP/1.1" 200 -
2026-02-10 23:26:50,140 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:50] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:50,845 - src.web.blueprints.knowledge - INFO - 搜索查询: '3'
2026-02-10 23:26:50,846 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '3' 返回 1 个结果
2026-02-10 23:26:50,846 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 1
2026-02-10 23:26:50,846 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:50] "GET /api/knowledge/search?q=3 HTTP/1.1" 200 -
2026-02-10 23:26:51,139 - src.web.blueprints.knowledge - INFO - 搜索查询: '3'
2026-02-10 23:26:51,140 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '3' 返回 1 个结果
2026-02-10 23:26:51,141 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 1
2026-02-10 23:26:51,141 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:51] "GET /api/knowledge/search?q=3 HTTP/1.1" 200 -
2026-02-10 23:26:51,316 - src.web.blueprints.knowledge - INFO - 搜索查询: '3'
2026-02-10 23:26:51,317 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '3' 返回 1 个结果
2026-02-10 23:26:51,318 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 1
2026-02-10 23:26:51,318 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:51] "GET /api/knowledge/search?q=3 HTTP/1.1" 200 -
2026-02-10 23:26:51,390 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:51] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:51,421 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:51] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:54,108 - src.web.blueprints.knowledge - INFO - 搜索查询: '4'
2026-02-10 23:26:54,109 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '4' 返回 0 个结果
2026-02-10 23:26:54,110 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 0
2026-02-10 23:26:54,110 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:54] "GET /api/knowledge/search?q=4 HTTP/1.1" 200 -
2026-02-10 23:26:54,318 - src.web.blueprints.knowledge - INFO - 搜索查询: '4'
2026-02-10 23:26:54,320 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '4' 返回 0 个结果
2026-02-10 23:26:54,320 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 0
2026-02-10 23:26:54,320 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:54] "GET /api/knowledge/search?q=4 HTTP/1.1" 200 -
2026-02-10 23:26:54,582 - src.web.blueprints.knowledge - INFO - 搜索查询: '4'
2026-02-10 23:26:54,583 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '4' 返回 0 个结果
2026-02-10 23:26:54,583 - src.web.blueprints.knowledge - INFO - 搜索结果数量: 0
2026-02-10 23:26:54,584 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:54] "GET /api/knowledge/search?q=4 HTTP/1.1" 200 -
2026-02-10 23:26:55,119 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:55] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:26:55,145 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:55] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:56,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:26:56,438 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:26:56,448 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:26:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:27:00,134 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:00] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:00,179 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:00] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:27:01,386 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:01] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:27:01,410 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:01] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:05,120 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:05] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:27:05,144 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:05] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:06,404 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:06] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:09,875 - src.agent_assistant - INFO - TSP Agent助手初始化完成
2026-02-10 23:27:09,875 - src.agent_assistant - INFO - 保存知识条目 1: 关于README.md的问题1...
2026-02-10 23:27:09,875 - src.agent_assistant - INFO - 知识条目 1 保存成功
2026-02-10 23:27:09,875 - src.agent_assistant - INFO - 保存知识条目 2: 关于README.md的问题2...
2026-02-10 23:27:09,875 - src.agent_assistant - INFO - 知识条目 2 保存成功
2026-02-10 23:27:09,876 - src.agent_assistant - INFO - 保存知识条目 3: 关于README.md的问题3...
2026-02-10 23:27:09,876 - src.agent_assistant - INFO - 知识条目 3 保存成功
2026-02-10 23:27:09,876 - src.agent_assistant - INFO - 保存知识条目 4: 关于README.md的问题5...
2026-02-10 23:27:09,876 - src.agent_assistant - INFO - 知识条目 4 保存成功
2026-02-10 23:27:09,876 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:09] "POST /api/knowledge/upload HTTP/1.1" 200 -
2026-02-10 23:27:10,136 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:10] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:10,884 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:10] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:11,388 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:11] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:27:11,421 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:11] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:11,436 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:11] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:27:14,964 - src.web.websocket_server - INFO - 客户端断开: ('::1', 62164, 0, 0)
2026-02-10 23:27:14,971 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:14] "GET / HTTP/1.1" 200 -
2026-02-10 23:27:14,998 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:14] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:27:15,001 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:27:15,006 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:27:15,006 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:27:15,007 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:27:15,008 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:27:15,017 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:27:15,018 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:27:15,019 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:27:15,122 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:27:15,140 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:15,176 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:27:15,182 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:27:15,187 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:27:15,193 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:27:15,210 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:27:15,231 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:27:15,245 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:27:15,247 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:27:15,254 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:15,254 - websockets.server - INFO - connection open
2026-02-10 23:27:15,257 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:27:15,259 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:15,259 - src.web.websocket_server - INFO - 客户端连接: ('::1', 62446, 0, 0)
2026-02-10 23:27:15,261 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:27:15,263 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:27:15,290 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:27:15,320 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:27:15,335 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:27:15,347 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:15] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:27:18,266 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:18] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:20,133 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:20,192 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:27:20,202 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:20,210 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:27:20,237 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:27:20,248 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:27:20,262 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:27:20,451 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:20,990 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:20] "GET /api/alerts?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:21,530 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:21] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:21,997 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:21] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:22,184 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:22] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:22,349 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:22] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:27:25,123 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:25] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:27:25,137 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:25] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:25,178 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:25] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:27:25,192 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:25] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:30,199 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:30] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:27:30,227 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:30] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:27:30,245 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:27:30] "GET /api/monitor/status HTTP/1.1" 200 -

View File

@@ -0,0 +1,545 @@
2026-02-10 23:48:47,381 - __main__ - INFO - 正在启动TSP智能助手综合管理平台...
2026-02-10 23:48:47,630 - src.config.unified_config - INFO - Initializing unified configuration from environment variables...
2026-02-10 23:48:47,630 - src.config.unified_config - INFO - Database config loaded.
2026-02-10 23:48:47,630 - src.config.unified_config - INFO - LLM config loaded.
2026-02-10 23:48:47,630 - src.config.unified_config - INFO - Server config loaded.
2026-02-10 23:48:47,630 - src.config.unified_config - INFO - Feishu config loaded.
2026-02-10 23:48:47,630 - src.config.unified_config - INFO - AI Accuracy config loaded.
2026-02-10 23:48:47,630 - src.config.unified_config - INFO - Configuration validation passed (warnings may exist).
2026-02-10 23:48:47,643 - src.core.database - INFO - 数据库初始化成功
2026-02-10 23:48:47,646 - __main__ - INFO - 跳过系统检查,直接启动服务...
2026-02-10 23:48:48,910 - src.core.backup_manager - INFO - 备份数据库初始化成功: tsp_assistant.db
2026-02-10 23:48:48,913 - src.integrations.config_manager - INFO - 配置加载成功
2026-02-10 23:48:49,025 - werkzeug - INFO - WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5001
* Running on http://192.168.31.45:5001
2026-02-10 23:48:49,025 - werkzeug - INFO - Press CTRL+C to quit
2026-02-10 23:48:49,026 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:49,050 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:49,051 - src.web.websocket_server - INFO - 启动WebSocket服务器: ws://localhost:8765
2026-02-10 23:48:49,054 - websockets.server - INFO - server listening on [::1]:8765
2026-02-10 23:48:49,054 - websockets.server - INFO - server listening on 127.0.0.1:8765
2026-02-10 23:48:52,832 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:52] "GET / HTTP/1.1" 200 -
2026-02-10 23:48:53,497 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/js/components/AlertManager.js HTTP/1.1" 200 -
2026-02-10 23:48:53,498 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/js/core/store.js HTTP/1.1" 200 -
2026-02-10 23:48:53,498 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/js/services/api.js HTTP/1.1" 200 -
2026-02-10 23:48:53,499 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 200 -
2026-02-10 23:48:53,500 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/js/components/NotificationManager.js HTTP/1.1" 200 -
2026-02-10 23:48:53,501 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 200 -
2026-02-10 23:48:53,524 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/js/app-new.js HTTP/1.1" 200 -
2026-02-10 23:48:53,529 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 200 -
2026-02-10 23:48:53,613 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:53] "GET /static/css/design-system.css HTTP/1.1" 200 -
2026-02-10 23:48:55,761 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:55,764 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:48:55,765 - websockets.server - INFO - connection open
2026-02-10 23:48:55,767 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:55,768 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:55,771 - src.web.websocket_server - INFO - 客户端连接: ('::1', 63792, 0, 0)
2026-02-10 23:48:55,776 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:55,792 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:55,793 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:55,798 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:48:55,803 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:55,803 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:55,808 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:55,809 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:55,821 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:48:55,831 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:55,851 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT count(*) AS count_1
FROM (SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders ORDER BY work_orders.created_at DESC) AS anon_1]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:55,852 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:55,856 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:48:55,863 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/workorders HTTP/1.1" 500 -
2026-02-10 23:48:55,880 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:48:55,884 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:48:55,908 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:48:55,936 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:48:55,939 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:48:55,967 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:55] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:48:56,002 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:48:56,021 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,023 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,027 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:48:56,065 - src.core.database - ERROR - 数据库操作失败: (sqlite3.DatabaseError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:56.033067',)]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
2026-02-10 23:48:56,067 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.DatabaseError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:56.033067',)]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
2026-02-10 23:48:56,071 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:48:56,086 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:48:56,110 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,111 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,114 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:48:56,154 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,155 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,157 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:48:56,359 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:48:56,400 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:48:56,427 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:48:56,433 - src.core.query_optimizer - ERROR - 优化分析查询失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:48:56,435 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:48:56,479 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:48:56,483 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:56.480246',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,484 - src.core.query_optimizer - ERROR - 优化分析查询失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:48:56,491 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:56.480246',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,498 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:48:56,517 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:48:56,542 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:56.537599',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:56,547 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:56.537599',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:57,683 - src.agent_assistant - INFO - TSP Agent助手初始化完成
2026-02-10 23:48:57,684 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:57] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:48:58,402 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET / HTTP/1.1" 200 -
2026-02-10 23:48:58,437 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:48:58,440 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:48:58,441 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:48:58,447 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:48:58,449 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:48:58,455 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:48:58,457 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:48:58,460 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:48:58,462 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:48:58,843 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:48:58,845 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:48:58,855 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:48:58,861 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:48:58,886 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:48:58,892 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT count(*) AS count_1
FROM (SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders ORDER BY work_orders.created_at DESC) AS anon_1]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:58,896 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/workorders HTTP/1.1" 500 -
2026-02-10 23:48:58,907 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:48:58,922 - src.core.database - ERROR - 数据库操作失败: (sqlite3.DatabaseError) another row available
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations) AS anon_1]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
2026-02-10 23:48:58,925 - src.core.query_optimizer - ERROR - 分页查询对话失败: (sqlite3.DatabaseError) another row available
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations) AS anon_1]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
2026-02-10 23:48:58,929 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:48:58,931 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:48:58,932 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:48:58,933 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:48:58,939 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:48:58,942 - websockets.server - INFO - connection open
2026-02-10 23:48:58,948 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:58.945590',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:58,950 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:48:58,951 - src.web.websocket_server - INFO - 客户端连接: ('::1', 63842, 0, 0)
2026-02-10 23:48:58,951 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:48:58.945590',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:58,966 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:58,966 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:58,967 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:48:58,997 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:58] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:48:59,008 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:59,009 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:59,010 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:59] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:48:59,020 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:59,021 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:48:59,022 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:48:59] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:49:00,644 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:00.643444',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:00,645 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:00.643444',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:00,650 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:00] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:00,698 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:00] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:49:00,698 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:00,701 - src.core.query_optimizer - ERROR - 优化分析查询失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:00,703 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:00] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:49:00,710 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:00,712 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:00,714 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:00] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:49:00,725 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:00.721862',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:00,725 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:00.721862',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:03,858 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:03.857297',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:03,860 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:03.857297',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:03,863 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:03] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:03,927 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:03] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:49:03,932 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:03,933 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:03,936 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:03] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:49:03,948 - src.core.database - ERROR - 数据库操作失败: (sqlite3.DatabaseError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
2026-02-10 23:49:03,951 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.DatabaseError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
2026-02-10 23:49:03,954 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:03] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:49:03,955 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:03.952792',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:03,957 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:03.952792',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:04,970 - src.web.websocket_server - INFO - 客户端断开: ('::1', 63842, 0, 0)
2026-02-10 23:49:04,975 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:04] "GET / HTTP/1.1" 200 -
2026-02-10 23:49:05,012 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:49:05,014 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:49:05,019 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:49:05,025 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:49:05,026 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:49:05,034 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:49:05,036 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:49:05,038 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:49:05,042 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:49:05,162 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:49:05,168 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:49:05,184 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:49:05,193 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:49:05,217 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:49:05,227 - src.core.database - ERROR - 数据库操作失败: (sqlite3.DatabaseError) no such column: work_orders.assigned_module
[SQL: SELECT count(*) AS count_1
FROM (SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders ORDER BY work_orders.created_at DESC) AS anon_1]
(Background on this error at: https://sqlalche.me/e/20/4xp6)
2026-02-10 23:49:05,230 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/workorders HTTP/1.1" 500 -
2026-02-10 23:49:05,244 - websockets.server - INFO - connection open
2026-02-10 23:49:05,254 - src.web.websocket_server - INFO - 客户端连接: ('::1', 63889, 0, 0)
2026-02-10 23:49:05,257 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:49:05,259 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:05,265 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:49:05,266 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:49:05,275 - src.web.error_handlers - ERROR - 未处理错误 get_health: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:05,279 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:49:05,281 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:49:05,283 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:05,293 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:49:05,293 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,295 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,298 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:49:05,320 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:49:05,327 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,328 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,330 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:49:05,337 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,337 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,339 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:49:05,632 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:49:05,661 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:05.657713',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,664 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:05.657713',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:05,671 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:05] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:10,174 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:10,176 - src.web.error_handlers - ERROR - 未处理错误 get_health: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:10,178 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:10] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:10,649 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:10.648202',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:10,650 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:10.648202',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:10,655 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:10] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:15,437 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:15] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:49:15,469 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:15.467371',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:15,471 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:15.467371',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:15,477 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:15,633 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:15] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:49:15,654 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:15,672 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:15.671190',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:15,672 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:15.671190',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:20,653 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:20,654 - src.web.error_handlers - ERROR - 未处理错误 get_health: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:20,658 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:20] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:21,941 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:21.940713',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:21,942 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:21.940713',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:21,947 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:21] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:25,489 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:49:25,773 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:49:25,801 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:49:25,814 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:25,819 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/workorders HTTP/1.1" 500 -
2026-02-10 23:49:25,820 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:49:25,838 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:49:25,838 - src.core.database - ERROR - 数据库操作失败: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:25,839 - src.web.error_handlers - ERROR - 未处理错误 get_health: <sqlite3.Connection object at 0x1065fd840> returned NULL without setting an exception
2026-02-10 23:49:25,843 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:25,862 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:25.860290',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,863 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:25.860290',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,864 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,866 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,867 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:49:25,879 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:49:25,886 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,886 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,887 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:49:25,897 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,898 - src.core.query_optimizer - ERROR - 优化分析查询失败: (sqlite3.OperationalError) no such column: work_orders.assigned_module
[SQL: SELECT work_orders.id AS work_orders_id, work_orders.order_id AS work_orders_order_id, work_orders.title AS work_orders_title, work_orders.description AS work_orders_description, work_orders.category AS work_orders_category, work_orders.priority AS work_orders_priority, work_orders.status AS work_orders_status, work_orders.created_at AS work_orders_created_at, work_orders.updated_at AS work_orders_updated_at, work_orders.resolution AS work_orders_resolution, work_orders.satisfaction_score AS work_orders_satisfaction_score, work_orders.feishu_record_id AS work_orders_feishu_record_id, work_orders.assignee AS work_orders_assignee, work_orders.solution AS work_orders_solution, work_orders.ai_suggestion AS work_orders_ai_suggestion, work_orders.source AS work_orders_source, work_orders.module AS work_orders_module, work_orders.created_by AS work_orders_created_by, work_orders.wilfulness AS work_orders_wilfulness, work_orders.date_of_close AS work_orders_date_of_close, work_orders.vehicle_type AS work_orders_vehicle_type, work_orders.vin_sim AS work_orders_vin_sim, work_orders.app_remote_control_version AS work_orders_app_remote_control_version, work_orders.hmi_sw AS work_orders_hmi_sw, work_orders.parent_record AS work_orders_parent_record, work_orders.has_updated_same_day AS work_orders_has_updated_same_day, work_orders.operating_time AS work_orders_operating_time, work_orders.assigned_module AS work_orders_assigned_module, work_orders.module_owner AS work_orders_module_owner, work_orders.dispatcher AS work_orders_dispatcher, work_orders.dispatch_time AS work_orders_dispatch_time, work_orders.region AS work_orders_region
FROM work_orders]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:25,900 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:25] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:49:27,790 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:27] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:27,800 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:27.799327',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:27,800 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:27.799327',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:30,641 - src.core.database - ERROR - 数据库操作失败: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:30.640792',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:30,642 - src.web.error_handlers - ERROR - 未处理错误 get_health: (sqlite3.OperationalError) no such column: conversations.ip_address
[SQL: SELECT count(*) AS count_1
FROM (SELECT conversations.id AS conversations_id, conversations.work_order_id AS conversations_work_order_id, conversations.user_message AS conversations_user_message, conversations.assistant_response AS conversations_assistant_response, conversations.timestamp AS conversations_timestamp, conversations.confidence_score AS conversations_confidence_score, conversations.knowledge_used AS conversations_knowledge_used, conversations.response_time AS conversations_response_time, conversations.ip_address AS conversations_ip_address, conversations.invocation_method AS conversations_invocation_method
FROM conversations
WHERE conversations.timestamp >= ?) AS anon_1]
[parameters: ('2026-02-10 22:49:30.640792',)]
(Background on this error at: https://sqlalche.me/e/20/e3q8)
2026-02-10 23:49:30,642 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:30] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:49:30,693 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:49:30] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -

View File

@@ -0,0 +1,298 @@
2026-02-10 23:51:10,759 - __main__ - INFO - 正在启动TSP智能助手综合管理平台...
2026-02-10 23:51:11,024 - src.config.unified_config - INFO - Initializing unified configuration from environment variables...
2026-02-10 23:51:11,024 - src.config.unified_config - INFO - Database config loaded.
2026-02-10 23:51:11,024 - src.config.unified_config - INFO - LLM config loaded.
2026-02-10 23:51:11,024 - src.config.unified_config - INFO - Server config loaded.
2026-02-10 23:51:11,024 - src.config.unified_config - INFO - Feishu config loaded.
2026-02-10 23:51:11,025 - src.config.unified_config - INFO - AI Accuracy config loaded.
2026-02-10 23:51:11,025 - src.config.unified_config - INFO - Configuration validation passed (warnings may exist).
2026-02-10 23:51:11,038 - src.core.database - INFO - 数据库初始化成功
2026-02-10 23:51:11,040 - __main__ - INFO - 跳过系统检查,直接启动服务...
2026-02-10 23:51:12,376 - src.core.backup_manager - INFO - 备份数据库初始化成功: tsp_assistant.db
2026-02-10 23:51:12,381 - src.integrations.config_manager - INFO - 配置加载成功
2026-02-10 23:51:12,536 - werkzeug - INFO - WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5001
* Running on http://192.168.31.45:5001
2026-02-10 23:51:12,536 - werkzeug - INFO - Press CTRL+C to quit
2026-02-10 23:51:12,541 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:12,573 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:12,574 - src.web.websocket_server - INFO - 启动WebSocket服务器: ws://localhost:8765
2026-02-10 23:51:12,578 - websockets.server - INFO - server listening on 127.0.0.1:8765
2026-02-10 23:51:12,579 - websockets.server - INFO - server listening on [::1]:8765
2026-02-10 23:51:12,966 - websockets.server - INFO - connection open
2026-02-10 23:51:12,966 - src.web.websocket_server - INFO - 客户端连接: ('::1', 64226, 0, 0)
2026-02-10 23:51:14,450 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:14,451 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:14,453 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:14,457 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:14] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:14,457 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:14,458 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:14,462 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:14,464 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:14] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:14,465 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:14] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:14,468 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:14,469 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:14,473 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:14,473 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:14,474 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:14,487 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:14,493 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:14] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:14,538 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:14] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:15,629 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:15] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:15,659 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:15,692 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:15] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:16,549 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET / HTTP/1.1" 200 -
2026-02-10 23:51:16,581 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:16,583 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:51:16,585 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:16,586 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:51:16,587 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:51:16,589 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:51:16,591 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:51:16,593 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:51:16,594 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:51:16,660 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:16,661 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:51:16,678 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:51:16,684 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:51:16,694 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:16,704 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:16,706 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:16,715 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:16,727 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:51:16,739 - websockets.server - INFO - connection open
2026-02-10 23:51:16,748 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:51:16,751 - src.web.websocket_server - INFO - 客户端连接: ('::1', 64265, 0, 0)
2026-02-10 23:51:16,754 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:51:16,756 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:51:16,758 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:16,766 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:16,776 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:16,799 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:16,824 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:16,831 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:16] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:17,552 - src.agent_assistant - INFO - TSP Agent助手初始化完成
2026-02-10 23:51:17,553 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:17] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:19,884 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:19] "GET /api/workorders?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:20,641 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:20] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:21,177 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:21] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:21,625 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:21] "GET /api/alerts?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:21,682 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:21] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:21,710 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:21] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:21,711 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:21] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:21,728 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:21] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:22,136 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:22] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:22,137 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:22] "GET /api/agent/tools/stats HTTP/1.1" 200 -
2026-02-10 23:51:24,059 - src.dialogue.realtime_chat - INFO - 创建新会话: session_user_001_1770738684
2026-02-10 23:51:24,060 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:24] "POST /api/chat/session HTTP/1.1" 200 -
2026-02-10 23:51:25,249 - src.knowledge_base.knowledge_manager - WARNING - 知识库中没有活跃条目
2026-02-10 23:51:25,635 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:25,637 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:25,649 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:25,688 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:25,699 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:25,700 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:25,715 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:25,731 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:25,742 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:25,763 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:25] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:26,661 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:26] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:26,697 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:26] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:28,485 - src.core.llm_client - INFO - API请求成功
2026-02-10 23:51:28,490 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:28] "POST /api/chat/message HTTP/1.1" 200 -
2026-02-10 23:51:30,251 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:30] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:30,645 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:30] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:31,677 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:31] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:33,186 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:33] "POST /api/conversations/migrate-merge HTTP/1.1" 200 -
2026-02-10 23:51:33,189 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:33] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:35,639 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:35] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:35,664 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:35] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:36,253 - src.web.websocket_server - INFO - 客户端断开: ('::1', 64265, 0, 0)
2026-02-10 23:51:36,257 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET / HTTP/1.1" 200 -
2026-02-10 23:51:36,280 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:36,281 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:36,288 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:51:36,289 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:51:36,291 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:51:36,292 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:51:36,293 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:51:36,294 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:51:36,297 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:51:36,360 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:36,362 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:51:36,364 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:36,372 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:36,383 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:36,394 - websockets.server - INFO - connection open
2026-02-10 23:51:36,397 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:51:36,402 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:36,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:36,404 - src.web.websocket_server - INFO - 客户端连接: ('::1', 64376, 0, 0)
2026-02-10 23:51:36,412 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:36,414 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:36,416 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:51:36,423 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:51:36,433 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:51:36,441 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:36,458 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:36,466 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:36,475 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:36] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:38,509 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:38] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:40,466 - src.web.websocket_server - INFO - 客户端断开: ('::1', 64376, 0, 0)
2026-02-10 23:51:40,469 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET / HTTP/1.1" 200 -
2026-02-10 23:51:40,495 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:40,498 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:40,503 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:51:40,509 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:51:40,512 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:51:40,515 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:51:40,516 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:51:40,519 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:51:40,520 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:51:40,641 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:40,675 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:40,676 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:51:40,677 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:40,692 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:40,701 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:40,711 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:40,716 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:40,725 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:51:40,725 - websockets.server - INFO - connection open
2026-02-10 23:51:40,730 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:40,731 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:40,732 - src.web.websocket_server - INFO - 客户端连接: ('::1', 64413, 0, 0)
2026-02-10 23:51:40,739 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:51:40,743 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:51:40,744 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:51:40,752 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:40,766 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:40,772 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:40,780 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:40] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:42,303 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:42] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:42,303 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:42] "GET /api/agent/tools/stats HTTP/1.1" 200 -
2026-02-10 23:51:44,893 - src.dialogue.realtime_chat - INFO - 创建新会话: session_user_001_1770738704
2026-02-10 23:51:44,895 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:44] "POST /api/chat/session HTTP/1.1" 200 -
2026-02-10 23:51:45,629 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:45] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:45,658 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:45] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:45,692 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:45] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:45,709 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:45] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:45,713 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:45] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:45,723 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:45] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:46,347 - src.knowledge_base.knowledge_manager - WARNING - 知识库中没有活跃条目
2026-02-10 23:51:49,951 - src.core.llm_client - INFO - API请求成功
2026-02-10 23:51:49,961 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:49] "POST /api/chat/message HTTP/1.1" 200 -
2026-02-10 23:51:50,642 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:50] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:50,679 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:50] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:50,703 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:50] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:51,250 - src.dialogue.realtime_chat - INFO - 结束会话: session_user_001_1770738704
2026-02-10 23:51:51,251 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:51] "DELETE /api/chat/session/session_user_001_1770738704 HTTP/1.1" 200 -
2026-02-10 23:51:52,872 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:52] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:55,386 - src.web.websocket_server - INFO - 客户端断开: ('::1', 64413, 0, 0)
2026-02-10 23:51:55,389 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET / HTTP/1.1" 200 -
2026-02-10 23:51:55,411 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:55,412 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:55,418 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:51:55,422 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:51:55,424 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:51:55,425 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:51:55,426 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:51:55,427 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:51:55,430 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:51:55,489 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:51:55,491 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:55,495 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:55,496 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:55,511 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:55,518 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:55,520 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:51:55,521 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:55,522 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:55,530 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:51:55,530 - websockets.server - INFO - connection open
2026-02-10 23:51:55,539 - src.web.websocket_server - INFO - 客户端连接: ('::1', 64475, 0, 0)
2026-02-10 23:51:55,542 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:51:55,544 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:51:55,545 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:55,555 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:55,574 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:55,583 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:55,591 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:55,632 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:55,638 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:55,647 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:55,653 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:55,661 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:55,675 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:55,676 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:55,690 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:55,699 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:55,708 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:55,718 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:55] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:56,132 - src.web.websocket_server - INFO - 客户端断开: ('::1', 64475, 0, 0)
2026-02-10 23:51:56,135 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET / HTTP/1.1" 200 -
2026-02-10 23:51:56,158 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:56,159 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:56,168 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:51:56,169 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:51:56,170 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:51:56,172 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:51:56,173 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:51:56,174 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:51:56,176 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:51:56,236 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:51:56,239 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:56,241 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:56,249 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:56,260 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:56,265 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:56,269 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:51:56,276 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:56,277 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:56,280 - websockets.server - INFO - connection open
2026-02-10 23:51:56,283 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:51:56,288 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:51:56,291 - src.web.websocket_server - INFO - 客户端连接: ('::1', 64530, 0, 0)
2026-02-10 23:51:56,299 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:51:56,301 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:56,314 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:56,334 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:56,344 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:56,355 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:51:56,693 - src.web.websocket_server - INFO - 客户端断开: ('::1', 64530, 0, 0)
2026-02-10 23:51:56,697 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET / HTTP/1.1" 200 -
2026-02-10 23:51:56,719 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:56,720 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:51:56,723 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:51:56,727 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:51:56,727 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:51:56,728 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:51:56,731 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:51:56,733 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:51:56,736 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:51:56,828 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:51:56,829 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:51:56,840 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:51:56,846 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:51:56,867 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:51:56,877 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:51:56,879 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:51:56,881 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:51:56,892 - websockets.server - INFO - connection open
2026-02-10 23:51:56,897 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:51:56,904 - src.web.websocket_server - INFO - 客户端连接: ('::1', 64557, 0, 0)
2026-02-10 23:51:56,906 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:51:56,908 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:51:56,910 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:51:56,918 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:51:56,935 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:51:56,956 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:51:56,968 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:51:56,977 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:51:56] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:52:00,641 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:52:00] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:52:01,926 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:52:01] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:52:05,630 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:52:05] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:52:05,659 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:52:05] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:52:08,742 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:52:08] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:52:08,759 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:52:08] "GET /api/monitor/status HTTP/1.1" 200 -

View File

@@ -0,0 +1,237 @@
2026-02-10 23:55:42,753 - __main__ - INFO - 正在启动TSP智能助手综合管理平台...
2026-02-10 23:55:42,982 - src.config.unified_config - INFO - Initializing unified configuration from environment variables...
2026-02-10 23:55:42,982 - src.config.unified_config - INFO - Database config loaded.
2026-02-10 23:55:42,982 - src.config.unified_config - INFO - LLM config loaded.
2026-02-10 23:55:42,982 - src.config.unified_config - INFO - Server config loaded.
2026-02-10 23:55:42,982 - src.config.unified_config - INFO - Feishu config loaded.
2026-02-10 23:55:42,982 - src.config.unified_config - INFO - AI Accuracy config loaded.
2026-02-10 23:55:42,982 - src.config.unified_config - INFO - Configuration validation passed (warnings may exist).
2026-02-10 23:55:42,994 - src.core.database - INFO - 数据库初始化成功
2026-02-10 23:55:42,996 - __main__ - INFO - 跳过系统检查,直接启动服务...
2026-02-10 23:55:44,509 - src.core.backup_manager - INFO - 备份数据库初始化成功: tsp_assistant.db
2026-02-10 23:55:44,513 - src.integrations.config_manager - INFO - 配置加载成功
2026-02-10 23:55:44,684 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:44,685 - werkzeug - INFO - WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5001
* Running on http://192.168.31.45:5001
2026-02-10 23:55:44,695 - werkzeug - INFO - Press CTRL+C to quit
2026-02-10 23:55:44,717 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:44,718 - src.web.websocket_server - INFO - 启动WebSocket服务器: ws://localhost:8765
2026-02-10 23:55:44,722 - websockets.server - INFO - server listening on 127.0.0.1:8765
2026-02-10 23:55:44,723 - websockets.server - INFO - server listening on [::1]:8765
2026-02-10 23:55:47,331 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:47,332 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:47,346 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:47,355 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:47,358 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:47,361 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:47] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:55:47,362 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:47,362 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:47] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:55:47,374 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:47,374 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:47,375 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:47,383 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:47] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:55:47,385 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:47,388 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:47,394 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:47,427 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:47] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:55:47,443 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:47] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:55:49,849 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET / HTTP/1.1" 200 -
2026-02-10 23:55:49,883 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:55:49,884 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:55:49,884 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:55:49,885 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:55:49,893 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:55:49,895 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:55:49,897 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:55:49,897 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:55:49,898 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:49] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:55:50,144 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:55:50,149 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:55:50,151 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:55:50,159 - src.knowledge_base.knowledge_manager - WARNING - 知识库尚无活跃条目,向量化器将保持空状态
2026-02-10 23:55:50,161 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:55:50,162 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:55:50,186 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:55:50,191 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:55:50,201 - websockets.server - INFO - connection open
2026-02-10 23:55:50,205 - src.web.websocket_server - INFO - 客户端连接: ('::1', 65043, 0, 0)
2026-02-10 23:55:50,220 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:55:50,222 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:55:50,230 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:55:50,235 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:55:50,241 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:55:50,262 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:55:50,262 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:55:50,298 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:55:50,320 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:55:50,342 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:50] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:55:51,214 - src.agent_assistant - INFO - TSP Agent助手初始化完成
2026-02-10 23:55:51,214 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:51] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:55:54,013 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:54] "GET /api/conversations/3 HTTP/1.1" 200 -
2026-02-10 23:55:55,158 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:55] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:55:55,189 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:55] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:55:55,190 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:55] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:55:55,199 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:55] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:55:57,948 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:57] "GET /api/token-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:55:57,966 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:57] "GET /api/token-monitor/records HTTP/1.1" 200 -
2026-02-10 23:55:57,975 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:55:57] "GET /api/token-monitor/chart HTTP/1.1" 200 -
2026-02-10 23:56:00,145 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:00] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:56:00,162 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:00] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:00,232 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:00] "GET /api/workorders?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:00,935 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:00] "GET /api/alerts?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:02,767 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:02] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:05,157 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:05] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:08,454 - src.knowledge_base.knowledge_manager - INFO - 正在初始化知识库向量化器...
2026-02-10 23:56:08,457 - src.knowledge_base.knowledge_manager - INFO - 向量化器加载成功: 共处理 1 个知识条目
2026-02-10 23:56:08,457 - src.knowledge_base.knowledge_manager - INFO - 添加知识库条目成功: 123...
2026-02-10 23:56:08,458 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:08] "POST /api/knowledge HTTP/1.1" 200 -
2026-02-10 23:56:08,471 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:08] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:10,143 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:10] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:56:10,170 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:10] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:10,377 - src.knowledge_base.knowledge_manager - INFO - 知识库条目验证成功: 1
2026-02-10 23:56:10,378 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:10] "POST /api/knowledge/verify/1 HTTP/1.1" 200 -
2026-02-10 23:56:10,383 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:10] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:12,315 - src.dialogue.realtime_chat - INFO - 创建新会话: session_user_001_1770738972
2026-02-10 23:56:12,316 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:12] "POST /api/chat/session HTTP/1.1" 200 -
2026-02-10 23:56:14,758 - src.knowledge_base.knowledge_manager - INFO - 搜索查询 '123' 返回 1 个结果
2026-02-10 23:56:15,156 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:16,283 - src.core.llm_client - INFO - API请求成功
2026-02-10 23:56:16,289 - src.knowledge_base.knowledge_manager - INFO - 成功更新 1 个知识库条目的使用次数
2026-02-10 23:56:16,290 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:16] "POST /api/chat/message HTTP/1.1" 200 -
2026-02-10 23:56:20,148 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:20] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:56:20,182 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:20] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:20,229 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:20] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:56:20,230 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:20] "GET /api/agent/tools/stats HTTP/1.1" 200 -
2026-02-10 23:56:21,681 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:21] "GET /api/knowledge?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:23,895 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:23] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:56:23,895 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:23] "GET /api/agent/tools/stats HTTP/1.1" 200 -
2026-02-10 23:56:25,157 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:25] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:25,188 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:25] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:56:26,989 - src.dialogue.realtime_chat - INFO - 结束会话: session_user_001_1770738972
2026-02-10 23:56:26,990 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:26] "DELETE /api/chat/session/session_user_001_1770738972 HTTP/1.1" 200 -
2026-02-10 23:56:28,275 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:28] "GET /api/token-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:56:28,284 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:28] "GET /api/token-monitor/records HTTP/1.1" 200 -
2026-02-10 23:56:28,293 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:28] "GET /api/token-monitor/chart HTTP/1.1" 200 -
2026-02-10 23:56:28,859 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:28] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:30,147 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:30] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:56:30,165 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:30] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:31,366 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:31] "POST /api/conversations/migrate-merge HTTP/1.1" 200 -
2026-02-10 23:56:31,369 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:31] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:33,616 - src.web.websocket_server - INFO - 客户端断开: ('::1', 65043, 0, 0)
2026-02-10 23:56:33,619 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET / HTTP/1.1" 200 -
2026-02-10 23:56:33,642 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/css/design-system.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:56:33,643 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/css/style.css?v=1.0.0 HTTP/1.1" 304 -
2026-02-10 23:56:33,648 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/js/core/store.js HTTP/1.1" 304 -
2026-02-10 23:56:33,649 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/js/services/api.js HTTP/1.1" 304 -
2026-02-10 23:56:33,654 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/js/components/NotificationManager.js HTTP/1.1" 304 -
2026-02-10 23:56:33,656 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/js/components/AlertManager.js HTTP/1.1" 304 -
2026-02-10 23:56:33,658 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/js/app-new.js HTTP/1.1" 304 -
2026-02-10 23:56:33,660 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/js/dashboard.js?v=1.0.9 HTTP/1.1" 304 -
2026-02-10 23:56:33,661 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /static/css/design-system.css HTTP/1.1" 304 -
2026-02-10 23:56:33,715 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:56:33,717 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:56:33,721 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/rules HTTP/1.1" 200 -
2026-02-10 23:56:33,723 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:56:33,731 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:56:33,746 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/system/info HTTP/1.1" 200 -
2026-02-10 23:56:33,748 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:56:33,760 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:33,768 - websockets.server - INFO - connection open
2026-02-10 23:56:33,770 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/feishu-sync/config HTTP/1.1" 200 -
2026-02-10 23:56:33,770 - src.web.websocket_server - INFO - 客户端连接: ('::1', 65161, 0, 0)
2026-02-10 23:56:33,774 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:56:33,776 - src.web.blueprints.feishu_sync - ERROR - 获取同步状态失败: 飞书配置不完整,请先配置飞书应用信息
2026-02-10 23:56:33,780 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/feishu-sync/status HTTP/1.1" 500 -
2026-02-10 23:56:33,787 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:33,799 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:56:33,814 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:56:33,829 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:56:33,841 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:33] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:56:38,725 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:38] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:38,747 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:38] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:56:38,751 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:38] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:56:38,763 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:38] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:56:42,962 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:42] "DELETE /api/conversations/2 HTTP/1.1" 200 -
2026-02-10 23:56:42,970 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:42] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:45,054 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:45] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:56:45,071 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:45] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:50,703 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:50] "DELETE /api/conversations/5 HTTP/1.1" 200 -
2026-02-10 23:56:50,710 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:50] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:56:51,423 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:51] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:56:56,407 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:56] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:56:56,422 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:56:56] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:11,410 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:11] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:57:11,422 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:11] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:16,416 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:16] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:21,403 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:21] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:57:31,615 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:31] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:57:33,383 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:33] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:35,756 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:35] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:57:36,898 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:36] "GET /api/token-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:57:36,907 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:36] "GET /api/token-monitor/records HTTP/1.1" 200 -
2026-02-10 23:57:36,915 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:36] "GET /api/token-monitor/chart HTTP/1.1" 200 -
2026-02-10 23:57:37,876 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:37] "GET /api/ai-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:57:37,888 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:37] "GET /api/ai-monitor/model-comparison HTTP/1.1" 200 -
2026-02-10 23:57:37,893 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:37] "GET /api/ai-monitor/error-distribution HTTP/1.1" 200 -
2026-02-10 23:57:37,896 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:37] "GET /api/ai-monitor/error-log HTTP/1.1" 200 -
2026-02-10 23:57:38,386 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:38] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:39,551 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:39] "GET /api/system-optimizer/status HTTP/1.1" 200 -
2026-02-10 23:57:39,556 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:39] "GET /api/system-optimizer/security-settings HTTP/1.1" 200 -
2026-02-10 23:57:39,557 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:39] "GET /api/system-optimizer/traffic-settings HTTP/1.1" 200 -
2026-02-10 23:57:39,558 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:39] "GET /api/system-optimizer/cost-settings HTTP/1.1" 200 -
2026-02-10 23:57:40,097 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:40] "POST /api/system-optimizer/optimize-all HTTP/1.1" 200 -
2026-02-10 23:57:41,103 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:41] "GET /api/system-optimizer/status HTTP/1.1" 200 -
2026-02-10 23:57:41,110 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:41] "GET /api/system-optimizer/cost-settings HTTP/1.1" 200 -
2026-02-10 23:57:41,111 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:41] "GET /api/system-optimizer/security-settings HTTP/1.1" 200 -
2026-02-10 23:57:41,111 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:41] "GET /api/system-optimizer/traffic-settings HTTP/1.1" 200 -
2026-02-10 23:57:41,278 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:41] "POST /api/system-optimizer/clear-cache HTTP/1.1" 200 -
2026-02-10 23:57:41,614 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:41] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:57:42,286 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:42] "GET /api/system-optimizer/status HTTP/1.1" 200 -
2026-02-10 23:57:42,295 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:42] "GET /api/system-optimizer/security-settings HTTP/1.1" 200 -
2026-02-10 23:57:42,296 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:42] "GET /api/system-optimizer/traffic-settings HTTP/1.1" 200 -
2026-02-10 23:57:42,297 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:42] "GET /api/system-optimizer/cost-settings HTTP/1.1" 200 -
2026-02-10 23:57:43,379 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:43] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:57:43,397 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:43] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:57:43,409 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:43] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:46,560 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:46] "GET /api/ai-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:57:46,570 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:46] "GET /api/ai-monitor/error-log HTTP/1.1" 200 -
2026-02-10 23:57:46,571 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:46] "GET /api/ai-monitor/model-comparison HTTP/1.1" 200 -
2026-02-10 23:57:46,572 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:46] "GET /api/ai-monitor/error-distribution HTTP/1.1" 200 -
2026-02-10 23:57:48,026 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:48] "GET /api/agent/status HTTP/1.1" 200 -
2026-02-10 23:57:48,027 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:48] "GET /api/agent/tools/stats HTTP/1.1" 200 -
2026-02-10 23:57:48,385 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:48] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:50,785 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/chat/sessions HTTP/1.1" 200 -
2026-02-10 23:57:50,792 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/workorders HTTP/1.1" 200 -
2026-02-10 23:57:50,793 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/alerts?per_page=1000 HTTP/1.1" 200 -
2026-02-10 23:57:50,799 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/knowledge/stats HTTP/1.1" 200 -
2026-02-10 23:57:50,806 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/analytics?days=7&dimension=performance HTTP/1.1" 200 -
2026-02-10 23:57:50,822 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/settings HTTP/1.1" 200 -
2026-02-10 23:57:50,830 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/analytics HTTP/1.1" 200 -
2026-02-10 23:57:50,840 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:50] "GET /api/analytics?timeRange=30&dimension=workorders HTTP/1.1" 200 -
2026-02-10 23:57:51,613 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:51] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:57:53,389 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:53] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:57:53,422 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:53] "GET /api/conversations?page=1&per_page=10 HTTP/1.1" 200 -
2026-02-10 23:57:55,324 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:55] "GET /api/token-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:57:55,337 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:55] "GET /api/token-monitor/records HTTP/1.1" 200 -
2026-02-10 23:57:55,342 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:55] "GET /api/token-monitor/chart HTTP/1.1" 200 -
2026-02-10 23:57:56,246 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:56] "GET /api/ai-monitor/stats HTTP/1.1" 200 -
2026-02-10 23:57:56,253 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:56] "GET /api/ai-monitor/error-distribution HTTP/1.1" 200 -
2026-02-10 23:57:56,255 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:56] "GET /api/ai-monitor/model-comparison HTTP/1.1" 200 -
2026-02-10 23:57:56,256 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:56] "GET /api/ai-monitor/error-log HTTP/1.1" 200 -
2026-02-10 23:57:58,382 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:57:58] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:58:01,614 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:01] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:58:03,388 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:03] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:58:08,481 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:08] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:58:11,998 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:11] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:58:14,480 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:14] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:58:23,303 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:23] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:58:26,634 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:26] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:58:31,960 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:31] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:58:48,331 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:48] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:58:48,348 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:58:48] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-10 23:59:15,050 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:59:15] "GET /api/alerts HTTP/1.1" 200 -
2026-02-10 23:59:15,062 - werkzeug - INFO - 127.0.0.1 - - [10/Feb/2026 23:59:15] "GET /api/monitor/status HTTP/1.1" 200 -
2026-02-11 00:00:08,355 - werkzeug - INFO - 127.0.0.1 - - [11/Feb/2026 00:00:08] "GET /api/alerts HTTP/1.1" 200 -
2026-02-11 00:00:08,367 - werkzeug - INFO - 127.0.0.1 - - [11/Feb/2026 00:00:08] "GET /api/monitor/status HTTP/1.1" 200 -

File diff suppressed because one or more lines are too long

0
logs/tsp_assistant.log Normal file
View File

View File

@@ -1,175 +0,0 @@
@echo off
chcp 65001 >nul
setlocal enabledelayedexpansion
echo 🚀 TSP智能助手 - 快速推送
echo.
:: 检查Git状态
git status --porcelain >nul 2>&1
if %errorlevel% neq 0 (
echo ❌ Git未初始化或不在Git仓库中
pause
exit /b 1
)
:: 检查是否有参数
if "%1"=="" (
:: 智能生成提交信息
echo 📝 分析markdown文件并生成提交信息...
:: 检查是否有markdown文件修改
set md_files=
for /f "tokens=*" %%f in ('git diff --name-only --cached 2^>nul ^| findstr /i "\.md$"') do (
set md_files=!md_files! %%f
)
for /f "tokens=*" %%f in ('git diff --name-only 2^>nul ^| findstr /i "\.md$"') do (
set md_files=!md_files! %%f
)
set commit_msg=
if not "%md_files%"=="" (
echo 📄 检测到markdown文件修改: %md_files%
:: 提取markdown文件的主要内容
set commit_title=
set commit_type=docs
:: 检查是否有修复相关内容
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /i "修复\|解决\|问题\|错误"') do (
set commit_type=fix
set commit_title=修复问题
goto :found_fix
)
)
)
:: 检查是否有新功能相关内容
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /i "功能\|新增\|添加\|实现"') do (
set commit_type=feat
set commit_title=新增功能
goto :found_feature
)
)
)
:: 检查是否有优化相关内容
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /i "优化\|性能\|改进\|提升"') do (
set commit_type=perf
set commit_title=性能优化
goto :found_optimization
)
)
)
:: 提取文件标题
for %%f in (%md_files%) do (
if exist "%%f" (
for /f "tokens=*" %%l in ('type "%%f" ^| findstr /n "^#" ^| head -1') do (
set line=%%l
set line=!line:*:=!
set line=!line:# =!
set line=!line:## =!
if "!line!" neq "" (
set commit_title=!line!
goto :found_title
)
)
)
)
:found_fix
:found_feature
:found_optimization
:found_title
if "%commit_title%"=="" (
set commit_title=更新文档记录
)
:: 生成提交信息
set commit_msg=%commit_type%: %commit_title%
) else (
echo 没有检测到markdown文件修改
set commit_msg=feat: 快速提交 - %date% %time%
)
) else (
set commit_msg=%1
)
echo 📝 提交信息: %commit_msg%
echo.
:: 检查是否有更改需要提交(含未跟踪文件)
setlocal enabledelayedexpansion
git diff --quiet
set has_unstaged=%errorlevel%
git diff --cached --quiet
set has_staged=%errorlevel%
set has_untracked=0
for /f "delims=" %%f in ('git ls-files --others --exclude-standard') do set has_untracked=1
if %has_unstaged% equ 0 if %has_staged% equ 0 if %has_untracked% equ 0 (
echo 没有检测到任何更改,无需提交
echo.
echo ✅ 工作区干净,无需推送
pause
exit /b 0
)
:: 执行推送
echo.
echo 📤 开始推送流程...
echo 📝 提交信息: %commit_msg%
git add .
if %errorlevel% neq 0 (
echo ❌ 添加文件失败
pause
exit /b 1
)
git commit -m "%commit_msg%"
if %errorlevel% neq 0 (
echo ❌ 提交失败
pause
exit /b 1
)
git fetch origin main
git push origin main
if %errorlevel% equ 0 (
echo.
echo ✅ 推送完成!
echo 📊 最新提交:
git log --oneline -1
) else (
echo.
echo ❌ 推送失败,尝试自动解决...
echo 🔄 执行: git pull origin main --rebase
git pull origin main --rebase
if %errorlevel% equ 0 (
echo ✅ 重试推送...
git push origin main
if %errorlevel% equ 0 (
echo ✅ 推送成功!
echo 📊 最新提交:
git log --oneline -1
) else (
echo ❌ 重试推送失败,请手动处理
)
) else (
echo ❌ 自动rebase失败请手动处理冲突后重试
)
)
echo.
pause

View File

@@ -1,306 +0,0 @@
#!/bin/bash
# TSP智能助手部署脚本
# 支持多环境部署、版本管理、自动备份
set -e # 遇到错误立即退出
# 颜色定义
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
# 日志函数
log_info() {
echo -e "${GREEN}[INFO]${NC} $1"
}
log_warn() {
echo -e "${YELLOW}[WARN]${NC} $1"
}
log_error() {
echo -e "${RED}[ERROR]${NC} $1"
}
# 检查依赖
check_dependencies() {
log_info "检查系统依赖..."
# 检查Python
if ! command -v python3 &> /dev/null; then
log_error "Python3 未安装"
exit 1
fi
# 检查pip
if ! command -v pip3 &> /dev/null; then
log_error "pip3 未安装"
exit 1
fi
# 检查Git
if ! command -v git &> /dev/null; then
log_error "Git 未安装"
exit 1
fi
log_info "依赖检查完成"
}
# 创建虚拟环境
setup_venv() {
local venv_path=$1
log_info "创建虚拟环境: $venv_path"
if [ ! -d "$venv_path" ]; then
python3 -m venv "$venv_path"
fi
source "$venv_path/bin/activate"
pip install --upgrade pip
log_info "虚拟环境设置完成"
}
# 安装依赖
install_dependencies() {
log_info "安装Python依赖..."
pip install -r requirements.txt
log_info "依赖安装完成"
}
# 数据库迁移
run_migrations() {
log_info "运行数据库迁移..."
# 检查数据库文件
if [ ! -f "tsp_assistant.db" ]; then
log_info "初始化数据库..."
python init_database.py
fi
log_info "数据库迁移完成"
}
# 创建systemd服务文件
create_systemd_service() {
local service_name=$1
local app_path=$2
local service_file="/etc/systemd/system/${service_name}.service"
log_info "创建systemd服务文件: $service_file"
sudo tee "$service_file" > /dev/null <<EOF
[Unit]
Description=TSP智能助手服务
After=network.target
[Service]
Type=simple
User=www-data
Group=www-data
WorkingDirectory=$app_path
Environment=PATH=$app_path/venv/bin
ExecStart=$app_path/venv/bin/python start_dashboard.py
Restart=always
RestartSec=10
[Install]
WantedBy=multi-user.target
EOF
sudo systemctl daemon-reload
sudo systemctl enable "$service_name"
log_info "systemd服务创建完成"
}
# 创建nginx配置
create_nginx_config() {
local domain=$1
local app_port=$2
local config_file="/etc/nginx/sites-available/tsp_assistant"
log_info "创建nginx配置: $config_file"
sudo tee "$config_file" > /dev/null <<EOF
server {
listen 80;
server_name $domain;
location / {
proxy_pass http://127.0.0.1:$app_port;
proxy_set_header Host \$host;
proxy_set_header X-Real-IP \$remote_addr;
proxy_set_header X-Forwarded-For \$proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto \$scheme;
}
location /static {
alias $app_path/src/web/static;
expires 1y;
add_header Cache-Control "public, immutable";
}
}
EOF
# 启用站点
sudo ln -sf "$config_file" /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx
log_info "nginx配置完成"
}
# 主部署函数
deploy() {
local environment=${1:-production}
local domain=${2:-localhost}
local app_port=${3:-5000}
log_info "开始部署TSP智能助手到 $environment 环境"
# 设置部署路径
case $environment in
development)
DEPLOY_PATH="./dev_deploy"
SERVICE_NAME=""
;;
staging)
DEPLOY_PATH="/opt/tsp_assistant_staging"
SERVICE_NAME="tsp_assistant_staging"
;;
production)
DEPLOY_PATH="/opt/tsp_assistant"
SERVICE_NAME="tsp_assistant"
;;
*)
log_error "未知环境: $environment"
exit 1
;;
esac
# 检查依赖
check_dependencies
# 创建部署目录
log_info "创建部署目录: $DEPLOY_PATH"
sudo mkdir -p "$DEPLOY_PATH"
sudo chown $USER:$USER "$DEPLOY_PATH"
# 复制文件
log_info "复制应用文件..."
cp -r . "$DEPLOY_PATH/"
cd "$DEPLOY_PATH"
# 设置虚拟环境
setup_venv "venv"
# 安装依赖
install_dependencies
# 运行迁移
run_migrations
# 创建服务文件(非开发环境)
if [ "$environment" != "development" ] && [ -n "$SERVICE_NAME" ]; then
create_systemd_service "$SERVICE_NAME" "$DEPLOY_PATH"
create_nginx_config "$domain" "$app_port"
fi
log_info "部署完成!"
if [ "$environment" != "development" ]; then
log_info "启动服务..."
sudo systemctl start "$SERVICE_NAME"
sudo systemctl status "$SERVICE_NAME"
else
log_info "开发环境部署完成,使用以下命令启动:"
log_info "cd $DEPLOY_PATH && source venv/bin/activate && python start_dashboard.py"
fi
}
# 回滚函数
rollback() {
local backup_name=$1
if [ -z "$backup_name" ]; then
log_error "请指定备份名称"
exit 1
fi
log_info "回滚到备份: $backup_name"
# 停止服务
sudo systemctl stop tsp_assistant
# 恢复备份
if [ -d "backups/$backup_name" ]; then
sudo rm -rf /opt/tsp_assistant
sudo cp -r "backups/$backup_name" /opt/tsp_assistant
sudo chown -R www-data:www-data /opt/tsp_assistant
# 重启服务
sudo systemctl start tsp_assistant
log_info "回滚完成"
else
log_error "备份不存在: $backup_name"
exit 1
fi
}
# 版本检查
check_version() {
log_info "检查版本信息..."
if [ -f "version.json" ]; then
local version=$(python3 -c "import json; print(json.load(open('version.json'))['version'])" 2>/dev/null || echo "unknown")
log_info "当前版本: $version"
else
log_warn "版本文件不存在"
fi
}
# 创建部署包
create_deployment_package() {
local package_name="tsp_assistant_$(date +%Y%m%d_%H%M%S).tar.gz"
log_info "创建部署包: $package_name"
# 排除不需要的文件
tar --exclude='.git' \
--exclude='__pycache__' \
--exclude='*.pyc' \
--exclude='.env' \
--exclude='logs/*' \
--exclude='backups/*' \
--exclude='dev_deploy' \
-czf "$package_name" .
log_info "部署包创建完成: $package_name"
echo "$package_name"
}
# 主函数
main() {
case ${1:-deploy} in
deploy)
check_version
deploy "$2" "$3" "$4"
;;
rollback)
rollback "$2"
;;
package)
create_deployment_package
;;
*)
echo "用法: $0 {deploy|rollback|package} [environment] [domain] [port]"
echo "环境: development, staging, production"
echo ""
echo "命令说明:"
echo " deploy - 部署到指定环境"
echo " rollback - 回滚到指定备份"
echo " package - 创建部署包"
exit 1
;;
esac
}
main "$@"

View File

@@ -1,204 +0,0 @@
#!/bin/bash
# TSP智能助手Docker部署脚本
set -e
# 颜色定义
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# 日志函数
log_info() {
echo -e "${BLUE}[INFO]${NC} $1"
}
log_success() {
echo -e "${GREEN}[SUCCESS]${NC} $1"
}
log_warning() {
echo -e "${YELLOW}[WARNING]${NC} $1"
}
log_error() {
echo -e "${RED}[ERROR]${NC} $1"
}
# 检查Docker和Docker Compose
check_dependencies() {
log_info "检查依赖..."
if ! command -v docker &> /dev/null; then
log_error "Docker未安装请先安装Docker"
exit 1
fi
if ! command -v docker-compose &> /dev/null; then
log_error "Docker Compose未安装请先安装Docker Compose"
exit 1
fi
log_success "依赖检查通过"
}
# 创建必要的目录
create_directories() {
log_info "创建必要的目录..."
mkdir -p logs/nginx
mkdir -p monitoring/grafana/provisioning/datasources
mkdir -p monitoring/grafana/provisioning/dashboards
mkdir -p ssl
mkdir -p data
mkdir -p backups
mkdir -p uploads
mkdir -p config
log_success "目录创建完成"
}
# 构建镜像
build_images() {
log_info "构建Docker镜像..."
# 构建主应用镜像
docker-compose build --no-cache tsp-assistant
log_success "镜像构建完成"
}
# 启动服务
start_services() {
log_info "启动服务..."
# 启动基础服务MySQL, Redis
docker-compose up -d mysql redis
# 等待数据库启动
log_info "等待数据库启动..."
sleep 30
# 启动主应用
docker-compose up -d tsp-assistant
# 启动其他服务
docker-compose up -d nginx prometheus grafana
log_success "服务启动完成"
}
# 检查服务状态
check_services() {
log_info "检查服务状态..."
sleep 10
# 检查主应用
if curl -f http://localhost:5000/api/health &> /dev/null; then
log_success "TSP助手服务正常"
else
log_warning "TSP助手服务可能未完全启动"
fi
# 检查Nginx
if curl -f http://localhost/health &> /dev/null; then
log_success "Nginx服务正常"
else
log_warning "Nginx服务可能未完全启动"
fi
# 检查Prometheus
if curl -f http://localhost:9090 &> /dev/null; then
log_success "Prometheus服务正常"
else
log_warning "Prometheus服务可能未完全启动"
fi
# 检查Grafana
if curl -f http://localhost:3000 &> /dev/null; then
log_success "Grafana服务正常"
else
log_warning "Grafana服务可能未完全启动"
fi
}
# 显示服务信息
show_info() {
log_info "服务访问信息:"
echo " TSP助手: http://localhost:5000"
echo " Nginx代理: http://localhost"
echo " Prometheus: http://localhost:9090"
echo " Grafana: http://localhost:3000 (admin/admin123456)"
echo " MySQL: localhost:3306 (root/root123456)"
echo " Redis: localhost:6379 (密码: redis123456)"
echo ""
log_info "查看日志命令:"
echo " docker-compose logs -f tsp-assistant"
echo " docker-compose logs -f mysql"
echo " docker-compose logs -f redis"
echo " docker-compose logs -f nginx"
}
# 停止服务
stop_services() {
log_info "停止服务..."
docker-compose down
log_success "服务已停止"
}
# 清理资源
cleanup() {
log_info "清理Docker资源..."
docker system prune -f
log_success "清理完成"
}
# 主函数
main() {
case "${1:-start}" in
"start")
check_dependencies
create_directories
build_images
start_services
check_services
show_info
;;
"stop")
stop_services
;;
"restart")
stop_services
sleep 5
start_services
check_services
show_info
;;
"cleanup")
stop_services
cleanup
;;
"logs")
docker-compose logs -f "${2:-tsp-assistant}"
;;
"status")
docker-compose ps
;;
*)
echo "用法: $0 {start|stop|restart|cleanup|logs|status}"
echo " start - 启动所有服务"
echo " stop - 停止所有服务"
echo " restart - 重启所有服务"
echo " cleanup - 清理Docker资源"
echo " logs - 查看日志 (可选指定服务名)"
echo " status - 查看服务状态"
exit 1
;;
esac
}
# 执行主函数
main "$@"

View File

@@ -1,277 +0,0 @@
#!/bin/bash
# TSP智能助手监控脚本
# 配置变量
APP_NAME="tsp_assistant"
SERVICE_NAME="tsp_assistant"
HEALTH_URL="http://localhost:5000/api/health"
LOG_FILE="./logs/monitor.log"
ALERT_EMAIL="admin@example.com"
ALERT_PHONE="13800138000"
# 颜色定义
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m'
# 日志函数
log_info() {
echo -e "${GREEN}[$(date '+%Y-%m-%d %H:%M:%S')] INFO${NC} $1" | tee -a "$LOG_FILE"
}
log_warn() {
echo -e "${YELLOW}[$(date '+%Y-%m-%d %H:%M:%S')] WARN${NC} $1" | tee -a "$LOG_FILE"
}
log_error() {
echo -e "${RED}[$(date '+%Y-%m-%d %H:%M:%S')] ERROR${NC} $1" | tee -a "$LOG_FILE"
}
# 发送告警
send_alert() {
local message=$1
local level=$2
log_error "告警: $message"
# 发送邮件告警
if command -v mail &> /dev/null; then
echo "$message" | mail -s "[$level] TSP助手告警" "$ALERT_EMAIL"
fi
# 发送短信告警(需要配置短信服务)
# curl -X POST "https://api.sms.com/send" \
# -d "phone=$ALERT_PHONE" \
# -d "message=$message"
}
# 检查服务状态
check_service_status() {
if systemctl is-active --quiet "$SERVICE_NAME"; then
return 0
else
return 1
fi
}
# 检查健康状态
check_health() {
local response_code
response_code=$(curl -s -o /dev/null -w "%{http_code}" "$HEALTH_URL" 2>/dev/null)
if [ "$response_code" = "200" ]; then
return 0
else
return 1
fi
}
# 检查响应时间
check_response_time() {
local response_time
response_time=$(curl -s -o /dev/null -w "%{time_total}" "$HEALTH_URL" 2>/dev/null)
# 响应时间超过5秒认为异常
if (( $(echo "$response_time > 5.0" | bc -l) )); then
return 1
else
return 0
fi
}
# 检查系统资源
check_system_resources() {
local cpu_usage
local memory_usage
local disk_usage
# CPU使用率
cpu_usage=$(top -bn1 | grep "Cpu(s)" | awk '{print $2}' | awk -F'%' '{print $1}')
# 内存使用率
memory_usage=$(free | grep Mem | awk '{printf "%.2f", $3/$2 * 100.0}')
# 磁盘使用率
disk_usage=$(df -h / | awk 'NR==2 {print $5}' | sed 's/%//')
# 检查阈值
if (( $(echo "$cpu_usage > 80" | bc -l) )); then
send_alert "CPU使用率过高: ${cpu_usage}%" "HIGH"
fi
if (( $(echo "$memory_usage > 80" | bc -l) )); then
send_alert "内存使用率过高: ${memory_usage}%" "HIGH"
fi
if [ "$disk_usage" -gt 80 ]; then
send_alert "磁盘使用率过高: ${disk_usage}%" "HIGH"
fi
log_info "系统资源 - CPU: ${cpu_usage}%, 内存: ${memory_usage}%, 磁盘: ${disk_usage}%"
}
# 检查日志错误
check_log_errors() {
local log_file="./logs/tsp_assistant.log"
local error_count
if [ -f "$log_file" ]; then
# 检查最近5分钟的错误日志
error_count=$(tail -n 100 "$log_file" | grep -c "ERROR" 2>/dev/null || echo "0")
if [ "$error_count" -gt 10 ]; then
send_alert "最近5分钟错误日志过多: $error_count" "MEDIUM"
fi
fi
}
# 检查数据库连接
check_database() {
local db_file="./tsp_assistant.db"
if [ -f "$db_file" ]; then
# 检查数据库文件大小
local db_size
db_size=$(du -h "$db_file" | cut -f1)
log_info "数据库大小: $db_size"
# 检查数据库是否可读
if ! sqlite3 "$db_file" "SELECT 1;" > /dev/null 2>&1; then
send_alert "数据库连接失败" "CRITICAL"
return 1
fi
fi
return 0
}
# 自动重启服务
restart_service() {
log_warn "尝试重启服务..."
sudo systemctl restart "$SERVICE_NAME"
sleep 10
if check_service_status && check_health; then
log_info "服务重启成功"
return 0
else
log_error "服务重启失败"
return 1
fi
}
# 主监控循环
monitor_loop() {
local consecutive_failures=0
local max_failures=3
while true; do
log_info "开始监控检查..."
# 检查服务状态
if ! check_service_status; then
log_error "服务未运行"
send_alert "TSP助手服务未运行" "CRITICAL"
consecutive_failures=$((consecutive_failures + 1))
else
# 检查健康状态
if ! check_health; then
log_error "健康检查失败"
send_alert "TSP助手健康检查失败" "HIGH"
consecutive_failures=$((consecutive_failures + 1))
else
# 检查响应时间
if ! check_response_time; then
log_warn "响应时间过长"
send_alert "TSP助手响应时间过长" "MEDIUM"
fi
consecutive_failures=0
fi
fi
# 检查系统资源
check_system_resources
# 检查日志错误
check_log_errors
# 检查数据库
check_database
# 连续失败处理
if [ "$consecutive_failures" -ge "$max_failures" ]; then
log_error "连续失败次数达到阈值,尝试重启服务"
if restart_service; then
consecutive_failures=0
else
send_alert "TSP助手服务重启失败需要人工干预" "CRITICAL"
fi
fi
# 等待下次检查
sleep 60
done
}
# 一次性检查
single_check() {
log_info "执行一次性健康检查..."
if check_service_status; then
log_info "✓ 服务运行正常"
else
log_error "✗ 服务未运行"
exit 1
fi
if check_health; then
log_info "✓ 健康检查通过"
else
log_error "✗ 健康检查失败"
exit 1
fi
if check_response_time; then
log_info "✓ 响应时间正常"
else
log_warn "⚠ 响应时间过长"
fi
check_system_resources
check_log_errors
check_database
log_info "健康检查完成"
}
# 主函数
main() {
# 创建日志目录
mkdir -p logs
case ${1:-monitor} in
monitor)
log_info "启动TSP助手监控服务..."
monitor_loop
;;
check)
single_check
;;
restart)
restart_service
;;
*)
echo "用法: $0 {monitor|check|restart}"
echo " monitor - 持续监控模式"
echo " check - 一次性健康检查"
echo " restart - 重启服务"
exit 1
;;
esac
}
# 执行主函数
main "$@"

View File

@@ -1,285 +0,0 @@
@echo off
REM TSP智能助手快速更新脚本 (Windows)
REM 支持热更新和完整更新
setlocal enabledelayedexpansion
REM 颜色定义
set "GREEN=[32m"
set "YELLOW=[33m"
set "RED=[31m"
set "NC=[0m"
REM 配置变量
set "APP_NAME=tsp_assistant"
set "DEPLOY_PATH=."
set "BACKUP_PATH=.\backups"
set "HEALTH_URL=http://localhost:5000/api/health"
REM 解析参数
set "ACTION=%1"
set "SOURCE_PATH=%2"
set "ENVIRONMENT=%3"
if "%ACTION%"=="" (
echo 用法: %0 {check^|hot-update^|full-update^|auto-update^|rollback} [源路径] [环境]
echo.
echo 命令说明:
echo check - 检查更新可用性
echo hot-update - 热更新(不重启服务)
echo full-update - 完整更新(重启服务)
echo auto-update - 自动更新(智能选择)
echo rollback - 回滚到指定备份
echo.
echo 环境: development, staging, production
exit /b 1
)
if "%ENVIRONMENT%"=="" set "ENVIRONMENT=production"
if "%SOURCE_PATH%"=="" set "SOURCE_PATH=."
REM 日志函数
:log_info
echo %GREEN%[INFO]%NC% %~1
goto :eof
:log_warn
echo %YELLOW%[WARN]%NC% %~1
goto :eof
:log_error
echo %RED%[ERROR]%NC% %~1
goto :eof
REM 检查更新可用性
:check_update
call :log_info "检查更新可用性..."
if not exist "%SOURCE_PATH%\version.json" (
call :log_error "源路径中未找到版本文件"
exit /b 1
)
REM 比较版本
for /f "tokens=2 delims=:" %%a in ('findstr "version" "%SOURCE_PATH%\version.json"') do (
set "NEW_VERSION=%%a"
set "NEW_VERSION=!NEW_VERSION: =!"
set "NEW_VERSION=!NEW_VERSION:"=!"
set "NEW_VERSION=!NEW_VERSION:,=!"
)
if exist "version.json" (
for /f "tokens=2 delims=:" %%a in ('findstr "version" "version.json"') do (
set "CURRENT_VERSION=%%a"
set "CURRENT_VERSION=!CURRENT_VERSION: =!"
set "CURRENT_VERSION=!CURRENT_VERSION:"=!"
set "CURRENT_VERSION=!CURRENT_VERSION:,=!"
)
) else (
set "CURRENT_VERSION=unknown"
)
if "!NEW_VERSION!"=="!CURRENT_VERSION!" (
call :log_info "没有更新可用 (当前版本: !CURRENT_VERSION!)"
) else (
call :log_info "发现更新: !CURRENT_VERSION! -> !NEW_VERSION!"
)
goto :eof
REM 创建备份
:create_backup
set "TIMESTAMP=%date:~0,4%%date:~5,2%%date:~8,2%_%time:~0,2%%time:~3,2%%time:~6,2%"
set "TIMESTAMP=!TIMESTAMP: =0!"
set "BACKUP_NAME=%APP_NAME%_backup_!TIMESTAMP!"
call :log_info "创建备份: !BACKUP_NAME!"
if not exist "%BACKUP_PATH%" mkdir "%BACKUP_PATH%"
mkdir "%BACKUP_PATH%\!BACKUP_NAME!"
REM 备份应用文件
if exist "%DEPLOY_PATH%" (
call :log_info "备份应用文件..."
xcopy "%DEPLOY_PATH%\*" "%BACKUP_PATH%\!BACKUP_NAME!\" /E /I /Y
)
REM 备份数据库
if exist "%DEPLOY_PATH%\tsp_assistant.db" (
call :log_info "备份数据库..."
mkdir "%BACKUP_PATH%\!BACKUP_NAME!\database"
copy "%DEPLOY_PATH%\tsp_assistant.db" "%BACKUP_PATH%\!BACKUP_NAME!\database\"
)
call :log_info "备份完成: !BACKUP_NAME!"
echo !BACKUP_NAME!
goto :eof
REM 热更新
:hot_update
call :log_info "开始热更新..."
REM 支持热更新的文件列表
set "HOT_UPDATE_FILES=src\web\static\js\dashboard.js src\web\static\css\style.css src\web\templates\dashboard.html src\web\app.py"
set "UPDATED_COUNT=0"
for %%f in (%HOT_UPDATE_FILES%) do (
if exist "%SOURCE_PATH%\%%f" (
call :log_info "更新文件: %%f"
if not exist "%DEPLOY_PATH%\%%f" mkdir "%DEPLOY_PATH%\%%f" 2>nul
copy "%SOURCE_PATH%\%%f" "%DEPLOY_PATH%\%%f" /Y >nul
set /a UPDATED_COUNT+=1
)
)
if !UPDATED_COUNT! gtr 0 (
call :log_info "热更新完成,更新了 !UPDATED_COUNT! 个文件"
) else (
call :log_info "没有文件需要热更新"
)
goto :eof
REM 完整更新
:full_update
call :log_info "开始完整更新..."
REM 创建备份
call :create_backup
set "BACKUP_NAME=!BACKUP_NAME!"
REM 停止服务(如果运行中)
call :log_info "停止服务..."
taskkill /f /im python.exe 2>nul || echo 服务未运行
REM 更新文件
call :log_info "更新应用文件..."
if exist "%DEPLOY_PATH%" rmdir /s /q "%DEPLOY_PATH%"
mkdir "%DEPLOY_PATH%"
xcopy "%SOURCE_PATH%\*" "%DEPLOY_PATH%\" /E /I /Y
REM 安装依赖
call :log_info "安装依赖..."
cd "%DEPLOY_PATH%"
if exist "requirements.txt" (
pip install -r requirements.txt
)
REM 运行数据库迁移
call :log_info "运行数据库迁移..."
if exist "init_database.py" (
python init_database.py
)
REM 启动服务
call :log_info "启动服务..."
start /b python start_dashboard.py
REM 等待服务启动
call :log_info "等待服务启动..."
timeout /t 15 /nobreak >nul
REM 健康检查
call :log_info "执行健康检查..."
set "RETRY_COUNT=0"
set "MAX_RETRIES=10"
:health_check_loop
if !RETRY_COUNT! geq !MAX_RETRIES! (
call :log_error "健康检查失败,开始回滚..."
call :rollback !BACKUP_NAME!
exit /b 1
)
curl -f "%HEALTH_URL%" >nul 2>&1
if !errorlevel! equ 0 (
call :log_info "健康检查通过!"
call :log_info "更新成功!"
call :log_info "备份名称: !BACKUP_NAME!"
exit /b 0
) else (
call :log_warn "健康检查失败,重试中... (!RETRY_COUNT!/!MAX_RETRIES!)"
set /a RETRY_COUNT+=1
timeout /t 5 /nobreak >nul
goto :health_check_loop
)
REM 回滚
:rollback
set "BACKUP_NAME=%1"
if "%BACKUP_NAME%"=="" (
call :log_error "请指定备份名称"
exit /b 1
)
call :log_info "开始回滚到备份: !BACKUP_NAME!"
if not exist "%BACKUP_PATH%\!BACKUP_NAME!" (
call :log_error "备份不存在: !BACKUP_NAME!"
exit /b 1
)
REM 停止服务
call :log_info "停止服务..."
taskkill /f /im python.exe 2>nul || echo 服务未运行
REM 恢复文件
call :log_info "恢复文件..."
if exist "%DEPLOY_PATH%" rmdir /s /q "%DEPLOY_PATH%"
mkdir "%DEPLOY_PATH%"
xcopy "%BACKUP_PATH%\!BACKUP_NAME!\*" "%DEPLOY_PATH%\" /E /I /Y
REM 恢复数据库
if exist "%BACKUP_PATH%\!BACKUP_NAME!\database\tsp_assistant.db" (
call :log_info "恢复数据库..."
copy "%BACKUP_PATH%\!BACKUP_NAME!\database\tsp_assistant.db" "%DEPLOY_PATH%\"
)
REM 启动服务
call :log_info "启动服务..."
cd "%DEPLOY_PATH%"
start /b python start_dashboard.py
REM 等待服务启动
timeout /t 15 /nobreak >nul
REM 健康检查
curl -f "%HEALTH_URL%" >nul 2>&1
if !errorlevel! equ 0 (
call :log_info "回滚成功!"
) else (
call :log_error "回滚后健康检查失败"
exit /b 1
)
goto :eof
REM 自动更新
:auto_update
call :log_info "开始自动更新..."
REM 尝试热更新
call :hot_update
if !errorlevel! equ 0 (
call :log_info "热更新成功"
exit /b 0
)
REM 热更新失败,进行完整更新
call :log_info "热更新失败,进行完整更新..."
call :full_update
goto :eof
REM 主逻辑
if "%ACTION%"=="check" (
call :check_update
) else if "%ACTION%"=="hot-update" (
call :hot_update
) else if "%ACTION%"=="full-update" (
call :full_update
) else if "%ACTION%"=="auto-update" (
call :auto_update
) else if "%ACTION%"=="rollback" (
call :rollback "%SOURCE_PATH%"
) else (
call :log_error "未知操作: %ACTION%"
exit /b 1
)
endlocal

View File

@@ -1,477 +0,0 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
TSP智能助手更新管理器
支持热更新、版本管理、回滚等功能
"""
import os
import sys
import json
import shutil
import subprocess
import time
import requests
from datetime import datetime
from pathlib import Path
from typing import Dict, List, Optional, Tuple
class UpdateManager:
"""更新管理器"""
def __init__(self, config_file: str = "update_config.json"):
self.config_file = config_file
self.config = self._load_config()
self.version_manager = None
# 初始化版本管理器
try:
from version import VersionManager
self.version_manager = VersionManager()
except ImportError:
print("警告: 版本管理器不可用")
def _load_config(self) -> Dict:
"""加载更新配置"""
default_config = {
"app_name": "tsp_assistant",
"deploy_path": "/opt/tsp_assistant",
"backup_path": "./backups",
"service_name": "tsp_assistant",
"health_url": "http://localhost:5000/api/health",
"update_timeout": 300,
"rollback_enabled": True,
"auto_backup": True,
"hot_update_enabled": True,
"environments": {
"development": {
"path": "./dev_deploy",
"service_name": "",
"auto_restart": False
},
"staging": {
"path": "/opt/tsp_assistant_staging",
"service_name": "tsp_assistant_staging",
"auto_restart": True
},
"production": {
"path": "/opt/tsp_assistant",
"service_name": "tsp_assistant",
"auto_restart": True
}
}
}
if os.path.exists(self.config_file):
try:
with open(self.config_file, 'r', encoding='utf-8') as f:
config = json.load(f)
# 合并默认配置
default_config.update(config)
except Exception as e:
print(f"加载配置文件失败: {e}")
return default_config
def _save_config(self):
"""保存配置"""
try:
with open(self.config_file, 'w', encoding='utf-8') as f:
json.dump(self.config, f, indent=2, ensure_ascii=False)
except Exception as e:
print(f"保存配置文件失败: {e}")
def check_update_available(self, source_path: str) -> Tuple[bool, str, str]:
"""检查是否有更新可用"""
if not self.version_manager:
return False, "unknown", "unknown"
current_version = self.version_manager.get_version()
# 检查源路径的版本
try:
source_version_file = os.path.join(source_path, "version.json")
if os.path.exists(source_version_file):
with open(source_version_file, 'r', encoding='utf-8') as f:
source_info = json.load(f)
source_version = source_info.get("version", "unknown")
else:
return False, current_version, "unknown"
except Exception as e:
print(f"检查源版本失败: {e}")
return False, current_version, "unknown"
# 比较版本
if source_version != current_version:
return True, current_version, source_version
return False, current_version, source_version
def create_backup(self, environment: str = "production") -> str:
"""创建备份"""
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
backup_name = f"{self.config['app_name']}_backup_{timestamp}"
backup_path = os.path.join(self.config["backup_path"], backup_name)
print(f"创建备份: {backup_name}")
# 创建备份目录
os.makedirs(backup_path, exist_ok=True)
# 获取部署路径
env_config = self.config["environments"].get(environment, {})
deploy_path = env_config.get("path", self.config["deploy_path"])
# 备份应用文件
if os.path.exists(deploy_path):
print("备份应用文件...")
shutil.copytree(deploy_path, os.path.join(backup_path, "app"))
# 备份数据库
db_file = os.path.join(deploy_path, "tsp_assistant.db")
if os.path.exists(db_file):
print("备份数据库...")
os.makedirs(os.path.join(backup_path, "database"), exist_ok=True)
shutil.copy2(db_file, os.path.join(backup_path, "database", "tsp_assistant.db"))
# 保存备份信息
backup_info = {
"backup_name": backup_name,
"backup_path": backup_path,
"timestamp": timestamp,
"environment": environment,
"version": self.version_manager.get_version() if self.version_manager else "unknown",
"git_commit": self._get_git_commit(deploy_path)
}
with open(os.path.join(backup_path, "backup_info.json"), 'w', encoding='utf-8') as f:
json.dump(backup_info, f, indent=2, ensure_ascii=False)
print(f"备份完成: {backup_name}")
return backup_name
def _get_git_commit(self, path: str) -> str:
"""获取Git提交哈希"""
try:
result = subprocess.run(['git', 'rev-parse', 'HEAD'],
cwd=path, capture_output=True, text=True)
return result.stdout.strip()[:8] if result.returncode == 0 else "unknown"
except:
return "unknown"
def hot_update(self, source_path: str, environment: str = "production") -> bool:
"""热更新(不重启服务)"""
if not self.config["hot_update_enabled"]:
print("热更新未启用")
return False
print("开始热更新...")
env_config = self.config["environments"].get(environment, {})
deploy_path = env_config.get("path", self.config["deploy_path"])
# 检查哪些文件可以热更新
hot_update_files = [
"src/web/static/js/dashboard.js",
"src/web/static/css/style.css",
"src/web/templates/dashboard.html",
"src/web/app.py",
"src/knowledge_base/knowledge_manager.py",
"src/dialogue/realtime_chat.py"
]
updated_files = []
for file_path in hot_update_files:
source_file = os.path.join(source_path, file_path)
target_file = os.path.join(deploy_path, file_path)
if os.path.exists(source_file):
# 检查文件是否有变化
if not os.path.exists(target_file) or not self._files_equal(source_file, target_file):
print(f"更新文件: {file_path}")
os.makedirs(os.path.dirname(target_file), exist_ok=True)
shutil.copy2(source_file, target_file)
updated_files.append(file_path)
if updated_files:
print(f"热更新完成,更新了 {len(updated_files)} 个文件")
return True
else:
print("没有文件需要热更新")
return False
def _files_equal(self, file1: str, file2: str) -> bool:
"""比较两个文件是否相等"""
try:
with open(file1, 'rb') as f1, open(file2, 'rb') as f2:
return f1.read() == f2.read()
except:
return False
def full_update(self, source_path: str, environment: str = "production",
create_backup: bool = True) -> bool:
"""完整更新(重启服务)"""
print("开始完整更新...")
env_config = self.config["environments"].get(environment, {})
deploy_path = env_config.get("path", self.config["deploy_path"])
service_name = env_config.get("service_name", self.config["service_name"])
auto_restart = env_config.get("auto_restart", True)
# 创建备份
backup_name = None
if create_backup and self.config["auto_backup"]:
backup_name = self.create_backup(environment)
try:
# 停止服务
if auto_restart and service_name:
print(f"停止服务: {service_name}")
subprocess.run(['sudo', 'systemctl', 'stop', service_name], check=True)
# 更新文件
print("更新应用文件...")
if os.path.exists(deploy_path):
shutil.rmtree(deploy_path)
os.makedirs(deploy_path, exist_ok=True)
shutil.copytree(source_path, deploy_path, dirs_exist_ok=True)
# 设置权限
subprocess.run(['sudo', 'chown', '-R', 'www-data:www-data', deploy_path], check=True)
# 安装依赖
print("安装依赖...")
requirements_file = os.path.join(deploy_path, "requirements.txt")
if os.path.exists(requirements_file):
subprocess.run(['sudo', '-u', 'www-data', 'python', '-m', 'pip', 'install', '-r', requirements_file],
cwd=deploy_path, check=True)
# 运行数据库迁移
print("运行数据库迁移...")
init_script = os.path.join(deploy_path, "init_database.py")
if os.path.exists(init_script):
subprocess.run(['sudo', '-u', 'www-data', 'python', init_script],
cwd=deploy_path, check=True)
# 启动服务
if auto_restart and service_name:
print(f"启动服务: {service_name}")
subprocess.run(['sudo', 'systemctl', 'start', service_name], check=True)
# 等待服务启动
print("等待服务启动...")
time.sleep(15)
# 健康检查
if self._health_check():
print("更新成功!")
return True
else:
print("健康检查失败,开始回滚...")
if backup_name:
self.rollback(backup_name, environment)
return False
else:
print("更新完成(未重启服务)")
return True
except Exception as e:
print(f"更新失败: {e}")
if backup_name:
print("开始回滚...")
self.rollback(backup_name, environment)
return False
def _health_check(self) -> bool:
"""健康检查"""
health_url = self.config["health_url"]
max_retries = 10
retry_count = 0
while retry_count < max_retries:
try:
response = requests.get(health_url, timeout=5)
if response.status_code == 200:
return True
except:
pass
retry_count += 1
print(f"健康检查失败,重试中... ({retry_count}/{max_retries})")
time.sleep(5)
return False
def rollback(self, backup_name: str, environment: str = "production") -> bool:
"""回滚到指定备份"""
print(f"开始回滚到备份: {backup_name}")
env_config = self.config["environments"].get(environment, {})
deploy_path = env_config.get("path", self.config["deploy_path"])
service_name = env_config.get("service_name", self.config["service_name"])
auto_restart = env_config.get("auto_restart", True)
backup_path = os.path.join(self.config["backup_path"], backup_name)
if not os.path.exists(backup_path):
print(f"备份不存在: {backup_name}")
return False
try:
# 停止服务
if auto_restart and service_name:
print(f"停止服务: {service_name}")
subprocess.run(['sudo', 'systemctl', 'stop', service_name], check=True)
# 恢复文件
print("恢复文件...")
app_backup_path = os.path.join(backup_path, "app")
if os.path.exists(app_backup_path):
if os.path.exists(deploy_path):
shutil.rmtree(deploy_path)
shutil.copytree(app_backup_path, deploy_path)
# 恢复数据库
db_backup_path = os.path.join(backup_path, "database", "tsp_assistant.db")
if os.path.exists(db_backup_path):
print("恢复数据库...")
shutil.copy2(db_backup_path, os.path.join(deploy_path, "tsp_assistant.db"))
# 设置权限
subprocess.run(['sudo', 'chown', '-R', 'www-data:www-data', deploy_path], check=True)
# 启动服务
if auto_restart and service_name:
print(f"启动服务: {service_name}")
subprocess.run(['sudo', 'systemctl', 'start', service_name], check=True)
# 等待服务启动
time.sleep(15)
# 健康检查
if self._health_check():
print("回滚成功!")
return True
else:
print("回滚后健康检查失败")
return False
else:
print("回滚完成(未重启服务)")
return True
except Exception as e:
print(f"回滚失败: {e}")
return False
def list_backups(self) -> List[Dict]:
"""列出所有备份"""
backups = []
backup_dir = self.config["backup_path"]
if os.path.exists(backup_dir):
for item in os.listdir(backup_dir):
backup_path = os.path.join(backup_dir, item)
if os.path.isdir(backup_path):
info_file = os.path.join(backup_path, "backup_info.json")
if os.path.exists(info_file):
try:
with open(info_file, 'r', encoding='utf-8') as f:
backup_info = json.load(f)
backups.append(backup_info)
except:
pass
return sorted(backups, key=lambda x: x.get("timestamp", ""), reverse=True)
def auto_update(self, source_path: str, environment: str = "production") -> bool:
"""自动更新(智能选择热更新或完整更新)"""
print("开始自动更新...")
# 检查是否有更新
has_update, current_version, new_version = self.check_update_available(source_path)
if not has_update:
print("没有更新可用")
return True
print(f"发现更新: {current_version} -> {new_version}")
# 尝试热更新
if self.hot_update(source_path, environment):
print("热更新成功")
return True
# 热更新失败,进行完整更新
print("热更新失败,进行完整更新...")
return self.full_update(source_path, environment)
def main():
"""命令行接口"""
import argparse
parser = argparse.ArgumentParser(description='TSP智能助手更新管理器')
parser.add_argument('action', choices=['check', 'hot-update', 'full-update', 'auto-update', 'rollback', 'list-backups'],
help='要执行的操作')
parser.add_argument('--source', help='源路径')
parser.add_argument('--environment', choices=['development', 'staging', 'production'],
default='production', help='目标环境')
parser.add_argument('--backup', help='备份名称(用于回滚)')
parser.add_argument('--no-backup', action='store_true', help='跳过备份')
args = parser.parse_args()
um = UpdateManager()
if args.action == 'check':
if not args.source:
print("错误: 需要指定源路径")
sys.exit(1)
has_update, current, new = um.check_update_available(args.source)
if has_update:
print(f"有更新可用: {current} -> {new}")
else:
print(f"没有更新可用 (当前版本: {current})")
elif args.action == 'hot-update':
if not args.source:
print("错误: 需要指定源路径")
sys.exit(1)
success = um.hot_update(args.source, args.environment)
sys.exit(0 if success else 1)
elif args.action == 'full-update':
if not args.source:
print("错误: 需要指定源路径")
sys.exit(1)
success = um.full_update(args.source, args.environment, not args.no_backup)
sys.exit(0 if success else 1)
elif args.action == 'auto-update':
if not args.source:
print("错误: 需要指定源路径")
sys.exit(1)
success = um.auto_update(args.source, args.environment)
sys.exit(0 if success else 1)
elif args.action == 'rollback':
if not args.backup:
print("错误: 需要指定备份名称")
sys.exit(1)
success = um.rollback(args.backup, args.environment)
sys.exit(0 if success else 1)
elif args.action == 'list-backups':
backups = um.list_backups()
if backups:
print("可用备份:")
for backup in backups:
print(f" {backup['backup_name']} - {backup['timestamp']} - {backup.get('version', 'unknown')}")
else:
print("没有找到备份")
if __name__ == "__main__":
main()

View File

@@ -1,273 +0,0 @@
#!/bin/bash
# TSP智能助手升级脚本
set -e
# 颜色定义
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m'
# 日志函数
log_info() {
echo -e "${GREEN}[INFO]${NC} $1"
}
log_warn() {
echo -e "${YELLOW}[WARN]${NC} $1"
}
log_error() {
echo -e "${RED}[ERROR]${NC} $1"
}
log_step() {
echo -e "${BLUE}[STEP]${NC} $1"
}
# 配置变量
APP_NAME="tsp_assistant"
BACKUP_DIR="./backups"
DEPLOY_PATH="/opt/tsp_assistant"
SERVICE_NAME="tsp_assistant"
HEALTH_URL="http://localhost:5000/api/health"
# 检查参数
if [ $# -lt 1 ]; then
echo "用法: $0 <新版本路径> [选项]"
echo "选项:"
echo " --force 强制升级,跳过确认"
echo " --no-backup 跳过备份"
echo " --rollback 回滚到指定备份"
exit 1
fi
NEW_VERSION_PATH=$1
FORCE_UPGRADE=false
SKIP_BACKUP=false
ROLLBACK_MODE=false
# 解析参数
while [[ $# -gt 1 ]]; do
case $2 in
--force)
FORCE_UPGRADE=true
;;
--no-backup)
SKIP_BACKUP=true
;;
--rollback)
ROLLBACK_MODE=true
;;
*)
log_error "未知选项: $2"
exit 1
;;
esac
shift
done
# 回滚功能
rollback() {
local backup_name=$1
if [ -z "$backup_name" ]; then
log_error "请指定备份名称"
exit 1
fi
log_step "开始回滚到备份: $backup_name"
# 检查备份是否存在
if [ ! -d "$BACKUP_DIR/$backup_name" ]; then
log_error "备份不存在: $backup_name"
log_info "可用备份列表:"
ls -la "$BACKUP_DIR" | grep backup
exit 1
fi
# 停止服务
log_info "停止服务..."
sudo systemctl stop "$SERVICE_NAME" || true
# 恢复文件
log_info "恢复文件..."
sudo rm -rf "$DEPLOY_PATH"
sudo cp -r "$BACKUP_DIR/$backup_name" "$DEPLOY_PATH"
sudo chown -R www-data:www-data "$DEPLOY_PATH"
# 恢复数据库
if [ -f "$BACKUP_DIR/$backup_name/database/tsp_assistant.db" ]; then
log_info "恢复数据库..."
sudo cp "$BACKUP_DIR/$backup_name/database/tsp_assistant.db" "$DEPLOY_PATH/"
fi
# 启动服务
log_info "启动服务..."
sudo systemctl start "$SERVICE_NAME"
# 等待服务启动
sleep 10
# 健康检查
if curl -f "$HEALTH_URL" > /dev/null 2>&1; then
log_info "回滚成功!"
else
log_error "回滚后健康检查失败"
exit 1
fi
}
# 创建备份
create_backup() {
local timestamp=$(date +"%Y%m%d_%H%M%S")
local backup_name="${APP_NAME}_backup_${timestamp}"
local backup_path="$BACKUP_DIR/$backup_name"
log_step "创建备份: $backup_name"
# 创建备份目录
mkdir -p "$backup_path"
# 备份应用文件
if [ -d "$DEPLOY_PATH" ]; then
log_info "备份应用文件..."
cp -r "$DEPLOY_PATH"/* "$backup_path/"
fi
# 备份数据库
if [ -f "$DEPLOY_PATH/tsp_assistant.db" ]; then
log_info "备份数据库..."
mkdir -p "$backup_path/database"
cp "$DEPLOY_PATH/tsp_assistant.db" "$backup_path/database/"
fi
# 保存备份信息
cat > "$backup_path/backup_info.json" << EOF
{
"backup_name": "$backup_name",
"backup_path": "$backup_path",
"timestamp": "$timestamp",
"version": "$(cd "$DEPLOY_PATH" && python version.py version 2>/dev/null || echo "unknown")",
"git_commit": "$(cd "$DEPLOY_PATH" && git rev-parse HEAD 2>/dev/null | cut -c1-8 || echo "unknown")"
}
EOF
log_info "备份完成: $backup_name"
echo "$backup_name"
}
# 升级功能
upgrade() {
local new_version_path=$1
log_step "开始升级TSP智能助手"
# 检查新版本路径
if [ ! -d "$new_version_path" ]; then
log_error "新版本路径不存在: $new_version_path"
exit 1
fi
# 检查当前版本
if [ -d "$DEPLOY_PATH" ]; then
local current_version=$(cd "$DEPLOY_PATH" && python version.py version 2>/dev/null || echo "unknown")
log_info "当前版本: $current_version"
else
log_warn "当前部署路径不存在: $DEPLOY_PATH"
fi
# 检查新版本
local new_version=$(cd "$new_version_path" && python version.py version 2>/dev/null || echo "unknown")
log_info "新版本: $new_version"
# 确认升级
if [ "$FORCE_UPGRADE" = false ]; then
echo -n "确认升级到版本 $new_version? (y/N): "
read -r response
if [[ ! "$response" =~ ^[Yy]$ ]]; then
log_info "升级取消"
exit 0
fi
fi
# 创建备份
local backup_name=""
if [ "$SKIP_BACKUP" = false ]; then
backup_name=$(create_backup)
fi
# 停止服务
log_step "停止服务..."
sudo systemctl stop "$SERVICE_NAME" || true
# 升级文件
log_step "升级应用文件..."
sudo rm -rf "$DEPLOY_PATH"
sudo mkdir -p "$DEPLOY_PATH"
sudo cp -r "$new_version_path"/* "$DEPLOY_PATH/"
sudo chown -R www-data:www-data "$DEPLOY_PATH"
# 安装依赖
log_step "安装依赖..."
cd "$DEPLOY_PATH"
sudo -u www-data python -m pip install -r requirements.txt
# 运行数据库迁移
log_step "运行数据库迁移..."
sudo -u www-data python init_database.py || true
# 启动服务
log_step "启动服务..."
sudo systemctl start "$SERVICE_NAME"
# 等待服务启动
log_info "等待服务启动..."
sleep 15
# 健康检查
log_step "执行健康检查..."
local retry_count=0
local max_retries=10
while [ $retry_count -lt $max_retries ]; do
if curl -f "$HEALTH_URL" > /dev/null 2>&1; then
log_info "健康检查通过!"
break
else
log_warn "健康检查失败,重试中... ($((retry_count + 1))/$max_retries)"
retry_count=$((retry_count + 1))
sleep 5
fi
done
if [ $retry_count -eq $max_retries ]; then
log_error "健康检查失败,开始回滚..."
if [ -n "$backup_name" ]; then
rollback "$backup_name"
else
log_error "没有备份可回滚"
exit 1
fi
else
log_info "升级成功!"
log_info "新版本: $new_version"
if [ -n "$backup_name" ]; then
log_info "备份名称: $backup_name"
fi
fi
}
# 主函数
main() {
if [ "$ROLLBACK_MODE" = true ]; then
rollback "$NEW_VERSION_PATH"
else
upgrade "$NEW_VERSION_PATH"
fi
}
# 执行主函数
main

View File

@@ -1,3 +0,0 @@
# TSP助手 - 基于大模型的AI客服机器人
__version__ = "1.0.0"
__author__ = "TSP Assistant Team"

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -5,7 +5,7 @@
实现Agent的主动调用功能 实现Agent的主动调用功能
""" """
import asyncio import asyncio跳过系统检查直接启动服务...
import logging import logging
import threading import threading
import time import time

View File

@@ -12,12 +12,14 @@ from typing import Dict, Any, Optional, List
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from dataclasses import dataclass from dataclasses import dataclass
from src.config.unified_config import get_config
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@dataclass @dataclass
class LLMConfig: class LLMConfig:
"""LLM配置""" """LLM配置"""
provider: str # openai, anthropic, local, etc. provider: str
api_key: str api_key: str
base_url: Optional[str] = None base_url: Optional[str] = None
model: str = "gpt-3.5-turbo" model: str = "gpt-3.5-turbo"
@@ -60,7 +62,7 @@ class OpenAIClient(BaseLLMClient):
async def generate(self, prompt: str, **kwargs) -> str: async def generate(self, prompt: str, **kwargs) -> str:
"""生成文本""" """生成文本"""
if not self.client: if not self.client:
return self._simulate_response(prompt) raise ImportError("OpenAI client not initialized. Please install the 'openai' package.")
try: try:
response = await self.client.chat.completions.create( response = await self.client.chat.completions.create(
@@ -72,12 +74,12 @@ class OpenAIClient(BaseLLMClient):
return response.choices[0].message.content return response.choices[0].message.content
except Exception as e: except Exception as e:
logger.error(f"OpenAI API调用失败: {e}") logger.error(f"OpenAI API调用失败: {e}")
return self._simulate_response(prompt) raise e
async def chat(self, messages: List[Dict[str, str]], **kwargs) -> str: async def chat(self, messages: List[Dict[str, str]], **kwargs) -> str:
"""对话生成""" """对话生成"""
if not self.client: if not self.client:
return self._simulate_chat(messages) raise ImportError("OpenAI client not initialized. Please install the 'openai' package.")
try: try:
response = await self.client.chat.completions.create( response = await self.client.chat.completions.create(
@@ -89,7 +91,7 @@ class OpenAIClient(BaseLLMClient):
return response.choices[0].message.content return response.choices[0].message.content
except Exception as e: except Exception as e:
logger.error(f"OpenAI Chat API调用失败: {e}") logger.error(f"OpenAI Chat API调用失败: {e}")
return self._simulate_chat(messages) raise e
def _simulate_response(self, prompt: str) -> str: def _simulate_response(self, prompt: str) -> str:
"""模拟响应""" """模拟响应"""
@@ -198,11 +200,14 @@ class LLMClientFactory:
@staticmethod @staticmethod
def create_client(config: LLMConfig) -> BaseLLMClient: def create_client(config: LLMConfig) -> BaseLLMClient:
"""创建LLM客户端""" """创建LLM客户端"""
if config.provider.lower() == "openai": provider = config.provider.lower()
# qwen 使用 OpenAI 兼容的 API
if provider in ["openai", "qwen"]:
return OpenAIClient(config) return OpenAIClient(config)
elif config.provider.lower() == "anthropic": elif provider == "anthropic":
return AnthropicClient(config) return AnthropicClient(config)
elif config.provider.lower() == "local": elif provider == "local":
return LocalLLMClient(config) return LocalLLMClient(config)
else: else:
raise ValueError(f"不支持的LLM提供商: {config.provider}") raise ValueError(f"不支持的LLM提供商: {config.provider}")
@@ -210,15 +215,21 @@ class LLMClientFactory:
class LLMManager: class LLMManager:
"""LLM管理器""" """LLM管理器"""
def __init__(self, config: LLMConfig): def __init__(self, config=None):
if config:
self.config = config self.config = config
self.client = LLMClientFactory.create_client(config) else:
# If no config is provided, fetch it from the unified config system
self.config = get_config().llm
self.client = LLMClientFactory.create_client(self.config)
self.usage_stats = { self.usage_stats = {
"total_requests": 0, "total_requests": 0,
"total_tokens": 0, "total_tokens": 0,
"error_count": 0 "error_count": 0
} }
async def generate(self, prompt: str, **kwargs) -> str: async def generate(self, prompt: str, **kwargs) -> str:
"""生成文本""" """生成文本"""
try: try:

View File

@@ -8,15 +8,18 @@ import logging
import asyncio import asyncio
from typing import Dict, Any, List, Optional from typing import Dict, Any, List, Optional
from datetime import datetime from datetime import datetime
from src.config.unified_config import get_config
from src.agent.llm_client import LLMManager
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
class TSPAgentAssistant: class TSPAgentAssistant:
"""TSP Agent助手 - 简化版本""" """TSP Agent助手"""
def __init__(self, llm_config=None): def __init__(self):
# 初始化基础功能 # 初始化基础功能
self.llm_config = llm_config config = get_config()
self.llm_manager = LLMManager(config.llm)
self.is_agent_mode = True self.is_agent_mode = True
self.execution_history = [] self.execution_history = []
@@ -339,6 +342,8 @@ class TSPAgentAssistant:
import os import os
import mimetypes import mimetypes
logger.info(f"开始处理知识库上传文件: {filename}")
# 检查文件类型 # 检查文件类型
mime_type, _ = mimetypes.guess_type(file_path) mime_type, _ = mimetypes.guess_type(file_path)
file_ext = os.path.splitext(filename)[1].lower() file_ext = os.path.splitext(filename)[1].lower()
@@ -346,22 +351,30 @@ class TSPAgentAssistant:
# 读取文件内容 # 读取文件内容
content = self._read_file_content(file_path, file_ext) content = self._read_file_content(file_path, file_ext)
if not content: if not content:
logger.error(f"文件读取失败或内容为空: {filename}")
return {"success": False, "error": "无法读取文件内容"} return {"success": False, "error": "无法读取文件内容"}
logger.info(f"文件读取成功: {filename}, 字符数={len(content)}")
# 使用简化的知识提取 # 使用简化的知识提取
logger.info(f"正在对文件内容进行 AI 知识提取...")
knowledge_entries = self._extract_knowledge_from_content(content, filename) knowledge_entries = self._extract_knowledge_from_content(content, filename)
logger.info(f"知识提取完成: 共提取出 {len(knowledge_entries)} 个潜在条目")
# 保存到知识库 # 保存到知识库
saved_count = 0 saved_count = 0
for i, entry in enumerate(knowledge_entries): for i, entry in enumerate(knowledge_entries):
try: try:
logger.info(f"保存知识条目 {i+1}: {entry.get('question', '')[:50]}...") logger.info(f"正在保存知识条目 [{i+1}/{len(knowledge_entries)}]: {entry.get('question', '')[:30]}...")
# 这里应该调用知识库管理器保存 # 这里在实际项目中应当注入知识库管理器保存逻辑
# 但在当前简化版本中仅记录日志
saved_count += 1 saved_count += 1
logger.info(f"知识条目 {i+1} 保存成功")
except Exception as save_error: except Exception as save_error:
logger.error(f"保存知识条目 {i+1} 时出错: {save_error}") logger.error(f"保存知识条目 {i+1} 时出错: {save_error}")
logger.info(f"文件处理任务结束: {filename}, 成功入库 {saved_count}")
return { return {
"success": True, "success": True,
"knowledge_count": saved_count, "knowledge_count": saved_count,

Binary file not shown.

Binary file not shown.

View File

@@ -15,7 +15,7 @@ import time
from ..core.database import db_manager from ..core.database import db_manager
from ..core.models import Alert from ..core.models import Alert
from ..core.redis_manager import redis_manager from ..core.redis_manager import redis_manager
from ..config.config import Config from src.config.unified_config import get_config
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View File

@@ -13,7 +13,6 @@ from collections import defaultdict
from ..core.database import db_manager from ..core.database import db_manager
from ..core.models import Conversation from ..core.models import Conversation
from ..core.redis_manager import redis_manager from ..core.redis_manager import redis_manager
from ..config.config import Config
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

Binary file not shown.

Binary file not shown.

View File

@@ -1,71 +0,0 @@
import os
from typing import Dict, Any
class Config:
"""系统配置类"""
# 阿里云千问API配置
ALIBABA_API_KEY = "sk-c0dbefa1718d46eaa897199135066f00"
ALIBABA_BASE_URL = "https://dashscope.aliyuncs.com/compatible-mode/v1"
ALIBABA_MODEL_NAME = "qwen-plus-latest"
# 数据库配置
DATABASE_URL = "mysql+pymysql://tsp_assistant:123456@jeason.online/tsp_assistant?charset=utf8mb4"
# DATABASE_URL = "sqlite:///local_test.db" # 本地测试数据库
# 知识库配置
KNOWLEDGE_BASE_PATH = "data/knowledge_base"
VECTOR_DB_PATH = "data/vector_db"
# 对话配置
MAX_HISTORY_LENGTH = 10
RESPONSE_TIMEOUT = 30
# 分析配置
ANALYTICS_UPDATE_INTERVAL = 3600 # 1小时
ALERT_THRESHOLD = 0.8 # 预警阈值
# 日志配置
LOG_LEVEL = "INFO"
LOG_FILE = "logs/tsp_assistant.log"
# 系统监控配置
SYSTEM_MONITORING = True # 是否启用系统监控
MONITORING_INTERVAL = 60 # 监控间隔(秒)
@classmethod
def get_api_config(cls) -> Dict[str, Any]:
"""获取API配置"""
return {
"api_key": cls.ALIBABA_API_KEY,
"base_url": cls.ALIBABA_BASE_URL,
"model_name": cls.ALIBABA_MODEL_NAME
}
@classmethod
def get_database_config(cls) -> Dict[str, Any]:
"""获取数据库配置"""
return {
"url": cls.DATABASE_URL,
"echo": False
}
@classmethod
def get_knowledge_config(cls) -> Dict[str, Any]:
"""获取知识库配置"""
return {
"base_path": cls.KNOWLEDGE_BASE_PATH,
"vector_db_path": cls.VECTOR_DB_PATH
}
@classmethod
def get_config(cls) -> Dict[str, Any]:
"""获取完整配置"""
return {
"system_monitoring": cls.SYSTEM_MONITORING,
"monitoring_interval": cls.MONITORING_INTERVAL,
"log_level": cls.LOG_LEVEL,
"log_file": cls.LOG_FILE,
"analytics_update_interval": cls.ANALYTICS_UPDATE_INTERVAL,
"alert_threshold": cls.ALERT_THRESHOLD
}

View File

@@ -1,36 +1,42 @@
#!/usr/bin/env python #!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
""" """
统一配置管理模块 统一配置管理模块
整合所有配置,提供统一的配置接口 从环境变量加载所有配置,提供统一的配置接口
""" """
import os import os
import json
import logging import logging
from typing import Dict, Any, Optional from typing import Dict, Any, Optional
from dataclasses import dataclass, asdict from dataclasses import dataclass, asdict
from pathlib import Path from dotenv import load_dotenv
# 在模块加载时,自动从.env文件加载环境变量
# 这使得所有后续的os.getenv调用都能获取到.env中定义的值
load_dotenv()
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
# --- 数据类定义 ---
# 这些类定义了配置的结构,但不包含敏感的默认值。
# 默认值只用于那些不敏感或在大多数环境中都相同的值。
@dataclass @dataclass
class DatabaseConfig: class DatabaseConfig:
"""数据库配置""" """数据库配置"""
url: str = "mysql+pymysql://tsp_assistant:password@jeason.online/tsp_assistant?charset=utf8mb4" url: str
pool_size: int = 10 pool_size: int = 10
max_overflow: int = 20 max_overflow: int = 20
pool_timeout: int = 30 pool_timeout: int = 30
pool_recycle: int = 3600 pool_recycle: int = 600 # 改为 10 分钟回收连接,避免连接超时
@dataclass @dataclass
class LLMConfig: class LLMConfig:
"""LLM配置""" """LLM配置"""
provider: str = "qwen" provider: str
api_key: str = "sk-c0dbefa1718d46eaa897199135066f00" api_key: str
base_url: str = "https://dashscope.aliyuncs.com/compatible-mode/v1" model: str
model: str = "qwen-plus-latest" base_url: Optional[str] = None
temperature: float = 0.7 temperature: float = 0.7
max_tokens: int = 2000 max_tokens: int = 2000
timeout: int = 30 timeout: int = 30
@@ -47,13 +53,12 @@ class ServerConfig:
@dataclass @dataclass
class FeishuConfig: class FeishuConfig:
"""飞书配置""" """飞书配置"""
app_id: str = "" app_id: Optional[str] = None
app_secret: str = "" app_secret: Optional[str] = None
app_token: str = "" verification_token: Optional[str] = None
table_id: str = "" encrypt_key: Optional[str] = None
status: str = "active" table_id: Optional[str] = None
sync_limit: int = 10
auto_sync_interval: int = 0
@dataclass @dataclass
class AIAccuracyConfig: class AIAccuracyConfig:
@@ -63,234 +68,120 @@ class AIAccuracyConfig:
manual_review_threshold: float = 0.80 manual_review_threshold: float = 0.80
ai_suggestion_confidence: float = 0.95 ai_suggestion_confidence: float = 0.95
human_resolution_confidence: float = 0.90 human_resolution_confidence: float = 0.90
prefer_human_when_low_accuracy: bool = True
enable_auto_approval: bool = True
enable_human_fallback: bool = True
@dataclass
class SystemConfig: # --- 统一配置管理器 ---
"""系统配置"""
backup_enabled: bool = True
backup_interval: int = 24 # 小时
max_backup_files: int = 7
cache_enabled: bool = True
cache_ttl: int = 3600 # 秒
monitoring_enabled: bool = True
class UnifiedConfig: class UnifiedConfig:
"""统一配置管理器""" """
统一配置管理器
在实例化时,从环境变量中加载所有配置。
"""
def __init__(self, config_dir: str = "config"): def __init__(self):
self.config_dir = Path(config_dir) logger.info("Initializing unified configuration from environment variables...")
self.config_file = self.config_dir / "unified_config.json" self.database = self._load_database_from_env()
self.llm = self._load_llm_from_env()
self.server = self._load_server_from_env()
self.feishu = self._load_feishu_from_env()
self.ai_accuracy = self._load_ai_accuracy_from_env()
self.validate_config()
# 默认配置 - 从config/llm_config.py加载默认LLM配置 def _load_database_from_env(self) -> DatabaseConfig:
self.database = DatabaseConfig() db_url = os.getenv("DATABASE_URL")
self.llm = self._load_default_llm_config() if not db_url:
self.server = ServerConfig() raise ValueError("FATAL: DATABASE_URL environment variable is not set.")
self.feishu = FeishuConfig() logger.info("Database config loaded.")
self.ai_accuracy = AIAccuracyConfig() return DatabaseConfig(url=db_url)
self.system = SystemConfig()
# 加载配置 def _load_llm_from_env(self) -> LLMConfig:
self.load_config() api_key = os.getenv("LLM_API_KEY")
if not api_key:
logger.warning("LLM_API_KEY is not set. LLM functionality will be disabled or fail.")
def _load_default_llm_config(self) -> LLMConfig: config = LLMConfig(
"""加载默认LLM配置""" provider=os.getenv("LLM_PROVIDER", "qwen"),
try: api_key=api_key,
from config.llm_config import DEFAULT_CONFIG model=os.getenv("LLM_MODEL", "qwen-plus-latest"),
# 将config/llm_config.py中的配置转换为统一配置的格式 base_url=os.getenv("LLM_BASE_URL"),
return LLMConfig( temperature=float(os.getenv("LLM_TEMPERATURE", 0.7)),
provider=DEFAULT_CONFIG.provider, max_tokens=int(os.getenv("LLM_MAX_TOKENS", 2000)),
api_key=DEFAULT_CONFIG.api_key, timeout=int(os.getenv("LLM_TIMEOUT", 30))
base_url=DEFAULT_CONFIG.base_url,
model=DEFAULT_CONFIG.model,
temperature=DEFAULT_CONFIG.temperature,
max_tokens=DEFAULT_CONFIG.max_tokens
) )
except Exception as e: logger.info("LLM config loaded.")
logger.warning(f"无法加载默认LLM配置使用内置默认值: {e}") return config
return LLMConfig()
def load_config(self): def _load_server_from_env(self) -> ServerConfig:
"""加载配置文件""" config = ServerConfig(
try: host=os.getenv("SERVER_HOST", "0.0.0.0"),
if self.config_file.exists(): port=int(os.getenv("SERVER_PORT", 5000)),
with open(self.config_file, 'r', encoding='utf-8') as f: websocket_port=int(os.getenv("WEBSOCKET_PORT", 8765)),
config_data = json.load(f) debug=os.getenv("DEBUG_MODE", "False").lower() in ('true', '1', 't'),
log_level=os.getenv("LOG_LEVEL", "INFO").upper()
)
logger.info("Server config loaded.")
return config
# 更新配置 def _load_feishu_from_env(self) -> FeishuConfig:
if 'database' in config_data: config = FeishuConfig(
self.database = DatabaseConfig(**config_data['database']) app_id=os.getenv("FEISHU_APP_ID"),
if 'llm' in config_data: app_secret=os.getenv("FEISHU_APP_SECRET"),
self.llm = LLMConfig(**config_data['llm']) verification_token=os.getenv("FEISHU_VERIFICATION_TOKEN"),
if 'server' in config_data: encrypt_key=os.getenv("FEISHU_ENCRYPT_KEY"),
self.server = ServerConfig(**config_data['server']) table_id=os.getenv("FEISHU_TABLE_ID")
if 'feishu' in config_data: )
self.feishu = FeishuConfig(**config_data['feishu']) logger.info("Feishu config loaded.")
if 'ai_accuracy' in config_data: return config
self.ai_accuracy = AIAccuracyConfig(**config_data['ai_accuracy'])
if 'system' in config_data:
self.system = SystemConfig(**config_data['system'])
logger.info("配置文件加载成功") def _load_ai_accuracy_from_env(self) -> AIAccuracyConfig:
else: config = AIAccuracyConfig(
logger.info("配置文件不存在,使用默认配置") auto_approve_threshold=float(os.getenv("AI_AUTO_APPROVE_THRESHOLD", 0.95)),
self.save_config() use_human_resolution_threshold=float(os.getenv("AI_USE_HUMAN_RESOLUTION_THRESHOLD", 0.90)),
manual_review_threshold=float(os.getenv("AI_MANUAL_REVIEW_THRESHOLD", 0.80)),
ai_suggestion_confidence=float(os.getenv("AI_SUGGESTION_CONFIDENCE", 0.95)),
human_resolution_confidence=float(os.getenv("AI_HUMAN_RESOLUTION_CONFIDENCE", 0.90))
)
logger.info("AI Accuracy config loaded.")
return config
except Exception as e: def validate_config(self):
logger.error(f"加载配置文件失败: {e}") """在启动时验证关键配置"""
def save_config(self):
"""保存配置文件"""
try:
self.config_dir.mkdir(exist_ok=True)
config_data = {
'database': asdict(self.database),
'llm': asdict(self.llm),
'server': asdict(self.server),
'feishu': asdict(self.feishu),
'ai_accuracy': asdict(self.ai_accuracy),
'system': asdict(self.system)
}
with open(self.config_file, 'w', encoding='utf-8') as f:
json.dump(config_data, f, indent=2, ensure_ascii=False)
logger.info("配置文件保存成功")
except Exception as e:
logger.error(f"保存配置文件失败: {e}")
def load_from_env(self):
"""从环境变量加载配置"""
# 数据库配置
if os.getenv('DATABASE_URL'):
self.database.url = os.getenv('DATABASE_URL')
# LLM配置
if os.getenv('LLM_PROVIDER'):
self.llm.provider = os.getenv('LLM_PROVIDER')
if os.getenv('LLM_API_KEY'):
self.llm.api_key = os.getenv('LLM_API_KEY')
if os.getenv('LLM_MODEL'):
self.llm.model = os.getenv('LLM_MODEL')
# 服务器配置
if os.getenv('SERVER_PORT'):
self.server.port = int(os.getenv('SERVER_PORT'))
if os.getenv('LOG_LEVEL'):
self.server.log_level = os.getenv('LOG_LEVEL')
# 飞书配置
if os.getenv('FEISHU_APP_ID'):
self.feishu.app_id = os.getenv('FEISHU_APP_ID')
if os.getenv('FEISHU_APP_SECRET'):
self.feishu.app_secret = os.getenv('FEISHU_APP_SECRET')
if os.getenv('FEISHU_APP_TOKEN'):
self.feishu.app_token = os.getenv('FEISHU_APP_TOKEN')
if os.getenv('FEISHU_TABLE_ID'):
self.feishu.table_id = os.getenv('FEISHU_TABLE_ID')
def get_database_url(self) -> str:
"""获取数据库连接URL"""
return self.database.url
def get_llm_config(self) -> Dict[str, Any]:
"""获取LLM配置"""
return asdict(self.llm)
def get_server_config(self) -> Dict[str, Any]:
"""获取服务器配置"""
return asdict(self.server)
def get_feishu_config(self) -> Dict[str, Any]:
"""获取飞书配置"""
return asdict(self.feishu)
def get_ai_accuracy_config(self) -> Dict[str, Any]:
"""获取AI准确率配置"""
return asdict(self.ai_accuracy)
def get_system_config(self) -> Dict[str, Any]:
"""获取系统配置"""
return asdict(self.system)
def update_config(self, section: str, config_data: Dict[str, Any]):
"""更新配置"""
try:
if section == 'database':
self.database = DatabaseConfig(**config_data)
elif section == 'llm':
self.llm = LLMConfig(**config_data)
elif section == 'server':
self.server = ServerConfig(**config_data)
elif section == 'feishu':
self.feishu = FeishuConfig(**config_data)
elif section == 'ai_accuracy':
self.ai_accuracy = AIAccuracyConfig(**config_data)
elif section == 'system':
self.system = SystemConfig(**config_data)
else:
raise ValueError(f"未知的配置节: {section}")
self.save_config()
logger.info(f"配置节 {section} 更新成功")
except Exception as e:
logger.error(f"更新配置失败: {e}")
raise
def validate_config(self) -> bool:
"""验证配置有效性"""
try:
# 验证数据库配置
if not self.database.url: if not self.database.url:
logger.error("数据库URL未配置") raise ValueError("Database URL is missing.")
return False
# 验证LLM配置
if not self.llm.api_key: if not self.llm.api_key:
logger.warning("LLM API密钥未配置") logger.warning("LLM API key is not configured. AI features may fail.")
if self.feishu.app_id and not self.feishu.app_secret:
logger.warning("FEISHU_APP_ID is set, but FEISHU_APP_SECRET is missing.")
logger.info("Configuration validation passed (warnings may exist).")
# 验证飞书配置 # --- Public Getters ---
if self.feishu.status == "active":
if not all([self.feishu.app_id, self.feishu.app_secret,
self.feishu.app_token, self.feishu.table_id]):
logger.warning("飞书配置不完整")
logger.info("配置验证通过")
return True
except Exception as e:
logger.error(f"配置验证失败: {e}")
return False
def get_all_config(self) -> Dict[str, Any]: def get_all_config(self) -> Dict[str, Any]:
"""获取所有配置""" """获取所有配置的字典表示"""
return { return {
'database': asdict(self.database), 'database': asdict(self.database),
'llm': asdict(self.llm), 'llm': asdict(self.llm),
'server': asdict(self.server), 'server': asdict(self.server),
'feishu': asdict(self.feishu), 'feishu': asdict(self.feishu),
'ai_accuracy': asdict(self.ai_accuracy), 'ai_accuracy': asdict(self.ai_accuracy),
'system': asdict(self.system)
} }
# 全局配置实例 # --- 全局单例模式 ---
_config_instance = None
_config_instance: Optional[UnifiedConfig] = None
def get_config() -> UnifiedConfig: def get_config() -> UnifiedConfig:
"""获取全局配置实例""" """
获取全局统一配置实例。
第一次调用时,它会创建并加载配置。后续调用将返回缓存的实例。
"""
global _config_instance global _config_instance
if _config_instance is None: if _config_instance is None:
_config_instance = UnifiedConfig() _config_instance = UnifiedConfig()
_config_instance.load_from_env()
return _config_instance return _config_instance
def reload_config(): def reload_config() -> UnifiedConfig:
"""重新加载配置""" """强制重新加载配置"""
global _config_instance global _config_instance
_config_instance = None _config_instance = None
return get_config() return get_config()

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More