chore(cursor): update IDE rules for multi-provider LLM architecture

Add llm/, db/, crypto/ dirs to structure; replace OpenAI-only references with LLM Gateway.

Ultraworked with [Sisyphus](https://github.com/code-yeongyu/oh-my-opencode)
This commit is contained in:
jeffusion
2026-03-05 10:14:52 +08:00
committed by 路遥知码力
parent 851c73e326
commit 9d986f4b5a
6 changed files with 38 additions and 20 deletions

View File

@@ -16,13 +16,17 @@ The application entry point is [src/index.ts](mdc:src/index.ts), which sets up t
- **services/**: Service layer for external API interactions
- **config/**: Configuration management
- **utils/**: Utility functions
- **llm/**: Multi-provider LLM gateway and provider adapters
- **db/**: SQLite database layer for LLM configuration
- **crypto/**: Encryption utilities for API key storage
- **agent/**: Multi-agent review engine (planner, specialists, judge)
## Configuration Files
- [package.json](mdc:package.json): Project dependencies and scripts
- [tsconfig.json](mdc:tsconfig.json): TypeScript compiler configuration
- [Dockerfile](mdc:Dockerfile): Container configuration
- [kubernetes.yaml](mdc:kubernetes.yaml): Kubernetes deployment configuration
- [kubernetes.yaml](mdc:k8s/gitea-assistant.yaml): Kubernetes deployment configuration
## Build and Deployment

View File

@@ -37,6 +37,15 @@ The application follows a clean, layered architecture:
- Centralizes application configuration from environment variables
- Manages Feishu webhook configurations
4. **LLM Gateway** ([src/llm/](mdc:src/llm))
- Multi-provider LLM abstraction layer
- Provider adapters for OpenAI Compatible, OpenAI Responses API, Anthropic, Google Gemini
- Role-based model routing (legacy, planner, specialist, judge, embedding)
5. **Database Layer** ([src/db/](mdc:src/db))
- SQLite-based LLM provider and role configuration
- API key encryption with AES-256-GCM
4. **Utilities**
- [src/utils/logger.ts](mdc:src/utils/logger.ts): Custom logging utilities

View File

@@ -10,7 +10,7 @@ alwaysApply: true
- **Runtime**: Bun (JavaScript/TypeScript runtime)
- **Language**: TypeScript
- **Framework**: Hono (lightweight web framework)
- **API Integration**: OpenAI API, Gitea API
- **API Integration**: LLM Gateway (OpenAI Compatible, OpenAI Responses API, Anthropic, Google Gemini), Gitea API
- **Containerization**: Docker, Kubernetes
## Key Dependencies
@@ -22,7 +22,9 @@ From [package.json](mdc:package.json):
- **hono**: Lightweight, ultrafast web framework
- **@hono/zod-validator**: Schema validation for Hono
- **zod**: TypeScript-first schema validation
- **openai**: OpenAI API client
- **openai**: OpenAI API client (used for OpenAI Compatible and Responses providers)
- **@anthropic-ai/sdk**: Anthropic Messages API client
- **@google/genai**: Google Gemini API client
- **axios**: HTTP client for API requests
- **dotenv**: Environment variable management
- **lodash-es**: Utility library
@@ -36,10 +38,8 @@ From [package.json](mdc:package.json):
## Environment Configuration
The application uses environment variables for configuration, which are processed in [src/config/index.ts](mdc:src/config/index.ts). Key configurations include:
The application uses a hybrid configuration approach:
- Gitea API settings
- OpenAI API settings
- Custom prompts for AI review
- Server configuration
- Webhook security
- **Environment variables** ([src/config/index.ts](mdc:src/config/index.ts)): Gitea settings, server config, webhook security, review engine params
- **Web UI + SQLite DB** ([src/db/](mdc:src/db)): LLM provider settings (API keys, models, endpoints) — managed via Admin Dashboard
- **bun:sqlite**: Embedded database for LLM configuration persistence

View File

@@ -7,7 +7,7 @@ alwaysApply: true
## Overview
The AI Code Review system is the core feature of this application. It automatically analyzes code changes in Pull Requests and commits, providing insightful feedback using OpenAI's language models.
The AI Code Review system is the core feature of this application. It automatically analyzes code changes in Pull Requests and commits, providing insightful feedback using pluggable LLM providers via the LLM Gateway.
## Key Components
@@ -36,7 +36,7 @@ The AI Code Review system is the core feature of this application. It automatica
- Generate AI prompts with context
3. **AI Review**:
- Send processed data to OpenAI API
- Route request through LLM Gateway to configured provider
- Generate summary feedback
- Generate line-level comments

View File

@@ -13,10 +13,10 @@ The application is configured through environment variables, defined in [src/con
- `GITEA_API_URL`: Gitea API endpoint URL
- `GITEA_ACCESS_TOKEN`: Access token for Gitea API
- **OpenAI Configuration**:
- `OPENAI_BASE_URL`: OpenAI API base URL
- `OPENAI_API_KEY`: API key for OpenAI
- `OPENAI_MODEL`: Model to use (e.g., "gpt-4o")
- **LLM Provider Configuration**:
- Configured exclusively through the Admin Dashboard Web UI
- Supports OpenAI Compatible, OpenAI Responses API, Anthropic, Google Gemini
- API keys stored encrypted (AES-256-GCM) in SQLite database
- **Server Configuration**:
- `PORT`: Server port (default: 3000)
@@ -53,7 +53,7 @@ docker run -p 3000:3000 --env-file .env gitea-assistant:latest
### Kubernetes Deployment
The [kubernetes.yaml](mdc:kubernetes.yaml) and [kubernetes.yaml.template](mdc:kubernetes.yaml.template) files provide Kubernetes deployment configuration.
The [kubernetes.yaml](mdc:k8s/gitea-assistant.yaml) file provides Kubernetes deployment configuration. Persistent storage is required for the `/app/data` directory.
Deployment can be managed using:
```bash

View File

@@ -31,11 +31,16 @@ When contributing to this project, adhere to these structural guidelines:
- External API interactions belong in the [services/](mdc:src/services) directory
- Each service should have a clear, single responsibility
3. **Configuration**:
- Environment-based configurations go in [config/index.ts](mdc:src/config/index.ts)
- Use environment variables for configurable values
3. **LLM Layer**:
- LLM provider adapters in [llm/](mdc:src/llm)
- Database layer in [db/](mdc:src/db)
- Encryption utilities in [crypto/](mdc:src/crypto)
4. **Utils**:
4. **Configuration**:
- Environment-based configurations go in [config/index.ts](mdc:src/config/index.ts)
- LLM provider settings are managed through Web UI + SQLite DB
5. **Utils**:
- Reusable utility functions belong in [utils/](mdc:src/utils)
- Logging should use the custom logger from [utils/logger.ts](mdc:src/utils/logger.ts)