feat: openai codex support (#292)

* feat(web): add provider oauth management ui

* feat: add OAuth callback support on port 1455

* feat: enhance reasoning effort options and support for OpenAI Codex OAuth

* feat: update twilight-ai dependency to v0.3.4

* refactor: promote openai-codex to first-class client_type, remove auth_type

Replace the previous openai-responses + metadata auth_type=openai-codex-oauth
combo with a dedicated openai-codex client_type. OAuth requirement is now
determined solely by client_type, eliminating the auth_type concept from the
LLM provider domain entirely.

- Add openai-codex to DB CHECK constraint (migration 0047) with data migration
- Add ClientTypeOpenAICodex constant and dedicated SDK/probe branches
- Remove AuthType from SDKModelConfig, ModelCredentials, TriggerConfig, etc.
- Simplify supportsOAuth to check client_type == openai-codex
- Add conf/providers/codex.yaml preset with Codex catalog models
- Frontend: replace auth_type selector with client_type-driven OAuth UI

---------

Co-authored-by: Acbox <acbox0328@gmail.com>
This commit is contained in:
Yiming Qi
2026-03-27 19:30:45 +08:00
committed by GitHub
parent 44c92f198b
commit 64378d29ed
44 changed files with 1663 additions and 160 deletions
+12 -7
View File
@@ -1,6 +1,9 @@
package compaction
import "time"
import (
"net/http"
"time"
)
// Log represents a compaction log entry.
type Log struct {
@@ -24,10 +27,12 @@ type ListLogsResponse struct {
// TriggerConfig holds the parameters needed to trigger a compaction.
type TriggerConfig struct {
BotID string
SessionID string
ModelID string
ClientType string
APIKey string //nolint:gosec // runtime credential, not a hardcoded secret
BaseURL string
BotID string
SessionID string
ModelID string
ClientType string
APIKey string //nolint:gosec // runtime credential, not a hardcoded secret
CodexAccountID string
BaseURL string
HTTPClient *http.Client
}