feat: openai codex support (#292)

* feat(web): add provider oauth management ui

* feat: add OAuth callback support on port 1455

* feat: enhance reasoning effort options and support for OpenAI Codex OAuth

* feat: update twilight-ai dependency to v0.3.4

* refactor: promote openai-codex to first-class client_type, remove auth_type

Replace the previous openai-responses + metadata auth_type=openai-codex-oauth
combo with a dedicated openai-codex client_type. OAuth requirement is now
determined solely by client_type, eliminating the auth_type concept from the
LLM provider domain entirely.

- Add openai-codex to DB CHECK constraint (migration 0047) with data migration
- Add ClientTypeOpenAICodex constant and dedicated SDK/probe branches
- Remove AuthType from SDKModelConfig, ModelCredentials, TriggerConfig, etc.
- Simplify supportsOAuth to check client_type == openai-codex
- Add conf/providers/codex.yaml preset with Codex catalog models
- Frontend: replace auth_type selector with client_type-driven OAuth UI

---------

Co-authored-by: Acbox <acbox0328@gmail.com>
This commit is contained in:
Yiming Qi
2026-03-27 19:30:45 +08:00
committed by GitHub
parent 44c92f198b
commit 64378d29ed
44 changed files with 1663 additions and 160 deletions
+1
View File
@@ -47,6 +47,7 @@ services:
- /etc/localtime:/etc/localtime:ro
ports:
- "8080:8080"
- "1455:8080"
depends_on:
migrate:
condition: service_completed_successfully