feat: openai codex support (#292)

* feat(web): add provider oauth management ui

* feat: add OAuth callback support on port 1455

* feat: enhance reasoning effort options and support for OpenAI Codex OAuth

* feat: update twilight-ai dependency to v0.3.4

* refactor: promote openai-codex to first-class client_type, remove auth_type

Replace the previous openai-responses + metadata auth_type=openai-codex-oauth
combo with a dedicated openai-codex client_type. OAuth requirement is now
determined solely by client_type, eliminating the auth_type concept from the
LLM provider domain entirely.

- Add openai-codex to DB CHECK constraint (migration 0047) with data migration
- Add ClientTypeOpenAICodex constant and dedicated SDK/probe branches
- Remove AuthType from SDKModelConfig, ModelCredentials, TriggerConfig, etc.
- Simplify supportsOAuth to check client_type == openai-codex
- Add conf/providers/codex.yaml preset with Codex catalog models
- Frontend: replace auth_type selector with client_type-driven OAuth UI

---------

Co-authored-by: Acbox <acbox0328@gmail.com>
This commit is contained in:
Yiming Qi
2026-03-27 19:30:45 +08:00
committed by GitHub
parent 44c92f198b
commit 64378d29ed
44 changed files with 1663 additions and 160 deletions
+41
View File
@@ -0,0 +1,41 @@
name: OpenAI Codex
client_type: openai-codex
icon: openai
base_url: https://chatgpt.com/backend-api
models:
- model_id: gpt-5.2
name: GPT-5.2
type: chat
config:
compatibilities: [tool-call, reasoning]
- model_id: gpt-5.2-codex
name: GPT-5.2 Codex
type: chat
config:
compatibilities: [tool-call, reasoning]
- model_id: gpt-5.1-codex
name: GPT-5.1 Codex
type: chat
config:
compatibilities: [tool-call, reasoning]
- model_id: gpt-5.1-codex-max
name: GPT-5.1 Codex Max
type: chat
config:
compatibilities: [tool-call, reasoning]
- model_id: gpt-5.1-codex-mini
name: GPT-5.1 Codex Mini
type: chat
config:
compatibilities: [tool-call, reasoning]
- model_id: gpt-5.1
name: GPT-5.1
type: chat
config:
compatibilities: [tool-call, reasoning]