refactor(agent): remove agent gateway instead of twilight sdk (#264)

* refactor(agent): replace TypeScript agent gateway with in-process Go agent using twilight-ai SDK

- Remove apps/agent (Bun/Elysia gateway), packages/agent (@memoh/agent),
  internal/bun runtime manager, and all embedded agent/bun assets
- Add internal/agent package powered by twilight-ai SDK for LLM calls,
  tool execution, streaming, sential logic, tag extraction, and prompts
- Integrate ToolGatewayService in-process for both built-in and user MCP
  tools, eliminating HTTP round-trips to the old gateway
- Update resolver to convert between sdk.Message and ModelMessage at the
  boundary (resolver_messages.go), keeping agent package free of
  persistence concerns
- Prepend user message before storeRound since SDK only returns output
  messages (assistant + tool)
- Clean up all Docker configs, TOML configs, nginx proxy, Dockerfile.agent,
  and Go config structs related to the removed agent gateway
- Update cmd/agent and cmd/memoh entry points with setter-based
  ToolGateway injection to avoid FX dependency cycles

* fix(web): move form declaration before computed properties that reference it

The `form` reactive object was declared after computed properties like
`selectedMemoryProvider` and `isSelectedMemoryProviderPersisted` that
reference it, causing a TDZ ReferenceError during setup.

* fix: prevent UTF-8 character corruption in streaming text output

StreamTagExtractor.Push() used byte-level string slicing to hold back
buffer tails for tag detection, which could split multi-byte UTF-8
characters. After json.Marshal replaced invalid bytes with U+FFFD,
the corruption became permanent — causing garbled CJK characters (�)
in agent responses.

Add safeUTF8SplitIndex() to back up split points to valid character
boundaries. Also fix byte-level truncation in command/formatter.go
and command/fs.go to use rune-aware slicing.

* fix: add agent error logging and fix Gemini tool schema validation

- Log agent stream errors in both SSE and WebSocket paths with bot/model context
- Fix send tool `attachments` parameter: empty `items` schema rejected by
  Google Gemini API (INVALID_ARGUMENT), now specifies `{"type": "string"}`
- Upgrade twilight-ai to d898f0b (includes raw body in API error messages)

* chore(ci): remove agent gateway from Docker build and release pipelines

Agent gateway has been replaced by in-process Go agent; remove the
obsolete Docker image matrix entry, Bun/UPX CI steps, and agent-binary
build logic from the release script.

* fix: preserve attachment filename, metadata, and container path through persistence

- Add `name` column to `bot_history_message_assets` (migration 0034) to
  persist original filenames across page refreshes.
- Add `metadata` JSONB column (migration 0035) to store source_path,
  source_url, and other context alongside each asset.
- Update SQL queries, sqlc-generated code, and all Go types (MessageAsset,
  AssetRef, OutboundAssetRef, FileAttachment) to carry name and metadata
  through the full lifecycle.
- Extract filenames from path/URL in AttachmentsResolver before clearing
  raw paths; enrich streaming event metadata with name, source_path, and
  source_url in both the WebSocket and channel inbound ingestion paths.
- Implement `LinkAssets` on message service and `LinkOutboundAssets` on
  flow resolver so WebSocket-streamed bot attachments are persisted to the
  correct assistant message after streaming completes.
- Frontend: update MessageAsset type with metadata field, pass metadata
  through to attachment items, and reorder attachment-block.vue template
  so container files (identified by metadata.source_path) open in the
  sidebar file manager instead of triggering a download.

* refactor(agent): decouple built-in tools from MCP, load via ToolProvider interface

Migrate all 13 built-in tool providers from internal/mcp/providers/ to
internal/agent/tools/ using the twilight-ai sdk.Tool structure. The agent
now loads tools through a ToolProvider interface instead of the MCP
ToolGatewayService, which is simplified to only manage external federation
sources. This enables selective tool loading and removes the coupling
between business tools and the MCP protocol layer.

* refactor(flow): split monolithic resolver.go into focused modules

Break the 1959-line resolver.go into 12 files organized by concern:
- resolver.go: core orchestration (Resolver struct, resolve, Chat, prepareRunConfig)
- resolver_stream.go: streaming (StreamChat, StreamChatWS, tryStoreStream)
- resolver_trigger.go: schedule/heartbeat triggers
- resolver_attachments.go: attachment routing, inlining, encoding
- resolver_history.go: message loading, deduplication, token trimming
- resolver_store.go: persistence (storeRound, storeMessages, asset linking)
- resolver_memory.go: memory provider integration
- resolver_model_selection.go: model selection and candidate matching
- resolver_identity.go: display name and channel identity resolution
- resolver_settings.go: bot settings, loop detection, inbox
- user_header.go: YAML front-matter formatting
- resolver_util.go: shared utilities (sanitize, normalize, dedup, UUID)

* fix(agent): enable Anthropic extended thinking by passing ReasoningConfig to provider

Anthropic's thinking requires WithThinking() at provider creation time,
unlike OpenAI which uses per-request ReasoningEffort. The config was
never wired through, so Claude models could not trigger thinking.

* refactor(agent): extract prompts into embedded markdown templates

Move inline prompt strings from prompt.go into separate .md files under
internal/agent/prompts/, using {{key}} placeholders and a simple render
engine. Remove obsolete SystemPromptParams fields (Language,
MaxContextLoadTime, Channels, CurrentChannel) and their call-site usage.

* fix: lint
This commit is contained in:
Acbox Liu
2026-03-19 13:31:54 +08:00
committed by GitHub
parent ef333ae516
commit 1680316c7f
169 changed files with 7988 additions and 14436 deletions
+3 -165
View File
@@ -9,17 +9,8 @@ COMMIT_HASH="${COMMIT_HASH:-unknown}"
BUILD_TIME="${BUILD_TIME:-$(date -u +"%Y-%m-%dT%H:%M:%SZ")}"
OUTPUT_DIR="${OUTPUT_DIR:-$ROOT_DIR/dist}"
PREPARE_ASSETS_ONLY="false"
UPX_COMPRESS_AGENT_BIN="${UPX_COMPRESS_AGENT_BIN:-false}"
UPX_ARGS="${UPX_ARGS:--3}"
UPX_ALLOW_DARWIN="${UPX_ALLOW_DARWIN:-false}"
AUTO_INSTALL_UPX="${AUTO_INSTALL_UPX:-}"
if [[ -z "$AUTO_INSTALL_UPX" ]]; then
AUTO_INSTALL_UPX=$([[ "${GITHUB_ACTIONS:-}" == "true" ]] && echo "true" || echo "false")
fi
WEB_DIR="$ROOT_DIR/internal/embedded/web"
AGENT_DIR="$ROOT_DIR/internal/embedded/agent"
BUN_DIR="$ROOT_DIR/internal/embedded/bun"
log() {
echo "[release] $*"
@@ -36,9 +27,6 @@ Options:
--commit-hash <sha> Commit hash injected into memoh binary
--output-dir <dir> Output directory for release artifacts
--prepare-assets Only prepare embedded assets, do not build archive
Compatibility options:
--bun-version <v> Deprecated; ignored (kept for backward compatibility)
EOF
}
@@ -53,10 +41,6 @@ parse_args() {
TARGET_ARCH="$2"
shift 2
;;
--bun-version)
# Bun runtime archives are no longer embedded; keep arg for compatibility.
shift 2
;;
--version)
VERSION="$2"
shift 2
@@ -91,23 +75,10 @@ write_keep_gitignore() {
printf "*\n!.gitignore\n" > "$dir/.gitignore"
}
resolve_agent_compile_target() {
case "${TARGET_OS}-${TARGET_ARCH}" in
linux-amd64) echo "bun-linux-x64|agent-bin" ;;
linux-arm64) echo "bun-linux-arm64|agent-bin" ;;
darwin-amd64) echo "bun-darwin-x64|agent-bin" ;;
darwin-arm64) echo "bun-darwin-arm64|agent-bin" ;;
windows-amd64) echo "bun-windows-x64|agent-bin.exe" ;;
*) echo "|" ;;
esac
}
prepare_embed_dirs() {
rm -rf "$WEB_DIR" "$AGENT_DIR" "$BUN_DIR"
mkdir -p "$WEB_DIR" "$AGENT_DIR" "$BUN_DIR"
rm -rf "$WEB_DIR"
mkdir -p "$WEB_DIR"
write_keep_gitignore "$WEB_DIR"
write_keep_gitignore "$AGENT_DIR"
write_keep_gitignore "$BUN_DIR"
}
prepare_assets() {
@@ -118,140 +89,7 @@ prepare_assets() {
cp -R "$ROOT_DIR/apps/web/dist/." "$WEB_DIR/"
gzip_embedded_web_assets "$WEB_DIR"
local target_key="${TARGET_OS}-${TARGET_ARCH}"
local resolved bun_compile_target agent_bin_name
resolved="$(resolve_agent_compile_target)"
bun_compile_target="${resolved%%|*}"
agent_bin_name="${resolved##*|}"
if [[ -z "$bun_compile_target" || -z "$agent_bin_name" ]]; then
echo "agent-bin not available for ${target_key}" > "$AGENT_DIR/UNAVAILABLE"
log "skipped agent-bin compile for unsupported target ${target_key}"
return 0
fi
log "building agent executable (${bun_compile_target})"
patch_jsdom_style_loader_for_compile
trap 'restore_jsdom_style_loader_patch' RETURN
(
cd "$ROOT_DIR/apps/agent"
bun build src/index.ts --compile --target "$bun_compile_target" --outfile "$AGENT_DIR/$agent_bin_name"
)
restore_jsdom_style_loader_patch
trap - RETURN
chmod +x "$AGENT_DIR/$agent_bin_name" || true
compress_agent_bin_if_enabled "$AGENT_DIR/$agent_bin_name" "$TARGET_OS"
log "embedded assets prepared (${target_key})"
}
JSDOM_STYLE_RULES_FILE=""
JSDOM_STYLE_RULES_BACKUP=""
JSDOM_XHR_IMPL_FILE=""
JSDOM_XHR_IMPL_BACKUP=""
patch_jsdom_style_loader_for_compile() {
local css_path css_json
JSDOM_STYLE_RULES_FILE="$(node -e "try{process.stdout.write(require.resolve('jsdom/lib/jsdom/living/helpers/style-rules.js',{paths:['$ROOT_DIR/apps/agent']}))}catch{process.exit(1)}" 2>/dev/null || true)"
css_path="$(node -e "try{process.stdout.write(require.resolve('jsdom/lib/jsdom/browser/default-stylesheet.css',{paths:['$ROOT_DIR/apps/agent']}))}catch{process.exit(1)}" 2>/dev/null || true)"
JSDOM_XHR_IMPL_FILE="$(node -e "try{process.stdout.write(require.resolve('jsdom/lib/jsdom/living/xhr/XMLHttpRequest-impl.js',{paths:['$ROOT_DIR/apps/agent']}))}catch{process.exit(1)}" 2>/dev/null || true)"
if [[ -z "$JSDOM_STYLE_RULES_FILE" || -z "$css_path" || -z "$JSDOM_XHR_IMPL_FILE" ]]; then
log "skip jsdom patch (jsdom sources not resolved)"
return 0
fi
JSDOM_STYLE_RULES_BACKUP="${JSDOM_STYLE_RULES_FILE}.memoh.bak"
JSDOM_XHR_IMPL_BACKUP="${JSDOM_XHR_IMPL_FILE}.memoh.bak"
cp "$JSDOM_STYLE_RULES_FILE" "$JSDOM_STYLE_RULES_BACKUP"
cp "$JSDOM_XHR_IMPL_FILE" "$JSDOM_XHR_IMPL_BACKUP"
css_json="$(node -e "const fs=require('fs');process.stdout.write(JSON.stringify(fs.readFileSync(process.argv[1],'utf8')))" "$css_path")"
node - "$JSDOM_STYLE_RULES_FILE" "$css_json" <<'NODE'
const fs = require("fs");
const file = process.argv[2];
const css = process.argv[3];
let src = fs.readFileSync(file, "utf8");
const pattern = /const defaultStyleSheet = fs\.readFileSync\([\s\S]*?\);\n/;
if (!pattern.test(src)) {
console.error("[release] jsdom patch target not found");
process.exit(1);
}
src = src.replace(pattern, `const defaultStyleSheet = ${css};\n`);
fs.writeFileSync(file, src, "utf8");
NODE
node - "$JSDOM_XHR_IMPL_FILE" <<'NODE'
const fs = require("fs");
const file = process.argv[2];
let src = fs.readFileSync(file, "utf8");
const pattern = /const syncWorkerFile = require\.resolve \? require\.resolve\("\.\/xhr-sync-worker\.js"\) : null;/;
if (!pattern.test(src)) {
console.error("[release] jsdom xhr patch target not found");
process.exit(1);
}
src = src.replace(pattern, 'const syncWorkerFile = `${__dirname}/xhr-sync-worker.js`;');
fs.writeFileSync(file, src, "utf8");
NODE
log "patched jsdom style loader for compile-time embedding"
}
restore_jsdom_style_loader_patch() {
if [[ -n "$JSDOM_STYLE_RULES_BACKUP" && -f "$JSDOM_STYLE_RULES_BACKUP" && -n "$JSDOM_STYLE_RULES_FILE" ]]; then
mv "$JSDOM_STYLE_RULES_BACKUP" "$JSDOM_STYLE_RULES_FILE"
fi
if [[ -n "$JSDOM_XHR_IMPL_BACKUP" && -f "$JSDOM_XHR_IMPL_BACKUP" && -n "$JSDOM_XHR_IMPL_FILE" ]]; then
mv "$JSDOM_XHR_IMPL_BACKUP" "$JSDOM_XHR_IMPL_FILE"
fi
log "restored jsdom compile-time patches"
}
compress_agent_bin_if_enabled() {
local bin_path="$1"
local target_os="$2"
if [[ "$UPX_COMPRESS_AGENT_BIN" != "true" ]]; then
return 0
fi
ensure_upx_available
if [[ "$target_os" == "darwin" && "$UPX_ALLOW_DARWIN" != "true" ]]; then
log "skip upx on darwin (set UPX_ALLOW_DARWIN=true to force)"
return 0
fi
local before_bytes after_bytes
before_bytes="$(wc -c < "$bin_path" | tr -d ' ')"
read -r -a upx_flags <<< "$UPX_ARGS"
upx "${upx_flags[@]}" "$bin_path"
after_bytes="$(wc -c < "$bin_path" | tr -d ' ')"
log "upx compressed agent-bin: ${before_bytes} -> ${after_bytes} bytes"
}
ensure_upx_available() {
if command -v upx >/dev/null 2>&1; then
return 0
fi
if [[ "$AUTO_INSTALL_UPX" != "true" ]]; then
echo "[release] UPX_COMPRESS_AGENT_BIN=true but upx not found in PATH" >&2
echo "[release] install upx or set AUTO_INSTALL_UPX=true" >&2
exit 1
fi
log "upx not found; attempting auto-install"
if [[ "$OSTYPE" == linux* ]] && command -v apt-get >/dev/null 2>&1; then
if command -v sudo >/dev/null 2>&1; then
sudo apt-get update -y && sudo apt-get install -y upx-ucl
else
apt-get update -y && apt-get install -y upx-ucl
fi
elif [[ "$OSTYPE" == darwin* ]] && command -v brew >/dev/null 2>&1; then
brew install upx
fi
if ! command -v upx >/dev/null 2>&1; then
echo "[release] failed to auto-install upx" >&2
exit 1
fi
log "embedded assets prepared"
}
gzip_embedded_web_assets() {