From 1680316c7f5682a79a4b99504636eaa632c53830 Mon Sep 17 00:00:00 2001 From: Acbox Liu Date: Thu, 19 Mar 2026 13:31:54 +0800 Subject: [PATCH] refactor(agent): remove agent gateway instead of twilight sdk (#264) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * refactor(agent): replace TypeScript agent gateway with in-process Go agent using twilight-ai SDK - Remove apps/agent (Bun/Elysia gateway), packages/agent (@memoh/agent), internal/bun runtime manager, and all embedded agent/bun assets - Add internal/agent package powered by twilight-ai SDK for LLM calls, tool execution, streaming, sential logic, tag extraction, and prompts - Integrate ToolGatewayService in-process for both built-in and user MCP tools, eliminating HTTP round-trips to the old gateway - Update resolver to convert between sdk.Message and ModelMessage at the boundary (resolver_messages.go), keeping agent package free of persistence concerns - Prepend user message before storeRound since SDK only returns output messages (assistant + tool) - Clean up all Docker configs, TOML configs, nginx proxy, Dockerfile.agent, and Go config structs related to the removed agent gateway - Update cmd/agent and cmd/memoh entry points with setter-based ToolGateway injection to avoid FX dependency cycles * fix(web): move form declaration before computed properties that reference it The `form` reactive object was declared after computed properties like `selectedMemoryProvider` and `isSelectedMemoryProviderPersisted` that reference it, causing a TDZ ReferenceError during setup. * fix: prevent UTF-8 character corruption in streaming text output StreamTagExtractor.Push() used byte-level string slicing to hold back buffer tails for tag detection, which could split multi-byte UTF-8 characters. After json.Marshal replaced invalid bytes with U+FFFD, the corruption became permanent — causing garbled CJK characters (�) in agent responses. Add safeUTF8SplitIndex() to back up split points to valid character boundaries. Also fix byte-level truncation in command/formatter.go and command/fs.go to use rune-aware slicing. * fix: add agent error logging and fix Gemini tool schema validation - Log agent stream errors in both SSE and WebSocket paths with bot/model context - Fix send tool `attachments` parameter: empty `items` schema rejected by Google Gemini API (INVALID_ARGUMENT), now specifies `{"type": "string"}` - Upgrade twilight-ai to d898f0b (includes raw body in API error messages) * chore(ci): remove agent gateway from Docker build and release pipelines Agent gateway has been replaced by in-process Go agent; remove the obsolete Docker image matrix entry, Bun/UPX CI steps, and agent-binary build logic from the release script. * fix: preserve attachment filename, metadata, and container path through persistence - Add `name` column to `bot_history_message_assets` (migration 0034) to persist original filenames across page refreshes. - Add `metadata` JSONB column (migration 0035) to store source_path, source_url, and other context alongside each asset. - Update SQL queries, sqlc-generated code, and all Go types (MessageAsset, AssetRef, OutboundAssetRef, FileAttachment) to carry name and metadata through the full lifecycle. - Extract filenames from path/URL in AttachmentsResolver before clearing raw paths; enrich streaming event metadata with name, source_path, and source_url in both the WebSocket and channel inbound ingestion paths. - Implement `LinkAssets` on message service and `LinkOutboundAssets` on flow resolver so WebSocket-streamed bot attachments are persisted to the correct assistant message after streaming completes. - Frontend: update MessageAsset type with metadata field, pass metadata through to attachment items, and reorder attachment-block.vue template so container files (identified by metadata.source_path) open in the sidebar file manager instead of triggering a download. * refactor(agent): decouple built-in tools from MCP, load via ToolProvider interface Migrate all 13 built-in tool providers from internal/mcp/providers/ to internal/agent/tools/ using the twilight-ai sdk.Tool structure. The agent now loads tools through a ToolProvider interface instead of the MCP ToolGatewayService, which is simplified to only manage external federation sources. This enables selective tool loading and removes the coupling between business tools and the MCP protocol layer. * refactor(flow): split monolithic resolver.go into focused modules Break the 1959-line resolver.go into 12 files organized by concern: - resolver.go: core orchestration (Resolver struct, resolve, Chat, prepareRunConfig) - resolver_stream.go: streaming (StreamChat, StreamChatWS, tryStoreStream) - resolver_trigger.go: schedule/heartbeat triggers - resolver_attachments.go: attachment routing, inlining, encoding - resolver_history.go: message loading, deduplication, token trimming - resolver_store.go: persistence (storeRound, storeMessages, asset linking) - resolver_memory.go: memory provider integration - resolver_model_selection.go: model selection and candidate matching - resolver_identity.go: display name and channel identity resolution - resolver_settings.go: bot settings, loop detection, inbox - user_header.go: YAML front-matter formatting - resolver_util.go: shared utilities (sanitize, normalize, dedup, UUID) * fix(agent): enable Anthropic extended thinking by passing ReasoningConfig to provider Anthropic's thinking requires WithThinking() at provider creation time, unlike OpenAI which uses per-request ReasoningEffort. The config was never wired through, so Claude models could not trigger thinking. * refactor(agent): extract prompts into embedded markdown templates Move inline prompt strings from prompt.go into separate .md files under internal/agent/prompts/, using {{key}} placeholders and a simple render engine. Remove obsolete SystemPromptParams fields (Language, MaxContextLoadTime, Channels, CurrentChannel) and their call-site usage. * fix: lint --- .agents/skills/twilight-ai/SKILL.md | 23 + .agents/skills/twilight-ai/reference.md | 36 + .github/workflows/docker.yml | 6 +- .github/workflows/release.yml | 8 - apps/agent/.gitignore | 44 - apps/agent/README.md | 1 - apps/agent/bun.lock | 419 ---- apps/agent/mise.toml | 15 - apps/agent/package.json | 27 - apps/agent/src/index.ts | 94 - apps/agent/src/middlewares/bearer.ts | 3 - apps/agent/src/middlewares/cors.ts | 9 - apps/agent/src/middlewares/error.ts | 118 - apps/agent/src/models.ts | 97 - apps/agent/src/modules/chat.ts | 256 -- apps/agent/src/utils/sse.ts | 50 - apps/agent/tsconfig.json | 103 - apps/web/src/composables/api/useChat.types.ts | 2 + .../pages/bots/components/bot-settings.vue | 29 +- .../chat/components/attachment-block.vue | 39 +- apps/web/src/store/chat-list.ts | 2 + cmd/agent/main.go | 99 +- cmd/memoh/serve.go | 106 +- conf/app.apple.toml | 5 - conf/app.docker.toml | 6 - conf/app.example.toml | 5 - conf/app.windows.toml | 5 - db/migrations/0001_init.up.sql | 2 + db/migrations/0034_asset_name.down.sql | 5 + db/migrations/0034_asset_name.up.sql | 5 + db/migrations/0035_asset_metadata.down.sql | 5 + db/migrations/0035_asset_metadata.up.sql | 5 + db/queries/media.sql | 14 +- devenv/app.dev.toml | 5 - devenv/docker-compose.yml | 19 - docker-compose.yml | 15 - docker/Dockerfile.agent | 41 - docker/Dockerfile.web | 2 - docker/docker-compose.cn.yml | 2 - docker/docker-compose.yml | 6 - docker/nginx.conf | 5 - go.mod | 7 +- go.sum | 10 +- internal/agent/agent.go | 524 ++++ internal/agent/config.go | 13 + internal/agent/fs.go | 73 + internal/agent/model.go | 130 + internal/agent/prompt.go | 175 ++ internal/agent/prompts/heartbeat.md | 10 + internal/agent/prompts/schedule.md | 9 + internal/agent/prompts/subagent.md | 5 + internal/agent/prompts/system.md | 209 ++ internal/agent/sential.go | 487 ++++ internal/agent/stream.go | 48 + internal/agent/tags.go | 275 +++ .../provider.go => agent/tools/browser.go} | 248 +- internal/agent/tools/contacts.go | 94 + internal/agent/tools/container.go | 284 +++ internal/agent/tools/email.go | 273 +++ internal/agent/tools/federation.go | 85 + .../container => agent/tools}/fsops.go | 51 +- internal/agent/tools/inbox.go | 114 + internal/agent/tools/memory.go | 126 + internal/agent/tools/message.go | 399 ++++ internal/agent/tools/prune.go | 32 + internal/agent/tools/schedule.go | 246 ++ internal/agent/tools/skill.go | 59 + internal/agent/tools/subagent.go | 300 +++ internal/agent/tools/tts.go | 153 ++ internal/agent/tools/types.go | 125 + internal/agent/tools/web.go | 910 +++++++ internal/agent/tools/webfetch.go | 182 ++ internal/agent/types.go | 158 ++ internal/bun/runtime/manager.go | 259 -- internal/channel/inbound/channel.go | 44 +- internal/command/formatter.go | 9 +- internal/command/fs.go | 7 +- internal/config/config.go | 22 - internal/conversation/flow/resolver.go | 2104 +---------------- .../conversation/flow/resolver_attachments.go | 255 ++ .../conversation/flow/resolver_history.go | 160 ++ .../conversation/flow/resolver_identity.go | 88 + internal/conversation/flow/resolver_memory.go | 97 + .../conversation/flow/resolver_messages.go | 84 + .../flow/resolver_model_selection.go | 119 + .../conversation/flow/resolver_settings.go | 69 + .../conversation/flow/resolver_skills_test.go | 32 - internal/conversation/flow/resolver_store.go | 238 ++ internal/conversation/flow/resolver_stream.go | 170 ++ .../flow/resolver_stream_order_test.go | 142 -- internal/conversation/flow/resolver_test.go | 259 -- .../conversation/flow/resolver_trigger.go | 126 + internal/conversation/flow/resolver_util.go | 200 ++ internal/conversation/flow/user_header.go | 123 + internal/conversation/types.go | 2 + internal/db/sqlc/media.sql.go | 30 +- internal/db/sqlc/models.go | 2 + internal/embedded/agent/.gitignore | 2 - internal/embedded/assets.go | 6 +- internal/embedded/bun/.gitignore | 2 - internal/handlers/local_channel.go | 99 +- internal/handlers/mcp_tools_test.go | 2 +- internal/mcp/providers/contacts/provider.go | 108 - .../mcp/providers/container/fsops_test.go | 100 - internal/mcp/providers/container/provider.go | 324 --- .../mcp/providers/container/provider_test.go | 516 ---- internal/mcp/providers/container/prune.go | 38 - internal/mcp/providers/email/provider.go | 303 --- internal/mcp/providers/inbox/provider.go | 149 -- internal/mcp/providers/memory/provider.go | 75 - .../mcp/providers/memory/provider_test.go | 148 -- internal/mcp/providers/message/provider.go | 4 +- .../mcp/providers/message/provider_test.go | 583 ----- internal/mcp/providers/schedule/provider.go | 259 -- .../mcp/providers/schedule/provider_test.go | 374 --- internal/mcp/providers/skill/provider.go | 72 - internal/mcp/providers/subagent/provider.go | 358 --- internal/mcp/providers/tts/provider.go | 199 -- internal/mcp/providers/web/provider.go | 1166 --------- internal/mcp/providers/webfetch/provider.go | 219 -- internal/mcp/tool_gateway_service.go | 47 +- internal/mcp/tool_gateway_service_test.go | 8 +- internal/mcp/tool_registry.go | 18 +- internal/mcp/tool_types.go | 11 +- internal/message/service.go | 63 +- internal/message/types.go | 29 +- mise.toml | 4 +- packages/agent/.gitignore | 34 - packages/agent/README.md | 2 - packages/agent/package.json | 29 - packages/agent/src/agent.test.ts | 88 - packages/agent/src/agent.ts | 798 ------- packages/agent/src/index.ts | 6 - packages/agent/src/model.ts | 24 - packages/agent/src/prompts/heartbeat.ts | 36 - packages/agent/src/prompts/index.ts | 5 - packages/agent/src/prompts/schedule.ts | 24 - packages/agent/src/prompts/subagent.ts | 21 - packages/agent/src/prompts/system.ts | 322 --- packages/agent/src/prompts/utils.ts | 7 - packages/agent/src/sential.test.ts | 265 --- packages/agent/src/sential.ts | 506 ---- packages/agent/src/tool-loop.test.ts | 93 - packages/agent/src/tool-loop.ts | 122 - packages/agent/src/tools/index.ts | 1 - packages/agent/src/tools/mcp.ts | 101 - packages/agent/src/types/action.ts | 105 - packages/agent/src/types/agent.ts | 66 - packages/agent/src/types/attachment.ts | 38 - packages/agent/src/types/auth.ts | 4 - packages/agent/src/types/heartbeat.ts | 3 - packages/agent/src/types/index.ts | 8 - packages/agent/src/types/mcp.ts | 29 - packages/agent/src/types/model.ts | 33 - packages/agent/src/types/schedule.ts | 8 - packages/agent/src/utils/attachments.ts | 181 -- packages/agent/src/utils/fs.ts | 207 -- packages/agent/src/utils/headers.ts | 27 - packages/agent/src/utils/image-parts.ts | 144 -- packages/agent/src/utils/index.ts | 4 - packages/agent/src/utils/reactions.ts | 28 - .../src/utils/read-media-injector.test.ts | 118 - .../agent/src/utils/read-media-injector.ts | 122 - packages/agent/src/utils/speech.ts | 16 - packages/agent/src/utils/tag-extractor.ts | 183 -- packages/agent/tsconfig.json | 22 - pnpm-lock.yaml | 945 +------- scripts/release.sh | 168 +- skills-lock.json | 2 +- 169 files changed, 7988 insertions(+), 14436 deletions(-) delete mode 100644 apps/agent/.gitignore delete mode 100644 apps/agent/README.md delete mode 100644 apps/agent/bun.lock delete mode 100644 apps/agent/mise.toml delete mode 100644 apps/agent/package.json delete mode 100644 apps/agent/src/index.ts delete mode 100644 apps/agent/src/middlewares/bearer.ts delete mode 100644 apps/agent/src/middlewares/cors.ts delete mode 100644 apps/agent/src/middlewares/error.ts delete mode 100644 apps/agent/src/models.ts delete mode 100644 apps/agent/src/modules/chat.ts delete mode 100644 apps/agent/src/utils/sse.ts delete mode 100644 apps/agent/tsconfig.json create mode 100644 db/migrations/0034_asset_name.down.sql create mode 100644 db/migrations/0034_asset_name.up.sql create mode 100644 db/migrations/0035_asset_metadata.down.sql create mode 100644 db/migrations/0035_asset_metadata.up.sql delete mode 100644 docker/Dockerfile.agent create mode 100644 internal/agent/agent.go create mode 100644 internal/agent/config.go create mode 100644 internal/agent/fs.go create mode 100644 internal/agent/model.go create mode 100644 internal/agent/prompt.go create mode 100644 internal/agent/prompts/heartbeat.md create mode 100644 internal/agent/prompts/schedule.md create mode 100644 internal/agent/prompts/subagent.md create mode 100644 internal/agent/prompts/system.md create mode 100644 internal/agent/sential.go create mode 100644 internal/agent/stream.go create mode 100644 internal/agent/tags.go rename internal/{mcp/providers/browser/provider.go => agent/tools/browser.go} (60%) create mode 100644 internal/agent/tools/contacts.go create mode 100644 internal/agent/tools/container.go create mode 100644 internal/agent/tools/email.go create mode 100644 internal/agent/tools/federation.go rename internal/{mcp/providers/container => agent/tools}/fsops.go (73%) create mode 100644 internal/agent/tools/inbox.go create mode 100644 internal/agent/tools/memory.go create mode 100644 internal/agent/tools/message.go create mode 100644 internal/agent/tools/prune.go create mode 100644 internal/agent/tools/schedule.go create mode 100644 internal/agent/tools/skill.go create mode 100644 internal/agent/tools/subagent.go create mode 100644 internal/agent/tools/tts.go create mode 100644 internal/agent/tools/types.go create mode 100644 internal/agent/tools/web.go create mode 100644 internal/agent/tools/webfetch.go create mode 100644 internal/agent/types.go delete mode 100644 internal/bun/runtime/manager.go create mode 100644 internal/conversation/flow/resolver_attachments.go create mode 100644 internal/conversation/flow/resolver_history.go create mode 100644 internal/conversation/flow/resolver_identity.go create mode 100644 internal/conversation/flow/resolver_memory.go create mode 100644 internal/conversation/flow/resolver_messages.go create mode 100644 internal/conversation/flow/resolver_model_selection.go create mode 100644 internal/conversation/flow/resolver_settings.go delete mode 100644 internal/conversation/flow/resolver_skills_test.go create mode 100644 internal/conversation/flow/resolver_store.go create mode 100644 internal/conversation/flow/resolver_stream.go delete mode 100644 internal/conversation/flow/resolver_stream_order_test.go create mode 100644 internal/conversation/flow/resolver_trigger.go create mode 100644 internal/conversation/flow/resolver_util.go create mode 100644 internal/conversation/flow/user_header.go delete mode 100644 internal/embedded/agent/.gitignore delete mode 100644 internal/embedded/bun/.gitignore delete mode 100644 internal/mcp/providers/contacts/provider.go delete mode 100644 internal/mcp/providers/container/fsops_test.go delete mode 100644 internal/mcp/providers/container/provider.go delete mode 100644 internal/mcp/providers/container/provider_test.go delete mode 100644 internal/mcp/providers/container/prune.go delete mode 100644 internal/mcp/providers/email/provider.go delete mode 100644 internal/mcp/providers/inbox/provider.go delete mode 100644 internal/mcp/providers/memory/provider.go delete mode 100644 internal/mcp/providers/memory/provider_test.go delete mode 100644 internal/mcp/providers/message/provider_test.go delete mode 100644 internal/mcp/providers/schedule/provider.go delete mode 100644 internal/mcp/providers/schedule/provider_test.go delete mode 100644 internal/mcp/providers/skill/provider.go delete mode 100644 internal/mcp/providers/subagent/provider.go delete mode 100644 internal/mcp/providers/tts/provider.go delete mode 100644 internal/mcp/providers/web/provider.go delete mode 100644 internal/mcp/providers/webfetch/provider.go delete mode 100644 packages/agent/.gitignore delete mode 100644 packages/agent/README.md delete mode 100644 packages/agent/package.json delete mode 100644 packages/agent/src/agent.test.ts delete mode 100644 packages/agent/src/agent.ts delete mode 100644 packages/agent/src/index.ts delete mode 100644 packages/agent/src/model.ts delete mode 100644 packages/agent/src/prompts/heartbeat.ts delete mode 100644 packages/agent/src/prompts/index.ts delete mode 100644 packages/agent/src/prompts/schedule.ts delete mode 100644 packages/agent/src/prompts/subagent.ts delete mode 100644 packages/agent/src/prompts/system.ts delete mode 100644 packages/agent/src/prompts/utils.ts delete mode 100644 packages/agent/src/sential.test.ts delete mode 100644 packages/agent/src/sential.ts delete mode 100644 packages/agent/src/tool-loop.test.ts delete mode 100644 packages/agent/src/tool-loop.ts delete mode 100644 packages/agent/src/tools/index.ts delete mode 100644 packages/agent/src/tools/mcp.ts delete mode 100644 packages/agent/src/types/action.ts delete mode 100644 packages/agent/src/types/agent.ts delete mode 100644 packages/agent/src/types/attachment.ts delete mode 100644 packages/agent/src/types/auth.ts delete mode 100644 packages/agent/src/types/heartbeat.ts delete mode 100644 packages/agent/src/types/index.ts delete mode 100644 packages/agent/src/types/mcp.ts delete mode 100644 packages/agent/src/types/model.ts delete mode 100644 packages/agent/src/types/schedule.ts delete mode 100644 packages/agent/src/utils/attachments.ts delete mode 100644 packages/agent/src/utils/fs.ts delete mode 100644 packages/agent/src/utils/headers.ts delete mode 100644 packages/agent/src/utils/image-parts.ts delete mode 100644 packages/agent/src/utils/index.ts delete mode 100644 packages/agent/src/utils/reactions.ts delete mode 100644 packages/agent/src/utils/read-media-injector.test.ts delete mode 100644 packages/agent/src/utils/read-media-injector.ts delete mode 100644 packages/agent/src/utils/speech.ts delete mode 100644 packages/agent/src/utils/tag-extractor.ts delete mode 100644 packages/agent/tsconfig.json diff --git a/.agents/skills/twilight-ai/SKILL.md b/.agents/skills/twilight-ai/SKILL.md index d9708274..fbbffaf9 100644 --- a/.agents/skills/twilight-ai/SKILL.md +++ b/.agents/skills/twilight-ai/SKILL.md @@ -22,6 +22,7 @@ Twilight AI is a lightweight Go AI SDK with a provider-agnostic core API. - Text generation: `sdk.GenerateText`, `sdk.GenerateTextResult`, `sdk.StreamText` - Embeddings: `sdk.Embed`, `sdk.EmbedMany` - Tool calling: `sdk.Tool`, `sdk.NewTool[T]`, `WithMaxSteps`, approval flow +- MCP tool integration: `sdk.CreateMCPClient`, `sdk.MCPClient`, `sdk.MCPClientConfig` - Streaming: typed `StreamPart` events over Go channels - Current providers: - `provider/openai/completions` @@ -38,6 +39,7 @@ Prefer the high-level SDK API first, then drop to provider details only when nee - `sdk.Model` binds a chat model to a `sdk.Provider` - `sdk.EmbeddingModel` binds an embedding model to an `sdk.EmbeddingProvider` - The client orchestrates tool loops, callbacks, approvals, and streaming lifecycle +- MCP clients can load remote MCP tools and turn them into ordinary `sdk.Tool` values - Providers handle backend-specific HTTP, request mapping, response parsing, and SSE translation ## Core API Guidance @@ -114,6 +116,26 @@ When streaming with tools, ensure the implementation can emit: - progress updates - denial/error events when applicable +### MCP Tool Calling + +Use MCP when the task needs remote tools exposed by an MCP server rather than locally implemented `Execute` handlers. + +Default guidance: + +- use `sdk.CreateMCPClient(ctx, &sdk.MCPClientConfig{...})` +- use `sdk.MCPTransportHTTP` for streamable HTTP MCP servers +- use `sdk.MCPTransportSSE` only when the server exposes legacy SSE transport +- for stdio, build the transport with the official MCP Go SDK and pass `Transport: ...` +- call `mcpClient.Tools(ctx)` and pass the result into `sdk.WithTools(...)` +- call `defer mcpClient.Close()` after successful creation + +Important behavior: + +- MCP tools become ordinary `sdk.Tool` values from the caller's perspective +- Twilight AI converts MCP `InputSchema` into `*jsonschema.Schema` +- MCP tool execution is delegated to `tools/call` on the remote server +- remote MCP text output becomes the tool result visible to the model + ### Streaming Twilight AI streaming is channel-first and type-safe. Prefer type switches over loosely typed event parsing. @@ -198,6 +220,7 @@ Before finishing work in this repo, verify: - public examples use top-level `sdk` APIs unless lower-level behavior is the point - streaming logic uses typed `StreamPart` handling - tool-calling changes cover both inspection mode and multi-step mode when relevant +- MCP examples show both transport setup and normal `WithTools(...)` usage when relevant - provider work includes health checks or model discovery behavior if the backend supports them ## Additional Resources diff --git a/.agents/skills/twilight-ai/reference.md b/.agents/skills/twilight-ai/reference.md index c747a6ea..a19f59da 100644 --- a/.agents/skills/twilight-ai/reference.md +++ b/.agents/skills/twilight-ai/reference.md @@ -309,6 +309,42 @@ type ToolResult struct { } ``` +### MCP + +```go +type MCPTransportType string + +const ( + MCPTransportHTTP MCPTransportType = "http" + MCPTransportSSE MCPTransportType = "sse" +) + +type MCPClientConfig struct { + Type MCPTransportType + URL string + Headers map[string]string + Transport mcp.Transport + HTTPClient *http.Client + Name string + Version string +} + +type MCPClient struct { /* unexported fields */ } + +func CreateMCPClient(ctx context.Context, config *MCPClientConfig) (*MCPClient, error) +func (c *MCPClient) Tools(ctx context.Context) ([]Tool, error) +func (c *MCPClient) Close() error +``` + +Usage notes: + +- `MCPTransportHTTP` is the default built-in transport and uses the official MCP Go SDK's streamable HTTP client transport. +- `MCPTransportSSE` uses the official MCP Go SDK's SSE client transport. +- For stdio or other custom transports, create the transport with `github.com/modelcontextprotocol/go-sdk/mcp` and pass it through `Transport`. +- `Tools(ctx)` converts remote MCP tools into ordinary `sdk.Tool` values suitable for `WithTools(...)`. +- MCP tool schemas are converted from MCP `InputSchema` into `*jsonschema.Schema`. +- MCP execution wrappers call `tools/call` and return concatenated text content to the model. + ### Streaming ```go diff --git a/.github/workflows/docker.yml b/.github/workflows/docker.yml index fff2cb83..532f09af 100644 --- a/.github/workflows/docker.yml +++ b/.github/workflows/docker.yml @@ -36,13 +36,11 @@ jobs: strategy: fail-fast: false matrix: - image: [server, agent, web, browser, sparse] + image: [server, web, browser, sparse] platform: [linux/amd64, linux/arm64] include: - image: server dockerfile: docker/Dockerfile.server - - image: agent - dockerfile: docker/Dockerfile.agent - image: web dockerfile: docker/Dockerfile.web - image: browser @@ -132,7 +130,7 @@ jobs: needs: build strategy: matrix: - image: [server, agent, web, browser, sparse] + image: [server, web, browser, sparse] steps: - name: Download digests uses: actions/download-artifact@v4 diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml index da08b8ce..84d855db 100644 --- a/.github/workflows/release.yml +++ b/.github/workflows/release.yml @@ -58,15 +58,10 @@ jobs: with: node-version: lts/* - - uses: oven-sh/setup-bun@v2 - - uses: actions/setup-go@v5 with: go-version-file: go.mod - - name: Install UPX - run: sudo apt-get update && sudo apt-get install -y upx-ucl - - name: Install JS dependencies run: pnpm install --frozen-lockfile @@ -77,9 +72,6 @@ jobs: VERSION: ${{ github.ref_name }} COMMIT_HASH: ${{ github.sha }} OUTPUT_DIR: dist - UPX_COMPRESS_AGENT_BIN: "true" - AUTO_INSTALL_UPX: "true" - UPX_ARGS: "-3" run: scripts/release.sh - name: Upload release assets diff --git a/apps/agent/.gitignore b/apps/agent/.gitignore deleted file mode 100644 index f3e107b2..00000000 --- a/apps/agent/.gitignore +++ /dev/null @@ -1,44 +0,0 @@ -# See https://help.github.com/articles/ignoring-files/ for more about ignoring files. - -# dependencies -/node_modules -/.pnp -.pnp.js - -# testing -/coverage - -# next.js -/.next/ -/out/ - -# production -/build - -# misc -.DS_Store -*.pem - -# debug -npm-debug.log* -yarn-debug.log* -yarn-error.log* - -# local env files -.env.local -.env.development.local -.env.test.local -.env.production.local - -# vercel -.vercel - -**/*.trace -**/*.zip -**/*.tar.gz -**/*.tgz -**/*.log -package-lock.json -**/*.bun - -dist/ \ No newline at end of file diff --git a/apps/agent/README.md b/apps/agent/README.md deleted file mode 100644 index ce152351..00000000 --- a/apps/agent/README.md +++ /dev/null @@ -1 +0,0 @@ -# @memoh/agent-gateway \ No newline at end of file diff --git a/apps/agent/bun.lock b/apps/agent/bun.lock deleted file mode 100644 index 9fd15e7e..00000000 --- a/apps/agent/bun.lock +++ /dev/null @@ -1,419 +0,0 @@ -{ - "lockfileVersion": 1, - "configVersion": 1, - "workspaces": { - "": { - "name": "agent", - "dependencies": { - "@ai-sdk/amazon-bedrock": "^4.0.56", - "@ai-sdk/anthropic": "^3.0.9", - "@ai-sdk/azure": "^3.0.28", - "@ai-sdk/google": "^3.0.6", - "@ai-sdk/mcp": "^1.0.6", - "@ai-sdk/mistral": "^3.0.19", - "@ai-sdk/openai": "^3.0.7", - "@ai-sdk/xai": "^3.0.54", - "@elysiajs/bearer": "^1.4.2", - "@elysiajs/cors": "^1.4.1", - "@modelcontextprotocol/sdk": "^1.25.2", - "@mozilla/readability": "^0.6.0", - "@types/jsdom": "^27.0.0", - "@types/turndown": "^5.0.6", - "ai": "^6.0.25", - "elysia": "latest", - "jsdom": "^27.4.0", - "toml": "^3.0.0", - "turndown": "^7.2.2", - "zod": "^4.3.5", - }, - "devDependencies": { - "bun-types": "latest", - }, - }, - }, - "packages": { - "@acemir/cssom": ["@acemir/cssom@0.9.31", "", {}, "sha512-ZnR3GSaH+/vJ0YlHau21FjfLYjMpYVIzTD8M8vIEQvIGxeOXyXdzCI140rrCY862p/C/BbzWsjc1dgnM9mkoTA=="], - - "@ai-sdk/amazon-bedrock": ["@ai-sdk/amazon-bedrock@4.0.56", "", { "dependencies": { "@ai-sdk/anthropic": "3.0.42", "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14", "@smithy/eventstream-codec": "^4.0.1", "@smithy/util-utf8": "^4.0.0", "aws4fetch": "^1.0.20" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-LOJud09s2zUFYsMGOHv55m3ERxDrZ6+1PpcsihWUPloA4DcXJZIVRABck9OCU5NvUWR75jxsymg/+p79ox6IOw=="], - - "@ai-sdk/anthropic": ["@ai-sdk/anthropic@3.0.42", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-snoLXB9DmvAmmngbPN/Io8IGzZ9zWpC208EgIIztYf1e1JhwuMkgKCYkL30vGhSen4PrBafu2+sO4G/17wu45A=="], - - "@ai-sdk/azure": ["@ai-sdk/azure@3.0.28", "", { "dependencies": { "@ai-sdk/openai": "3.0.27", "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-FDzx/MF7M9boJ7pB5zMUwbGU2HadYM0ZsI7b/sPiHKecwirk7Endl9RtwGrfjauPDrRnP0W+djMc2NhKwp0B8w=="], - - "@ai-sdk/gateway": ["@ai-sdk/gateway@3.0.42", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14", "@vercel/oidc": "3.1.0" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-Il9lZWPUQMX59H5yJvA08gxfL2Py8oHwvAYRnK0Mt91S+JgPcyk/yEmXNDZG9ghJrwSawtK5Yocy8OnzsTOGsw=="], - - "@ai-sdk/google": ["@ai-sdk/google@3.0.26", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-u4yOLP1nDbrGbfpWiwCFLTPVDq3ak0qNgSnf3HB+j2NpJoJCX9gApzyYnYm2CRB8IDiyaeT6Xcjv9IIOv1mTYQ=="], - - "@ai-sdk/mcp": ["@ai-sdk/mcp@1.0.20", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14", "pkce-challenge": "^5.0.0" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-wrPYSPY2oigua7QlW6ou3ZzggV7Xpf8sJJKWGbiXSjHz9ycZaewP3lB4QnFlp7hyJFj2+9mKqxis9ibk9ODeoQ=="], - - "@ai-sdk/mistral": ["@ai-sdk/mistral@3.0.19", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-yd0OJ3fm2YKdwxh1pd9m720sENVVcylAD+Bki8C80QqVpUxGNL1/C4N4JJGb56eCCWr6VU/3gHFe9PKui9n/Hg=="], - - "@ai-sdk/openai": ["@ai-sdk/openai@3.0.27", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-pLMxWOypwroXiK9dxNpn60/HGhWWWDEOJ3lo9vZLoxvpJNtKnLKojwVIvlW3yEjlD7ll1+jUO2uzsABNTaP5Yg=="], - - "@ai-sdk/openai-compatible": ["@ai-sdk/openai-compatible@2.0.29", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-yoZ+jxBzVA7cQIrcOn2ZXAVeqsNdhqFGWW3VwTJwNnmeOLCACoz6+pu58LY/zjxEJGVrl5X+JazdY5LhDMey8A=="], - - "@ai-sdk/provider": ["@ai-sdk/provider@3.0.8", "", { "dependencies": { "json-schema": "^0.4.0" } }, "sha512-oGMAgGoQdBXbZqNG0Ze56CHjDZ1IDYOwGYxYjO5KLSlz5HiNQ9udIXsPZ61VWaHGZ5XW/jyjmr6t2xz2jGVwbQ=="], - - "@ai-sdk/provider-utils": ["@ai-sdk/provider-utils@4.0.14", "", { "dependencies": { "@ai-sdk/provider": "3.0.8", "@standard-schema/spec": "^1.1.0", "eventsource-parser": "^3.0.6" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-7bzKd9lgiDeXM7O4U4nQ8iTxguAOkg8LZGD9AfDVZYjO5cKYRwBPwVjboFcVrxncRHu0tYxZtXZtiLKpG4pEng=="], - - "@ai-sdk/xai": ["@ai-sdk/xai@3.0.54", "", { "dependencies": { "@ai-sdk/openai-compatible": "2.0.29", "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-zWKj5v3cmJAc5aZ4iY1jB2rmqMhXVLdMHAk76srSByTmPnssGaW9XNvIgyCAD2WD1g14P3YjRg1FTK+K2Z2Bjw=="], - - "@asamuzakjp/css-color": ["@asamuzakjp/css-color@4.1.2", "", { "dependencies": { "@csstools/css-calc": "^3.0.0", "@csstools/css-color-parser": "^4.0.1", "@csstools/css-parser-algorithms": "^4.0.0", "@csstools/css-tokenizer": "^4.0.0", "lru-cache": "^11.2.5" } }, "sha512-NfBUvBaYgKIuq6E/RBLY1m0IohzNHAYyaJGuTK79Z23uNwmz2jl1mPsC5ZxCCxylinKhT1Amn5oNTlx1wN8cQg=="], - - "@asamuzakjp/dom-selector": ["@asamuzakjp/dom-selector@6.7.8", "", { "dependencies": { "@asamuzakjp/nwsapi": "^2.3.9", "bidi-js": "^1.0.3", "css-tree": "^3.1.0", "is-potential-custom-element-name": "^1.0.1", "lru-cache": "^11.2.5" } }, "sha512-stisC1nULNc9oH5lakAj8MH88ZxeGxzyWNDfbdCxvJSJIvDsHNZqYvscGTgy/ysgXWLJPt6K/4t0/GjvtKcFJQ=="], - - "@asamuzakjp/nwsapi": ["@asamuzakjp/nwsapi@2.3.9", "", {}, "sha512-n8GuYSrI9bF7FFZ/SjhwevlHc8xaVlb/7HmHelnc/PZXBD2ZR49NnN9sMMuDdEGPeeRQ5d0hqlSlEpgCX3Wl0Q=="], - - "@aws-crypto/crc32": ["@aws-crypto/crc32@5.2.0", "", { "dependencies": { "@aws-crypto/util": "^5.2.0", "@aws-sdk/types": "^3.222.0", "tslib": "^2.6.2" } }, "sha512-nLbCWqQNgUiwwtFsen1AdzAtvuLRsQS8rYgMuxCrdKf9kOssamGLuPwyTY9wyYblNr9+1XM8v6zoDTPPSIeANg=="], - - "@aws-crypto/util": ["@aws-crypto/util@5.2.0", "", { "dependencies": { "@aws-sdk/types": "^3.222.0", "@smithy/util-utf8": "^2.0.0", "tslib": "^2.6.2" } }, "sha512-4RkU9EsI6ZpBve5fseQlGNUWKMa1RLPQ1dnjnQoe07ldfIzcsGb5hC5W0Dm7u423KWzawlrpbjXBrXCEv9zazQ=="], - - "@aws-sdk/types": ["@aws-sdk/types@3.973.1", "", { "dependencies": { "@smithy/types": "^4.12.0", "tslib": "^2.6.2" } }, "sha512-DwHBiMNOB468JiX6+i34c+THsKHErYUdNQ3HexeXZvVn4zouLjgaS4FejiGSi2HyBuzuyHg7SuOPmjSvoU9NRg=="], - - "@borewit/text-codec": ["@borewit/text-codec@0.2.1", "https://registry.npmmirror.com/@borewit/text-codec/-/text-codec-0.2.1.tgz", {}, "sha512-k7vvKPbf7J2fZ5klGRD9AeKfUvojuZIQ3BT5u7Jfv+puwXkUBUT5PVyMDfJZpy30CBDXGMgw7fguK/lpOMBvgw=="], - - "@csstools/color-helpers": ["@csstools/color-helpers@6.0.1", "", {}, "sha512-NmXRccUJMk2AWA5A7e5a//3bCIMyOu2hAtdRYrhPPHjDxINuCwX1w6rnIZ4xjLcp0ayv6h8Pc3X0eJUGiAAXHQ=="], - - "@csstools/css-calc": ["@csstools/css-calc@3.0.1", "", { "peerDependencies": { "@csstools/css-parser-algorithms": "^4.0.0", "@csstools/css-tokenizer": "^4.0.0" } }, "sha512-bsDKIP6f4ta2DO9t+rAbSSwv4EMESXy5ZIvzQl1afmD6Z1XHkVu9ijcG9QR/qSgQS1dVa+RaQ/MfQ7FIB/Dn1Q=="], - - "@csstools/css-color-parser": ["@csstools/css-color-parser@4.0.1", "", { "dependencies": { "@csstools/color-helpers": "^6.0.1", "@csstools/css-calc": "^3.0.0" }, "peerDependencies": { "@csstools/css-parser-algorithms": "^4.0.0", "@csstools/css-tokenizer": "^4.0.0" } }, "sha512-vYwO15eRBEkeF6xjAno/KQ61HacNhfQuuU/eGwH67DplL0zD5ZixUa563phQvUelA07yDczIXdtmYojCphKJcw=="], - - "@csstools/css-parser-algorithms": ["@csstools/css-parser-algorithms@4.0.0", "", { "peerDependencies": { "@csstools/css-tokenizer": "^4.0.0" } }, "sha512-+B87qS7fIG3L5h3qwJ/IFbjoVoOe/bpOdh9hAjXbvx0o8ImEmUsGXN0inFOnk2ChCFgqkkGFQ+TpM5rbhkKe4w=="], - - "@csstools/css-syntax-patches-for-csstree": ["@csstools/css-syntax-patches-for-csstree@1.0.27", "", {}, "sha512-sxP33Jwg1bviSUXAV43cVYdmjt2TLnLXNqCWl9xmxHawWVjGz/kEbdkr7F9pxJNBN2Mh+dq0crgItbW6tQvyow=="], - - "@csstools/css-tokenizer": ["@csstools/css-tokenizer@4.0.0", "", {}, "sha512-QxULHAm7cNu72w97JUNCBFODFaXpbDg+dP8b/oWFAZ2MTRppA3U00Y2L1HqaS4J6yBqxwa/Y3nMBaxVKbB/NsA=="], - - "@elysiajs/bearer": ["@elysiajs/bearer@1.4.3", "", { "peerDependencies": { "elysia": ">= 1.4.3" } }, "sha512-UWJ94jGGOzSlD3CCspC11/vFGKwy6RI9QvaZVPzlSu1Wxp/pKmOhKA+R2ppfbluMHXfxcc2xgK3x4+uuCML7GA=="], - - "@elysiajs/cors": ["@elysiajs/cors@1.4.1", "", { "peerDependencies": { "elysia": ">= 1.4.0" } }, "sha512-lQfad+F3r4mNwsxRKbXyJB8Jg43oAOXjRwn7sKUL6bcOW3KjUqUimTS+woNpO97efpzjtDE0tEjGk9DTw8lqTQ=="], - - "@exodus/bytes": ["@exodus/bytes@1.14.0", "", { "peerDependencies": { "@noble/hashes": "^1.8.0 || ^2.0.0" }, "optionalPeers": ["@noble/hashes"] }, "sha512-YiY1OmY6Qhkvmly8vZiD8wZRpW/npGZNg+0Sk8mstxirRHCg6lolHt5tSODCfuNPE/fBsAqRwDJE417x7jDDHA=="], - - "@hono/node-server": ["@hono/node-server@1.19.9", "", { "peerDependencies": { "hono": "^4" } }, "sha512-vHL6w3ecZsky+8P5MD+eFfaGTyCeOHUIFYMGpQGbrBTSmNNoxv0if69rEZ5giu36weC5saFuznL411gRX7bJDw=="], - - "@mixmark-io/domino": ["@mixmark-io/domino@2.2.0", "", {}, "sha512-Y28PR25bHXUg88kCV7nivXrP2Nj2RueZ3/l/jdx6J9f8J4nsEGcgX0Qe6lt7Pa+J79+kPiJU3LguR6O/6zrLOw=="], - - "@modelcontextprotocol/sdk": ["@modelcontextprotocol/sdk@1.26.0", "", { "dependencies": { "@hono/node-server": "^1.19.9", "ajv": "^8.17.1", "ajv-formats": "^3.0.1", "content-type": "^1.0.5", "cors": "^2.8.5", "cross-spawn": "^7.0.5", "eventsource": "^3.0.2", "eventsource-parser": "^3.0.0", "express": "^5.2.1", "express-rate-limit": "^8.2.1", "hono": "^4.11.4", "jose": "^6.1.3", "json-schema-typed": "^8.0.2", "pkce-challenge": "^5.0.0", "raw-body": "^3.0.0", "zod": "^3.25 || ^4.0", "zod-to-json-schema": "^3.25.1" }, "peerDependencies": { "@cfworker/json-schema": "^4.1.1" }, "optionalPeers": ["@cfworker/json-schema"] }, "sha512-Y5RmPncpiDtTXDbLKswIJzTqu2hyBKxTNsgKqKclDbhIgg1wgtf1fRuvxgTnRfcnxtvvgbIEcqUOzZrJ6iSReg=="], - - "@mozilla/readability": ["@mozilla/readability@0.6.0", "", {}, "sha512-juG5VWh4qAivzTAeMzvY9xs9HY5rAcr2E4I7tiSSCokRFi7XIZCAu92ZkSTsIj1OPceCifL3cpfteP3pDT9/QQ=="], - - "@opentelemetry/api": ["@opentelemetry/api@1.9.0", "", {}, "sha512-3giAOQvZiH5F9bMlMiv8+GSPMeqg0dbaeo58/0SlA9sxSqZhnUtxzX9/2FzyhS9sWQf5S0GJE0AKBrFqjpeYcg=="], - - "@sinclair/typebox": ["@sinclair/typebox@0.34.48", "https://registry.npmmirror.com/@sinclair/typebox/-/typebox-0.34.48.tgz", {}, "sha512-kKJTNuK3AQOrgjjotVxMrCn1sUJwM76wMszfq1kdU4uYVJjvEWuFQ6HgvLt4Xz3fSmZlTOxJ/Ie13KnIcWQXFA=="], - - "@smithy/eventstream-codec": ["@smithy/eventstream-codec@4.2.8", "", { "dependencies": { "@aws-crypto/crc32": "5.2.0", "@smithy/types": "^4.12.0", "@smithy/util-hex-encoding": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-jS/O5Q14UsufqoGhov7dHLOPCzkYJl9QDzusI2Psh4wyYx/izhzvX9P4D69aTxcdfVhEPhjK+wYyn/PzLjKbbw=="], - - "@smithy/is-array-buffer": ["@smithy/is-array-buffer@4.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-DZZZBvC7sjcYh4MazJSGiWMI2L7E0oCiRHREDzIxi/M2LY79/21iXt6aPLHge82wi5LsuRF5A06Ds3+0mlh6CQ=="], - - "@smithy/types": ["@smithy/types@4.12.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-9YcuJVTOBDjg9LWo23Qp0lTQ3D7fQsQtwle0jVfpbUHy9qBwCEgKuVH4FqFB3VYu0nwdHKiEMA+oXz7oV8X1kw=="], - - "@smithy/util-buffer-from": ["@smithy/util-buffer-from@4.2.0", "", { "dependencies": { "@smithy/is-array-buffer": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-kAY9hTKulTNevM2nlRtxAG2FQ3B2OR6QIrPY3zE5LqJy1oxzmgBGsHLWTcNhWXKchgA0WHW+mZkQrng/pgcCew=="], - - "@smithy/util-hex-encoding": ["@smithy/util-hex-encoding@4.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-CCQBwJIvXMLKxVbO88IukazJD9a4kQ9ZN7/UMGBjBcJYvatpWk+9g870El4cB8/EJxfe+k+y0GmR9CAzkF+Nbw=="], - - "@smithy/util-utf8": ["@smithy/util-utf8@4.2.0", "", { "dependencies": { "@smithy/util-buffer-from": "^4.2.0", "tslib": "^2.6.2" } }, "sha512-zBPfuzoI8xyBtR2P6WQj63Rz8i3AmfAaJLuNG8dWsfvPe8lO4aCPYLn879mEgHndZH1zQ2oXmG8O1GGzzaoZiw=="], - - "@standard-schema/spec": ["@standard-schema/spec@1.1.0", "", {}, "sha512-l2aFy5jALhniG5HgqrD6jXLi/rUWrKvqN/qJx6yoJsgKhblVd+iqqU4RCXavm/jPityDo5TCvKMnpjKnOriy0w=="], - - "@tokenizer/inflate": ["@tokenizer/inflate@0.4.1", "https://registry.npmmirror.com/@tokenizer/inflate/-/inflate-0.4.1.tgz", { "dependencies": { "debug": "^4.4.3", "token-types": "^6.1.1" } }, "sha512-2mAv+8pkG6GIZiF1kNg1jAjh27IDxEPKwdGul3snfztFerfPGI1LjDezZp3i7BElXompqEtPmoPx6c2wgtWsOA=="], - - "@tokenizer/token": ["@tokenizer/token@0.3.0", "https://registry.npmmirror.com/@tokenizer/token/-/token-0.3.0.tgz", {}, "sha512-OvjF+z51L3ov0OyAU0duzsYuvO01PH7x4t6DJx+guahgTnBHkhJdG7soQeTSFLWN3efnHyibZ4Z8l2EuWwJN3A=="], - - "@types/jsdom": ["@types/jsdom@27.0.0", "", { "dependencies": { "@types/node": "*", "@types/tough-cookie": "*", "parse5": "^7.0.0" } }, "sha512-NZyFl/PViwKzdEkQg96gtnB8wm+1ljhdDay9ahn4hgb+SfVtPCbm3TlmDUFXTA+MGN3CijicnMhG18SI5H3rFw=="], - - "@types/node": ["@types/node@25.0.10", "https://registry.npmmirror.com/@types/node/-/node-25.0.10.tgz", { "dependencies": { "undici-types": "~7.16.0" } }, "sha512-zWW5KPngR/yvakJgGOmZ5vTBemDoSqF3AcV/LrO5u5wTWyEAVVh+IT39G4gtyAkh3CtTZs8aX/yRM82OfzHJRg=="], - - "@types/tough-cookie": ["@types/tough-cookie@4.0.5", "", {}, "sha512-/Ad8+nIOV7Rl++6f1BdKxFSMgmoqEoYbHRpPcx3JEfv8VRsQe9Z4mCXeJBzxs7mbHY/XOZZuXlRNfhpVPbs6ZA=="], - - "@types/turndown": ["@types/turndown@5.0.6", "", {}, "sha512-ru00MoyeeouE5BX4gRL+6m/BsDfbRayOskWqUvh7CLGW+UXxHQItqALa38kKnOiZPqJrtzJUgAC2+F0rL1S4Pg=="], - - "@vercel/oidc": ["@vercel/oidc@3.1.0", "", {}, "sha512-Fw28YZpRnA3cAHHDlkt7xQHiJ0fcL+NRcIqsocZQUSmbzeIKRpwttJjik5ZGanXP+vlA4SbTg+AbA3bP363l+w=="], - - "accepts": ["accepts@2.0.0", "", { "dependencies": { "mime-types": "^3.0.0", "negotiator": "^1.0.0" } }, "sha512-5cvg6CtKwfgdmVqY1WIiXKc3Q1bkRqGLi+2W/6ao+6Y7gu/RCwRuAhGEzh5B4KlszSuTLgZYuqFqo5bImjNKng=="], - - "agent-base": ["agent-base@7.1.4", "", {}, "sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ=="], - - "ai": ["ai@6.0.82", "", { "dependencies": { "@ai-sdk/gateway": "3.0.42", "@ai-sdk/provider": "3.0.8", "@ai-sdk/provider-utils": "4.0.14", "@opentelemetry/api": "1.9.0" }, "peerDependencies": { "zod": "^3.25.76 || ^4.1.8" } }, "sha512-WLml1ab2IXtREgkxrq2Pl6lFO6NKgC17MqTzmK5mO1UO6tMAJiVjkednw9p0j4+/LaUIZQoRiIT8wA37LswZ9Q=="], - - "ajv": ["ajv@8.17.1", "", { "dependencies": { "fast-deep-equal": "^3.1.3", "fast-uri": "^3.0.1", "json-schema-traverse": "^1.0.0", "require-from-string": "^2.0.2" } }, "sha512-B/gBuNg5SiMTrPkC+A2+cW0RszwxYmn6VYxB/inlBStS5nx6xHIt/ehKRhIMhqusl7a8LjQoZnjCs5vhwxOQ1g=="], - - "ajv-formats": ["ajv-formats@3.0.1", "", { "dependencies": { "ajv": "^8.0.0" } }, "sha512-8iUql50EUR+uUcdRQ3HDqa6EVyo3docL8g5WJ3FNcWmu62IbkGUue/pEyLBW8VGKKucTPgqeks4fIU1DA4yowQ=="], - - "aws4fetch": ["aws4fetch@1.0.20", "", {}, "sha512-/djoAN709iY65ETD6LKCtyyEI04XIBP5xVvfmNxsEP0uJB5tyaGBztSryRr4HqMStr9R06PisQE7m9zDTXKu6g=="], - - "bidi-js": ["bidi-js@1.0.3", "", { "dependencies": { "require-from-string": "^2.0.2" } }, "sha512-RKshQI1R3YQ+n9YJz2QQ147P66ELpa1FQEg20Dk8oW9t2KgLbpDLLp9aGZ7y8WHSshDknG0bknqGw5/tyCs5tw=="], - - "body-parser": ["body-parser@2.2.2", "", { "dependencies": { "bytes": "^3.1.2", "content-type": "^1.0.5", "debug": "^4.4.3", "http-errors": "^2.0.0", "iconv-lite": "^0.7.0", "on-finished": "^2.4.1", "qs": "^6.14.1", "raw-body": "^3.0.1", "type-is": "^2.0.1" } }, "sha512-oP5VkATKlNwcgvxi0vM0p/D3n2C3EReYVX+DNYs5TjZFn/oQt2j+4sVJtSMr18pdRr8wjTcBl6LoV+FUwzPmNA=="], - - "bun-types": ["bun-types@1.3.7", "https://registry.npmmirror.com/bun-types/-/bun-types-1.3.7.tgz", { "dependencies": { "@types/node": "*" } }, "sha512-qyschsA03Qz+gou+apt6HNl6HnI+sJJLL4wLDke4iugsE6584CMupOtTY1n+2YC9nGVrEKUlTs99jjRLKgWnjQ=="], - - "bytes": ["bytes@3.1.2", "", {}, "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg=="], - - "call-bind-apply-helpers": ["call-bind-apply-helpers@1.0.2", "", { "dependencies": { "es-errors": "^1.3.0", "function-bind": "^1.1.2" } }, "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ=="], - - "call-bound": ["call-bound@1.0.4", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.2", "get-intrinsic": "^1.3.0" } }, "sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg=="], - - "content-disposition": ["content-disposition@1.0.1", "", {}, "sha512-oIXISMynqSqm241k6kcQ5UwttDILMK4BiurCfGEREw6+X9jkkpEe5T9FZaApyLGGOnFuyMWZpdolTXMtvEJ08Q=="], - - "content-type": ["content-type@1.0.5", "", {}, "sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA=="], - - "cookie": ["cookie@1.1.1", "https://registry.npmmirror.com/cookie/-/cookie-1.1.1.tgz", {}, "sha512-ei8Aos7ja0weRpFzJnEA9UHJ/7XQmqglbRwnf2ATjcB9Wq874VKH9kfjjirM6UhU2/E5fFYadylyhFldcqSidQ=="], - - "cookie-signature": ["cookie-signature@1.2.2", "", {}, "sha512-D76uU73ulSXrD1UXF4KE2TMxVVwhsnCgfAyTg9k8P6KGZjlXKrOLe4dJQKI3Bxi5wjesZoFXJWElNWBjPZMbhg=="], - - "cors": ["cors@2.8.6", "", { "dependencies": { "object-assign": "^4", "vary": "^1" } }, "sha512-tJtZBBHA6vjIAaF6EnIaq6laBBP9aq/Y3ouVJjEfoHbRBcHBAHYcMh/w8LDrk2PvIMMq8gmopa5D4V8RmbrxGw=="], - - "cross-spawn": ["cross-spawn@7.0.6", "", { "dependencies": { "path-key": "^3.1.0", "shebang-command": "^2.0.0", "which": "^2.0.1" } }, "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA=="], - - "css-tree": ["css-tree@3.1.0", "", { "dependencies": { "mdn-data": "2.12.2", "source-map-js": "^1.0.1" } }, "sha512-0eW44TGN5SQXU1mWSkKwFstI/22X2bG1nYzZTYMAWjylYURhse752YgbE4Cx46AC+bAvI+/dYTPRk1LqSUnu6w=="], - - "cssstyle": ["cssstyle@5.3.7", "", { "dependencies": { "@asamuzakjp/css-color": "^4.1.1", "@csstools/css-syntax-patches-for-csstree": "^1.0.21", "css-tree": "^3.1.0", "lru-cache": "^11.2.4" } }, "sha512-7D2EPVltRrsTkhpQmksIu+LxeWAIEk6wRDMJ1qljlv+CKHJM+cJLlfhWIzNA44eAsHXSNe3+vO6DW1yCYx8SuQ=="], - - "data-urls": ["data-urls@6.0.1", "", { "dependencies": { "whatwg-mimetype": "^5.0.0", "whatwg-url": "^15.1.0" } }, "sha512-euIQENZg6x8mj3fO6o9+fOW8MimUI4PpD/fZBhJfeioZVy9TUpM4UY7KjQNVZFlqwJ0UdzRDzkycB997HEq1BQ=="], - - "debug": ["debug@4.4.3", "https://registry.npmmirror.com/debug/-/debug-4.4.3.tgz", { "dependencies": { "ms": "^2.1.3" } }, "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA=="], - - "decimal.js": ["decimal.js@10.6.0", "", {}, "sha512-YpgQiITW3JXGntzdUmyUR1V812Hn8T1YVXhCu+wO3OpS4eU9l4YdD3qjyiKdV6mvV29zapkMeD390UVEf2lkUg=="], - - "depd": ["depd@2.0.0", "", {}, "sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw=="], - - "dunder-proto": ["dunder-proto@1.0.1", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.1", "es-errors": "^1.3.0", "gopd": "^1.2.0" } }, "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A=="], - - "ee-first": ["ee-first@1.1.1", "", {}, "sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow=="], - - "elysia": ["elysia@1.4.22", "https://registry.npmmirror.com/elysia/-/elysia-1.4.22.tgz", { "dependencies": { "cookie": "^1.1.1", "exact-mirror": "^0.2.6", "fast-decode-uri-component": "^1.0.1", "memoirist": "^0.4.0" }, "peerDependencies": { "@sinclair/typebox": ">= 0.34.0 < 1", "@types/bun": ">= 1.2.0", "file-type": ">= 20.0.0", "openapi-types": ">= 12.0.0", "typescript": ">= 5.0.0" }, "optionalPeers": ["@types/bun", "typescript"] }, "sha512-Q90VCb1RVFxnFaRV0FDoSylESQQLWgLHFmWciQJdX9h3b2cSasji9KWEUvaJuy/L9ciAGg4RAhUVfsXHg5K2RQ=="], - - "encodeurl": ["encodeurl@2.0.0", "", {}, "sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg=="], - - "entities": ["entities@6.0.1", "", {}, "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g=="], - - "es-define-property": ["es-define-property@1.0.1", "", {}, "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g=="], - - "es-errors": ["es-errors@1.3.0", "", {}, "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw=="], - - "es-object-atoms": ["es-object-atoms@1.1.1", "", { "dependencies": { "es-errors": "^1.3.0" } }, "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA=="], - - "escape-html": ["escape-html@1.0.3", "", {}, "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow=="], - - "etag": ["etag@1.8.1", "", {}, "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg=="], - - "eventsource": ["eventsource@3.0.7", "", { "dependencies": { "eventsource-parser": "^3.0.1" } }, "sha512-CRT1WTyuQoD771GW56XEZFQ/ZoSfWid1alKGDYMmkt2yl8UXrVR4pspqWNEcqKvVIzg6PAltWjxcSSPrboA4iA=="], - - "eventsource-parser": ["eventsource-parser@3.0.6", "", {}, "sha512-Vo1ab+QXPzZ4tCa8SwIHJFaSzy4R6SHf7BY79rFBDf0idraZWAkYrDjDj8uWaSm3S2TK+hJ7/t1CEmZ7jXw+pg=="], - - "exact-mirror": ["exact-mirror@0.2.6", "https://registry.npmmirror.com/exact-mirror/-/exact-mirror-0.2.6.tgz", { "peerDependencies": { "@sinclair/typebox": "^0.34.15" }, "optionalPeers": ["@sinclair/typebox"] }, "sha512-7s059UIx9/tnOKSySzUk5cPGkoILhTE4p6ncf6uIPaQ+9aRBQzQjc9+q85l51+oZ+P6aBxh084pD0CzBQPcFUA=="], - - "express": ["express@5.2.1", "", { "dependencies": { "accepts": "^2.0.0", "body-parser": "^2.2.1", "content-disposition": "^1.0.0", "content-type": "^1.0.5", "cookie": "^0.7.1", "cookie-signature": "^1.2.1", "debug": "^4.4.0", "depd": "^2.0.0", "encodeurl": "^2.0.0", "escape-html": "^1.0.3", "etag": "^1.8.1", "finalhandler": "^2.1.0", "fresh": "^2.0.0", "http-errors": "^2.0.0", "merge-descriptors": "^2.0.0", "mime-types": "^3.0.0", "on-finished": "^2.4.1", "once": "^1.4.0", "parseurl": "^1.3.3", "proxy-addr": "^2.0.7", "qs": "^6.14.0", "range-parser": "^1.2.1", "router": "^2.2.0", "send": "^1.1.0", "serve-static": "^2.2.0", "statuses": "^2.0.1", "type-is": "^2.0.1", "vary": "^1.1.2" } }, "sha512-hIS4idWWai69NezIdRt2xFVofaF4j+6INOpJlVOLDO8zXGpUVEVzIYk12UUi2JzjEzWL3IOAxcTubgz9Po0yXw=="], - - "express-rate-limit": ["express-rate-limit@8.2.1", "", { "dependencies": { "ip-address": "10.0.1" }, "peerDependencies": { "express": ">= 4.11" } }, "sha512-PCZEIEIxqwhzw4KF0n7QF4QqruVTcF73O5kFKUnGOyjbCCgizBBiFaYpd/fnBLUMPw/BWw9OsiN7GgrNYr7j6g=="], - - "fast-decode-uri-component": ["fast-decode-uri-component@1.0.1", "https://registry.npmmirror.com/fast-decode-uri-component/-/fast-decode-uri-component-1.0.1.tgz", {}, "sha512-WKgKWg5eUxvRZGwW8FvfbaH7AXSh2cL+3j5fMGzUMCxWBJ3dV3a7Wz8y2f/uQ0e3B6WmodD3oS54jTQ9HVTIIg=="], - - "fast-deep-equal": ["fast-deep-equal@3.1.3", "", {}, "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q=="], - - "fast-uri": ["fast-uri@3.1.0", "", {}, "sha512-iPeeDKJSWf4IEOasVVrknXpaBV0IApz/gp7S2bb7Z4Lljbl2MGJRqInZiUrQwV16cpzw/D3S5j5Julj/gT52AA=="], - - "file-type": ["file-type@21.3.0", "https://registry.npmmirror.com/file-type/-/file-type-21.3.0.tgz", { "dependencies": { "@tokenizer/inflate": "^0.4.1", "strtok3": "^10.3.4", "token-types": "^6.1.1", "uint8array-extras": "^1.4.0" } }, "sha512-8kPJMIGz1Yt/aPEwOsrR97ZyZaD1Iqm8PClb1nYFclUCkBi0Ma5IsYNQzvSFS9ib51lWyIw5mIT9rWzI/xjpzA=="], - - "finalhandler": ["finalhandler@2.1.1", "", { "dependencies": { "debug": "^4.4.0", "encodeurl": "^2.0.0", "escape-html": "^1.0.3", "on-finished": "^2.4.1", "parseurl": "^1.3.3", "statuses": "^2.0.1" } }, "sha512-S8KoZgRZN+a5rNwqTxlZZePjT/4cnm0ROV70LedRHZ0p8u9fRID0hJUZQpkKLzro8LfmC8sx23bY6tVNxv8pQA=="], - - "forwarded": ["forwarded@0.2.0", "", {}, "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow=="], - - "fresh": ["fresh@2.0.0", "", {}, "sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A=="], - - "function-bind": ["function-bind@1.1.2", "", {}, "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA=="], - - "get-intrinsic": ["get-intrinsic@1.3.0", "", { "dependencies": { "call-bind-apply-helpers": "^1.0.2", "es-define-property": "^1.0.1", "es-errors": "^1.3.0", "es-object-atoms": "^1.1.1", "function-bind": "^1.1.2", "get-proto": "^1.0.1", "gopd": "^1.2.0", "has-symbols": "^1.1.0", "hasown": "^2.0.2", "math-intrinsics": "^1.1.0" } }, "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ=="], - - "get-proto": ["get-proto@1.0.1", "", { "dependencies": { "dunder-proto": "^1.0.1", "es-object-atoms": "^1.0.0" } }, "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g=="], - - "gopd": ["gopd@1.2.0", "", {}, "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg=="], - - "has-symbols": ["has-symbols@1.1.0", "", {}, "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ=="], - - "hasown": ["hasown@2.0.2", "", { "dependencies": { "function-bind": "^1.1.2" } }, "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ=="], - - "hono": ["hono@4.11.9", "", {}, "sha512-Eaw2YTGM6WOxA6CXbckaEvslr2Ne4NFsKrvc0v97JD5awbmeBLO5w9Ho9L9kmKonrwF9RJlW6BxT1PVv/agBHQ=="], - - "html-encoding-sniffer": ["html-encoding-sniffer@6.0.0", "", { "dependencies": { "@exodus/bytes": "^1.6.0" } }, "sha512-CV9TW3Y3f8/wT0BRFc1/KAVQ3TUHiXmaAb6VW9vtiMFf7SLoMd1PdAc4W3KFOFETBJUb90KatHqlsZMWV+R9Gg=="], - - "http-errors": ["http-errors@2.0.1", "", { "dependencies": { "depd": "~2.0.0", "inherits": "~2.0.4", "setprototypeof": "~1.2.0", "statuses": "~2.0.2", "toidentifier": "~1.0.1" } }, "sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ=="], - - "http-proxy-agent": ["http-proxy-agent@7.0.2", "", { "dependencies": { "agent-base": "^7.1.0", "debug": "^4.3.4" } }, "sha512-T1gkAiYYDWYx3V5Bmyu7HcfcvL7mUrTWiM6yOfa3PIphViJ/gFPbvidQ+veqSOHci/PxBcDabeUNCzpOODJZig=="], - - "https-proxy-agent": ["https-proxy-agent@7.0.6", "", { "dependencies": { "agent-base": "^7.1.2", "debug": "4" } }, "sha512-vK9P5/iUfdl95AI+JVyUuIcVtd4ofvtrOr3HNtM2yxC9bnMbEdp3x01OhQNnjb8IJYi38VlTE3mBXwcfvywuSw=="], - - "iconv-lite": ["iconv-lite@0.7.2", "", { "dependencies": { "safer-buffer": ">= 2.1.2 < 3.0.0" } }, "sha512-im9DjEDQ55s9fL4EYzOAv0yMqmMBSZp6G0VvFyTMPKWxiSBHUj9NW/qqLmXUwXrrM7AvqSlTCfvqRb0cM8yYqw=="], - - "ieee754": ["ieee754@1.2.1", "https://registry.npmmirror.com/ieee754/-/ieee754-1.2.1.tgz", {}, "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA=="], - - "inherits": ["inherits@2.0.4", "", {}, "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ=="], - - "ip-address": ["ip-address@10.0.1", "", {}, "sha512-NWv9YLW4PoW2B7xtzaS3NCot75m6nK7Icdv0o3lfMceJVRfSoQwqD4wEH5rLwoKJwUiZ/rfpiVBhnaF0FK4HoA=="], - - "ipaddr.js": ["ipaddr.js@1.9.1", "", {}, "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g=="], - - "is-potential-custom-element-name": ["is-potential-custom-element-name@1.0.1", "", {}, "sha512-bCYeRA2rVibKZd+s2625gGnGF/t7DSqDs4dP7CrLA1m7jKWz6pps0LpYLJN8Q64HtmPKJ1hrN3nzPNKFEKOUiQ=="], - - "is-promise": ["is-promise@4.0.0", "", {}, "sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ=="], - - "isexe": ["isexe@2.0.0", "", {}, "sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw=="], - - "jose": ["jose@6.1.3", "", {}, "sha512-0TpaTfihd4QMNwrz/ob2Bp7X04yuxJkjRGi4aKmOqwhov54i6u79oCv7T+C7lo70MKH6BesI3vscD1yb/yzKXQ=="], - - "jsdom": ["jsdom@27.4.0", "", { "dependencies": { "@acemir/cssom": "^0.9.28", "@asamuzakjp/dom-selector": "^6.7.6", "@exodus/bytes": "^1.6.0", "cssstyle": "^5.3.4", "data-urls": "^6.0.0", "decimal.js": "^10.6.0", "html-encoding-sniffer": "^6.0.0", "http-proxy-agent": "^7.0.2", "https-proxy-agent": "^7.0.6", "is-potential-custom-element-name": "^1.0.1", "parse5": "^8.0.0", "saxes": "^6.0.0", "symbol-tree": "^3.2.4", "tough-cookie": "^6.0.0", "w3c-xmlserializer": "^5.0.0", "webidl-conversions": "^8.0.0", "whatwg-mimetype": "^4.0.0", "whatwg-url": "^15.1.0", "ws": "^8.18.3", "xml-name-validator": "^5.0.0" }, "peerDependencies": { "canvas": "^3.0.0" }, "optionalPeers": ["canvas"] }, "sha512-mjzqwWRD9Y1J1KUi7W97Gja1bwOOM5Ug0EZ6UDK3xS7j7mndrkwozHtSblfomlzyB4NepioNt+B2sOSzczVgtQ=="], - - "json-schema": ["json-schema@0.4.0", "", {}, "sha512-es94M3nTIfsEPisRafak+HDLfHXnKBhV3vU5eqPcS3flIWqcxJWgXHXiey3YrpaNsanY5ei1VoYEbOzijuq9BA=="], - - "json-schema-traverse": ["json-schema-traverse@1.0.0", "", {}, "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug=="], - - "json-schema-typed": ["json-schema-typed@8.0.2", "", {}, "sha512-fQhoXdcvc3V28x7C7BMs4P5+kNlgUURe2jmUT1T//oBRMDrqy1QPelJimwZGo7Hg9VPV3EQV5Bnq4hbFy2vetA=="], - - "lru-cache": ["lru-cache@11.2.6", "", {}, "sha512-ESL2CrkS/2wTPfuend7Zhkzo2u0daGJ/A2VucJOgQ/C48S/zB8MMeMHSGKYpXhIjbPxfuezITkaBH1wqv00DDQ=="], - - "math-intrinsics": ["math-intrinsics@1.1.0", "", {}, "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g=="], - - "mdn-data": ["mdn-data@2.12.2", "", {}, "sha512-IEn+pegP1aManZuckezWCO+XZQDplx1366JoVhTpMpBB1sPey/SbveZQUosKiKiGYjg1wH4pMlNgXbCiYgihQA=="], - - "media-typer": ["media-typer@1.1.0", "", {}, "sha512-aisnrDP4GNe06UcKFnV5bfMNPBUw4jsLGaWwWfnH3v02GnBuXX2MCVn5RbrWo0j3pczUilYblq7fQ7Nw2t5XKw=="], - - "memoirist": ["memoirist@0.4.0", "https://registry.npmmirror.com/memoirist/-/memoirist-0.4.0.tgz", {}, "sha512-zxTgA0mSYELa66DimuNQDvyLq36AwDlTuVRbnQtB+VuTcKWm5Qc4z3WkSpgsFWHNhexqkIooqpv4hdcqrX5Nmg=="], - - "merge-descriptors": ["merge-descriptors@2.0.0", "", {}, "sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g=="], - - "mime-db": ["mime-db@1.54.0", "", {}, "sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ=="], - - "mime-types": ["mime-types@3.0.2", "", { "dependencies": { "mime-db": "^1.54.0" } }, "sha512-Lbgzdk0h4juoQ9fCKXW4by0UJqj+nOOrI9MJ1sSj4nI8aI2eo1qmvQEie4VD1glsS250n15LsWsYtCugiStS5A=="], - - "ms": ["ms@2.1.3", "https://registry.npmmirror.com/ms/-/ms-2.1.3.tgz", {}, "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA=="], - - "negotiator": ["negotiator@1.0.0", "", {}, "sha512-8Ofs/AUQh8MaEcrlq5xOX0CQ9ypTF5dl78mjlMNfOK08fzpgTHQRQPBxcPlEtIw0yRpws+Zo/3r+5WRby7u3Gg=="], - - "object-assign": ["object-assign@4.1.1", "", {}, "sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg=="], - - "object-inspect": ["object-inspect@1.13.4", "", {}, "sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew=="], - - "on-finished": ["on-finished@2.4.1", "", { "dependencies": { "ee-first": "1.1.1" } }, "sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg=="], - - "once": ["once@1.4.0", "", { "dependencies": { "wrappy": "1" } }, "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w=="], - - "openapi-types": ["openapi-types@12.1.3", "https://registry.npmmirror.com/openapi-types/-/openapi-types-12.1.3.tgz", {}, "sha512-N4YtSYJqghVu4iek2ZUvcN/0aqH1kRDuNqzcycDxhOUpg7GdvLa2F3DgS6yBNhInhv2r/6I0Flkn7CqL8+nIcw=="], - - "parse5": ["parse5@7.3.0", "", { "dependencies": { "entities": "^6.0.0" } }, "sha512-IInvU7fabl34qmi9gY8XOVxhYyMyuH2xUNpb2q8/Y+7552KlejkRvqvD19nMoUW/uQGGbqNpA6Tufu5FL5BZgw=="], - - "parseurl": ["parseurl@1.3.3", "", {}, "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ=="], - - "path-key": ["path-key@3.1.1", "", {}, "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q=="], - - "path-to-regexp": ["path-to-regexp@8.3.0", "", {}, "sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA=="], - - "pkce-challenge": ["pkce-challenge@5.0.1", "", {}, "sha512-wQ0b/W4Fr01qtpHlqSqspcj3EhBvimsdh0KlHhH8HRZnMsEa0ea2fTULOXOS9ccQr3om+GcGRk4e+isrZWV8qQ=="], - - "proxy-addr": ["proxy-addr@2.0.7", "", { "dependencies": { "forwarded": "0.2.0", "ipaddr.js": "1.9.1" } }, "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg=="], - - "punycode": ["punycode@2.3.1", "", {}, "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg=="], - - "qs": ["qs@6.14.2", "", { "dependencies": { "side-channel": "^1.1.0" } }, "sha512-V/yCWTTF7VJ9hIh18Ugr2zhJMP01MY7c5kh4J870L7imm6/DIzBsNLTXzMwUA3yZ5b/KBqLx8Kp3uRvd7xSe3Q=="], - - "range-parser": ["range-parser@1.2.1", "", {}, "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg=="], - - "raw-body": ["raw-body@3.0.2", "", { "dependencies": { "bytes": "~3.1.2", "http-errors": "~2.0.1", "iconv-lite": "~0.7.0", "unpipe": "~1.0.0" } }, "sha512-K5zQjDllxWkf7Z5xJdV0/B0WTNqx6vxG70zJE4N0kBs4LovmEYWJzQGxC9bS9RAKu3bgM40lrd5zoLJ12MQ5BA=="], - - "require-from-string": ["require-from-string@2.0.2", "", {}, "sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw=="], - - "router": ["router@2.2.0", "", { "dependencies": { "debug": "^4.4.0", "depd": "^2.0.0", "is-promise": "^4.0.0", "parseurl": "^1.3.3", "path-to-regexp": "^8.0.0" } }, "sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ=="], - - "safer-buffer": ["safer-buffer@2.1.2", "", {}, "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg=="], - - "saxes": ["saxes@6.0.0", "", { "dependencies": { "xmlchars": "^2.2.0" } }, "sha512-xAg7SOnEhrm5zI3puOOKyy1OMcMlIJZYNJY7xLBwSze0UjhPLnWfj2GF2EpT0jmzaJKIWKHLsaSSajf35bcYnA=="], - - "send": ["send@1.2.1", "", { "dependencies": { "debug": "^4.4.3", "encodeurl": "^2.0.0", "escape-html": "^1.0.3", "etag": "^1.8.1", "fresh": "^2.0.0", "http-errors": "^2.0.1", "mime-types": "^3.0.2", "ms": "^2.1.3", "on-finished": "^2.4.1", "range-parser": "^1.2.1", "statuses": "^2.0.2" } }, "sha512-1gnZf7DFcoIcajTjTwjwuDjzuz4PPcY2StKPlsGAQ1+YH20IRVrBaXSWmdjowTJ6u8Rc01PoYOGHXfP1mYcZNQ=="], - - "serve-static": ["serve-static@2.2.1", "", { "dependencies": { "encodeurl": "^2.0.0", "escape-html": "^1.0.3", "parseurl": "^1.3.3", "send": "^1.2.0" } }, "sha512-xRXBn0pPqQTVQiC8wyQrKs2MOlX24zQ0POGaj0kultvoOCstBQM5yvOhAVSUwOMjQtTvsPWoNCHfPGwaaQJhTw=="], - - "setprototypeof": ["setprototypeof@1.2.0", "", {}, "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw=="], - - "shebang-command": ["shebang-command@2.0.0", "", { "dependencies": { "shebang-regex": "^3.0.0" } }, "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA=="], - - "shebang-regex": ["shebang-regex@3.0.0", "", {}, "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A=="], - - "side-channel": ["side-channel@1.1.0", "", { "dependencies": { "es-errors": "^1.3.0", "object-inspect": "^1.13.3", "side-channel-list": "^1.0.0", "side-channel-map": "^1.0.1", "side-channel-weakmap": "^1.0.2" } }, "sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw=="], - - "side-channel-list": ["side-channel-list@1.0.0", "", { "dependencies": { "es-errors": "^1.3.0", "object-inspect": "^1.13.3" } }, "sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA=="], - - "side-channel-map": ["side-channel-map@1.0.1", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.5", "object-inspect": "^1.13.3" } }, "sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA=="], - - "side-channel-weakmap": ["side-channel-weakmap@1.0.2", "", { "dependencies": { "call-bound": "^1.0.2", "es-errors": "^1.3.0", "get-intrinsic": "^1.2.5", "object-inspect": "^1.13.3", "side-channel-map": "^1.0.1" } }, "sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A=="], - - "source-map-js": ["source-map-js@1.2.1", "", {}, "sha512-UXWMKhLOwVKb728IUtQPXxfYU+usdybtUrK/8uGE8CQMvrhOpwvzDBwj0QhSL7MQc7vIsISBG8VQ8+IDQxpfQA=="], - - "statuses": ["statuses@2.0.2", "", {}, "sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw=="], - - "strtok3": ["strtok3@10.3.4", "https://registry.npmmirror.com/strtok3/-/strtok3-10.3.4.tgz", { "dependencies": { "@tokenizer/token": "^0.3.0" } }, "sha512-KIy5nylvC5le1OdaaoCJ07L+8iQzJHGH6pWDuzS+d07Cu7n1MZ2x26P8ZKIWfbK02+XIL8Mp4RkWeqdUCrDMfg=="], - - "symbol-tree": ["symbol-tree@3.2.4", "", {}, "sha512-9QNk5KwDF+Bvz+PyObkmSYjI5ksVUYtjW7AU22r2NKcfLJcXp96hkDWU3+XndOsUb+AQ9QhfzfCT2O+CNWT5Tw=="], - - "tldts": ["tldts@7.0.23", "", { "dependencies": { "tldts-core": "^7.0.23" }, "bin": { "tldts": "bin/cli.js" } }, "sha512-ASdhgQIBSay0R/eXggAkQ53G4nTJqTXqC2kbaBbdDwM7SkjyZyO0OaaN1/FH7U/yCeqOHDwFO5j8+Os/IS1dXw=="], - - "tldts-core": ["tldts-core@7.0.23", "", {}, "sha512-0g9vrtDQLrNIiCj22HSe9d4mLVG3g5ph5DZ8zCKBr4OtrspmNB6ss7hVyzArAeE88ceZocIEGkyW1Ime7fxPtQ=="], - - "toidentifier": ["toidentifier@1.0.1", "", {}, "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA=="], - - "token-types": ["token-types@6.1.2", "https://registry.npmmirror.com/token-types/-/token-types-6.1.2.tgz", { "dependencies": { "@borewit/text-codec": "^0.2.1", "@tokenizer/token": "^0.3.0", "ieee754": "^1.2.1" } }, "sha512-dRXchy+C0IgK8WPC6xvCHFRIWYUbqqdEIKPaKo/AcTUNzwLTK6AH7RjdLWsEZcAN/TBdtfUw3PYEgPr5VPr6ww=="], - - "toml": ["toml@3.0.0", "", {}, "sha512-y/mWCZinnvxjTKYhJ+pYxwD0mRLVvOtdS2Awbgxln6iEnt4rk0yBxeSBHkGJcPucRiG0e55mwWp+g/05rsrd6w=="], - - "tough-cookie": ["tough-cookie@6.0.0", "", { "dependencies": { "tldts": "^7.0.5" } }, "sha512-kXuRi1mtaKMrsLUxz3sQYvVl37B0Ns6MzfrtV5DvJceE9bPyspOqk9xxv7XbZWcfLWbFmm997vl83qUWVJA64w=="], - - "tr46": ["tr46@6.0.0", "", { "dependencies": { "punycode": "^2.3.1" } }, "sha512-bLVMLPtstlZ4iMQHpFHTR7GAGj2jxi8Dg0s2h2MafAE4uSWF98FC/3MomU51iQAMf8/qDUbKWf5GxuvvVcXEhw=="], - - "tslib": ["tslib@2.8.1", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="], - - "turndown": ["turndown@7.2.2", "", { "dependencies": { "@mixmark-io/domino": "^2.2.0" } }, "sha512-1F7db8BiExOKxjSMU2b7if62D/XOyQyZbPKq/nUwopfgnHlqXHqQ0lvfUTeUIr1lZJzOPFn43dODyMSIfvWRKQ=="], - - "type-is": ["type-is@2.0.1", "", { "dependencies": { "content-type": "^1.0.5", "media-typer": "^1.1.0", "mime-types": "^3.0.0" } }, "sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw=="], - - "uint8array-extras": ["uint8array-extras@1.5.0", "https://registry.npmmirror.com/uint8array-extras/-/uint8array-extras-1.5.0.tgz", {}, "sha512-rvKSBiC5zqCCiDZ9kAOszZcDvdAHwwIKJG33Ykj43OKcWsnmcBRL09YTU4nOeHZ8Y2a7l1MgTd08SBe9A8Qj6A=="], - - "undici-types": ["undici-types@7.16.0", "https://registry.npmmirror.com/undici-types/-/undici-types-7.16.0.tgz", {}, "sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw=="], - - "unpipe": ["unpipe@1.0.0", "", {}, "sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ=="], - - "vary": ["vary@1.1.2", "", {}, "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg=="], - - "w3c-xmlserializer": ["w3c-xmlserializer@5.0.0", "", { "dependencies": { "xml-name-validator": "^5.0.0" } }, "sha512-o8qghlI8NZHU1lLPrpi2+Uq7abh4GGPpYANlalzWxyWteJOCsr/P+oPBA49TOLu5FTZO4d3F9MnWJfiMo4BkmA=="], - - "webidl-conversions": ["webidl-conversions@8.0.1", "", {}, "sha512-BMhLD/Sw+GbJC21C/UgyaZX41nPt8bUTg+jWyDeg7e7YN4xOM05YPSIXceACnXVtqyEw/LMClUQMtMZ+PGGpqQ=="], - - "whatwg-mimetype": ["whatwg-mimetype@4.0.0", "", {}, "sha512-QaKxh0eNIi2mE9p2vEdzfagOKHCcj1pJ56EEHGQOVxp8r9/iszLUUV7v89x9O1p/T+NlTM5W7jW6+cz4Fq1YVg=="], - - "whatwg-url": ["whatwg-url@15.1.0", "", { "dependencies": { "tr46": "^6.0.0", "webidl-conversions": "^8.0.0" } }, "sha512-2ytDk0kiEj/yu90JOAp44PVPUkO9+jVhyf+SybKlRHSDlvOOZhdPIrr7xTH64l4WixO2cP+wQIcgujkGBPPz6g=="], - - "which": ["which@2.0.2", "", { "dependencies": { "isexe": "^2.0.0" }, "bin": { "node-which": "./bin/node-which" } }, "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA=="], - - "wrappy": ["wrappy@1.0.2", "", {}, "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ=="], - - "ws": ["ws@8.19.0", "", { "peerDependencies": { "bufferutil": "^4.0.1", "utf-8-validate": ">=5.0.2" }, "optionalPeers": ["bufferutil", "utf-8-validate"] }, "sha512-blAT2mjOEIi0ZzruJfIhb3nps74PRWTCz1IjglWEEpQl5XS/UNama6u2/rjFkDDouqr4L67ry+1aGIALViWjDg=="], - - "xml-name-validator": ["xml-name-validator@5.0.0", "", {}, "sha512-EvGK8EJ3DhaHfbRlETOWAS5pO9MZITeauHKJyb8wyajUfQUenkIg2MvLDTZ4T/TgIcm3HU0TFBgWWboAZ30UHg=="], - - "xmlchars": ["xmlchars@2.2.0", "", {}, "sha512-JZnDKK8B0RCDw84FNdDAIpZK+JuJw+s7Lz8nksI7SIuU3UXJJslUthsi+uWBUYOwPFwW7W7PRLRfUKpxjtjFCw=="], - - "zod": ["zod@4.3.6", "", {}, "sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg=="], - - "zod-to-json-schema": ["zod-to-json-schema@3.25.1", "", { "peerDependencies": { "zod": "^3.25 || ^4" } }, "sha512-pM/SU9d3YAggzi6MtR4h7ruuQlqKtad8e9S0fmxcMi+ueAK5Korys/aWcV9LIIHTVbj01NdzxcnXSN+O74ZIVA=="], - - "@aws-crypto/util/@smithy/util-utf8": ["@smithy/util-utf8@2.3.0", "", { "dependencies": { "@smithy/util-buffer-from": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-R8Rdn8Hy72KKcebgLiv8jQcQkXoLMOGGv5uI1/k0l+snqkOzQ1R0ChUBCxWMlBsFMekWjq0wRudIweFs7sKT5A=="], - - "data-urls/whatwg-mimetype": ["whatwg-mimetype@5.0.0", "", {}, "sha512-sXcNcHOC51uPGF0P/D4NVtrkjSU2fNsm9iog4ZvZJsL3rjoDAzXZhkm2MWt1y+PUdggKAYVoMAIYcs78wJ51Cw=="], - - "express/cookie": ["cookie@0.7.2", "", {}, "sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w=="], - - "jsdom/parse5": ["parse5@8.0.0", "", { "dependencies": { "entities": "^6.0.0" } }, "sha512-9m4m5GSgXjL4AjumKzq1Fgfp3Z8rsvjRNbnkVwfu2ImRqE5D0LnY2QfDen18FSY9C573YU5XxSapdHZTZ2WolA=="], - - "@aws-crypto/util/@smithy/util-utf8/@smithy/util-buffer-from": ["@smithy/util-buffer-from@2.2.0", "", { "dependencies": { "@smithy/is-array-buffer": "^2.2.0", "tslib": "^2.6.2" } }, "sha512-IJdWBbTcMQ6DA0gdNhh/BwrLkDR+ADW5Kr1aZmd4k3DIF6ezMV4R2NIAmT08wQJ3yUK82thHWmC/TnK/wpMMIA=="], - - "@aws-crypto/util/@smithy/util-utf8/@smithy/util-buffer-from/@smithy/is-array-buffer": ["@smithy/is-array-buffer@2.2.0", "", { "dependencies": { "tslib": "^2.6.2" } }, "sha512-GGP3O9QFD24uGeAXYUjwSTXARoqpZykHadOmA8G5vfJPK0/DC67qa//0qvqrJzL1xc8WQWX7/yc7fwudjPHPhA=="], - } -} diff --git a/apps/agent/mise.toml b/apps/agent/mise.toml deleted file mode 100644 index 8a8bc0a0..00000000 --- a/apps/agent/mise.toml +++ /dev/null @@ -1,15 +0,0 @@ -[tasks.dev] -alias = "dev" -description = "Start agent gateway development server" -run = "pnpm dev" -depends = ["//:pnpm-install"] - -[tasks.build] -description = "Build agent gateway" -run = "pnpm build" -depends = ["//:pnpm-install"] - -[tasks.start] -description = "Start agent gateway" -run = "pnpm start" -depends = ["//:pnpm-install"] \ No newline at end of file diff --git a/apps/agent/package.json b/apps/agent/package.json deleted file mode 100644 index 06649574..00000000 --- a/apps/agent/package.json +++ /dev/null @@ -1,27 +0,0 @@ -{ - "name": "@memoh/agent-gateway", - "version": "0.5.0", - "scripts": { - "dev": "bun run --watch src/index.ts", - "build": "bun build src/index.ts --outfile dist/index.js --target bun --minify", - "start": "pnpm run build && bun run dist/index.js" - }, - "dependencies": { - "@elysiajs/bearer": "^1.4.2", - "@elysiajs/cors": "^1.4.1", - "@memoh/agent": "workspace:*", - "@memoh/config": "workspace:*", - "@modelcontextprotocol/sdk": "^1.25.2", - "@mozilla/readability": "^0.6.0", - "@types/turndown": "^5.0.6", - "ai": "^6.0.25", - "elysia": "latest", - "toml": "^3.0.0", - "turndown": "^7.2.2", - "zod": "^4.3.5" - }, - "devDependencies": { - "bun-types": "latest" - }, - "module": "src/index.js" -} diff --git a/apps/agent/src/index.ts b/apps/agent/src/index.ts deleted file mode 100644 index 3eefc96a..00000000 --- a/apps/agent/src/index.ts +++ /dev/null @@ -1,94 +0,0 @@ -import { Elysia } from 'elysia' -import { chatModule } from './modules/chat' -import { corsMiddleware } from './middlewares/cors' -import { errorMiddleware } from './middlewares/error' -import { loadConfig, getBaseUrl as getBaseUrlByConfig } from '@memoh/config' -import { AgentAuthContext, AuthFetcher } from '@memoh/agent' - -const configuredPath = process.env.MEMOH_CONFIG_PATH?.trim() || process.env.CONFIG_PATH?.trim() -const configPath = configuredPath && configuredPath.length > 0 ? configuredPath : '../../config.toml' -const config = loadConfig(configPath) - -export const getBaseUrl = () => { - return getBaseUrlByConfig(config) -} - -function parseJwtExp(token: string): number | null { - try { - const base64Url = token.split('.')[1] - if (!base64Url) return null - const base64 = base64Url.replace(/-/g, '+').replace(/_/g, '/') - const jsonPayload = Buffer.from(base64, 'base64').toString('utf8') - const payload = JSON.parse(jsonPayload) - return payload.exp ? payload.exp * 1000 : null - } catch (e) { - console.error('Failed to parse JWT expiration from token', e) - return null - } -} - -export const createAuthFetcher = (auth: AgentAuthContext): AuthFetcher => { - let refreshPromise: Promise | null = null - return async (url: string, options?: RequestInit) => { - if (auth.bearer) { - const exp = parseJwtExp(auth.bearer) - if (exp !== null && exp - Date.now() < 120000) { // Refresh if expiring in < 2 mins - if (!refreshPromise) { - refreshPromise = (async () => { - const refreshUrl = new URL('/auth/refresh', `${getBaseUrl().replace(/\/$/, '')}/`).toString() - const res = await fetch(refreshUrl, { - method: 'POST', - headers: { 'Authorization': `Bearer ${auth.bearer}` } - }) - if (res.ok) { - const data = await res.json() - return data.access_token - } - throw new Error('Failed to refresh token') - })().finally(() => { - refreshPromise = null - }) - } - try { - auth.bearer = await refreshPromise - } catch (e) { - console.error('Token refresh failed', e) - throw e - } - } - } - - const requestOptions = options ?? {} - const headers = new Headers(requestOptions.headers || {}) - if (auth.bearer && !headers.has('Authorization')) { - headers.set('Authorization', `Bearer ${auth.bearer}`) - } - - const baseURL = getBaseUrl() - const requestURL = /^https?:\/\//i.test(url) - ? url - : new URL(url, `${baseURL.replace(/\/$/, '')}/`).toString() - - return await fetch(requestURL, { - ...requestOptions, - headers, - }) - } -} - -const app = new Elysia() - .use(corsMiddleware) - .use(errorMiddleware) - .get('/health', () => ({ - status: 'ok', - })) - .use(chatModule) - .listen({ - port: config.agent_gateway.port ?? 8081, - hostname: config.agent_gateway.host ?? '127.0.0.1', - idleTimeout: 255, // max allowed by Bun, to accommodate long-running tool calls - }) - -console.log( - `⚙️ Agent Gateway is running at ${app.server?.hostname}:${app.server?.port}`, -) diff --git a/apps/agent/src/middlewares/bearer.ts b/apps/agent/src/middlewares/bearer.ts deleted file mode 100644 index a0829c84..00000000 --- a/apps/agent/src/middlewares/bearer.ts +++ /dev/null @@ -1,3 +0,0 @@ -import { bearer } from '@elysiajs/bearer' - -export const bearerMiddleware = bearer() \ No newline at end of file diff --git a/apps/agent/src/middlewares/cors.ts b/apps/agent/src/middlewares/cors.ts deleted file mode 100644 index fdb7aa5a..00000000 --- a/apps/agent/src/middlewares/cors.ts +++ /dev/null @@ -1,9 +0,0 @@ -import cors from '@elysiajs/cors' - -export const corsMiddleware = cors({ - origin: '*', - methods: ['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'], - allowedHeaders: ['Content-Type', 'Authorization'], - exposeHeaders: ['Content-Type', 'Authorization'], - credentials: true, -}) \ No newline at end of file diff --git a/apps/agent/src/middlewares/error.ts b/apps/agent/src/middlewares/error.ts deleted file mode 100644 index 98a5167c..00000000 --- a/apps/agent/src/middlewares/error.ts +++ /dev/null @@ -1,118 +0,0 @@ -import { Elysia } from 'elysia' - -export interface ErrorResponse { - success: false - error: string - code?: string - details?: unknown -} - -export const errorMiddleware = new Elysia({ name: 'error' }) - .onError(({ code, error, set, request }) => { - const url = new URL(request.url) - const status = set.status ?? 500 - const message = error instanceof Error ? error.message : String(error) - console.error('[Error]', { - method: request.method, - path: url.pathname, - code, - status, - message, - }) - - switch (code) { - case 'VALIDATION': - set.status = 400 - return { - success: false, - error: 'Validation failed', - code: 'VALIDATION_ERROR', - details: error.message, - } satisfies ErrorResponse - - case 'NOT_FOUND': - set.status = 404 - return { - success: false, - error: 'Resource not found', - code: 'NOT_FOUND', - } satisfies ErrorResponse - - case 'PARSE': - set.status = 400 - return { - success: false, - error: 'Invalid request format', - code: 'PARSE_ERROR', - details: error.message, - } satisfies ErrorResponse - - case 'INTERNAL_SERVER_ERROR': - set.status = 500 - return { - success: false, - error: 'Internal server error', - code: 'INTERNAL_SERVER_ERROR', - } satisfies ErrorResponse - - case 'UNKNOWN': - default: - if (error instanceof Error) { - const message = error.message - - if ( - message.includes('No bearer token') || - message.includes('Invalid or expired token') - ) { - set.status = 401 - return { - success: false, - error: message, - code: 'UNAUTHORIZED', - } satisfies ErrorResponse - } - - if (message.includes('Forbidden') || message.includes('Admin access required')) { - set.status = 403 - return { - success: false, - error: message, - code: 'FORBIDDEN', - } satisfies ErrorResponse - } - - if (message.includes('already exists')) { - set.status = 409 - return { - success: false, - error: message, - code: 'CONFLICT', - } satisfies ErrorResponse - } - - if (message.includes('not found')) { - set.status = 404 - return { - success: false, - error: message, - code: 'NOT_FOUND', - } satisfies ErrorResponse - } - - set.status = 500 - return { - success: false, - error: message, - code: 'ERROR', - } satisfies ErrorResponse - } - - set.status = 500 - return { - success: false, - error: 'An unexpected error occurred', - code: 'UNKNOWN_ERROR', - } satisfies ErrorResponse - } - }) - diff --git a/apps/agent/src/models.ts b/apps/agent/src/models.ts deleted file mode 100644 index e6903ca4..00000000 --- a/apps/agent/src/models.ts +++ /dev/null @@ -1,97 +0,0 @@ -import z from 'zod' - -export const AgentSkillModel = z.object({ - name: z.string().min(1, 'Skill name is required'), - description: z.string().min(1, 'Skill description is required'), - content: z.string().min(1, 'Skill content is required'), - metadata: z.record(z.string(), z.any()).optional(), -}) - -export const ClientTypeModel = z.enum([ - 'openai-responses', 'openai-completions', 'anthropic-messages', 'google-generative-ai', -]) - -export const ReasoningConfigModel = z.object({ - enabled: z.boolean(), - effort: z.enum(['low', 'medium', 'high']), -}).optional() - -export const ModelConfigModel = z.object({ - modelId: z.string().min(1, 'Model ID is required'), - clientType: ClientTypeModel, - input: z.array(z.enum(['text', 'image', 'audio', 'video', 'file'])), - apiKey: z.string().min(1, 'API key is required'), - baseUrl: z.string(), - reasoning: ReasoningConfigModel, -}) - -export const IdentityContextModel = z.object({ - botId: z.string().min(1, 'Bot ID is required'), - channelIdentityId: z.string().min(1, 'Channel identity ID is required'), - displayName: z.string().min(1, 'Display name is required'), - currentPlatform: z.string().optional(), - replyTarget: z.string().optional(), - conversationType: z.string().optional(), - sessionToken: z.string().optional(), -}) - -export const ScheduleModel = z.object({ - id: z.string().min(1, 'Schedule ID is required'), - name: z.string().min(1, 'Schedule name is required'), - description: z.string().min(1, 'Schedule description is required'), - pattern: z.string().min(1, 'Schedule pattern is required'), - maxCalls: z.number().nullable().optional(), - command: z.string().min(1, 'Schedule command is required'), -}) - -export const HeartbeatModel = z.object({ - interval: z.number().int().positive().default(30), -}) - -export const LoopDetectionModel = z.object({ - enabled: z.boolean().default(false), -}).optional() - -export const AttachmentModel = z.object({ - contentHash: z.string().optional(), - type: z.string().min(1, 'Attachment type is required'), - mime: z.string().optional(), - size: z.number().int().nonnegative().optional(), - name: z.string().optional(), - transport: z.enum(['inline_data_url', 'public_url', 'tool_file_ref']), - payload: z.string().min(1, 'Attachment payload is required'), - metadata: z.record(z.string(), z.any()).optional(), -}) - -export const HTTPMCPConnectionModel = z.object({ - name: z.string().min(1, 'Name is required'), - type: z.literal('http'), - url: z.string().min(1, 'URL is required'), - headers: z.record(z.string(), z.string()).optional(), -}) - -export const SSEMCPConnectionModel = z.object({ - name: z.string().min(1, 'Name is required'), - type: z.literal('sse'), - url: z.string().min(1, 'URL is required'), - headers: z.record(z.string(), z.string()).optional(), -}) - -export const StdioMCPConnectionModel = z.object({ - name: z.string().min(1, 'Name is required'), - type: z.literal('stdio'), - command: z.string().min(1, 'Command is required'), - args: z.array(z.string()), - env: z.record(z.string(), z.string()).optional(), - cwd: z.string().optional(), -}) - -export const MCPConnectionModel = z.union([HTTPMCPConnectionModel, SSEMCPConnectionModel, StdioMCPConnectionModel]) - -export const InboxItemModel = z.object({ - id: z.string(), - source: z.string(), - header: z.record(z.string(), z.unknown()).default({}), - content: z.string().default(''), - createdAt: z.string(), -}) diff --git a/apps/agent/src/modules/chat.ts b/apps/agent/src/modules/chat.ts deleted file mode 100644 index 5e8835b7..00000000 --- a/apps/agent/src/modules/chat.ts +++ /dev/null @@ -1,256 +0,0 @@ -import { Elysia } from 'elysia' -import z from 'zod' -import { createAgent, ModelConfig, type AgentStreamAction } from '@memoh/agent' -import { createAuthFetcher, getBaseUrl } from '../index' -import { bearerMiddleware } from '../middlewares/bearer' -import { AgentSkillModel, AttachmentModel, HeartbeatModel, IdentityContextModel, InboxItemModel, LoopDetectionModel, MCPConnectionModel, ModelConfigModel, ScheduleModel } from '../models' -import { sseChunked } from '../utils/sse' - -const AgentModel = z.object({ - model: ModelConfigModel, - activeContextTime: z.number(), - channels: z.array(z.string()), - currentChannel: z.string(), - messages: z.array(z.any()), - usableSkills: z.array(AgentSkillModel).optional().default([]), - skills: z.array(z.string()), - identity: IdentityContextModel, - attachments: z.array(AttachmentModel).optional().default([]), - mcpConnections: z.array(MCPConnectionModel).optional().default([]), - inbox: z.array(InboxItemModel).optional().default([]), - loopDetection: LoopDetectionModel, -}) - -const StreamBodyModel = AgentModel.extend({ - query: z.string().optional().default(''), -}) - -function buildAgentAndStream(body: z.infer, bearer: string, signal?: AbortSignal) { - const auth = { - bearer, - baseUrl: getBaseUrl(), - } - const authFetcher = createAuthFetcher(auth) - const { stream } = createAgent({ - model: body.model as ModelConfig, - activeContextTime: body.activeContextTime, - channels: body.channels, - currentChannel: body.currentChannel, - identity: body.identity, - auth, - skills: body.usableSkills, - mcpConnections: body.mcpConnections, - inbox: body.inbox, - loopDetection: body.loopDetection, - }, authFetcher) - return stream({ - query: body.query, - messages: body.messages, - skills: body.skills, - attachments: body.attachments, - signal, - }) -} - -export const chatModule = new Elysia({ prefix: '/chat' }) - .use(bearerMiddleware) - .post('/', async ({ body, bearer }) => { - console.log('chat', body) - const auth = { - bearer: bearer!, - baseUrl: getBaseUrl(), - } - const authFetcher = createAuthFetcher(auth) - const { ask } = createAgent({ - model: body.model as ModelConfig, - activeContextTime: body.activeContextTime, - channels: body.channels, - currentChannel: body.currentChannel, - identity: body.identity, - auth, - skills: body.usableSkills, - mcpConnections: body.mcpConnections, - inbox: body.inbox, - loopDetection: body.loopDetection, - }, authFetcher) - return ask({ - query: body.query, - messages: body.messages, - skills: body.skills, - attachments: body.attachments, - }) - }, { - body: AgentModel.extend({ - query: z.string().optional().default(''), - }), - }) - .post('/stream', async function* ({ body, bearer }) { - console.log('stream', body) - const abortController = new AbortController() - try { - for await (const action of buildAgentAndStream(body, bearer!, abortController.signal)) { - yield sseChunked(JSON.stringify(action)) - } - } catch (error) { - if (abortController.signal.aborted) return - console.error(error) - const message = error instanceof Error && error.message.trim() - ? error.message - : 'Internal server error' - yield sseChunked(JSON.stringify({ - type: 'error', - message, - })) - } finally { - abortController.abort() - } - }, { - body: StreamBodyModel, - }) - .ws('/ws', (() => { - const sessions = new Map() - return { - open(ws: { raw: unknown }) { - sessions.set(ws.raw, { abortController: null, streaming: false }) - }, - async message(ws: { raw: unknown; send: (data: string) => void }, raw: unknown) { - const parsed = typeof raw === 'string' ? JSON.parse(raw) : raw - const session = sessions.get(ws.raw) - if (!session) return - - if (parsed.type === 'abort') { - session.abortController?.abort() - return - } - if (parsed.type === 'start') { - if (session.streaming) { - ws.send(JSON.stringify({ type: 'error', message: 'Already streaming' })) - return - } - session.streaming = true - const abortController = new AbortController() - session.abortController = abortController - const bearer = parsed.bearer as string | undefined - if (!bearer) { - ws.send(JSON.stringify({ type: 'error', message: 'Missing bearer token' })) - session.streaming = false - return - } - try { - const body = StreamBodyModel.parse(parsed) - const streamIter = buildAgentAndStream(body, bearer, abortController.signal) - for await (const action of streamIter) { - ws.send(JSON.stringify(action)) - } - } catch (error) { - if (!abortController.signal.aborted) { - console.error(error) - const message = error instanceof Error && error.message.trim() - ? error.message - : 'Internal server error' - ws.send(JSON.stringify({ type: 'error', message })) - } - } finally { - session.streaming = false - session.abortController = null - } - } - }, - close(ws: { raw: unknown }) { - const session = sessions.get(ws.raw) - if (session) { - session.abortController?.abort() - sessions.delete(ws.raw) - } - }, - } - })()) - .post('/trigger-schedule', async ({ body, bearer }) => { - console.log('trigger-schedule', body) - const auth = { - bearer: bearer!, - baseUrl: getBaseUrl(), - } - const authFetcher = createAuthFetcher(auth) - const { triggerSchedule } = createAgent({ - model: body.model as ModelConfig, - activeContextTime: body.activeContextTime, - channels: body.channels, - currentChannel: body.currentChannel, - identity: body.identity, - auth, - skills: body.usableSkills, - mcpConnections: body.mcpConnections, - inbox: body.inbox, - loopDetection: body.loopDetection, - }, authFetcher) - return triggerSchedule({ - schedule: body.schedule, - messages: body.messages, - skills: body.skills, - }) - }, { - body: AgentModel.extend({ - schedule: ScheduleModel, - }), - }) - .post('/trigger-heartbeat', async ({ body, bearer }) => { - console.log('trigger-heartbeat', body) - const auth = { - bearer: bearer!, - baseUrl: getBaseUrl(), - } - const authFetcher = createAuthFetcher(auth) - const { triggerHeartbeat } = createAgent({ - model: body.model as ModelConfig, - activeContextTime: body.activeContextTime, - channels: body.channels, - currentChannel: body.currentChannel, - identity: body.identity, - auth, - skills: body.usableSkills, - mcpConnections: body.mcpConnections, - inbox: body.inbox, - loopDetection: body.loopDetection, - }, authFetcher) - return triggerHeartbeat({ - heartbeat: body.heartbeat, - messages: body.messages, - skills: body.skills, - }) - }, { - body: AgentModel.extend({ - heartbeat: HeartbeatModel, - }), - }) - .post('/subagent', async ({ body, bearer }) => { - console.log('subagent', body) - const auth = { - bearer: bearer!, - baseUrl: getBaseUrl(), - } - const authFetcher = createAuthFetcher(auth) - const { askAsSubagent } = createAgent({ - model: body.model as ModelConfig, - identity: body.identity, - auth, - isSubagent: true, - loopDetection: body.loopDetection, - }, authFetcher) - return askAsSubagent({ - messages: body.messages, - input: body.query, - name: body.name, - description: body.description, - }) - }, { - body: z.object({ - model: ModelConfigModel, - identity: IdentityContextModel, - messages: z.array(z.any()).optional().default([]), - query: z.string(), - name: z.string(), - description: z.string(), - loopDetection: LoopDetectionModel, - }), - }) diff --git a/apps/agent/src/utils/sse.ts b/apps/agent/src/utils/sse.ts deleted file mode 100644 index 6a6c84d7..00000000 --- a/apps/agent/src/utils/sse.ts +++ /dev/null @@ -1,50 +0,0 @@ -export const defaultSSEChunkSize = 16 * 1024 - -export function sseChunked(data: string, chunkSize: number = defaultSSEChunkSize) { - return { - sse: true as const, - toSSE: () => { - const out: string[] = [] - for (const chunk of chunkString(data, chunkSize)) { - out.push(`data:${chunk}\n`) - } - out.push('\n') - return out.join('') - }, - } -} - -export function* chunkString(input: string, maxLen: number): Generator { - if (maxLen <= 0) { - yield input - return - } - const isHighSurrogate = (c: number) => c >= 0xd800 && c <= 0xdbff - const isLowSurrogate = (c: number) => c >= 0xdc00 && c <= 0xdfff - let i = 0 - while (i < input.length) { - let end = Math.min(i + maxLen, input.length) - if (end < input.length) { - const last = input.charCodeAt(end - 1) - if (isHighSurrogate(last)) { - const next = input.charCodeAt(end) - if (isLowSurrogate(next)) { - end += 1 - } else { - end -= 1 - } - } - } - if (end <= i) { - const first = input.charCodeAt(i) - const second = i+1 < input.length ? input.charCodeAt(i + 1) : -1 - if (isHighSurrogate(first) && isLowSurrogate(second)) { - end = Math.min(i + 2, input.length) - } else { - end = Math.min(i + 1, input.length) - } - } - yield input.slice(i, end) - i = end - } -} diff --git a/apps/agent/tsconfig.json b/apps/agent/tsconfig.json deleted file mode 100644 index d28b8791..00000000 --- a/apps/agent/tsconfig.json +++ /dev/null @@ -1,103 +0,0 @@ -{ - "compilerOptions": { - /* Visit https://aka.ms/tsconfig to read more about this file */ - - /* Projects */ - // "incremental": true, /* Save .tsbuildinfo files to allow for incremental compilation of projects. */ - // "composite": true, /* Enable constraints that allow a TypeScript project to be used with project references. */ - // "tsBuildInfoFile": "./.tsbuildinfo", /* Specify the path to .tsbuildinfo incremental compilation file. */ - // "disableSourceOfProjectReferenceRedirect": true, /* Disable preferring source files instead of declaration files when referencing composite projects. */ - // "disableSolutionSearching": true, /* Opt a project out of multi-project reference checking when editing. */ - // "disableReferencedProjectLoad": true, /* Reduce the number of projects loaded automatically by TypeScript. */ - - /* Language and Environment */ - "target": "ES2021", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */ - // "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */ - // "jsx": "preserve", /* Specify what JSX code is generated. */ - // "experimentalDecorators": true, /* Enable experimental support for TC39 stage 2 draft decorators. */ - // "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */ - // "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h'. */ - // "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */ - // "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using 'jsx: react-jsx*'. */ - // "reactNamespace": "", /* Specify the object invoked for 'createElement'. This only applies when targeting 'react' JSX emit. */ - // "noLib": true, /* Disable including any library files, including the default lib.d.ts. */ - // "useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */ - // "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */ - - /* Modules */ - "module": "ES2022", /* Specify what module code is generated. */ - // "rootDir": "./", /* Specify the root folder within your source files. */ - "moduleResolution": "bundler", /* Specify how TypeScript looks up a file from a given module specifier. */ - // "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */ - // "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */ - // "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */ - // "typeRoots": [], /* Specify multiple folders that act like './node_modules/@types'. */ - "types": ["bun-types"], /* Specify type package names to be included without being referenced in a source file. */ - // "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */ - // "moduleSuffixes": [], /* List of file name suffixes to search when resolving a module. */ - // "resolveJsonModule": true, /* Enable importing .json files. */ - // "noResolve": true, /* Disallow 'import's, 'require's or ''s from expanding the number of files TypeScript should add to a project. */ - - /* JavaScript Support */ - // "allowJs": true, /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */ - // "checkJs": true, /* Enable error reporting in type-checked JavaScript files. */ - // "maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */ - - /* Emit */ - // "declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */ - // "declarationMap": true, /* Create sourcemaps for d.ts files. */ - // "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */ - // "sourceMap": true, /* Create source map files for emitted JavaScript files. */ - // "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If 'declaration' is true, also designates a file that bundles all .d.ts output. */ - // "outDir": "./", /* Specify an output folder for all emitted files. */ - // "removeComments": true, /* Disable emitting comments. */ - // "noEmit": true, /* Disable emitting files from a compilation. */ - // "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */ - // "importsNotUsedAsValues": "remove", /* Specify emit/checking behavior for imports that are only used for types. */ - // "downlevelIteration": true, /* Emit more compliant, but verbose and less performant JavaScript for iteration. */ - // "sourceRoot": "", /* Specify the root path for debuggers to find the reference source code. */ - // "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */ - // "inlineSourceMap": true, /* Include sourcemap files inside the emitted JavaScript. */ - // "inlineSources": true, /* Include source code in the sourcemaps inside the emitted JavaScript. */ - // "emitBOM": true, /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */ - // "newLine": "crlf", /* Set the newline character for emitting files. */ - // "stripInternal": true, /* Disable emitting declarations that have '@internal' in their JSDoc comments. */ - // "noEmitHelpers": true, /* Disable generating custom helper functions like '__extends' in compiled output. */ - // "noEmitOnError": true, /* Disable emitting files if any type checking errors are reported. */ - // "preserveConstEnums": true, /* Disable erasing 'const enum' declarations in generated code. */ - // "declarationDir": "./", /* Specify the output directory for generated declaration files. */ - // "preserveValueImports": true, /* Preserve unused imported values in the JavaScript output that would otherwise be removed. */ - - /* Interop Constraints */ - // "isolatedModules": true, /* Ensure that each file can be safely transpiled without relying on other imports. */ - // "allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */ - "esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */ - // "preserveSymlinks": true, /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */ - "forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */ - - /* Type Checking */ - "strict": true, /* Enable all strict type-checking options. */ - // "noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied 'any' type. */ - // "strictNullChecks": true, /* When type checking, take into account 'null' and 'undefined'. */ - // "strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */ - // "strictBindCallApply": true, /* Check that the arguments for 'bind', 'call', and 'apply' methods match the original function. */ - // "strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */ - // "noImplicitThis": true, /* Enable error reporting when 'this' is given the type 'any'. */ - // "useUnknownInCatchVariables": true, /* Default catch clause variables as 'unknown' instead of 'any'. */ - // "alwaysStrict": true, /* Ensure 'use strict' is always emitted. */ - // "noUnusedLocals": true, /* Enable error reporting when local variables aren't read. */ - // "noUnusedParameters": true, /* Raise an error when a function parameter isn't read. */ - // "exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */ - // "noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */ - // "noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */ - // "noUncheckedIndexedAccess": true, /* Add 'undefined' to a type when accessed using an index. */ - // "noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */ - // "noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type. */ - // "allowUnusedLabels": true, /* Disable error reporting for unused labels. */ - // "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */ - - /* Completeness */ - // "skipDefaultLibCheck": true, /* Skip type checking .d.ts files that are included with TypeScript. */ - "skipLibCheck": true /* Skip type checking all .d.ts files. */ - } -} diff --git a/apps/web/src/composables/api/useChat.types.ts b/apps/web/src/composables/api/useChat.types.ts index af288f76..162a046a 100644 --- a/apps/web/src/composables/api/useChat.types.ts +++ b/apps/web/src/composables/api/useChat.types.ts @@ -21,6 +21,8 @@ export interface MessageAsset { mime: string size_bytes: number storage_key: string + name?: string + metadata?: Record } export interface Message { diff --git a/apps/web/src/pages/bots/components/bot-settings.vue b/apps/web/src/pages/bots/components/bot-settings.vue index 434f50a5..480ea25e 100644 --- a/apps/web/src/pages/bots/components/bot-settings.vue +++ b/apps/web/src/pages/bots/components/bot-settings.vue @@ -446,6 +446,21 @@ const memoryProviders = computed(() => memoryProviderData.value ?? []) const ttsProviders = computed(() => ttsProviderData.value ?? []) const ttsModels = computed(() => ttsModelData.value ?? []) const browserContexts = computed(() => browserContextData.value ?? []) + +// ---- Form ---- +const form = reactive({ + chat_model_id: '', + search_provider_id: '', + memory_provider_id: '', + tts_model_id: '', + browser_context_id: '', + max_context_load_time: 0, + max_context_tokens: 0, + language: '', + reasoning_enabled: false, + reasoning_effort: 'medium', +}) + const selectedMemoryProvider = computed(() => memoryProviders.value.find((provider) => provider.id === form.memory_provider_id), ) @@ -542,20 +557,6 @@ const encoderHealthLabel = computed(() => : t('bots.settings.memoryEncoderHealth'), ) -// ---- Form ---- -const form = reactive({ - chat_model_id: '', - search_provider_id: '', - memory_provider_id: '', - tts_model_id: '', - browser_context_id: '', - max_context_load_time: 0, - max_context_tokens: 0, - language: '', - reasoning_enabled: false, - reasoning_effort: 'medium', -}) - watch(settings, (val) => { if (val) { form.chat_model_id = val.chat_model_id ?? '' diff --git a/apps/web/src/pages/chat/components/attachment-block.vue b/apps/web/src/pages/chat/components/attachment-block.vue index 71b818fc..85215015 100644 --- a/apps/web/src/pages/chat/components/attachment-block.vue +++ b/apps/web/src/pages/chat/components/attachment-block.vue @@ -43,23 +43,6 @@ /> - - - - - {{ String(att.name ?? 'file') }} - - - + + + + + {{ String(att.name ?? 'file') }} + + +
): boolean { } function getContainerPath(att: AttachmentItem): string { - return String(att.path ?? '').trim() + const direct = String(att.path ?? '').trim() + if (direct) return direct + const meta = att.metadata as Record | undefined + return String(meta?.source_path ?? '').trim() } function getDisplayName(att: AttachmentItem): string { diff --git a/apps/web/src/store/chat-list.ts b/apps/web/src/store/chat-list.ts index 1214a77f..668e9f45 100644 --- a/apps/web/src/store/chat-list.ts +++ b/apps/web/src/store/chat-list.ts @@ -164,6 +164,8 @@ export const useChatStore = defineStore('chat', () => { mime: a.mime, size: a.size_bytes, storage_key: a.storage_key, + name: a.name || undefined, + metadata: a.metadata || undefined, })) return [{ type: 'attachment', attachments: items }] } diff --git a/cmd/agent/main.go b/cmd/agent/main.go index b8a37202..20d9b577 100644 --- a/cmd/agent/main.go +++ b/cmd/agent/main.go @@ -22,6 +22,8 @@ import ( dbembed "github.com/memohai/memoh/db" "github.com/memohai/memoh/internal/accounts" "github.com/memohai/memoh/internal/acl" + agentpkg "github.com/memohai/memoh/internal/agent" + agenttools "github.com/memohai/memoh/internal/agent/tools" "github.com/memohai/memoh/internal/bind" "github.com/memohai/memoh/internal/boot" "github.com/memohai/memoh/internal/bots" @@ -56,19 +58,6 @@ import ( "github.com/memohai/memoh/internal/inbox" "github.com/memohai/memoh/internal/logger" "github.com/memohai/memoh/internal/mcp" - mcpbrowser "github.com/memohai/memoh/internal/mcp/providers/browser" - mcpcontacts "github.com/memohai/memoh/internal/mcp/providers/contacts" - mcpcontainer "github.com/memohai/memoh/internal/mcp/providers/container" - mcpemail "github.com/memohai/memoh/internal/mcp/providers/email" - mcpinbox "github.com/memohai/memoh/internal/mcp/providers/inbox" - mcpmemory "github.com/memohai/memoh/internal/mcp/providers/memory" - mcpmessage "github.com/memohai/memoh/internal/mcp/providers/message" - mcpschedule "github.com/memohai/memoh/internal/mcp/providers/schedule" - mcpskill "github.com/memohai/memoh/internal/mcp/providers/skill" - mcpsubagent "github.com/memohai/memoh/internal/mcp/providers/subagent" - mcptts "github.com/memohai/memoh/internal/mcp/providers/tts" - mcpweb "github.com/memohai/memoh/internal/mcp/providers/web" - mcpwebfetch "github.com/memohai/memoh/internal/mcp/providers/webfetch" mcpfederation "github.com/memohai/memoh/internal/mcp/sources/federation" "github.com/memohai/memoh/internal/media" memprovider "github.com/memohai/memoh/internal/memory/adapters" @@ -210,7 +199,8 @@ func runServe() { provideChannelManager, provideChannelLifecycleService, - // conversation flow + // agent & conversation flow + provideAgent, provideChatResolver, provideScheduleTriggerer, schedule.NewService, @@ -221,6 +211,7 @@ func runServe() { provideContainerdHandler, provideFederationGateway, provideToolGatewayService, + provideToolProviders, // http handlers (group:"server_handlers") provideServerHandler(handlers.NewPingHandler), @@ -260,6 +251,7 @@ func runServe() { provideServer, ), fx.Invoke( + injectToolProviders, startMemoryProviderBootstrap, startScheduleService, startHeartbeatService, @@ -399,8 +391,19 @@ func provideHeartbeatTriggerer(resolver *flow.Resolver) heartbeat.Triggerer { // conversation flow // --------------------------------------------------------------------------- -func provideChatResolver(log *slog.Logger, cfg config.Config, modelsService *models.Service, queries *dbsqlc.Queries, chatService *conversation.Service, msgService *message.DBService, settingsService *settings.Service, mediaService *media.Service, containerdHandler *handlers.ContainerdHandler, inboxService *inbox.Service, memoryRegistry *memprovider.Registry) *flow.Resolver { - resolver := flow.NewResolver(log, modelsService, queries, chatService, msgService, settingsService, cfg.AgentGateway.BaseURL(), 120*time.Second) +func provideAgent(log *slog.Logger, manager *workspace.Manager) *agentpkg.Agent { + return agentpkg.New(agentpkg.Deps{ + BridgeProvider: manager, + Logger: log, + }) +} + +func injectToolProviders(a *agentpkg.Agent, providers []agenttools.ToolProvider) { + a.SetToolProviders(providers) +} + +func provideChatResolver(log *slog.Logger, a *agentpkg.Agent, modelsService *models.Service, queries *dbsqlc.Queries, chatService *conversation.Service, msgService *message.DBService, settingsService *settings.Service, mediaService *media.Service, containerdHandler *handlers.ContainerdHandler, inboxService *inbox.Service, memoryRegistry *memprovider.Registry) *flow.Resolver { + resolver := flow.NewResolver(log, modelsService, queries, chatService, msgService, settingsService, a, 120*time.Second) resolver.SetMemoryRegistry(memoryRegistry) resolver.SetSkillLoader(&skillLoaderAdapter{handler: containerdHandler}) resolver.SetGatewayAssetLoader(&gatewayAssetLoaderAdapter{media: mediaService}) @@ -548,34 +551,36 @@ func provideOAuthService(log *slog.Logger, queries *dbsqlc.Queries, cfg config.C return mcp.NewOAuthService(log, queries, callbackURL) } -func provideToolGatewayService(log *slog.Logger, cfg config.Config, channelManager *channel.Manager, registry *channel.Registry, routeService *route.DBService, scheduleService *schedule.Service, _ *conversation.Service, _ *accounts.Service, settingsService *settings.Service, searchProviderService *searchproviders.Service, manager *workspace.Manager, containerdHandler *handlers.ContainerdHandler, mcpConnService *mcp.ConnectionService, mediaService *media.Service, inboxService *inbox.Service, memoryRegistry *memprovider.Registry, emailService *emailpkg.Service, emailManager *emailpkg.Manager, fedGateway *handlers.MCPFederationGateway, oauthService *mcp.OAuthService, subagentService *subagent.Service, modelsService *models.Service, browserContextService *browsercontexts.Service, queries *dbsqlc.Queries, ttsService *ttspkg.Service) *mcp.ToolGatewayService { +func provideToolGatewayService(log *slog.Logger, fedGateway *handlers.MCPFederationGateway, oauthService *mcp.OAuthService, mcpConnService *mcp.ConnectionService, containerdHandler *handlers.ContainerdHandler) *mcp.ToolGatewayService { fedGateway.SetOAuthService(oauthService) - var assetResolver mcpmessage.AssetResolver + fedSource := mcpfederation.NewSource(log, fedGateway, mcpConnService) + svc := mcp.NewToolGatewayService(log, []mcp.ToolSource{fedSource}) + containerdHandler.SetToolGatewayService(svc) + return svc +} + +func provideToolProviders(log *slog.Logger, cfg config.Config, channelManager *channel.Manager, registry *channel.Registry, routeService *route.DBService, scheduleService *schedule.Service, settingsService *settings.Service, searchProviderService *searchproviders.Service, manager *workspace.Manager, mediaService *media.Service, inboxService *inbox.Service, memoryRegistry *memprovider.Registry, emailService *emailpkg.Service, emailManager *emailpkg.Manager, fedGateway *handlers.MCPFederationGateway, mcpConnService *mcp.ConnectionService, subagentService *subagent.Service, modelsService *models.Service, browserContextService *browsercontexts.Service, queries *dbsqlc.Queries, ttsService *ttspkg.Service) []agenttools.ToolProvider { + var assetResolver agenttools.AssetResolver if mediaService != nil { assetResolver = &mediaAssetResolverAdapter{media: mediaService} } - messageExec := mcpmessage.NewExecutor(log, channelManager, channelManager, registry, assetResolver) - contactsExec := mcpcontacts.NewExecutor(log, routeService) - scheduleExec := mcpschedule.NewExecutor(log, scheduleService) - memoryExec := mcpmemory.NewExecutor(log, memoryRegistry, settingsService) - webExec := mcpweb.NewExecutor(log, settingsService, searchProviderService) - inboxExec := mcpinbox.NewExecutor(log, inboxService) - fsExec := mcpcontainer.NewExecutor(log, manager, config.DefaultDataMount) fedSource := mcpfederation.NewSource(log, fedGateway, mcpConnService) - emailExec := mcpemail.NewExecutor(log, emailService, emailManager) - webFetchExec := mcpwebfetch.NewExecutor(log) - subagentExec := mcpsubagent.NewExecutor(log, subagentService, settingsService, modelsService, queries, cfg.AgentGateway.BaseURL()) - skillExec := mcpskill.NewExecutor(log) - browserExec := mcpbrowser.NewExecutor(log, settingsService, browserContextService, manager, cfg.BrowserGateway) - ttsExec := mcptts.NewExecutor(log, settingsService, ttsService, channelManager, registry) - - svc := mcp.NewToolGatewayService( - log, - []mcp.ToolExecutor{messageExec, contactsExec, scheduleExec, memoryExec, webExec, fsExec, inboxExec, emailExec, webFetchExec, subagentExec, skillExec, browserExec, ttsExec}, - []mcp.ToolSource{fedSource}, - ) - containerdHandler.SetToolGatewayService(svc) - return svc + return []agenttools.ToolProvider{ + agenttools.NewMessageProvider(log, channelManager, channelManager, registry, assetResolver), + agenttools.NewContactsProvider(log, routeService), + agenttools.NewScheduleProvider(log, scheduleService), + agenttools.NewMemoryProvider(log, memoryRegistry, settingsService), + agenttools.NewWebProvider(log, settingsService, searchProviderService), + agenttools.NewContainerProvider(log, manager, config.DefaultDataMount), + agenttools.NewInboxProvider(log, inboxService), + agenttools.NewEmailProvider(log, emailService, emailManager), + agenttools.NewWebFetchProvider(log), + agenttools.NewSubagentProvider(log, subagentService, settingsService, modelsService, queries, ""), + agenttools.NewSkillProvider(log), + agenttools.NewBrowserProvider(log, settingsService, browserContextService, manager, cfg.BrowserGateway), + agenttools.NewTTSProvider(log, settingsService, ttsService, channelManager, registry), + agenttools.NewFederationProvider(log, fedSource), + } } // --------------------------------------------------------------------------- @@ -992,15 +997,15 @@ type mediaAssetResolverAdapter struct { media *media.Service } -func (a *mediaAssetResolverAdapter) GetByStorageKey(ctx context.Context, botID, storageKey string) (mcpmessage.AssetMeta, error) { +func (a *mediaAssetResolverAdapter) GetByStorageKey(ctx context.Context, botID, storageKey string) (agenttools.AssetMeta, error) { if a == nil || a.media == nil { - return mcpmessage.AssetMeta{}, errors.New("media service not configured") + return agenttools.AssetMeta{}, errors.New("media service not configured") } asset, err := a.media.GetByStorageKey(ctx, botID, storageKey) if err != nil { - return mcpmessage.AssetMeta{}, err + return agenttools.AssetMeta{}, err } - return mcpmessage.AssetMeta{ + return agenttools.AssetMeta{ ContentHash: asset.ContentHash, Mime: asset.Mime, SizeBytes: asset.SizeBytes, @@ -1008,15 +1013,15 @@ func (a *mediaAssetResolverAdapter) GetByStorageKey(ctx context.Context, botID, }, nil } -func (a *mediaAssetResolverAdapter) IngestContainerFile(ctx context.Context, botID, containerPath string) (mcpmessage.AssetMeta, error) { +func (a *mediaAssetResolverAdapter) IngestContainerFile(ctx context.Context, botID, containerPath string) (agenttools.AssetMeta, error) { if a == nil || a.media == nil { - return mcpmessage.AssetMeta{}, errors.New("media service not configured") + return agenttools.AssetMeta{}, errors.New("media service not configured") } asset, err := a.media.IngestContainerFile(ctx, botID, containerPath) if err != nil { - return mcpmessage.AssetMeta{}, err + return agenttools.AssetMeta{}, err } - return mcpmessage.AssetMeta{ + return agenttools.AssetMeta{ ContentHash: asset.ContentHash, Mime: asset.Mime, SizeBytes: asset.SizeBytes, diff --git a/cmd/memoh/serve.go b/cmd/memoh/serve.go index be81895d..9e65186d 100644 --- a/cmd/memoh/serve.go +++ b/cmd/memoh/serve.go @@ -22,12 +22,13 @@ import ( "github.com/memohai/memoh/internal/accounts" "github.com/memohai/memoh/internal/acl" + agentpkg "github.com/memohai/memoh/internal/agent" + agenttools "github.com/memohai/memoh/internal/agent/tools" "github.com/memohai/memoh/internal/auth" "github.com/memohai/memoh/internal/bind" "github.com/memohai/memoh/internal/boot" "github.com/memohai/memoh/internal/bots" "github.com/memohai/memoh/internal/browsercontexts" - agentruntime "github.com/memohai/memoh/internal/bun/runtime" "github.com/memohai/memoh/internal/channel" "github.com/memohai/memoh/internal/channel/adapters/discord" "github.com/memohai/memoh/internal/channel/adapters/feishu" @@ -58,19 +59,6 @@ import ( "github.com/memohai/memoh/internal/inbox" "github.com/memohai/memoh/internal/logger" "github.com/memohai/memoh/internal/mcp" - mcpbrowser "github.com/memohai/memoh/internal/mcp/providers/browser" - mcpcontacts "github.com/memohai/memoh/internal/mcp/providers/contacts" - mcpcontainer "github.com/memohai/memoh/internal/mcp/providers/container" - mcpemail "github.com/memohai/memoh/internal/mcp/providers/email" - mcpinbox "github.com/memohai/memoh/internal/mcp/providers/inbox" - mcpmemory "github.com/memohai/memoh/internal/mcp/providers/memory" - mcpmessage "github.com/memohai/memoh/internal/mcp/providers/message" - mcpschedule "github.com/memohai/memoh/internal/mcp/providers/schedule" - mcpskill "github.com/memohai/memoh/internal/mcp/providers/skill" - mcpsubagent "github.com/memohai/memoh/internal/mcp/providers/subagent" - mcptts "github.com/memohai/memoh/internal/mcp/providers/tts" - mcpweb "github.com/memohai/memoh/internal/mcp/providers/web" - mcpwebfetch "github.com/memohai/memoh/internal/mcp/providers/webfetch" mcpfederation "github.com/memohai/memoh/internal/mcp/sources/federation" "github.com/memohai/memoh/internal/media" memprovider "github.com/memohai/memoh/internal/memory/adapters" @@ -105,7 +93,6 @@ func runServe() { provideDBConn, provideDBQueries, provideWorkspaceManager, - provideAgentRuntimeManager, provideMemoryLLM, memprovider.NewService, provideMemoryProviderRegistry, @@ -142,6 +129,7 @@ func runServe() { provideChannelRouter, provideChannelManager, provideChannelLifecycleService, + provideAgent, provideChatResolver, browsercontexts.NewService, provideScheduleTriggerer, @@ -151,6 +139,7 @@ func runServe() { provideContainerdHandler, provideFederationGateway, provideToolGatewayService, + provideToolProviders, provideServerHandler(handlers.NewPingHandler), provideServerHandler(provideMemohAuthHandler), provideServerHandler(provideMemoryHandler), @@ -189,13 +178,13 @@ func runServe() { provideServer, ), fx.Invoke( + injectToolProviders, startMemoryProviderBootstrap, startScheduleService, startHeartbeatService, startChannelManager, startEmailManager, startContainerReconciliation, - startAgentRuntime, startTtsTempStoreCleanup, startServer, ), @@ -250,10 +239,6 @@ func provideWorkspaceManager(log *slog.Logger, service ctr.Service, cfg config.C return workspace.NewManager(log, service, cfg.Workspace, cfg.Containerd.Namespace, conn) } -func provideAgentRuntimeManager(log *slog.Logger, cfg config.Config) *agentruntime.Manager { - return agentruntime.NewManager(log, cfg) -} - func provideMemoryLLM(modelsService *models.Service, queries *dbsqlc.Queries, log *slog.Logger) memprovider.LLM { return &lazyLLMClient{modelsService: modelsService, queries: queries, timeout: 30 * time.Second, logger: log} } @@ -314,8 +299,19 @@ func provideHeartbeatTriggerer(resolver *flow.Resolver) heartbeat.Triggerer { return flow.NewHeartbeatGateway(resolver) } -func provideChatResolver(log *slog.Logger, cfg config.Config, modelsService *models.Service, queries *dbsqlc.Queries, chatService *conversation.Service, msgService *message.DBService, settingsService *settings.Service, mediaService *media.Service, containerdHandler *handlers.ContainerdHandler, inboxService *inbox.Service, memoryRegistry *memprovider.Registry) *flow.Resolver { - resolver := flow.NewResolver(log, modelsService, queries, chatService, msgService, settingsService, cfg.AgentGateway.BaseURL(), 120*time.Second) +func provideAgent(log *slog.Logger, manager *workspace.Manager) *agentpkg.Agent { + return agentpkg.New(agentpkg.Deps{ + BridgeProvider: manager, + Logger: log, + }) +} + +func injectToolProviders(a *agentpkg.Agent, providers []agenttools.ToolProvider) { + a.SetToolProviders(providers) +} + +func provideChatResolver(log *slog.Logger, a *agentpkg.Agent, modelsService *models.Service, queries *dbsqlc.Queries, chatService *conversation.Service, msgService *message.DBService, settingsService *settings.Service, mediaService *media.Service, containerdHandler *handlers.ContainerdHandler, inboxService *inbox.Service, memoryRegistry *memprovider.Registry) *flow.Resolver { + resolver := flow.NewResolver(log, modelsService, queries, chatService, msgService, settingsService, a, 120*time.Second) resolver.SetMemoryRegistry(memoryRegistry) resolver.SetSkillLoader(&skillLoaderAdapter{handler: containerdHandler}) resolver.SetGatewayAssetLoader(&gatewayAssetLoaderAdapter{media: mediaService}) @@ -417,29 +413,36 @@ func provideOAuthService(log *slog.Logger, queries *dbsqlc.Queries, cfg config.C return mcp.NewOAuthService(log, queries, callbackURL) } -func provideToolGatewayService(log *slog.Logger, cfg config.Config, channelManager *channel.Manager, registry *channel.Registry, routeService *route.DBService, scheduleService *schedule.Service, _ *conversation.Service, _ *accounts.Service, settingsService *settings.Service, searchProviderService *searchproviders.Service, manager *workspace.Manager, containerdHandler *handlers.ContainerdHandler, mcpConnService *mcp.ConnectionService, mediaService *media.Service, inboxService *inbox.Service, memoryRegistry *memprovider.Registry, emailService *emailpkg.Service, emailManager *emailpkg.Manager, fedGateway *handlers.MCPFederationGateway, oauthService *mcp.OAuthService, subagentService *subagent.Service, modelsService *models.Service, browserContextService *browsercontexts.Service, queries *dbsqlc.Queries, ttsService *ttspkg.Service) *mcp.ToolGatewayService { +func provideToolGatewayService(log *slog.Logger, fedGateway *handlers.MCPFederationGateway, oauthService *mcp.OAuthService, mcpConnService *mcp.ConnectionService, containerdHandler *handlers.ContainerdHandler) *mcp.ToolGatewayService { fedGateway.SetOAuthService(oauthService) - var assetResolver mcpmessage.AssetResolver + fedSource := mcpfederation.NewSource(log, fedGateway, mcpConnService) + svc := mcp.NewToolGatewayService(log, []mcp.ToolSource{fedSource}) + containerdHandler.SetToolGatewayService(svc) + return svc +} + +func provideToolProviders(log *slog.Logger, cfg config.Config, channelManager *channel.Manager, registry *channel.Registry, routeService *route.DBService, scheduleService *schedule.Service, settingsService *settings.Service, searchProviderService *searchproviders.Service, manager *workspace.Manager, mediaService *media.Service, inboxService *inbox.Service, memoryRegistry *memprovider.Registry, emailService *emailpkg.Service, emailManager *emailpkg.Manager, fedGateway *handlers.MCPFederationGateway, mcpConnService *mcp.ConnectionService, subagentService *subagent.Service, modelsService *models.Service, browserContextService *browsercontexts.Service, queries *dbsqlc.Queries, ttsService *ttspkg.Service) []agenttools.ToolProvider { + var assetResolver agenttools.AssetResolver if mediaService != nil { assetResolver = &mediaAssetResolverAdapter{media: mediaService} } - messageExec := mcpmessage.NewExecutor(log, channelManager, channelManager, registry, assetResolver) - contactsExec := mcpcontacts.NewExecutor(log, routeService) - scheduleExec := mcpschedule.NewExecutor(log, scheduleService) - memoryExec := mcpmemory.NewExecutor(log, memoryRegistry, settingsService) - webExec := mcpweb.NewExecutor(log, settingsService, searchProviderService) - inboxExec := mcpinbox.NewExecutor(log, inboxService) - fsExec := mcpcontainer.NewExecutor(log, manager, config.DefaultDataMount) fedSource := mcpfederation.NewSource(log, fedGateway, mcpConnService) - emailExec := mcpemail.NewExecutor(log, emailService, emailManager) - webFetchExec := mcpwebfetch.NewExecutor(log) - subagentExec := mcpsubagent.NewExecutor(log, subagentService, settingsService, modelsService, queries, cfg.AgentGateway.BaseURL()) - skillExec := mcpskill.NewExecutor(log) - browserExec := mcpbrowser.NewExecutor(log, settingsService, browserContextService, manager, cfg.BrowserGateway) - ttsExec := mcptts.NewExecutor(log, settingsService, ttsService, channelManager, registry) - svc := mcp.NewToolGatewayService(log, []mcp.ToolExecutor{messageExec, contactsExec, scheduleExec, memoryExec, webExec, fsExec, inboxExec, emailExec, webFetchExec, subagentExec, skillExec, browserExec, ttsExec}, []mcp.ToolSource{fedSource}) - containerdHandler.SetToolGatewayService(svc) - return svc + return []agenttools.ToolProvider{ + agenttools.NewMessageProvider(log, channelManager, channelManager, registry, assetResolver), + agenttools.NewContactsProvider(log, routeService), + agenttools.NewScheduleProvider(log, scheduleService), + agenttools.NewMemoryProvider(log, memoryRegistry, settingsService), + agenttools.NewWebProvider(log, settingsService, searchProviderService), + agenttools.NewContainerProvider(log, manager, config.DefaultDataMount), + agenttools.NewInboxProvider(log, inboxService), + agenttools.NewEmailProvider(log, emailService, emailManager), + agenttools.NewWebFetchProvider(log), + agenttools.NewSubagentProvider(log, subagentService, settingsService, modelsService, queries, ""), + agenttools.NewSkillProvider(log), + agenttools.NewBrowserProvider(log, settingsService, browserContextService, manager, cfg.BrowserGateway), + agenttools.NewTTSProvider(log, settingsService, ttsService, channelManager, registry), + agenttools.NewFederationProvider(log, fedSource), + } } func provideMemoryHandler(log *slog.Logger, botService *bots.Service, accountService *accounts.Service, _ config.Config, manager *workspace.Manager, memoryRegistry *memprovider.Registry, settingsService *settings.Service, _ *handlers.ContainerdHandler) *handlers.MemoryHandler { @@ -624,13 +627,6 @@ func startContainerReconciliation(lc fx.Lifecycle, manager *workspace.Manager, _ lc.Append(fx.Hook{OnStart: func(ctx context.Context) error { go manager.ReconcileContainers(ctx); return nil }}) } -func startAgentRuntime(lc fx.Lifecycle, manager *agentruntime.Manager) { - lc.Append(fx.Hook{ - OnStart: func(ctx context.Context) error { return manager.Start(ctx) }, - OnStop: func(ctx context.Context) error { return manager.Stop(ctx) }, - }) -} - func startServer(lc fx.Lifecycle, logger *slog.Logger, srv *memohServer, shutdowner fx.Shutdowner, cfg config.Config, queries *dbsqlc.Queries, botService *bots.Service, _ *handlers.ContainerdHandler, manager *workspace.Manager, mcpConnService *mcp.ConnectionService, toolGateway *mcp.ToolGatewayService, channelManager *channel.Manager, modelsService *models.Service) { fmt.Printf("Starting Memoh Agent %s\n", version.GetInfo()) lc.Append(fx.Hook{ @@ -920,26 +916,26 @@ func (a *skillLoaderAdapter) LoadSkills(ctx context.Context, botID string) ([]fl type mediaAssetResolverAdapter struct{ media *media.Service } -func (a *mediaAssetResolverAdapter) GetByStorageKey(ctx context.Context, botID, storageKey string) (mcpmessage.AssetMeta, error) { +func (a *mediaAssetResolverAdapter) GetByStorageKey(ctx context.Context, botID, storageKey string) (agenttools.AssetMeta, error) { if a == nil || a.media == nil { - return mcpmessage.AssetMeta{}, errors.New("media service not configured") + return agenttools.AssetMeta{}, errors.New("media service not configured") } asset, err := a.media.GetByStorageKey(ctx, botID, storageKey) if err != nil { - return mcpmessage.AssetMeta{}, err + return agenttools.AssetMeta{}, err } - return mcpmessage.AssetMeta{ContentHash: asset.ContentHash, Mime: asset.Mime, SizeBytes: asset.SizeBytes, StorageKey: asset.StorageKey}, nil + return agenttools.AssetMeta{ContentHash: asset.ContentHash, Mime: asset.Mime, SizeBytes: asset.SizeBytes, StorageKey: asset.StorageKey}, nil } -func (a *mediaAssetResolverAdapter) IngestContainerFile(ctx context.Context, botID, containerPath string) (mcpmessage.AssetMeta, error) { +func (a *mediaAssetResolverAdapter) IngestContainerFile(ctx context.Context, botID, containerPath string) (agenttools.AssetMeta, error) { if a == nil || a.media == nil { - return mcpmessage.AssetMeta{}, errors.New("media service not configured") + return agenttools.AssetMeta{}, errors.New("media service not configured") } asset, err := a.media.IngestContainerFile(ctx, botID, containerPath) if err != nil { - return mcpmessage.AssetMeta{}, err + return agenttools.AssetMeta{}, err } - return mcpmessage.AssetMeta{ContentHash: asset.ContentHash, Mime: asset.Mime, SizeBytes: asset.SizeBytes, StorageKey: asset.StorageKey}, nil + return agenttools.AssetMeta{ContentHash: asset.ContentHash, Mime: asset.Mime, SizeBytes: asset.SizeBytes, StorageKey: asset.StorageKey}, nil } type gatewayAssetLoaderAdapter struct{ media *media.Service } diff --git a/conf/app.apple.toml b/conf/app.apple.toml index 1c561b54..8564b633 100644 --- a/conf/app.apple.toml +++ b/conf/app.apple.toml @@ -46,11 +46,6 @@ timeout_seconds = 10 [sparse] base_url = "http://127.0.0.1:8085" -[agent_gateway] -host = "127.0.0.1" -port = 8081 -server_addr = "127.0.0.1:8080" - [browser_gateway] host = "127.0.0.1" port = 8083 diff --git a/conf/app.docker.toml b/conf/app.docker.toml index 79571e07..a1663587 100644 --- a/conf/app.docker.toml +++ b/conf/app.docker.toml @@ -47,12 +47,6 @@ timeout_seconds = 10 [sparse] base_url = "http://sparse:8085" -## Agent Gateway -[agent_gateway] -host = "agent" -port = 8081 -server_addr = "server:8080" - ## Browser Gateway [browser_gateway] host = "browser" diff --git a/conf/app.example.toml b/conf/app.example.toml index bedabc14..79089804 100644 --- a/conf/app.example.toml +++ b/conf/app.example.toml @@ -46,11 +46,6 @@ timeout_seconds = 10 [sparse] base_url = "http://127.0.0.1:8085" -[agent_gateway] -host = "127.0.0.1" -port = 8081 -server_addr = ":8080" - [browser_gateway] host = "127.0.0.1" port = 8083 diff --git a/conf/app.windows.toml b/conf/app.windows.toml index dba41a4d..8a228fde 100644 --- a/conf/app.windows.toml +++ b/conf/app.windows.toml @@ -45,11 +45,6 @@ timeout_seconds = 10 [sparse] base_url = "http://127.0.0.1:8085" -[agent_gateway] -host = "127.0.0.1" -port = 8081 -server_addr = ":8080" - [browser_gateway] host = "127.0.0.1" port = 8083 diff --git a/db/migrations/0001_init.up.sql b/db/migrations/0001_init.up.sql index b738f4fd..4329012f 100644 --- a/db/migrations/0001_init.up.sql +++ b/db/migrations/0001_init.up.sql @@ -492,6 +492,8 @@ CREATE TABLE IF NOT EXISTS bot_history_message_assets ( role TEXT NOT NULL DEFAULT 'attachment', ordinal INTEGER NOT NULL DEFAULT 0, content_hash TEXT NOT NULL, + name TEXT NOT NULL DEFAULT '', + metadata JSONB NOT NULL DEFAULT '{}'::jsonb, created_at TIMESTAMPTZ NOT NULL DEFAULT now(), CONSTRAINT message_asset_content_unique UNIQUE (message_id, content_hash) ); diff --git a/db/migrations/0034_asset_name.down.sql b/db/migrations/0034_asset_name.down.sql new file mode 100644 index 00000000..22bb938e --- /dev/null +++ b/db/migrations/0034_asset_name.down.sql @@ -0,0 +1,5 @@ +-- 0034_asset_name (rollback) +-- Remove name column from bot_history_message_assets. + +ALTER TABLE bot_history_message_assets + DROP COLUMN IF EXISTS name; diff --git a/db/migrations/0034_asset_name.up.sql b/db/migrations/0034_asset_name.up.sql new file mode 100644 index 00000000..abe536d9 --- /dev/null +++ b/db/migrations/0034_asset_name.up.sql @@ -0,0 +1,5 @@ +-- 0034_asset_name +-- Add name column to bot_history_message_assets to preserve original filenames. + +ALTER TABLE bot_history_message_assets + ADD COLUMN IF NOT EXISTS name TEXT NOT NULL DEFAULT ''; diff --git a/db/migrations/0035_asset_metadata.down.sql b/db/migrations/0035_asset_metadata.down.sql new file mode 100644 index 00000000..6b568ddd --- /dev/null +++ b/db/migrations/0035_asset_metadata.down.sql @@ -0,0 +1,5 @@ +-- 0035_asset_metadata (rollback) +-- Remove metadata column from bot_history_message_assets. + +ALTER TABLE bot_history_message_assets + DROP COLUMN IF EXISTS metadata; diff --git a/db/migrations/0035_asset_metadata.up.sql b/db/migrations/0035_asset_metadata.up.sql new file mode 100644 index 00000000..756cde7f --- /dev/null +++ b/db/migrations/0035_asset_metadata.up.sql @@ -0,0 +1,5 @@ +-- 0035_asset_metadata +-- Add metadata JSONB column to bot_history_message_assets for source_path, source_url, etc. + +ALTER TABLE bot_history_message_assets + ADD COLUMN IF NOT EXISTS metadata JSONB NOT NULL DEFAULT '{}'::jsonb; diff --git a/db/queries/media.sql b/db/queries/media.sql index 8f0dd74c..1c0395a8 100644 --- a/db/queries/media.sql +++ b/db/queries/media.sql @@ -25,26 +25,30 @@ RETURNING *; SELECT * FROM bot_storage_bindings WHERE bot_id = sqlc.arg(bot_id); -- name: CreateMessageAsset :one -INSERT INTO bot_history_message_assets (message_id, role, ordinal, content_hash) +INSERT INTO bot_history_message_assets (message_id, role, ordinal, content_hash, name, metadata) VALUES ( sqlc.arg(message_id), sqlc.arg(role), sqlc.arg(ordinal), - sqlc.arg(content_hash) + sqlc.arg(content_hash), + sqlc.arg(name), + sqlc.arg(metadata) ) ON CONFLICT (message_id, content_hash) DO UPDATE SET role = EXCLUDED.role, - ordinal = EXCLUDED.ordinal + ordinal = EXCLUDED.ordinal, + name = EXCLUDED.name, + metadata = EXCLUDED.metadata RETURNING *; -- name: ListMessageAssets :many -SELECT id AS rel_id, message_id, role, ordinal, content_hash +SELECT id AS rel_id, message_id, role, ordinal, content_hash, name, metadata FROM bot_history_message_assets WHERE message_id = sqlc.arg(message_id) ORDER BY ordinal ASC; -- name: ListMessageAssetsBatch :many -SELECT id AS rel_id, message_id, role, ordinal, content_hash +SELECT id AS rel_id, message_id, role, ordinal, content_hash, name, metadata FROM bot_history_message_assets WHERE message_id = ANY(sqlc.arg(message_ids)::uuid[]) ORDER BY message_id, ordinal ASC; diff --git a/devenv/app.dev.toml b/devenv/app.dev.toml index 6e57f2d4..11960d24 100644 --- a/devenv/app.dev.toml +++ b/devenv/app.dev.toml @@ -46,11 +46,6 @@ timeout_seconds = 10 [sparse] base_url = "http://sparse:8085" -[agent_gateway] -host = "agent" -port = 8081 -server_addr = "server:8080" - [browser_gateway] host = "browser" port = 8083 diff --git a/devenv/docker-compose.yml b/devenv/docker-compose.yml index e8571b98..4a733467 100644 --- a/devenv/docker-compose.yml +++ b/devenv/docker-compose.yml @@ -107,25 +107,6 @@ services: condition: service_healthy restart: unless-stopped - agent: - image: oven/bun:1-alpine - container_name: memoh-dev-agent - working_dir: /workspace/apps/agent - command: ["bun", "run", "--watch", "src/index.ts"] - environment: - MEMOH_CONFIG_PATH: /workspace/devenv/app.dev.toml - volumes: - - ..:/workspace - - node_modules:/workspace/node_modules - ports: - - "${MEMOH_DEV_AGENT_PORT:-18081}:8081" - depends_on: - deps: - condition: service_completed_successfully - server: - condition: service_healthy - restart: unless-stopped - web: build: context: .. diff --git a/docker-compose.yml b/docker-compose.yml index 3b86c43a..1820e3da 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -54,20 +54,6 @@ services: networks: - memoh-network - agent: - image: memohai/agent:latest - container_name: memoh-agent - volumes: - - ${MEMOH_CONFIG:-./config.toml}:/config.toml:ro - - /etc/localtime:/etc/localtime:ro - ports: - - "8081:8081" - depends_on: - - server - restart: unless-stopped - networks: - - memoh-network - web: image: memohai/web:latest container_name: memoh-web @@ -75,7 +61,6 @@ services: - "8082:8082" depends_on: - server - - agent restart: unless-stopped networks: - memoh-network diff --git a/docker/Dockerfile.agent b/docker/Dockerfile.agent deleted file mode 100644 index cf1fd9cd..00000000 --- a/docker/Dockerfile.agent +++ /dev/null @@ -1,41 +0,0 @@ -# syntax=docker/dockerfile:1 -FROM --platform=$BUILDPLATFORM oven/bun:1 AS builder - -WORKDIR /build - -# Set up workspace structure so bun can resolve workspace deps (@memoh/config, @memoh/agent) -COPY apps/agent/package.json apps/agent/bun.lock* ./apps/agent/ -COPY packages/config/package.json ./packages/config/package.json -COPY packages/agent/package.json ./packages/agent/package.json - -# Create root package.json with workspace config -RUN echo '{"name":"@memoh/monorepo","private":true,"workspaces":["apps/*","packages/*"]}' > package.json - -RUN cd apps/agent && bun install - -# Copy source files -COPY packages/config/ ./packages/config/ -COPY packages/agent/ ./packages/agent/ -COPY apps/agent/ ./apps/agent/ - -RUN cd apps/agent && bun run build - -FROM oven/bun:1-alpine - -WORKDIR /app - -RUN apk add --no-cache ca-certificates wget - -COPY --from=builder /build/apps/agent/dist /app/dist -COPY --from=builder /build/apps/agent/node_modules /app/node_modules -COPY --from=builder /build/apps/agent/package.json /app/package.json -COPY --from=builder /build/node_modules /node_modules - -EXPOSE 8081 - -HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \ - CMD wget --no-verbose --tries=1 --spider http://127.0.0.1:8081/health \ - || wget --no-verbose --tries=1 --spider http://agent:8081/health \ - || exit 1 - -CMD ["bun", "run", "dist/index.js"] diff --git a/docker/Dockerfile.web b/docker/Dockerfile.web index 02466ed8..3aadabc3 100644 --- a/docker/Dockerfile.web +++ b/docker/Dockerfile.web @@ -14,10 +14,8 @@ RUN --mount=type=cache,target=/root/.local/share/pnpm/store \ pnpm install ARG VITE_API_URL=/api -ARG VITE_AGENT_URL=/agent ENV VITE_API_URL=$VITE_API_URL -ENV VITE_AGENT_URL=$VITE_AGENT_URL WORKDIR /build/apps/web RUN pnpm build diff --git a/docker/docker-compose.cn.yml b/docker/docker-compose.cn.yml index f526831e..18bc372c 100644 --- a/docker/docker-compose.cn.yml +++ b/docker/docker-compose.cn.yml @@ -7,8 +7,6 @@ services: image: memoh.cn/memohai/server:latest server: image: memoh.cn/memohai/server:latest - agent: - image: memoh.cn/memohai/agent:latest web: image: memoh.cn/memohai/web:latest browser: diff --git a/docker/docker-compose.yml b/docker/docker-compose.yml index 63442018..eddb6107 100644 --- a/docker/docker-compose.yml +++ b/docker/docker-compose.yml @@ -18,18 +18,12 @@ services: - COMMIT_HASH=${MEMOH_COMMIT:-unknown} - BUILD_TIME=${MEMOH_BUILD_TIME:-unknown} - agent: - build: - context: . - dockerfile: docker/Dockerfile.agent - web: build: context: . dockerfile: docker/Dockerfile.web args: - VITE_API_URL=${VITE_API_URL:-/api} - - VITE_AGENT_URL=${VITE_AGENT_URL:-/agent} sparse: build: diff --git a/docker/nginx.conf b/docker/nginx.conf index ffa92b1c..c689df44 100644 --- a/docker/nginx.conf +++ b/docker/nginx.conf @@ -48,11 +48,6 @@ server { proxy_pass http://memoh-server:8080/; } - # Agent Gateway 代理 - location /agent/ { - proxy_pass http://memoh-agent:8081/; - } - # 静态资源缓存 location ~* \.(jpg|jpeg|png|gif|ico|css|js|svg|woff|woff2|ttf|eot)$ { expires 1y; diff --git a/go.mod b/go.mod index 6d7db244..76270637 100644 --- a/go.mod +++ b/go.mod @@ -1,6 +1,6 @@ module github.com/memohai/memoh -go 1.25.2 +go 1.25.7 require ( github.com/BurntSushi/toml v1.6.0 @@ -26,7 +26,8 @@ require ( github.com/larksuite/oapi-sdk-go/v3 v3.5.3 github.com/mailgun/mailgun-go/v5 v5.14.0 github.com/memohai/acgo v0.0.0-20260221232113-babac0d6acd7 - github.com/modelcontextprotocol/go-sdk v1.4.0 + github.com/memohai/twilight-ai v0.3.1-0.20260318172714-2c786325e3af + github.com/modelcontextprotocol/go-sdk v1.4.1 github.com/opencontainers/image-spec v1.1.1 github.com/opencontainers/runtime-spec v1.3.0 github.com/qdrant/go-client v1.17.1 @@ -116,7 +117,7 @@ require ( github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 // indirect github.com/sasha-s/go-deadlock v0.3.6 // indirect github.com/segmentio/asm v1.1.3 // indirect - github.com/segmentio/encoding v0.5.3 // indirect + github.com/segmentio/encoding v0.5.4 // indirect github.com/sirupsen/logrus v1.9.4 // indirect github.com/spf13/pflag v1.0.9 // indirect github.com/valyala/bytebufferpool v1.0.0 // indirect diff --git a/go.sum b/go.sum index 534549c5..907b1130 100644 --- a/go.sum +++ b/go.sum @@ -228,6 +228,8 @@ github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D github.com/mattn/go-runewidth v0.0.10/go.mod h1:RAqKPSqVFrSLVXbA8x7dzmKdmGzieGRCM46jaSJTDAk= github.com/memohai/acgo v0.0.0-20260221232113-babac0d6acd7 h1:beehwOQperqGWj4m4EhcPhnSZKtDiuHK/7ZMoTPaQjw= github.com/memohai/acgo v0.0.0-20260221232113-babac0d6acd7/go.mod h1:OvmxM7JmnXBmwJWWVqtreL3HSHSKuzPbtbhlg5MvBg0= +github.com/memohai/twilight-ai v0.3.1-0.20260318172714-2c786325e3af h1:Yvp4DU+V0PHtUWax/IL+eKjcyKE3Stl0hUAsWHkShE0= +github.com/memohai/twilight-ai v0.3.1-0.20260318172714-2c786325e3af/go.mod h1:vHNoRb6/quMacMAgIp838aoiNhsZbE0bFCnRRNyRwNc= github.com/moby/docker-image-spec v1.3.1 h1:jMKff3w6PgbfSa69GfNg+zN/XLhfXJGnEx3Nl2EsFP0= github.com/moby/docker-image-spec v1.3.1/go.mod h1:eKmb5VW8vQEh/BAr2yvVNvuiJuY6UIocYsFu/DxxRpo= github.com/moby/locker v1.0.1 h1:fOXqR41zeveg4fFODix+1Ch4mj/gT0NE1XJbp/epuBg= @@ -244,8 +246,8 @@ github.com/moby/sys/userns v0.1.0 h1:tVLXkFOxVu9A64/yh59slHVv9ahO9UIev4JZusOLG/g github.com/moby/sys/userns v0.1.0/go.mod h1:IHUYgu/kao6N8YZlp9Cf444ySSvCmDlmzUcYfDHOl28= github.com/moby/term v0.5.2 h1:6qk3FJAFDs6i/q3W/pQ97SX192qKfZgGjCQqfCJkgzQ= github.com/moby/term v0.5.2/go.mod h1:d3djjFCrjnB+fl8NJux+EJzu0msscUP+f8it8hPkFLc= -github.com/modelcontextprotocol/go-sdk v1.4.0 h1:u0kr8lbJc1oBcawK7Df+/ajNMpIDFE41OEPxdeTLOn8= -github.com/modelcontextprotocol/go-sdk v1.4.0/go.mod h1:Nxc2n+n/GdCebUaqCOhTetptS17SXXNu9IfNTaLDi1E= +github.com/modelcontextprotocol/go-sdk v1.4.1 h1:M4x9GyIPj+HoIlHNGpK2hq5o3BFhC+78PkEaldQRphc= +github.com/modelcontextprotocol/go-sdk v1.4.1/go.mod h1:Bo/mS87hPQqHSRkMv4dQq1XCu6zv4INdXnFZabkNU6s= github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q= github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd h1:TRLaZ9cD/w8PVh93nsPXa1VrQ6jlwL5oN8l14QlcNfg= github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q= @@ -295,8 +297,8 @@ github.com/sebdah/goldie/v2 v2.8.0 h1:dZb9wR8q5++oplmEiJT+U/5KyotVD+HNGCAc5gNr8r github.com/sebdah/goldie/v2 v2.8.0/go.mod h1:oZ9fp0+se1eapSRjfYbsV/0Hqhbuu3bJVvKI/NNtssI= github.com/segmentio/asm v1.1.3 h1:WM03sfUOENvvKexOLp+pCqgb/WDjsi7EK8gIsICtzhc= github.com/segmentio/asm v1.1.3/go.mod h1:Ld3L4ZXGNcSLRg4JBsZ3//1+f/TjYl0Mzen/DQy1EJg= -github.com/segmentio/encoding v0.5.3 h1:OjMgICtcSFuNvQCdwqMCv9Tg7lEOXGwm1J5RPQccx6w= -github.com/segmentio/encoding v0.5.3/go.mod h1:HS1ZKa3kSN32ZHVZ7ZLPLXWvOVIiZtyJnO1gPH1sKt0= +github.com/segmentio/encoding v0.5.4 h1:OW1VRern8Nw6ITAtwSZ7Idrl3MXCFwXHPgqESYfvNt0= +github.com/segmentio/encoding v0.5.4/go.mod h1:HS1ZKa3kSN32ZHVZ7ZLPLXWvOVIiZtyJnO1gPH1sKt0= github.com/sergi/go-diff v1.4.0 h1:n/SP9D5ad1fORl+llWyN+D6qoUETXNZARKjyY2/KVCw= github.com/sergi/go-diff v1.4.0/go.mod h1:A0bzQcvG0E7Rwjx0REVgAGH58e96+X0MeOfepqsbeW4= github.com/sirupsen/logrus v1.9.4 h1:TsZE7l11zFCLZnZ+teH4Umoq5BhEIfIzfRDZ1Uzql2w= diff --git a/internal/agent/agent.go b/internal/agent/agent.go new file mode 100644 index 00000000..0acdabb6 --- /dev/null +++ b/internal/agent/agent.go @@ -0,0 +1,524 @@ +package agent + +import ( + "context" + "encoding/json" + "errors" + "fmt" + "log/slog" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/agent/tools" + "github.com/memohai/memoh/internal/workspace/bridge" +) + +// Agent is the core agent that handles LLM interactions. +type Agent struct { + client *sdk.Client + toolProviders []tools.ToolProvider + bridgeProvider bridge.Provider + logger *slog.Logger +} + +// New creates a new Agent with the given dependencies. +func New(deps Deps) *Agent { + logger := deps.Logger + if logger == nil { + logger = slog.Default() + } + return &Agent{ + client: sdk.NewClient(), + bridgeProvider: deps.BridgeProvider, + logger: logger.With(slog.String("service", "agent")), + } +} + +// SetToolProviders sets the tool providers after construction. +// This allows breaking dependency cycles in the DI graph. +func (a *Agent) SetToolProviders(providers []tools.ToolProvider) { + a.toolProviders = providers +} + +// Stream runs the agent in streaming mode, emitting events to the returned channel. +func (a *Agent) Stream(ctx context.Context, cfg RunConfig) <-chan StreamEvent { + ch := make(chan StreamEvent) + go func() { + defer close(ch) + a.runStream(ctx, cfg, ch) + }() + return ch +} + +// Generate runs the agent in non-streaming mode, returning the complete result. +func (a *Agent) Generate(ctx context.Context, cfg RunConfig) (*GenerateResult, error) { + return a.runGenerate(ctx, cfg) +} + +func (a *Agent) runStream(ctx context.Context, cfg RunConfig, ch chan<- StreamEvent) { + tools, err := a.assembleTools(ctx, cfg) + if err != nil { + ch <- StreamEvent{Type: EventError, Error: fmt.Sprintf("assemble tools: %v", err)} + return + } + + enabledSkills := make([]string, 0, len(cfg.EnabledSkillNames)) + copy(enabledSkills, cfg.EnabledSkillNames) + enableSkill := func(name string) { + for _, s := range cfg.Skills { + if s.Name == name { + for _, existing := range enabledSkills { + if existing == name { + return + } + } + enabledSkills = append(enabledSkills, name) + return + } + } + } + + // Loop detection setup + var textLoopGuard *TextLoopGuard + var textLoopProbeBuffer *TextLoopProbeBuffer + var toolLoopGuard *ToolLoopGuard + toolLoopAbortCallIDs := make(map[string]struct{}) + if cfg.LoopDetection.Enabled { + textLoopGuard = NewTextLoopGuard(LoopDetectedStreakThreshold, LoopDetectedMinNewGramsPerChunk, SentialOptions{}) + textLoopProbeBuffer = NewTextLoopProbeBuffer(LoopDetectedProbeChars, func(text string) { + result := textLoopGuard.Inspect(text) + if result.Abort { + a.logger.Warn("text loop detected, will abort") + } + }) + toolLoopGuard = NewToolLoopGuard(ToolLoopRepeatThreshold, ToolLoopWarningsBeforeAbort) + } + + // Wrap tools with loop detection + if toolLoopGuard != nil { + tools = wrapToolsWithLoopGuard(tools, toolLoopGuard, toolLoopAbortCallIDs) + } + + tagResolvers := DefaultTagResolvers() + tagExtractor := NewStreamTagExtractor(tagResolvers) + + opts := a.buildGenerateOptions(cfg, tools) + + streamResult, err := a.client.StreamText(ctx, opts...) + if err != nil { + ch <- StreamEvent{Type: EventError, Error: fmt.Sprintf("stream start: %v", err)} + return + } + + ch <- StreamEvent{Type: EventAgentStart} + + var allText strings.Builder + aborted := false + + for part := range streamResult.Stream { + if ctx.Err() != nil { + aborted = true + break + } + + switch p := part.(type) { + case *sdk.StartPart: + _ = p // stream start already emitted + + case *sdk.TextStartPart: + ch <- StreamEvent{Type: EventTextStart} + + case *sdk.TextDeltaPart: + result := tagExtractor.Push(p.Text) + if result.VisibleText != "" { + if textLoopProbeBuffer != nil { + textLoopProbeBuffer.Push(result.VisibleText) + } + ch <- StreamEvent{Type: EventTextDelta, Delta: result.VisibleText} + allText.WriteString(result.VisibleText) + } + emitTagEvents(ch, result.Events) + + case *sdk.TextEndPart: + remainder := tagExtractor.FlushRemainder() + if remainder.VisibleText != "" { + if textLoopProbeBuffer != nil { + textLoopProbeBuffer.Push(remainder.VisibleText) + } + ch <- StreamEvent{Type: EventTextDelta, Delta: remainder.VisibleText} + allText.WriteString(remainder.VisibleText) + } + if textLoopProbeBuffer != nil { + textLoopProbeBuffer.Flush() + } + emitTagEvents(ch, remainder.Events) + ch <- StreamEvent{Type: EventTextEnd} + + case *sdk.ReasoningStartPart: + ch <- StreamEvent{Type: EventReasoningStart} + + case *sdk.ReasoningDeltaPart: + ch <- StreamEvent{Type: EventReasoningDelta, Delta: p.Text} + + case *sdk.ReasoningEndPart: + ch <- StreamEvent{Type: EventReasoningEnd} + + case *sdk.StreamToolCallPart: + remainder := tagExtractor.FlushRemainder() + if remainder.VisibleText != "" { + if textLoopProbeBuffer != nil { + textLoopProbeBuffer.Push(remainder.VisibleText) + } + ch <- StreamEvent{Type: EventTextDelta, Delta: remainder.VisibleText} + allText.WriteString(remainder.VisibleText) + } + if textLoopProbeBuffer != nil { + textLoopProbeBuffer.Flush() + } + emitTagEvents(ch, remainder.Events) + ch <- StreamEvent{ + Type: EventToolCallStart, + ToolName: p.ToolName, + ToolCallID: p.ToolCallID, + Input: p.Input, + } + + case *sdk.StreamToolResultPart: + shouldAbort := false + if _, ok := toolLoopAbortCallIDs[p.ToolCallID]; ok { + delete(toolLoopAbortCallIDs, p.ToolCallID) + shouldAbort = true + } + ch <- StreamEvent{ + Type: EventToolCallEnd, + ToolName: p.ToolName, + ToolCallID: p.ToolCallID, + Input: p.Input, + Result: p.Output, + } + if p.ToolName == "use_skill" { + if resultMap, ok := p.Output.(map[string]any); ok { + if skillName, ok := resultMap["skillName"].(string); ok && skillName != "" { + enableSkill(skillName) + } + } + } + if shouldAbort { + a.logger.Warn("tool loop abort triggered", slog.String("tool_call_id", p.ToolCallID)) + aborted = true + } + + case *sdk.StreamToolErrorPart: + ch <- StreamEvent{ + Type: EventToolCallEnd, + ToolName: p.ToolName, + ToolCallID: p.ToolCallID, + Error: p.Error.Error(), + } + + case *sdk.StreamFilePart: + mediaType := p.File.MediaType + if mediaType == "" { + mediaType = "image/png" + } + ch <- StreamEvent{ + Type: EventAttachment, + Attachments: []FileAttachment{{ + Type: "image", + URL: fmt.Sprintf("data:%s;base64,%s", mediaType, p.File.Data), + Mime: mediaType, + }}, + } + + case *sdk.ErrorPart: + ch <- StreamEvent{Type: EventError, Error: p.Error.Error()} + aborted = true + + case *sdk.AbortPart: + aborted = true + + case *sdk.FinishPart: + // handled after loop + } + + if aborted { + break + } + } + + if textLoopProbeBuffer != nil { + textLoopProbeBuffer.Flush() + } + + finalMessages := StripTagsFromMessages(streamResult.Messages) + + var totalUsage sdk.Usage + perStepUsages := make([]json.RawMessage, 0, len(streamResult.Steps)) + for _, step := range streamResult.Steps { + totalUsage.InputTokens += step.Usage.InputTokens + totalUsage.OutputTokens += step.Usage.OutputTokens + totalUsage.TotalTokens += step.Usage.TotalTokens + totalUsage.ReasoningTokens += step.Usage.ReasoningTokens + totalUsage.CachedInputTokens += step.Usage.CachedInputTokens + stepJSON, _ := json.Marshal(step.Usage) + perStepUsages = append(perStepUsages, stepJSON) + } + usageJSON, _ := json.Marshal(totalUsage) + usagesJSON, _ := json.Marshal(perStepUsages) + + termEvent := StreamEvent{ + Messages: mustMarshal(finalMessages), + Usage: usageJSON, + Usages: usagesJSON, + Skills: enabledSkills, + } + if aborted { + termEvent.Type = EventAgentAbort + } else { + termEvent.Type = EventAgentEnd + } + ch <- termEvent +} + +func (a *Agent) runGenerate(ctx context.Context, cfg RunConfig) (*GenerateResult, error) { + tools, err := a.assembleTools(ctx, cfg) + if err != nil { + return nil, fmt.Errorf("assemble tools: %w", err) + } + + enabledSkills := make([]string, 0, len(cfg.EnabledSkillNames)) + copy(enabledSkills, cfg.EnabledSkillNames) + enableSkill := func(name string) { + for _, s := range cfg.Skills { + if s.Name == name { + for _, existing := range enabledSkills { + if existing == name { + return + } + } + enabledSkills = append(enabledSkills, name) + return + } + } + } + + var toolLoopGuard *ToolLoopGuard + var textLoopGuard *TextLoopGuard + toolLoopAbortCallIDs := make(map[string]struct{}) + if cfg.LoopDetection.Enabled { + toolLoopGuard = NewToolLoopGuard(ToolLoopRepeatThreshold, ToolLoopWarningsBeforeAbort) + textLoopGuard = NewTextLoopGuard(LoopDetectedStreakThreshold, LoopDetectedMinNewGramsPerChunk, SentialOptions{}) + } + + if toolLoopGuard != nil { + tools = wrapToolsWithLoopGuard(tools, toolLoopGuard, toolLoopAbortCallIDs) + } + + opts := a.buildGenerateOptions(cfg, tools) + opts = append(opts, + sdk.WithOnStep(func(step *sdk.StepResult) *sdk.GenerateParams { + if cfg.LoopDetection.Enabled { + if len(toolLoopAbortCallIDs) > 0 { + return nil // stop + } + if textLoopGuard != nil && isNonEmptyString(step.Text) { + result := textLoopGuard.Inspect(step.Text) + if result.Abort { + return nil // stop + } + } + } + for _, tr := range step.ToolResults { + if tr.ToolName == "use_skill" { + if resultMap, ok := tr.Output.(map[string]any); ok { + if skillName, ok := resultMap["skillName"].(string); ok && skillName != "" { + enableSkill(skillName) + } + } + } + } + return nil + }), + ) + + genResult, err := a.client.GenerateTextResult(ctx, opts...) + if err != nil { + return nil, fmt.Errorf("generate: %w", err) + } + + resolvers := DefaultTagResolvers() + cleanedText, events := ExtractTagsFromText(genResult.Text, resolvers) + + var attachments []FileAttachment + var reactions []ReactionItem + var speeches []SpeechItem + for _, ev := range events { + switch ev.Tag { + case "attachments": + for _, d := range ev.Data { + if att, ok := d.(FileAttachment); ok { + attachments = append(attachments, att) + } + } + case "reactions": + for _, d := range ev.Data { + if r, ok := d.(ReactionItem); ok { + reactions = append(reactions, r) + } + } + case "speech": + for _, d := range ev.Data { + if s, ok := d.(SpeechItem); ok { + speeches = append(speeches, s) + } + } + } + } + + finalMessages := StripTagsFromMessages(genResult.Messages) + + return &GenerateResult{ + Messages: finalMessages, + Text: cleanedText, + Attachments: attachments, + Reactions: reactions, + Speeches: speeches, + Skills: enabledSkills, + Usage: &genResult.Usage, + }, nil +} + +func (*Agent) buildGenerateOptions(cfg RunConfig, tools []sdk.Tool) []sdk.GenerateOption { + opts := []sdk.GenerateOption{ + sdk.WithModel(cfg.Model), + sdk.WithMessages(cfg.Messages), + sdk.WithSystem(cfg.System), + sdk.WithMaxSteps(-1), + } + if len(tools) > 0 { + opts = append(opts, sdk.WithTools(tools)) + } + opts = append(opts, BuildReasoningOptions(ModelConfig{ + ClientType: resolveClientType(cfg.Model), + ReasoningConfig: &ReasoningConfig{ + Enabled: cfg.ReasoningEffort != "", + Effort: cfg.ReasoningEffort, + }, + })...) + return opts +} + +func resolveClientType(model *sdk.Model) string { + if model == nil || model.Provider == nil { + return ClientTypeOpenAICompletions + } + name := model.Provider.Name() + switch { + case strings.Contains(name, "anthropic"): + return ClientTypeAnthropicMessages + case strings.Contains(name, "google"): + return ClientTypeGoogleGenerativeAI + case strings.Contains(name, "responses"): + return ClientTypeOpenAIResponses + default: + return ClientTypeOpenAICompletions + } +} + +// assembleTools collects tools from all registered ToolProviders. +func (a *Agent) assembleTools(ctx context.Context, cfg RunConfig) ([]sdk.Tool, error) { + if len(a.toolProviders) == 0 { + return nil, nil + } + session := tools.SessionContext{ + BotID: cfg.Identity.BotID, + ChatID: cfg.Identity.ChatID, + ChannelIdentityID: cfg.Identity.ChannelIdentityID, + SessionToken: cfg.Identity.SessionToken, + CurrentPlatform: cfg.Identity.CurrentPlatform, + ReplyTarget: cfg.Identity.ReplyTarget, + IsSubagent: cfg.Identity.IsSubagent, + } + + var allTools []sdk.Tool + for _, provider := range a.toolProviders { + providerTools, err := provider.Tools(ctx, session) + if err != nil { + a.logger.Warn("tool provider failed", slog.Any("error", err)) + continue + } + allTools = append(allTools, providerTools...) + } + return allTools, nil +} + +func emitTagEvents(ch chan<- StreamEvent, events []TagEvent) { + for _, ev := range events { + switch ev.Tag { + case "attachments": + var atts []FileAttachment + for _, d := range ev.Data { + if att, ok := d.(FileAttachment); ok { + atts = append(atts, att) + } + } + if len(atts) > 0 { + ch <- StreamEvent{Type: EventAttachment, Attachments: atts} + } + case "reactions": + var reactions []ReactionItem + for _, d := range ev.Data { + if r, ok := d.(ReactionItem); ok { + reactions = append(reactions, r) + } + } + if len(reactions) > 0 { + ch <- StreamEvent{Type: EventReaction, Reactions: reactions} + } + case "speech": + var speeches []SpeechItem + for _, d := range ev.Data { + if s, ok := d.(SpeechItem); ok { + speeches = append(speeches, s) + } + } + if len(speeches) > 0 { + ch <- StreamEvent{Type: EventSpeech, Speeches: speeches} + } + } + } +} + +func wrapToolsWithLoopGuard(tools []sdk.Tool, guard *ToolLoopGuard, abortCallIDs map[string]struct{}) []sdk.Tool { + wrapped := make([]sdk.Tool, len(tools)) + for i, tool := range tools { + originalExecute := tool.Execute + toolName := tool.Name + wrapped[i] = tool + wrapped[i].Execute = func(ctx *sdk.ToolExecContext, input any) (any, error) { + warn, abort := guard.Guard(toolName, input) + if abort { + abortCallIDs[ctx.ToolCallID] = struct{}{} + return map[string]any{ + "isError": true, + "content": []map[string]any{{ + "type": "text", + "text": ToolLoopDetectedAbortMessage, + }}, + }, errors.New(ToolLoopDetectedAbortMessage) + } + if warn { + return map[string]any{ + ToolLoopWarningKey: true, + "content": []map[string]any{{ + "type": "text", + "text": ToolLoopWarningText, + }}, + }, nil + } + return originalExecute(ctx, input) + } + } + return wrapped +} diff --git a/internal/agent/config.go b/internal/agent/config.go new file mode 100644 index 00000000..14db6a6f --- /dev/null +++ b/internal/agent/config.go @@ -0,0 +1,13 @@ +package agent + +import ( + "log/slog" + + "github.com/memohai/memoh/internal/workspace/bridge" +) + +// Deps holds all service dependencies for the Agent. +type Deps struct { + BridgeProvider bridge.Provider + Logger *slog.Logger +} diff --git a/internal/agent/fs.go b/internal/agent/fs.go new file mode 100644 index 00000000..e6bb303c --- /dev/null +++ b/internal/agent/fs.go @@ -0,0 +1,73 @@ +package agent + +import ( + "context" + "fmt" + "strings" + + "github.com/memohai/memoh/internal/workspace/bridge" +) + +// FSClient provides file operations against a bot's container filesystem. +type FSClient struct { + provider bridge.Provider + botID string +} + +// NewFSClient creates a new container filesystem client. +func NewFSClient(provider bridge.Provider, botID string) *FSClient { + return &FSClient{provider: provider, botID: botID} +} + +// ReadText reads a text file from the container, returning its content as a string. +// Returns an empty string if the file does not exist or cannot be read. +func (f *FSClient) ReadText(ctx context.Context, path string) (string, error) { + if f.provider == nil { + return "", nil + } + client, err := f.provider.MCPClient(ctx, f.botID) + if err != nil { + return "", fmt.Errorf("mcp client: %w", err) + } + resp, err := client.ReadFile(ctx, path, 0, 0) + if err != nil { + return "", err + } + return resp.GetContent(), nil +} + +// ReadTextSafe reads a text file, returning empty string on any error. +func (f *FSClient) ReadTextSafe(ctx context.Context, path string) string { + content, _ := f.ReadText(ctx, path) + return content +} + +// LoadSystemFiles loads the standard set of system files from the bot container. +func (f *FSClient) LoadSystemFiles(ctx context.Context) []SystemFile { + home := "/data" + now := TimeNow() + pad := func(n int) string { return fmt.Sprintf("%02d", n) } + today := fmt.Sprintf("%d-%s-%s", now.Year(), pad(int(now.Month())), pad(now.Day())) + yesterday := now.AddDate(0, 0, -1) + yesterdayStr := fmt.Sprintf("%d-%s-%s", yesterday.Year(), pad(int(yesterday.Month())), pad(yesterday.Day())) + + filenames := []string{ + "IDENTITY.md", + "SOUL.md", + "TOOLS.md", + "MEMORY.md", + "PROFILES.md", + "memory/" + today + ".md", + "memory/" + yesterdayStr + ".md", + } + + files := make([]SystemFile, len(filenames)) + for i, name := range filenames { + content := f.ReadTextSafe(ctx, home+"/"+name) + files[i] = SystemFile{ + Filename: name, + Content: strings.TrimSpace(content), + } + } + return files +} diff --git a/internal/agent/model.go b/internal/agent/model.go new file mode 100644 index 00000000..a663df38 --- /dev/null +++ b/internal/agent/model.go @@ -0,0 +1,130 @@ +package agent + +import ( + anthropicmessages "github.com/memohai/twilight-ai/provider/anthropic/messages" + googlegenerative "github.com/memohai/twilight-ai/provider/google/generativeai" + openaicompletions "github.com/memohai/twilight-ai/provider/openai/completions" + openairesponses "github.com/memohai/twilight-ai/provider/openai/responses" + sdk "github.com/memohai/twilight-ai/sdk" +) + +// ClientType constants matching the database model configuration. +const ( + ClientTypeOpenAICompletions = "openai-completions" + ClientTypeOpenAIResponses = "openai-responses" + ClientTypeAnthropicMessages = "anthropic-messages" + ClientTypeGoogleGenerativeAI = "google-generative-ai" +) + +// Reasoning budget maps per client type. +var ( + anthropicBudget = map[string]int{"low": 5000, "medium": 16000, "high": 50000} + googleBudget = map[string]int{"low": 5000, "medium": 16000, "high": 50000} +) + +// CreateModel builds a Twilight AI SDK Model from the resolved model config. +func CreateModel(cfg ModelConfig) *sdk.Model { + switch cfg.ClientType { + case ClientTypeOpenAICompletions: + opts := []openaicompletions.Option{ + openaicompletions.WithAPIKey(cfg.APIKey), + } + if cfg.BaseURL != "" { + opts = append(opts, openaicompletions.WithBaseURL(cfg.BaseURL)) + } + p := openaicompletions.New(opts...) + return p.ChatModel(cfg.ModelID) + + case ClientTypeOpenAIResponses: + opts := []openairesponses.Option{ + openairesponses.WithAPIKey(cfg.APIKey), + } + if cfg.BaseURL != "" { + opts = append(opts, openairesponses.WithBaseURL(cfg.BaseURL)) + } + p := openairesponses.New(opts...) + return p.ChatModel(cfg.ModelID) + + case ClientTypeAnthropicMessages: + opts := []anthropicmessages.Option{ + anthropicmessages.WithAPIKey(cfg.APIKey), + } + if cfg.BaseURL != "" { + opts = append(opts, anthropicmessages.WithBaseURL(cfg.BaseURL)) + } + if cfg.ReasoningConfig != nil && cfg.ReasoningConfig.Enabled { + budget := ReasoningBudgetTokens(ClientTypeAnthropicMessages, cfg.ReasoningConfig.Effort) + opts = append(opts, anthropicmessages.WithThinking(anthropicmessages.ThinkingConfig{ + Type: "enabled", + BudgetTokens: budget, + })) + } + p := anthropicmessages.New(opts...) + return p.ChatModel(cfg.ModelID) + + case ClientTypeGoogleGenerativeAI: + opts := []googlegenerative.Option{ + googlegenerative.WithAPIKey(cfg.APIKey), + } + if cfg.BaseURL != "" { + opts = append(opts, googlegenerative.WithBaseURL(cfg.BaseURL)) + } + p := googlegenerative.New(opts...) + return p.ChatModel(cfg.ModelID) + + default: + // OpenAI-compatible fallback + opts := []openaicompletions.Option{ + openaicompletions.WithAPIKey(cfg.APIKey), + } + if cfg.BaseURL != "" { + opts = append(opts, openaicompletions.WithBaseURL(cfg.BaseURL)) + } + p := openaicompletions.New(opts...) + return p.ChatModel(cfg.ModelID) + } +} + +// BuildReasoningOptions returns SDK generation options for reasoning/thinking. +func BuildReasoningOptions(cfg ModelConfig) []sdk.GenerateOption { + if cfg.ReasoningConfig == nil || !cfg.ReasoningConfig.Enabled { + return nil + } + effort := cfg.ReasoningConfig.Effort + if effort == "" { + effort = "medium" + } + + switch cfg.ClientType { + case ClientTypeAnthropicMessages: + // Anthropic uses thinking budget — no SDK option, handled by provider + return nil + case ClientTypeOpenAIResponses, ClientTypeOpenAICompletions: + return []sdk.GenerateOption{sdk.WithReasoningEffort(effort)} + case ClientTypeGoogleGenerativeAI: + return nil + default: + return []sdk.GenerateOption{sdk.WithReasoningEffort(effort)} + } +} + +// ReasoningBudgetTokens returns the token budget for extended thinking based on client type and effort. +func ReasoningBudgetTokens(clientType, effort string) int { + if effort == "" { + effort = "medium" + } + switch clientType { + case ClientTypeAnthropicMessages: + if b, ok := anthropicBudget[effort]; ok { + return b + } + return anthropicBudget["medium"] + case ClientTypeGoogleGenerativeAI: + if b, ok := googleBudget[effort]; ok { + return b + } + return googleBudget["medium"] + default: + return 0 + } +} diff --git a/internal/agent/prompt.go b/internal/agent/prompt.go new file mode 100644 index 00000000..7b4ba8e3 --- /dev/null +++ b/internal/agent/prompt.go @@ -0,0 +1,175 @@ +package agent + +import ( + "embed" + "fmt" + "strconv" + "strings" +) + +//go:embed prompts/*.md +var promptsFS embed.FS + +var ( + systemTmpl string + scheduleTmpl string + heartbeatTmpl string + subagentTmpl string +) + +func init() { + systemTmpl = mustReadPrompt("prompts/system.md") + scheduleTmpl = mustReadPrompt("prompts/schedule.md") + heartbeatTmpl = mustReadPrompt("prompts/heartbeat.md") + subagentTmpl = mustReadPrompt("prompts/subagent.md") +} + +func mustReadPrompt(name string) string { + data, err := promptsFS.ReadFile(name) + if err != nil { + panic(fmt.Sprintf("failed to read embedded prompt %s: %v", name, err)) + } + return string(data) +} + +// render replaces all {{key}} placeholders in tmpl with values from vars. +func render(tmpl string, vars map[string]string) string { + result := tmpl + for k, v := range vars { + result = strings.ReplaceAll(result, "{{"+k+"}}", v) + } + return strings.TrimSpace(result) +} + +// GenerateSystemPrompt builds the complete system prompt from files, skills, and context. +func GenerateSystemPrompt(params SystemPromptParams) string { + home := "/data" + + basicTools := []string{ + "- `read`: read file content", + } + if params.SupportsImageInput { + basicTools = append(basicTools, "- `read_media`: view the media") + } + basicTools = append(basicTools, + "- `write`: write file content", + "- `list`: list directory entries", + "- `edit`: replace exact text in a file", + "- `exec`: execute command", + ) + + skillsList := "" + if len(params.Skills) > 0 { + lines := make([]string, len(params.Skills)) + for i, s := range params.Skills { + lines[i] = "- " + s.Name + ": " + s.Description + } + skillsList = strings.Join(lines, "\n") + } + + enabledSkillsSection := "" + var enabledSkillsSectionSb70 strings.Builder + for _, s := range params.EnabledSkills { + enabledSkillsSectionSb70.WriteString("\n\n---\n\n" + formatSkillPrompt(s)) + } + enabledSkillsSection += enabledSkillsSectionSb70.String() + + fileSections := "" + var fileSectionsSb75 strings.Builder + for _, f := range params.Files { + if f.Content == "" { + continue + } + fileSectionsSb75.WriteString("\n\n" + formatSystemFile(f)) + } + fileSections += fileSectionsSb75.String() + + return render(systemTmpl, map[string]string{ + "home": home, + "basicTools": strings.Join(basicTools, "\n"), + "fileSections": fileSections, + "skillsCount": strconv.Itoa(len(params.Skills)), + "skillsList": skillsList, + "enabledSkillsSection": enabledSkillsSection, + "inboxSection": formatInbox(params.Inbox), + }) +} + +// SystemPromptParams holds all inputs for system prompt generation. +type SystemPromptParams struct { + Skills []SkillEntry + EnabledSkills []SkillEntry + Files []SystemFile + Inbox []InboxItem + SupportsImageInput bool +} + +// GenerateSchedulePrompt builds the user message for a scheduled task trigger. +func GenerateSchedulePrompt(s Schedule) string { + maxCallsStr := "Unlimited" + if s.MaxCalls != nil { + maxCallsStr = strconv.Itoa(*s.MaxCalls) + } + return render(scheduleTmpl, map[string]string{ + "name": s.Name, + "description": s.Description, + "maxCalls": maxCallsStr, + "pattern": s.Pattern, + "command": s.Command, + }) +} + +// GenerateHeartbeatPrompt builds the user message for a heartbeat trigger. +func GenerateHeartbeatPrompt(interval int, checklist string) string { + checklistSection := "" + if strings.TrimSpace(checklist) != "" { + checklistSection = "\n## HEARTBEAT.md (checklist)\n\n" + strings.TrimSpace(checklist) + "\n" + } + return render(heartbeatTmpl, map[string]string{ + "interval": strconv.Itoa(interval), + "timeNow": TimeNow().UTC().Format("2006-01-02T15:04:05Z"), + "checklistSection": checklistSection, + }) +} + +// GenerateSubagentSystemPrompt builds the system prompt for a subagent. +func GenerateSubagentSystemPrompt(name, description string) string { + return render(subagentTmpl, map[string]string{ + "name": name, + "description": description, + }) +} + +func formatSkillPrompt(skill SkillEntry) string { + return fmt.Sprintf("**`%s`**\n> %s\n\n%s", skill.Name, skill.Description, skill.Content) +} + +func formatSystemFile(file SystemFile) string { + return fmt.Sprintf("## %s\n\n%s", file.Filename, file.Content) +} + +func formatInbox(items []InboxItem) string { + if len(items) == 0 { + return "" + } + + formatted := make([]map[string]any, len(items)) + for i, item := range items { + formatted[i] = map[string]any{ + "id": item.ID, + "source": item.Source, + "header": item.Header, + "content": item.Content, + "createdAt": item.CreatedAt, + } + } + + var sb strings.Builder + fmt.Fprintf(&sb, "## Inbox (%d unread)\n\n", len(items)) + sb.WriteString("These are messages from other channels — NOT from the current conversation. Use `send` or `react` if you want to respond to any of them.\n\n") + sb.WriteString("\n") + sb.Write(mustMarshal(formatted)) + sb.WriteString("\n\n\n") + sb.WriteString("Use `search_inbox` to find older messages by keyword.") + return sb.String() +} diff --git a/internal/agent/prompts/heartbeat.md b/internal/agent/prompts/heartbeat.md new file mode 100644 index 00000000..069cd4d6 --- /dev/null +++ b/internal/agent/prompts/heartbeat.md @@ -0,0 +1,10 @@ +** This is a heartbeat check automatically triggered by the system ** +--- +interval: every {{interval}} minutes +time: {{timeNow}} +--- +{{checklistSection}} + +Do not infer or repeat old tasks from prior chats. +If nothing needs attention, reply HEARTBEAT_OK. +If something needs attention, use the send tool to deliver alerts to the appropriate channel. diff --git a/internal/agent/prompts/schedule.md b/internal/agent/prompts/schedule.md new file mode 100644 index 00000000..a8456d1e --- /dev/null +++ b/internal/agent/prompts/schedule.md @@ -0,0 +1,9 @@ +** This is a scheduled task automatically send to you by the system ** +--- +schedule-name: {{name}} +schedule-description: {{description}} +max-calls: {{maxCalls}} +cron-pattern: {{pattern}} +--- + +{{command}} diff --git a/internal/agent/prompts/subagent.md b/internal/agent/prompts/subagent.md new file mode 100644 index 00000000..5fec6f6c --- /dev/null +++ b/internal/agent/prompts/subagent.md @@ -0,0 +1,5 @@ +--- +name: {{name}} +--- + +{{description}} diff --git a/internal/agent/prompts/system.md b/internal/agent/prompts/system.md new file mode 100644 index 00000000..d3b960a1 --- /dev/null +++ b/internal/agent/prompts/system.md @@ -0,0 +1,209 @@ +You are just woke up. + +**Your text output IS your reply.** Whatever you write goes directly back to the person who messaged you. You do not need any tool to reply — just write. + +**`{{home}}` is your HOME** — you can read and write files there freely. + +## Basic Tools +{{basicTools}} + +## Safety +- Keep private data private +- Don't run destructive commands without asking +- When in doubt, ask + +## Core files +- `IDENTITY.md`: Your identity and personality. +- `SOUL.md`: Your soul and beliefs. +- `TOOLS.md`: Your tools and methods. +- `PROFILES.md`: Profiles of users and groups. +- `MEMORY.md`: Your core memory. +- `memory/YYYY-MM-DD.md`: Today's memory. + +## Memory + +You wake up fresh each session. These files are your continuity: + +- **Daily notes:** `memory/YYYY-MM-DD.md` (create `memory/` if needed) — raw logs of what happened +- **Long-term:** `MEMORY.md` — your curated memories, like a human's long-term memory + +Use `search_memory` to recall earlier conversations beyond the current context window. + +### Memory Write Rules (IMPORTANT) + +For `memory/YYYY-MM-DD.md`, use canonical markdown entries: + +``` +## Entry mem_20260313_001 + +```yaml +id: mem_20260313_001 +created_at: 2026-03-13T13:34:49Z +updated_at: 2026-03-13T13:34:49Z +metadata: + topic: Notes +``` + +What happened / what to remember +``` + +Rules: +- Only send NEW memory items (do not re-write old content). +- Preserve the canonical entry structure for daily memory files. +- When a memory is about a known user or group from `PROFILES.md`, include a stable profile link in `metadata` (for example `profile_ref`, plus identity fields when available). +- Do not provide `hash` (backend generates it). +- If plain text is unavoidable, write concise factual notes only. +- `MEMORY.md` stays human-readable markdown (not JSON). + +## How to Respond + +**Direct reply (default):** When someone sends you a message in the current session, just write your response as plain text. This is the normal way to answer — your text output goes directly back to the person talking to you. Do NOT use `send` for this. + +**`send` tool:** ONLY for reaching out to a DIFFERENT channel or conversation — e.g. posting to another group, messaging a different person, or replying to an inbox item from another platform. Requires a `target` — use `get_contacts` to find available targets. + +**`react` tool:** Add or remove an emoji reaction on a specific message (any channel). + +**`speak` tool:** Send a voice message to a DIFFERENT channel. Synthesizes text and delivers as audio. Requires `target` — use `get_contacts` to find available targets. For speaking in the current conversation, use the `` block instead. + +### When to use `send` +- A scheduled task tells you to notify or post somewhere. +- You want to forward information to a different group or person. +- You want to reply to an inbox message that came from another channel. +- The user explicitly asks you to send a message to someone else or another channel. + +### When NOT to use `send` +- The user is chatting with you and expects a reply — just respond directly. +- The user asks a question, gives a command, or has a conversation — just respond directly. +- The user asks you to search, summarize, compute, or do any task — do the work with tools, then write the result directly. Do NOT use `send` to deliver results back to the person who asked. +- If you are unsure, respond directly. Only use `send` when the destination is clearly a different target. + +**Common mistake:** User says "search for X" → you search → then you use `send` to post the result back to the same conversation. This is WRONG. Just write the result as your reply. + +## Contacts +You may receive messages from different people, bots, and channels. Use `get_contacts` to list all known contacts and conversations for your bot. +It returns each route's platform, conversation type, and `target` (the value you pass to `send`). + +## Your Inbox +Your inbox contains notifications from: +- Group conversations where you were not directly mentioned. +- Other connected platforms (email, etc.). + +Guidelines: +- Not all messages need a response — be selective like a human would. +- If you decide to reply to an inbox message, use `send` or `react` (since inbox messages come from other channels). +- Sometimes an emoji reaction is better than a long reply. + +## Attachments + +**Receiving**: Uploaded files are saved to your workspace; the file path appears in the message header. + +**Sending via `send` tool**: Pass file paths or URLs in the `attachments` parameter. Example: `attachments: ["{{home}}/media/ab/file.jpg", "https://example.com/img.png"]` + +**Sending in direct responses**: Use this format: + +``` + +- {{home}}/path/to/file.pdf +- {{home}}/path/to/video.mp4 +- https://example.com/image.png + +``` + +Rules: +- One path or URL per line, prefixed by `- ` +- No extra text inside `...` +- The block can appear anywhere in your response; it will be parsed and stripped from visible text + +## Reactions + +To react with an emoji to the message you are replying to, use this format in your direct response: + +``` + +- 👍 + +``` + +Rules: +- One emoji per line, prefixed by `- ` +- The block can appear anywhere in your response; it will be parsed and stripped from visible text +- This reacts to the **source message** of the current conversation (the message you are responding to) +- For reacting to messages in other channels or removing reactions, use the `react` tool instead + +## Speech + +To speak aloud in the current conversation (text-to-speech), use this format in your direct response: + +``` + +The text you want to say aloud. + +``` + +Rules: +- Content is the text to synthesize (max 500 characters) +- The block can appear anywhere in your response; it will be parsed and stripped from visible text +- For sending voice to a DIFFERENT channel, use the `speak` tool instead + +## Schedule Tasks + +You can create and manage schedule tasks via cron. +Use `schedule` to create a new schedule task, and fill `command` with natural language. +When cron pattern is valid, you will receive a schedule message with your `command`. + +When a scheduled task triggers, use `send` to deliver the result to the intended channel — do not respond directly, as there is no active conversation to reply to. + +## Heartbeat — Be Proactive + +You may receive periodic **heartbeat** messages — automatic system-triggered turns that let you proactively check on things without the user asking. + +### The HEARTBEAT_OK Contract +- If nothing needs attention, reply with exactly `HEARTBEAT_OK`. The system will suppress this message — the user will not see it. +- If something needs attention, use `send` to deliver alerts to the appropriate channel. Your text output in heartbeat turns is NOT sent to the user directly. + +### HEARTBEAT.md +`{{home}}/HEARTBEAT.md` is your checklist file. The system will read it automatically and include its content in the heartbeat message. You are free to edit this file — add short checklists, reminders, or periodic tasks. Keep it small to limit token usage. + +### When to Reach Out (use `send`) +- Important messages or notifications arrived +- Upcoming events or deadlines (< 2 hours) +- Something interesting or actionable you discovered +- A monitored task changed status + +### When to Stay Quiet (`HEARTBEAT_OK`) +- Late night hours unless truly urgent +- Nothing new since last check +- The user is clearly busy or in a conversation +- You just checked recently and nothing changed + +### Proactive Work (no need to ask) +During heartbeats you can freely: +- Read, organize, and update your memory files +- Check on ongoing projects (git status, file changes, etc.) +- Update `HEARTBEAT.md` to refine your own checklist +- Clean up or archive old notes + +### Heartbeat vs Schedule: When to Use Each +- **Heartbeat**: batch multiple periodic checks together (inbox + calendar + notifications), timing can drift slightly, needs conversational context. +- **Schedule (cron)**: exact timing matters, task needs isolation, one-shot reminders, output should go directly to a channel. + +**Tip:** Batch similar periodic checks into `HEARTBEAT.md` instead of creating multiple schedule tasks. Use schedule for precise timing and standalone tasks. + +## Subagent + +For complex tasks like: +- Create a website +- Research a topic +- Generate a report +- etc. + +You can create a subagent to help you with these tasks, +`description` will be the system prompt for the subagent. +{{fileSections}} + +## Skills +{{skillsCount}} skills available via `use_skill`: +{{skillsList}} +{{enabledSkillsSection}} + +{{inboxSection}} diff --git a/internal/agent/sential.go b/internal/agent/sential.go new file mode 100644 index 00000000..51d5784d --- /dev/null +++ b/internal/agent/sential.go @@ -0,0 +1,487 @@ +package agent + +import ( + "crypto/sha256" + "encoding/hex" + "encoding/json" + "fmt" + "sort" + "strings" + "unicode" + "unicode/utf8" +) + +// Loop detection constants matching the TypeScript implementation. +const ( + LoopDetectedAbortMessage = "loop detected, stream aborted" + LoopDetectedStreakThreshold = 3 + LoopDetectedMinNewGramsPerChunk = 8 + LoopDetectedProbeChars = 256 + ToolLoopDetectedAbortMessage = "tool loop detected, stream aborted" + ToolLoopRepeatThreshold = 5 + ToolLoopWarningsBeforeAbort = 1 + ToolLoopWarningKey = "__memoh_tool_loop_warning" //nolint:gosec // internal warning key, not a credential + ToolLoopWarningText = "[MEMOH_TOOL_LOOP_WARNING] Repeated identical tool invocation (same tool + arguments) was detected more than 5 times. Stop looping this tool and either summarize current results or change strategy." //nolint:gosec // human-readable warning text, not a credential + + defaultNgramSize = 10 + defaultWindowSize = 1000 + defaultOverlapThreshold = 0.75 + defaultConsecutiveHits = 10 + defaultMinNewGramsPerChunk = 1 +) + +// --- Sential: n-gram overlap detector --- + +// SentialOptions configures the n-gram overlap detector. +type SentialOptions struct { + NgramSize int + WindowSize int + OverlapThreshold float64 +} + +// SentialResult is the output of an overlap inspection. +type SentialResult struct { + Hit bool + Overlap float64 + MatchedGrams int + NewGrams int +} + +// Sential detects text repetition via n-gram overlap. +type Sential struct { + ngramSize int + windowSize int + overlapThreshold float64 + windowChars []rune + windowNgramQueue []string + historySet map[string]struct{} + historyCounts map[string]int +} + +// NewSential creates a new n-gram overlap detector. +func NewSential(opts SentialOptions) *Sential { + ngramSize := opts.NgramSize + if ngramSize <= 0 { + ngramSize = defaultNgramSize + } + windowSize := opts.WindowSize + if windowSize <= 0 { + windowSize = defaultWindowSize + } + overlapThreshold := opts.OverlapThreshold + if overlapThreshold <= 0 { + overlapThreshold = defaultOverlapThreshold + } + return &Sential{ + ngramSize: ngramSize, + windowSize: windowSize, + overlapThreshold: overlapThreshold, + historySet: make(map[string]struct{}), + historyCounts: make(map[string]int), + } +} + +func (s *Sential) addHistoryGram(gram string) { + s.historyCounts[gram]++ + if s.historyCounts[gram] == 1 { + s.historySet[gram] = struct{}{} + } +} + +func (s *Sential) removeHistoryGram(gram string) { + count := s.historyCounts[gram] + if count <= 1 { + delete(s.historyCounts, gram) + delete(s.historySet, gram) + return + } + s.historyCounts[gram] = count - 1 +} + +func (s *Sential) pushWindowChar(ch rune) { + s.windowChars = append(s.windowChars, ch) + if len(s.windowChars) >= s.ngramSize { + start := len(s.windowChars) - s.ngramSize + gram := string(s.windowChars[start : start+s.ngramSize]) + s.windowNgramQueue = append(s.windowNgramQueue, gram) + s.addHistoryGram(gram) + } + if len(s.windowChars) <= s.windowSize { + return + } + s.windowChars = s.windowChars[1:] + if len(s.windowNgramQueue) > 0 { + removed := s.windowNgramQueue[0] + s.windowNgramQueue = s.windowNgramQueue[1:] + s.removeHistoryGram(removed) + } +} + +// Inspect checks a chunk of text for n-gram overlap with the sliding window. +func (s *Sential) Inspect(text string) SentialResult { + incoming := []rune(text) + if len(incoming) == 0 { + return SentialResult{} + } + + contextSize := s.ngramSize - 1 + if contextSize < 0 { + contextSize = 0 + } + var contextChars []rune + if contextSize > 0 && len(s.windowChars) > 0 { + start := len(s.windowChars) - contextSize + if start < 0 { + start = 0 + } + contextChars = make([]rune, len(s.windowChars[start:])) + copy(contextChars, s.windowChars[start:]) + } + candidate := append([]rune{}, contextChars...) + candidate = append(candidate, incoming...) + contextLength := len(contextChars) + + matchedGrams := 0 + newGrams := 0 + if len(candidate) >= s.ngramSize { + for i := 0; i <= len(candidate)-s.ngramSize; i++ { + gramEndIndex := i + s.ngramSize - 1 + if gramEndIndex < contextLength { + continue + } + gram := string(candidate[i : i+s.ngramSize]) + newGrams++ + if _, ok := s.historySet[gram]; ok { + matchedGrams++ + } + } + } + + overlap := 0.0 + if newGrams > 0 { + overlap = float64(matchedGrams) / float64(newGrams) + } + hit := overlap > s.overlapThreshold + + for _, ch := range incoming { + s.pushWindowChar(ch) + } + + return SentialResult{ + Hit: hit, + Overlap: overlap, + MatchedGrams: matchedGrams, + NewGrams: newGrams, + } +} + +// Reset clears the detector state. +func (s *Sential) Reset() { + s.windowChars = nil + s.windowNgramQueue = nil + s.historySet = make(map[string]struct{}) + s.historyCounts = make(map[string]int) +} + +// --- Text Loop Guard --- + +// TextLoopGuardResult extends SentialResult with streak and abort tracking. +type TextLoopGuardResult struct { + SentialResult + Streak int + Abort bool +} + +// TextLoopGuard wraps Sential with consecutive-hit tracking. +type TextLoopGuard struct { + sential *Sential + consecutiveHitsToAbort int + minNewGramsPerChunk int + streak int +} + +// NewTextLoopGuard creates a text loop guard. +func NewTextLoopGuard(consecutiveHits, minNewGrams int, opts SentialOptions) *TextLoopGuard { + if consecutiveHits <= 0 { + consecutiveHits = defaultConsecutiveHits + } + if minNewGrams <= 0 { + minNewGrams = defaultMinNewGramsPerChunk + } + return &TextLoopGuard{ + sential: NewSential(opts), + consecutiveHitsToAbort: consecutiveHits, + minNewGramsPerChunk: minNewGrams, + } +} + +// Inspect checks text and tracks consecutive overlap streaks. +func (g *TextLoopGuard) Inspect(text string) TextLoopGuardResult { + result := g.sential.Inspect(text) + if result.NewGrams >= g.minNewGramsPerChunk { + if result.Hit { + g.streak++ + } else { + g.streak = 0 + } + } + return TextLoopGuardResult{ + SentialResult: result, + Streak: g.streak, + Abort: g.streak >= g.consecutiveHitsToAbort, + } +} + +// Reset clears the guard state. +func (g *TextLoopGuard) Reset() { + g.sential.Reset() + g.streak = 0 +} + +// --- Text Loop Probe Buffer --- + +// TextLoopProbeBuffer batches text into chunks before passing to an inspector. +type TextLoopProbeBuffer struct { + chunkSize int + inspect func(string) + chars []rune + offset int +} + +// NewTextLoopProbeBuffer creates a probe buffer. +func NewTextLoopProbeBuffer(chunkSize int, inspect func(string)) *TextLoopProbeBuffer { + if chunkSize <= 0 { + chunkSize = LoopDetectedProbeChars + } + return &TextLoopProbeBuffer{ + chunkSize: chunkSize, + inspect: inspect, + } +} + +// Push adds text to the buffer, emitting full chunks to the inspector. +func (b *TextLoopProbeBuffer) Push(text string) { + if text == "" { + return + } + b.chars = append(b.chars, []rune(text)...) + for len(b.chars)-b.offset >= b.chunkSize { + chunk := string(b.chars[b.offset : b.offset+b.chunkSize]) + b.offset += b.chunkSize + if len(chunk) > 0 { + b.inspect(chunk) + } + } + if b.offset >= b.chunkSize { + b.chars = b.chars[b.offset:] + b.offset = 0 + } +} + +// Flush emits any remaining content to the inspector. +func (b *TextLoopProbeBuffer) Flush() { + if len(b.chars)-b.offset > 0 { + remainder := string(b.chars[b.offset:]) + if len(remainder) > 0 { + b.inspect(remainder) + } + } + b.chars = nil + b.offset = 0 +} + +// --- Tool Loop Guard --- + +var defaultVolatileKeys = []string{ + "toolcallid", "toolcallid", "requestid", "requestid", + "traceid", "traceid", "spanid", "spanid", + "sessionid", "sessionid", "timestamp", + "createdat", "createdat", "updatedat", "updatedat", + "expiresat", "expiresat", "nonce", +} + +var volatileKeySuffixes = []string{ + "requestid", "traceid", "sessionid", "toolcallid", + "timestamp", "createdat", "updatedat", "expiresat", +} + +// ToolLoopInput represents a tool call for loop detection. +type ToolLoopInput struct { + ToolName string + Input any +} + +// ToolLoopResult is the output of a tool loop inspection. +type ToolLoopResult struct { + Hash string + RepeatCount int + BreachCount int + Warn bool + Abort bool +} + +// ToolLoopGuard detects repeated identical tool calls. +type ToolLoopGuard struct { + repeatThreshold int + warningsBeforeAbort int + volatileKeySet map[string]struct{} + lastHash string + repeatCount int + breachCount int + breachHash string +} + +// NewToolLoopGuard creates a tool loop guard. +func NewToolLoopGuard(repeatThreshold, warningsBeforeAbort int) *ToolLoopGuard { + if repeatThreshold <= 0 { + repeatThreshold = ToolLoopRepeatThreshold + } + if warningsBeforeAbort <= 0 { + warningsBeforeAbort = ToolLoopWarningsBeforeAbort + } + volatileSet := make(map[string]struct{}) + for _, k := range defaultVolatileKeys { + volatileSet[normalizeKeyName(k)] = struct{}{} + } + return &ToolLoopGuard{ + repeatThreshold: repeatThreshold, + warningsBeforeAbort: warningsBeforeAbort, + volatileKeySet: volatileSet, + } +} + +// Inspect checks a tool call for repetition. +func (g *ToolLoopGuard) Inspect(input ToolLoopInput) ToolLoopResult { + hash := computeToolLoopHash(input, g.volatileKeySet) + + if hash == g.lastHash { + g.repeatCount++ + } else { + g.lastHash = hash + g.repeatCount = 1 + } + + if g.breachHash != hash { + g.breachHash = hash + g.breachCount = 0 + } + + warn := false + abort := false + if g.repeatCount > g.repeatThreshold { + if g.breachCount < g.warningsBeforeAbort { + g.breachCount++ + warn = true + g.lastHash = "" + g.repeatCount = 0 + } else { + g.breachCount++ + abort = true + } + } + + return ToolLoopResult{ + Hash: hash, + RepeatCount: g.repeatCount, + BreachCount: g.breachCount, + Warn: warn, + Abort: abort, + } +} + +// Reset clears the guard state. +func (g *ToolLoopGuard) Reset() { + g.lastHash = "" + g.repeatCount = 0 + g.breachCount = 0 + g.breachHash = "" +} + +func normalizeKeyName(key string) string { + var b strings.Builder + for _, r := range strings.TrimSpace(strings.ToLower(key)) { + if unicode.IsLetter(r) || unicode.IsDigit(r) { + b.WriteRune(r) + } + } + return b.String() +} + +func isVolatileKey(key string, volatileSet map[string]struct{}) bool { + normalized := normalizeKeyName(key) + if normalized == "" { + return false + } + if _, ok := volatileSet[normalized]; ok { + return true + } + for _, suffix := range volatileKeySuffixes { + if strings.HasSuffix(normalized, suffix) { + return true + } + } + return false +} + +func normalizeToolLoopValue(value any, volatileSet map[string]struct{}) any { + if value == nil { + return nil + } + switch v := value.(type) { + case string: + return v + case bool: + return v + case float64: + return v + case json.Number: + return v.String() + case []any: + result := make([]any, len(v)) + for i, item := range v { + result[i] = normalizeToolLoopValue(item, volatileSet) + } + return result + case map[string]any: + keys := make([]string, 0, len(v)) + for k := range v { + keys = append(keys, k) + } + sort.Strings(keys) + result := make(map[string]any, len(v)) + for _, k := range keys { + if isVolatileKey(k, volatileSet) { + continue + } + normalized := normalizeToolLoopValue(v[k], volatileSet) + if normalized != nil { + result[k] = normalized + } + } + return result + default: + return fmt.Sprintf("%v", v) + } +} + +func computeToolLoopHash(input ToolLoopInput, volatileSet map[string]struct{}) string { + payload := map[string]any{ + "toolName": strings.TrimSpace(input.ToolName), + "input": normalizeToolLoopValue(input.Input, volatileSet), + } + serialized, _ := json.Marshal(payload) + h := sha256.Sum256(serialized) + return hex.EncodeToString(h[:]) +} + +// --- Helper to check for repetitions in text --- + +func isNonEmptyString(s string) bool { + return utf8.RuneCountInString(strings.TrimSpace(s)) > 0 +} + +// Guard wraps tools with tool loop detection. Returns a wrapper execute function. +func (g *ToolLoopGuard) Guard(toolName string, input any) (warn bool, abort bool) { + result := g.Inspect(ToolLoopInput{ToolName: toolName, Input: input}) + return result.Warn, result.Abort +} diff --git a/internal/agent/stream.go b/internal/agent/stream.go new file mode 100644 index 00000000..1c99020d --- /dev/null +++ b/internal/agent/stream.go @@ -0,0 +1,48 @@ +package agent + +import "encoding/json" + +// StreamEventType identifies the kind of stream event. +type StreamEventType string + +const ( + EventAgentStart StreamEventType = "agent_start" + EventTextStart StreamEventType = "text_start" + EventTextDelta StreamEventType = "text_delta" + EventTextEnd StreamEventType = "text_end" + EventReasoningStart StreamEventType = "reasoning_start" + EventReasoningDelta StreamEventType = "reasoning_delta" + EventReasoningEnd StreamEventType = "reasoning_end" + EventToolCallStart StreamEventType = "tool_call_start" + EventToolCallEnd StreamEventType = "tool_call_end" + EventAttachment StreamEventType = "attachment_delta" + EventReaction StreamEventType = "reaction_delta" + EventSpeech StreamEventType = "speech_delta" + EventAgentEnd StreamEventType = "agent_end" + EventAgentAbort StreamEventType = "agent_abort" + EventError StreamEventType = "error" +) + +// StreamEvent is emitted by the agent during streaming. +type StreamEvent struct { + Type StreamEventType `json:"type"` + Delta string `json:"delta,omitempty"` + ToolName string `json:"toolName,omitempty"` + ToolCallID string `json:"toolCallId,omitempty"` + Input any `json:"input,omitempty"` + Result any `json:"result,omitempty"` + Attachments []FileAttachment `json:"attachments,omitempty"` + Reactions []ReactionItem `json:"reactions,omitempty"` + Speeches []SpeechItem `json:"speeches,omitempty"` + Messages json.RawMessage `json:"messages,omitempty"` + Usage json.RawMessage `json:"usage,omitempty"` + Usages json.RawMessage `json:"usages,omitempty"` + Skills []string `json:"skills,omitempty"` + Reasoning []string `json:"reasoning,omitempty"` + Error string `json:"error,omitempty"` +} + +// IsTerminal returns true for events that signal end of stream. +func (e StreamEvent) IsTerminal() bool { + return e.Type == EventAgentEnd || e.Type == EventAgentAbort +} diff --git a/internal/agent/tags.go b/internal/agent/tags.go new file mode 100644 index 00000000..df26cae2 --- /dev/null +++ b/internal/agent/tags.go @@ -0,0 +1,275 @@ +package agent + +import ( + "net/url" + "path/filepath" + "regexp" + "strings" + "unicode/utf8" +) + +// TagResolver parses the inner content of a specific XML-like tag. +type TagResolver struct { + Tag string + Parse func(content string) []any +} + +// TagEvent is a parsed tag occurrence. +type TagEvent struct { + Tag string + Data []any +} + +// TagStreamResult is the output of a streaming tag extraction step. +type TagStreamResult struct { + VisibleText string + Events []TagEvent +} + +// DefaultTagResolvers returns the standard set of tag resolvers. +func DefaultTagResolvers() []TagResolver { + return []TagResolver{ + AttachmentsResolver(), + ReactionsResolver(), + SpeechResolver(), + } +} + +// AttachmentsResolver parses blocks into FileAttachment items. +func AttachmentsResolver() TagResolver { + return TagResolver{ + Tag: "attachments", + Parse: func(content string) []any { + seen := make(map[string]struct{}) + var result []any + for _, line := range strings.Split(content, "\n") { + line = strings.TrimSpace(line) + if !strings.HasPrefix(line, "-") { + continue + } + path := strings.TrimSpace(line[1:]) + if path == "" { + continue + } + if _, ok := seen[path]; ok { + continue + } + seen[path] = struct{}{} + att := FileAttachment{Path: path, Type: "file", Name: filenameFromPath(path)} + if strings.HasPrefix(path, "http://") || strings.HasPrefix(path, "https://") { + att = FileAttachment{URL: path, Type: "image", Name: filenameFromURL(path)} + } + result = append(result, att) + } + return result + }, + } +} + +// ReactionsResolver parses blocks into ReactionItem items. +func ReactionsResolver() TagResolver { + return TagResolver{ + Tag: "reactions", + Parse: func(content string) []any { + seen := make(map[string]struct{}) + var result []any + for _, line := range strings.Split(content, "\n") { + line = strings.TrimSpace(line) + if !strings.HasPrefix(line, "-") { + continue + } + emoji := strings.TrimSpace(line[1:]) + if emoji == "" { + continue + } + if _, ok := seen[emoji]; ok { + continue + } + seen[emoji] = struct{}{} + result = append(result, ReactionItem{Emoji: emoji}) + } + return result + }, + } +} + +// SpeechResolver parses blocks into SpeechItem items. +func SpeechResolver() TagResolver { + return TagResolver{ + Tag: "speech", + Parse: func(content string) []any { + text := strings.TrimSpace(content) + if text == "" { + return nil + } + return []any{SpeechItem{Text: text}} + }, + } +} + +// ExtractTagsFromText extracts and removes all tag blocks from a complete string. +func ExtractTagsFromText(text string, resolvers []TagResolver) (string, []TagEvent) { + var events []TagEvent + cleaned := text + for _, r := range resolvers { + open := "<" + r.Tag + ">" + closeTag := "" + pattern := regexp.MustCompile(regexp.QuoteMeta(open) + `([\s\S]*?)` + regexp.QuoteMeta(closeTag)) + cleaned = pattern.ReplaceAllStringFunc(cleaned, func(match string) string { + inner := match[len(open) : len(match)-len(closeTag)] + parsed := r.Parse(inner) + if len(parsed) > 0 { + events = append(events, TagEvent{Tag: r.Tag, Data: parsed}) + } + return "" + }) + } + cleaned = regexp.MustCompile(`\n{3,}`).ReplaceAllString(cleaned, "\n\n") + cleaned = strings.TrimSpace(cleaned) + return cleaned, events +} + +// StreamTagExtractor is an incremental state machine that intercepts tag blocks +// from a stream of text deltas. +type StreamTagExtractor struct { + metas []resolverMeta + maxOpenLen int + state int // 0 = text, 1 = inside + activeMeta *resolverMeta + buffer string + tagBuffer string +} + +type resolverMeta struct { + resolver TagResolver + openTag string + closeTag string +} + +// NewStreamTagExtractor creates a new streaming tag extractor. +func NewStreamTagExtractor(resolvers []TagResolver) *StreamTagExtractor { + metas := make([]resolverMeta, len(resolvers)) + maxOpenLen := 0 + for i, r := range resolvers { + open := "<" + r.Tag + ">" + closeTag := "" + metas[i] = resolverMeta{resolver: r, openTag: open, closeTag: closeTag} + if len(open) > maxOpenLen { + maxOpenLen = len(open) + } + } + return &StreamTagExtractor{ + metas: metas, + maxOpenLen: maxOpenLen, + } +} + +// safeUTF8SplitIndex adjusts a byte split index so it does not fall in the +// middle of a multi-byte UTF-8 character. It backs up to the start of the +// rune that contains idx, guaranteeing both halves are valid UTF-8. +func safeUTF8SplitIndex(s string, idx int) int { + if idx <= 0 || idx >= len(s) { + return idx + } + for idx > 0 && !utf8.RuneStart(s[idx]) { + idx-- + } + return idx +} + +// Push processes a text delta and returns visible text and any completed tag events. +func (e *StreamTagExtractor) Push(delta string) TagStreamResult { + e.buffer += delta + visible := "" + var events []TagEvent + + var visibleSb186 strings.Builder + for len(e.buffer) > 0 { + if e.state == 0 { // text + earliestIdx := -1 + var matchedMeta *resolverMeta + for i := range e.metas { + idx := strings.Index(e.buffer, e.metas[i].openTag) + if idx != -1 && (earliestIdx == -1 || idx < earliestIdx) { + earliestIdx = idx + matchedMeta = &e.metas[i] + } + } + if earliestIdx == -1 { + keep := e.maxOpenLen - 1 + if keep > len(e.buffer) { + keep = len(e.buffer) + } + splitAt := safeUTF8SplitIndex(e.buffer, len(e.buffer)-keep) + emit := e.buffer[:splitAt] + visibleSb186.WriteString(emit) + e.buffer = e.buffer[splitAt:] + break + } + visibleSb186.WriteString(e.buffer[:earliestIdx]) + e.buffer = e.buffer[earliestIdx+len(matchedMeta.openTag):] + e.tagBuffer = "" + e.activeMeta = matchedMeta + e.state = 1 + continue + } + + // state == 1 (inside) + closeTag := e.activeMeta.closeTag + endIdx := strings.Index(e.buffer, closeTag) + if endIdx == -1 { + keep := len(closeTag) - 1 + if keep > len(e.buffer) { + keep = len(e.buffer) + } + splitAt := safeUTF8SplitIndex(e.buffer, len(e.buffer)-keep) + take := e.buffer[:splitAt] + e.tagBuffer += take + e.buffer = e.buffer[splitAt:] + break + } + e.tagBuffer += e.buffer[:endIdx] + parsed := e.activeMeta.resolver.Parse(e.tagBuffer) + if len(parsed) > 0 { + events = append(events, TagEvent{Tag: e.activeMeta.resolver.Tag, Data: parsed}) + } + e.buffer = e.buffer[endIdx+len(closeTag):] + e.tagBuffer = "" + e.activeMeta = nil + e.state = 0 + } + visible += visibleSb186.String() + + return TagStreamResult{VisibleText: visible, Events: events} +} + +// FlushRemainder flushes any remaining buffered content. Call when the stream ends. +func (e *StreamTagExtractor) FlushRemainder() TagStreamResult { + if e.state == 0 { + out := e.buffer + e.buffer = "" + return TagStreamResult{VisibleText: out} + } + out := e.activeMeta.openTag + e.tagBuffer + e.buffer + e.state = 0 + e.buffer = "" + e.tagBuffer = "" + e.activeMeta = nil + return TagStreamResult{VisibleText: out} +} + +func filenameFromPath(p string) string { + return filepath.Base(p) +} + +func filenameFromURL(rawURL string) string { + u, err := url.Parse(rawURL) + if err != nil { + return "" + } + base := filepath.Base(u.Path) + if base == "." || base == "/" { + return "" + } + return base +} diff --git a/internal/mcp/providers/browser/provider.go b/internal/agent/tools/browser.go similarity index 60% rename from internal/mcp/providers/browser/provider.go rename to internal/agent/tools/browser.go index 00fc7a4b..2ebf6c56 100644 --- a/internal/mcp/providers/browser/provider.go +++ b/internal/agent/tools/browser.go @@ -1,10 +1,11 @@ -package browser +package tools import ( "bytes" "context" "encoding/base64" "encoding/json" + "errors" "fmt" "io" "log/slog" @@ -12,19 +13,15 @@ import ( "strings" "time" + sdk "github.com/memohai/twilight-ai/sdk" + "github.com/memohai/memoh/internal/browsercontexts" "github.com/memohai/memoh/internal/config" - mcpgw "github.com/memohai/memoh/internal/mcp" "github.com/memohai/memoh/internal/settings" "github.com/memohai/memoh/internal/workspace/bridge" ) -const ( - toolBrowserAction = "browser_action" - toolBrowserObserve = "browser_observe" -) - -type Executor struct { +type BrowserProvider struct { logger *slog.Logger settings *settings.Service browserContexts *browsercontexts.Service @@ -33,12 +30,12 @@ type Executor struct { httpClient *http.Client } -func NewExecutor(log *slog.Logger, settingsSvc *settings.Service, browserSvc *browsercontexts.Service, containers bridge.Provider, gatewayCfg config.BrowserGatewayConfig) *Executor { +func NewBrowserProvider(log *slog.Logger, settingsSvc *settings.Service, browserSvc *browsercontexts.Service, containers bridge.Provider, gatewayCfg config.BrowserGatewayConfig) *BrowserProvider { if log == nil { log = slog.Default() } - return &Executor{ - logger: log.With(slog.String("provider", "browser_tool")), + return &BrowserProvider{ + logger: log.With(slog.String("tool", "browser")), settings: settingsSvc, browserContexts: browserSvc, containers: containers, @@ -47,26 +44,27 @@ func NewExecutor(log *slog.Logger, settingsSvc *settings.Service, browserSvc *br } } -func (e *Executor) ListTools(ctx context.Context, session mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if e.settings == nil || e.browserContexts == nil { - return []mcpgw.ToolDescriptor{}, nil +func (p *BrowserProvider) Tools(ctx context.Context, session SessionContext) ([]sdk.Tool, error) { + if p.settings == nil || p.browserContexts == nil { + return nil, nil } botID := strings.TrimSpace(session.BotID) if botID == "" { - return []mcpgw.ToolDescriptor{}, nil + return nil, nil } - botSettings, err := e.settings.GetBot(ctx, botID) + botSettings, err := p.settings.GetBot(ctx, botID) if err != nil { - return []mcpgw.ToolDescriptor{}, nil + return nil, nil } if strings.TrimSpace(botSettings.BrowserContextID) == "" { - return []mcpgw.ToolDescriptor{}, nil + return nil, nil } - return []mcpgw.ToolDescriptor{ + sess := session + return []sdk.Tool{ { - Name: toolBrowserAction, + Name: "browser_action", Description: "Execute a browser action: navigate, click, double-click, focus, type, fill, press key, keyboard input, hover, select option, check/uncheck, scroll, drag-and-drop, upload files, go back/forward, reload, wait, or manage tabs (new/select/close).", - InputSchema: map[string]any{ + Parameters: map[string]any{ "type": "object", "properties": map[string]any{ "action": map[string]any{"type": "string", "enum": []string{"navigate", "click", "dblclick", "focus", "type", "fill", "press", "keyboard_type", "keyboard_inserttext", "keydown", "keyup", "hover", "select", "check", "uncheck", "scroll", "scrollintoview", "drag", "upload", "wait", "go_back", "go_forward", "reload", "tab_new", "tab_select", "tab_close"}, "description": "The browser action to perform"}, @@ -84,11 +82,14 @@ func (e *Executor) ListTools(ctx context.Context, session mcpgw.ToolSessionConte }, "required": []string{"action"}, }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execAction(ctx.Context, sess, inputAsMap(input)) + }, }, { - Name: toolBrowserObserve, + Name: "browser_observe", Description: "Observe the current browser page: take screenshot (optionally annotated with numbered element labels or full-page), get accessibility tree snapshot, get text content, get HTML, evaluate JavaScript, get current URL, get page title, export PDF, or list open tabs.", - InputSchema: map[string]any{ + Parameters: map[string]any{ "type": "object", "properties": map[string]any{ "observe": map[string]any{"type": "string", "enum": []string{"screenshot", "screenshot_annotate", "snapshot", "get_content", "get_html", "evaluate", "get_url", "get_title", "pdf", "tab_list"}, "description": "What to observe from the page"}, @@ -98,60 +99,104 @@ func (e *Executor) ListTools(ctx context.Context, session mcpgw.ToolSessionConte }, "required": []string{"observe"}, }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execObserve(ctx.Context, sess, inputAsMap(input)) + }, }, }, nil } -func (e *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if e.settings == nil || e.browserContexts == nil { - return mcpgw.BuildToolErrorResult("browser tools are not available"), nil - } - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - botSettings, err := e.settings.GetBot(ctx, botID) +func (p *BrowserProvider) resolveContext(ctx context.Context, botID string) (string, browsercontexts.BrowserContext, error) { + botSettings, err := p.settings.GetBot(ctx, botID) if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil + return "", browsercontexts.BrowserContext{}, err } browserCtxID := strings.TrimSpace(botSettings.BrowserContextID) if browserCtxID == "" { - return mcpgw.BuildToolErrorResult("browser context not configured for this bot"), nil + return "", browsercontexts.BrowserContext{}, errors.New("browser context not configured for this bot") } - - bcConfig, err := e.browserContexts.GetByID(ctx, browserCtxID) + bcConfig, err := p.browserContexts.GetByID(ctx, browserCtxID) if err != nil { - return mcpgw.BuildToolErrorResult("failed to load browser context config: " + err.Error()), nil + return "", browsercontexts.BrowserContext{}, fmt.Errorf("failed to load browser context config: %s", err.Error()) } - - if err := e.ensureContext(ctx, browserCtxID, bcConfig); err != nil { - return mcpgw.BuildToolErrorResult("failed to ensure browser context: " + err.Error()), nil - } - - switch toolName { - case toolBrowserAction: - return e.callAction(ctx, botID, browserCtxID, arguments) - case toolBrowserObserve: - return e.callObserve(ctx, botID, browserCtxID, arguments) - default: - return nil, mcpgw.ErrToolNotFound + if err := p.ensureContext(ctx, browserCtxID, bcConfig); err != nil { + return "", browsercontexts.BrowserContext{}, fmt.Errorf("failed to ensure browser context: %s", err.Error()) } + return browserCtxID, bcConfig, nil } -func (e *Executor) ensureContext(ctx context.Context, contextID string, bc browsercontexts.BrowserContext) error { - existsURL := fmt.Sprintf("%s/context/%s/exists", e.gatewayBaseURL, contextID) +func (p *BrowserProvider) execAction(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + contextID, _, err := p.resolveContext(ctx, botID) + if err != nil { + return nil, err + } + action := StringArg(args, "action") + if action == "" { + return nil, errors.New("action is required") + } + payload := map[string]any{"action": action} + for _, key := range []string{"url", "selector", "text", "key", "value", "target_selector", "direction"} { + if v := StringArg(args, key); v != "" { + payload[key] = v + } + } + if v, ok, _ := IntArg(args, "timeout"); ok { + payload["timeout"] = v + } + if v, ok, _ := IntArg(args, "amount"); ok { + payload["amount"] = v + } + if v, ok, _ := IntArg(args, "tab_index"); ok { + payload["tab_index"] = v + } + if files, ok := args["files"].([]any); ok && len(files) > 0 { + payload["files"] = files + } + return p.doGatewayAction(ctx, botID, contextID, payload) +} + +func (p *BrowserProvider) execObserve(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + contextID, _, err := p.resolveContext(ctx, botID) + if err != nil { + return nil, err + } + observe := StringArg(args, "observe") + if observe == "" { + return nil, errors.New("observe is required") + } + payload := map[string]any{"action": observe} + if v := StringArg(args, "selector"); v != "" { + payload["selector"] = v + } + if v := StringArg(args, "script"); v != "" { + payload["script"] = v + } + if v, ok := args["full_page"].(bool); ok { + payload["full_page"] = v + } + return p.doGatewayAction(ctx, botID, contextID, payload) +} + +func (p *BrowserProvider) ensureContext(ctx context.Context, contextID string, bc browsercontexts.BrowserContext) error { + existsURL := fmt.Sprintf("%s/context/%s/exists", p.gatewayBaseURL, contextID) req, err := http.NewRequestWithContext(ctx, http.MethodGet, existsURL, nil) if err != nil { return err } - resp, err := e.httpClient.Do(req) //nolint:gosec // URL from internal gateway config + resp, err := p.httpClient.Do(req) //nolint:gosec if err != nil { return fmt.Errorf("browser gateway unreachable: %w", err) } defer func() { _ = resp.Body.Close() }() body, _ := io.ReadAll(resp.Body) - var existsResp struct { Exists bool `json:"exists"` } @@ -161,19 +206,14 @@ func (e *Executor) ensureContext(ctx context.Context, contextID string, bc brows if existsResp.Exists { return nil } - - createPayload, _ := json.Marshal(map[string]any{ - "id": contextID, - "name": bc.Name, - "config": bc.Config, - }) - createURL := fmt.Sprintf("%s/context", e.gatewayBaseURL) + createPayload, _ := json.Marshal(map[string]any{"id": contextID, "name": bc.Name, "config": bc.Config}) + createURL := fmt.Sprintf("%s/context", p.gatewayBaseURL) createReq, err := http.NewRequestWithContext(ctx, http.MethodPost, createURL, bytes.NewReader(createPayload)) if err != nil { return err } createReq.Header.Set("Content-Type", "application/json") - createResp, err := e.httpClient.Do(createReq) //nolint:gosec // URL from internal gateway config + createResp, err := p.httpClient.Do(createReq) //nolint:gosec if err != nil { return fmt.Errorf("failed to create browser context: %w", err) } @@ -185,96 +225,47 @@ func (e *Executor) ensureContext(ctx context.Context, contextID string, bc brows return nil } -func (e *Executor) callAction(ctx context.Context, botID, contextID string, arguments map[string]any) (map[string]any, error) { - action := mcpgw.StringArg(arguments, "action") - if action == "" { - return mcpgw.BuildToolErrorResult("action is required"), nil - } - payload := map[string]any{"action": action} - for _, key := range []string{"url", "selector", "text", "key", "value", "target_selector", "direction"} { - if v := mcpgw.StringArg(arguments, key); v != "" { - payload[key] = v - } - } - if v, ok, _ := mcpgw.IntArg(arguments, "timeout"); ok { - payload["timeout"] = v - } - if v, ok, _ := mcpgw.IntArg(arguments, "amount"); ok { - payload["amount"] = v - } - if v, ok, _ := mcpgw.IntArg(arguments, "tab_index"); ok { - payload["tab_index"] = v - } - if files, ok := arguments["files"].([]any); ok && len(files) > 0 { - payload["files"] = files - } - return e.doGatewayAction(ctx, botID, contextID, payload) -} - -func (e *Executor) callObserve(ctx context.Context, botID, contextID string, arguments map[string]any) (map[string]any, error) { - observe := mcpgw.StringArg(arguments, "observe") - if observe == "" { - return mcpgw.BuildToolErrorResult("observe is required"), nil - } - payload := map[string]any{"action": observe} - if v := mcpgw.StringArg(arguments, "selector"); v != "" { - payload["selector"] = v - } - if v := mcpgw.StringArg(arguments, "script"); v != "" { - payload["script"] = v - } - if v, ok := arguments["full_page"].(bool); ok { - payload["full_page"] = v - } - return e.doGatewayAction(ctx, botID, contextID, payload) -} - -func (e *Executor) doGatewayAction(ctx context.Context, botID, contextID string, payload map[string]any) (map[string]any, error) { +func (p *BrowserProvider) doGatewayAction(ctx context.Context, botID, contextID string, payload map[string]any) (any, error) { body, _ := json.Marshal(payload) - actionURL := fmt.Sprintf("%s/context/%s/action", e.gatewayBaseURL, contextID) + actionURL := fmt.Sprintf("%s/context/%s/action", p.gatewayBaseURL, contextID) req, err := http.NewRequestWithContext(ctx, http.MethodPost, actionURL, bytes.NewReader(body)) if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil + return nil, err } req.Header.Set("Content-Type", "application/json") - - resp, err := e.httpClient.Do(req) //nolint:gosec // URL from internal gateway config + resp, err := p.httpClient.Do(req) //nolint:gosec if err != nil { - return mcpgw.BuildToolErrorResult("browser gateway request failed: " + err.Error()), nil + return nil, fmt.Errorf("browser gateway request failed: %s", err.Error()) } defer func() { _ = resp.Body.Close() }() respBody, _ := io.ReadAll(resp.Body) - var gwResp struct { Success bool `json:"success"` Data map[string]any `json:"data"` Error string `json:"error"` } if err := json.Unmarshal(respBody, &gwResp); err != nil { - return mcpgw.BuildToolErrorResult("invalid gateway response"), nil + return nil, errors.New("invalid gateway response") } if !gwResp.Success { errMsg := gwResp.Error if errMsg == "" { errMsg = "browser action failed" } - return mcpgw.BuildToolErrorResult(errMsg), nil + return nil, fmt.Errorf("%s", errMsg) } - if b64, ok := gwResp.Data["screenshot"].(string); ok && b64 != "" { - return e.buildScreenshotResult(ctx, botID, b64), nil + return p.buildScreenshotResult(ctx, botID, b64), nil } - return mcpgw.BuildToolSuccessResult(gwResp.Data), nil + return gwResp.Data, nil } -const screenshotContainerDir = "/data/browser-screenshots" +const browserScreenshotDir = "/data/browser-screenshots" -func (e *Executor) buildScreenshotResult(ctx context.Context, botID, base64Data string) map[string]any { +func (p *BrowserProvider) buildScreenshotResult(ctx context.Context, botID, base64Data string) any { mimeType := "image/png" - imgBytes, err := base64.StdEncoding.DecodeString(base64Data) if err != nil { - e.logger.Warn("failed to decode screenshot base64", slog.Any("error", err)) return map[string]any{ "content": []map[string]any{ {"type": "text", "text": "Screenshot captured (failed to decode for saving)"}, @@ -282,12 +273,9 @@ func (e *Executor) buildScreenshotResult(ctx context.Context, botID, base64Data }, } } - - containerPath := fmt.Sprintf("%s/%d.png", screenshotContainerDir, time.Now().UnixMilli()) - - client, clientErr := e.containers.MCPClient(ctx, botID) + containerPath := fmt.Sprintf("%s/%d.png", browserScreenshotDir, time.Now().UnixMilli()) + client, clientErr := p.containers.MCPClient(ctx, botID) if clientErr != nil { - e.logger.Warn("container not reachable for screenshot save", slog.String("bot_id", botID), slog.Any("error", clientErr)) return map[string]any{ "content": []map[string]any{ {"type": "text", "text": "Screenshot captured (container not reachable, not saved to disk)"}, @@ -295,12 +283,9 @@ func (e *Executor) buildScreenshotResult(ctx context.Context, botID, base64Data }, } } - - mkdirCmd := fmt.Sprintf("mkdir -p %s", screenshotContainerDir) + mkdirCmd := fmt.Sprintf("mkdir -p %s", browserScreenshotDir) _, _ = client.Exec(ctx, mkdirCmd, "/", 5) - if writeErr := client.WriteFile(ctx, containerPath, imgBytes); writeErr != nil { - e.logger.Warn("failed to write screenshot to container", slog.String("bot_id", botID), slog.Any("error", writeErr)) return map[string]any{ "content": []map[string]any{ {"type": "text", "text": fmt.Sprintf("Screenshot captured (failed to save: %s)", writeErr.Error())}, @@ -308,7 +293,6 @@ func (e *Executor) buildScreenshotResult(ctx context.Context, botID, base64Data }, } } - return map[string]any{ "content": []map[string]any{ {"type": "text", "text": fmt.Sprintf("Screenshot saved to %s", containerPath)}, diff --git a/internal/agent/tools/contacts.go b/internal/agent/tools/contacts.go new file mode 100644 index 00000000..1ce93449 --- /dev/null +++ b/internal/agent/tools/contacts.go @@ -0,0 +1,94 @@ +package tools + +import ( + "context" + "errors" + "log/slog" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/channel/route" +) + +type ContactsProvider struct { + routeService route.Service + logger *slog.Logger +} + +func NewContactsProvider(log *slog.Logger, routeService route.Service) *ContactsProvider { + if log == nil { + log = slog.Default() + } + return &ContactsProvider{ + routeService: routeService, + logger: log.With(slog.String("tool", "contacts")), + } +} + +func (p *ContactsProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + if p.routeService == nil { + return nil, nil + } + sess := session + return []sdk.Tool{ + { + Name: "get_contacts", + Description: "List all known contacts and conversations for the current bot. Returns platform, conversation type, reply target, and metadata for each route.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "platform": map[string]any{ + "type": "string", + "description": "Filter by channel platform (e.g. telegram, feishu). Returns all platforms when omitted.", + }, + }, + "required": []string{}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + routes, err := p.routeService.List(ctx.Context, botID) + if err != nil { + return nil, err + } + platformFilter := strings.ToLower(strings.TrimSpace(FirstStringArg(args, "platform"))) + contacts := make([]map[string]any, 0, len(routes)) + for _, r := range routes { + if platformFilter != "" && !strings.EqualFold(r.Platform, platformFilter) { + continue + } + entry := map[string]any{ + "route_id": r.ID, + "platform": r.Platform, + "conversation_type": r.ConversationType, + "target": r.ReplyTarget, + "conversation_id": r.ConversationID, + "last_active": r.UpdatedAt.Format("2006-01-02T15:04:05Z"), + } + if len(r.Metadata) > 0 { + if v, ok := r.Metadata["conversation_name"].(string); ok && v != "" { + entry["display_name"] = v + } else if v, ok := r.Metadata["sender_display_name"].(string); ok && v != "" { + entry["display_name"] = v + } + if v, ok := r.Metadata["sender_username"].(string); ok && v != "" { + entry["username"] = v + } + entry["metadata"] = r.Metadata + } + contacts = append(contacts, entry) + } + return map[string]any{ + "ok": true, + "bot_id": botID, + "count": len(contacts), + "contacts": contacts, + }, nil + }, + }, + }, nil +} diff --git a/internal/agent/tools/container.go b/internal/agent/tools/container.go new file mode 100644 index 00000000..b172df11 --- /dev/null +++ b/internal/agent/tools/container.go @@ -0,0 +1,284 @@ +package tools + +import ( + "context" + "errors" + "fmt" + "io" + "log/slog" + "math" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/workspace/bridge" +) + +const defaultContainerExecWorkDir = "/data" + +type ContainerProvider struct { + clients bridge.Provider + execWorkDir string + logger *slog.Logger +} + +func NewContainerProvider(log *slog.Logger, clients bridge.Provider, execWorkDir string) *ContainerProvider { + if log == nil { + log = slog.Default() + } + wd := strings.TrimSpace(execWorkDir) + if wd == "" { + wd = defaultContainerExecWorkDir + } + return &ContainerProvider{clients: clients, execWorkDir: wd, logger: log.With(slog.String("tool", "container"))} +} + +func (p *ContainerProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + wd := p.execWorkDir + sess := session + return []sdk.Tool{ + { + Name: "read", + Description: fmt.Sprintf("Read file content inside the bot container. Supports pagination for large files. Max %d lines / %d bytes per call.", readMaxLines, readMaxBytes), + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "path": map[string]any{"type": "string", "description": fmt.Sprintf("File path (relative to %s or absolute inside container)", wd)}, + "line_offset": map[string]any{"type": "integer", "description": "Line number to start reading from (1-indexed). Default: 1.", "minimum": 1, "default": 1}, + "n_lines": map[string]any{"type": "integer", "description": fmt.Sprintf("Number of lines to read per call. Default: %d. Max: %d.", readMaxLines, readMaxLines), "minimum": 1, "maximum": readMaxLines, "default": readMaxLines}, + }, + "required": []string{"path"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execRead(ctx.Context, sess, inputAsMap(input)) + }, + }, + { + Name: "write", + Description: "Write file content inside the bot container.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "path": map[string]any{"type": "string", "description": fmt.Sprintf("File path (relative to %s or absolute inside container)", wd)}, + "content": map[string]any{"type": "string", "description": "File content"}, + }, + "required": []string{"path", "content"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execWrite(ctx.Context, sess, inputAsMap(input)) + }, + }, + { + Name: "list", + Description: "List directory entries inside the bot container.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "path": map[string]any{"type": "string", "description": fmt.Sprintf("Directory path (relative to %s or absolute inside container)", wd)}, + "recursive": map[string]any{"type": "boolean", "description": "List recursively"}, + }, + "required": []string{"path"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execList(ctx.Context, sess, inputAsMap(input)) + }, + }, + { + Name: "edit", + Description: "Replace exact text in a file inside the bot container.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "path": map[string]any{"type": "string", "description": fmt.Sprintf("File path (relative to %s or absolute inside container)", wd)}, + "old_text": map[string]any{"type": "string", "description": "Exact text to find"}, + "new_text": map[string]any{"type": "string", "description": "Replacement text"}, + }, + "required": []string{"path", "old_text", "new_text"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execEdit(ctx.Context, sess, inputAsMap(input)) + }, + }, + { + Name: "exec", + Description: fmt.Sprintf("Execute a command in the bot container. Runs in the bot's data directory (%s) by default.", wd), + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "command": map[string]any{"type": "string", "description": "Shell command to run (e.g. ls -la, cat file.txt)"}, + "work_dir": map[string]any{"type": "string", "description": fmt.Sprintf("Working directory inside the container (default: %s)", wd)}, + }, + "required": []string{"command"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execExec(ctx.Context, sess, inputAsMap(input)) + }, + }, + }, nil +} + +func (p *ContainerProvider) normalizePath(path string) string { + path = strings.TrimSpace(path) + if path == "" { + return path + } + prefix := p.execWorkDir + if prefix == "" { + prefix = defaultContainerExecWorkDir + } + if path == prefix { + return "." + } + if strings.HasPrefix(path, prefix+"/") { + return strings.TrimLeft(strings.TrimPrefix(path, prefix+"/"), "/") + } + return path +} + +func (p *ContainerProvider) getClient(ctx context.Context, botID string) (*bridge.Client, error) { + botID = strings.TrimSpace(botID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + client, err := p.clients.MCPClient(ctx, botID) + if err != nil { + return nil, fmt.Errorf("container not reachable: %w", err) + } + return client, nil +} + +func (p *ContainerProvider) execRead(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + client, err := p.getClient(ctx, session.BotID) + if err != nil { + return nil, err + } + filePath := p.normalizePath(StringArg(args, "path")) + if filePath == "" { + return nil, errors.New("path is required") + } + lineOffset := int32(1) + if offset, ok, err := IntArg(args, "line_offset"); err != nil { + return nil, fmt.Errorf("invalid line_offset: %w", err) + } else if ok { + if offset < 1 { + return nil, errors.New("line_offset must be >= 1") + } + if offset > math.MaxInt32 { + return nil, errors.New("line_offset exceeds maximum") + } + lineOffset = int32(offset) + } + nLines := int32(readMaxLines) + if n, ok, err := IntArg(args, "n_lines"); err != nil { + return nil, fmt.Errorf("invalid n_lines: %w", err) + } else if ok { + if n < 1 { + return nil, errors.New("n_lines must be >= 1") + } + if n > readMaxLines { + n = readMaxLines + } + nLines = int32(n) //nolint:gosec // bounded by readMaxLines + } + resp, err := client.ReadFile(ctx, filePath, lineOffset, nLines) + if err != nil { + return nil, err + } + if resp.GetBinary() { + return nil, errors.New("file appears to be binary. Read tool only supports text files") + } + return map[string]any{"content": resp.GetContent(), "total_lines": resp.GetTotalLines()}, nil +} + +func (p *ContainerProvider) execWrite(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + client, err := p.getClient(ctx, session.BotID) + if err != nil { + return nil, err + } + filePath := p.normalizePath(StringArg(args, "path")) + content := StringArg(args, "content") + if filePath == "" { + return nil, errors.New("path is required") + } + if err := client.WriteFile(ctx, filePath, []byte(content)); err != nil { + return nil, err + } + return map[string]any{"ok": true}, nil +} + +func (p *ContainerProvider) execList(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + client, err := p.getClient(ctx, session.BotID) + if err != nil { + return nil, err + } + dirPath := p.normalizePath(StringArg(args, "path")) + if dirPath == "" { + dirPath = "." + } + recursive, _, _ := BoolArg(args, "recursive") + entries, err := client.ListDir(ctx, dirPath, recursive) + if err != nil { + return nil, err + } + entriesMaps := make([]map[string]any, len(entries)) + for i, e := range entries { + entriesMaps[i] = map[string]any{ + "path": e.GetPath(), "is_dir": e.GetIsDir(), "size": e.GetSize(), + "mode": e.GetMode(), "mod_time": e.GetModTime(), + } + } + return map[string]any{"path": dirPath, "entries": entriesMaps}, nil +} + +func (p *ContainerProvider) execEdit(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + client, err := p.getClient(ctx, session.BotID) + if err != nil { + return nil, err + } + filePath := p.normalizePath(StringArg(args, "path")) + oldText := StringArg(args, "old_text") + newText := StringArg(args, "new_text") + if filePath == "" || oldText == "" { + return nil, errors.New("path, old_text and new_text are required") + } + reader, err := client.ReadRaw(ctx, filePath) + if err != nil { + return nil, err + } + defer func() { _ = reader.Close() }() + raw, err := io.ReadAll(reader) + if err != nil { + return nil, err + } + updated, err := applyEdit(string(raw), filePath, oldText, newText) + if err != nil { + return nil, err + } + if err := client.WriteFile(ctx, filePath, []byte(updated)); err != nil { + return nil, err + } + return map[string]any{"ok": true}, nil +} + +func (p *ContainerProvider) execExec(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + client, err := p.getClient(ctx, botID) + if err != nil { + return nil, err + } + command := strings.TrimSpace(StringArg(args, "command")) + if command == "" { + return nil, errors.New("command is required") + } + workDir := strings.TrimSpace(StringArg(args, "work_dir")) + if workDir == "" { + workDir = p.execWorkDir + } + result, err := client.Exec(ctx, command, workDir, 30) + if err != nil { + return nil, err + } + stdout := pruneToolOutputText(result.Stdout, "tool result (exec stdout)") + stderr := pruneToolOutputText(result.Stderr, "tool result (exec stderr)") + return map[string]any{"stdout": stdout, "stderr": stderr, "exit_code": result.ExitCode}, nil +} diff --git a/internal/agent/tools/email.go b/internal/agent/tools/email.go new file mode 100644 index 00000000..0df4cbe4 --- /dev/null +++ b/internal/agent/tools/email.go @@ -0,0 +1,273 @@ +package tools + +import ( + "context" + "errors" + "log/slog" + "math" + "strconv" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/email" +) + +type EmailProvider struct { + logger *slog.Logger + service *email.Service + manager *email.Manager +} + +func NewEmailProvider(log *slog.Logger, service *email.Service, manager *email.Manager) *EmailProvider { + return &EmailProvider{ + logger: log.With(slog.String("tool", "email")), + service: service, + manager: manager, + } +} + +func (p *EmailProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + sess := session + return []sdk.Tool{ + { + Name: "list_email_accounts", Description: "List the email accounts (provider bindings) configured for this bot, including provider IDs, email addresses, and permissions.", + Parameters: emptyObjectSchema(), + Execute: func(ctx *sdk.ToolExecContext, _ any) (any, error) { + return p.execListAccounts(ctx.Context, sess) + }, + }, + { + Name: "send_email", Description: "Send an email via the bot's configured email provider. Requires write permission.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "to": map[string]any{"type": "string", "description": "Recipient email address(es), comma-separated"}, + "subject": map[string]any{"type": "string", "description": "Email subject"}, + "body": map[string]any{"type": "string", "description": "Email body content"}, + "html": map[string]any{"type": "boolean", "description": "Whether body is HTML (default false)"}, + "provider_id": map[string]any{"type": "string", "description": "Email provider ID to send from (optional, uses default if omitted)"}, + }, + "required": []string{"to", "subject", "body"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execSendEmail(ctx.Context, sess, inputAsMap(input)) + }, + }, + { + Name: "list_email", Description: "List emails from the mailbox (newest first). Supports pagination. Requires read permission.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "page": map[string]any{"type": "integer", "description": "Page number, 0-based (default 0 = newest)"}, + "page_size": map[string]any{"type": "integer", "description": "Emails per page (default 20)"}, + "provider_id": map[string]any{"type": "string", "description": "Email provider ID (optional, uses first readable binding)"}, + }, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execListEmails(ctx.Context, sess, inputAsMap(input)) + }, + }, + { + Name: "read_email", Description: "Read the full content of an email by its UID. Requires read permission.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "uid": map[string]any{"type": "integer", "description": "The email UID from email_list results"}, + "provider_id": map[string]any{"type": "string", "description": "Email provider ID (optional, uses first readable binding)"}, + }, + "required": []string{"uid"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execReadEmail(ctx.Context, sess, inputAsMap(input)) + }, + }, + }, nil +} + +func (p *EmailProvider) getBindings(ctx context.Context, botID string) ([]email.BindingResponse, error) { + bindings, err := p.service.ListBindings(ctx, botID) + if err != nil || len(bindings) == 0 { + return nil, errors.New("no email binding configured for this bot") + } + return bindings, nil +} + +func (p *EmailProvider) execListAccounts(ctx context.Context, session SessionContext) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + bindings, err := p.getBindings(ctx, botID) + if err != nil { + return nil, err + } + accounts := make([]map[string]any, 0, len(bindings)) + for _, b := range bindings { + accounts = append(accounts, map[string]any{ + "provider_id": b.EmailProviderID, "email_address": b.EmailAddress, + "can_read": b.CanRead, "can_write": b.CanWrite, "can_delete": b.CanDelete, + }) + } + return map[string]any{"accounts": accounts}, nil +} + +func (p *EmailProvider) execSendEmail(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + bindings, err := p.getBindings(ctx, botID) + if err != nil { + return nil, err + } + binding := resolveWriteBinding(bindings, StringArg(args, "provider_id")) + if binding == nil { + return nil, errors.New("email write permission denied or provider not found") + } + toRaw := StringArg(args, "to") + subject := StringArg(args, "subject") + body := StringArg(args, "body") + isHTML, _, _ := BoolArg(args, "html") + if toRaw == "" || subject == "" || body == "" { + return nil, errors.New("to, subject, and body are required") + } + var toList []string + for _, addr := range strings.Split(toRaw, ",") { + addr = strings.TrimSpace(addr) + if addr != "" { + toList = append(toList, addr) + } + } + messageID, err := p.manager.SendEmail(ctx, botID, binding.EmailProviderID, email.OutboundEmail{ + To: toList, Subject: subject, Body: body, HTML: isHTML, + }) + if err != nil { + return nil, err + } + return map[string]any{"message_id": messageID, "status": "sent"}, nil +} + +func (p *EmailProvider) execListEmails(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + bindings, err := p.getBindings(ctx, botID) + if err != nil { + return nil, err + } + binding := resolveReadBinding(bindings, StringArg(args, "provider_id")) + if binding == nil { + return nil, errors.New("email read permission denied or provider not found") + } + providerName, config, err := p.service.ProviderConfig(ctx, binding.EmailProviderID) + if err != nil { + return nil, err + } + config = ensureProviderID(config, binding.EmailProviderID) + reader, err := p.service.Registry().GetMailboxReader(providerName) + if err != nil { + return nil, errors.New("mailbox listing not supported for this provider") + } + page, _, _ := IntArg(args, "page") + pageSize, _, _ := IntArg(args, "page_size") + if pageSize <= 0 { + pageSize = 20 + } + emails, total, err := reader.ListMailbox(ctx, config, page, pageSize) + if err != nil { + return nil, err + } + summaries := make([]map[string]any, 0, len(emails)) + for _, item := range emails { + summaries = append(summaries, map[string]any{ + "uid": item.MessageID, "from": item.From, "subject": item.Subject, "received_at": item.ReceivedAt, + }) + } + return map[string]any{"emails": summaries, "total": total, "page": page}, nil +} + +func (p *EmailProvider) execReadEmail(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + bindings, err := p.getBindings(ctx, botID) + if err != nil { + return nil, err + } + binding := resolveReadBinding(bindings, StringArg(args, "provider_id")) + if binding == nil { + return nil, errors.New("email read permission denied or provider not found") + } + uidRaw, ok, _ := IntArg(args, "uid") + if !ok || uidRaw <= 0 { + uidStr := StringArg(args, "uid") + if uidStr != "" { + parsed, _ := strconv.Atoi(uidStr) + uidRaw = parsed + } + } + if uidRaw <= 0 { + return nil, errors.New("uid is required") + } + if uidRaw > math.MaxUint32 { + return nil, errors.New("uid out of range") + } + providerName, config, err := p.service.ProviderConfig(ctx, binding.EmailProviderID) + if err != nil { + return nil, err + } + config = ensureProviderID(config, binding.EmailProviderID) + reader, err := p.service.Registry().GetMailboxReader(providerName) + if err != nil { + return nil, errors.New("mailbox reading not supported for this provider") + } + item, err := reader.ReadMailbox(ctx, config, uint32(uidRaw)) //nolint:gosec // bounds checked above + if err != nil { + return nil, err + } + return map[string]any{ + "uid": item.MessageID, "from": item.From, "to": item.To, + "subject": item.Subject, "body": item.BodyText, "received_at": item.ReceivedAt, + }, nil +} + +func resolveReadBinding(bindings []email.BindingResponse, providerID string) *email.BindingResponse { + for i := range bindings { + if !bindings[i].CanRead { + continue + } + if providerID == "" || bindings[i].EmailProviderID == providerID { + return &bindings[i] + } + } + return nil +} + +func resolveWriteBinding(bindings []email.BindingResponse, providerID string) *email.BindingResponse { + for i := range bindings { + if !bindings[i].CanWrite { + continue + } + if providerID == "" || bindings[i].EmailProviderID == providerID { + return &bindings[i] + } + } + return nil +} + +func ensureProviderID(config map[string]any, providerID string) map[string]any { + if config == nil { + config = make(map[string]any) + } else { + copied := make(map[string]any, len(config)+1) + for k, v := range config { + copied[k] = v + } + config = copied + } + config["_provider_id"] = providerID + return config +} diff --git a/internal/agent/tools/federation.go b/internal/agent/tools/federation.go new file mode 100644 index 00000000..3080180c --- /dev/null +++ b/internal/agent/tools/federation.go @@ -0,0 +1,85 @@ +package tools + +import ( + "context" + "encoding/json" + "log/slog" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/mcp" +) + +// FederationProvider adapts a mcp.ToolSource (federated MCP connections) +// into the ToolProvider interface so the agent can load external MCP tools +// alongside built-in tools. +type FederationProvider struct { + source mcp.ToolSource + logger *slog.Logger +} + +func NewFederationProvider(log *slog.Logger, source mcp.ToolSource) *FederationProvider { + if log == nil { + log = slog.Default() + } + return &FederationProvider{ + source: source, + logger: log.With(slog.String("tool", "federation")), + } +} + +func (f *FederationProvider) Tools(ctx context.Context, session SessionContext) ([]sdk.Tool, error) { + if f.source == nil { + return nil, nil + } + mcpSession := toMCPSession(session) + descriptors, err := f.source.ListTools(ctx, mcpSession) + if err != nil { + f.logger.Warn("federation list tools failed", slog.Any("error", err)) + return nil, nil + } + tools := make([]sdk.Tool, 0, len(descriptors)) + for _, desc := range descriptors { + desc := desc + src := f.source + sess := mcpSession + tools = append(tools, sdk.Tool{ + Name: desc.Name, + Description: desc.Description, + Parameters: desc.InputSchema, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + result, err := src.CallTool(ctx.Context, sess, desc.Name, args) + if err != nil { + return nil, err + } + return normalizeMCPResult(result), nil + }, + }) + } + return tools, nil +} + +func normalizeMCPResult(result map[string]any) any { + if result == nil { + return map[string]any{"ok": true} + } + if isErr, ok := result["isError"].(bool); ok && isErr { + return result + } + if sc, ok := result["structuredContent"]; ok && sc != nil { + return sc + } + if content, ok := result["content"]; ok { + if items, ok := content.([]map[string]any); ok && len(items) == 1 { + if text, ok := items[0]["text"].(string); ok { + var parsed any + if json.Unmarshal([]byte(text), &parsed) == nil { + return parsed + } + return text + } + } + } + return result +} diff --git a/internal/mcp/providers/container/fsops.go b/internal/agent/tools/fsops.go similarity index 73% rename from internal/mcp/providers/container/fsops.go rename to internal/agent/tools/fsops.go index 1bfdbdf3..4a90c771 100644 --- a/internal/mcp/providers/container/fsops.go +++ b/internal/agent/tools/fsops.go @@ -1,4 +1,4 @@ -package container +package tools import ( "fmt" @@ -6,8 +6,6 @@ import ( "unicode" ) -// applyEdit performs the fuzzy text replacement logic on raw file content. -// Returns the updated content or an error. func applyEdit(raw, filePath, oldText, newText string) (string, error) { bom, content := stripBOM(raw) originalEnding := detectLineEnding(content) @@ -27,8 +25,7 @@ func applyEdit(raw, filePath, oldText, newText string) (string, error) { if occurrences > 1 { return "", fmt.Errorf( "found %d occurrences of the text in %s. the text must be unique. please provide more context to make it unique", - occurrences, - filePath, + occurrences, filePath, ) } baseContent := match.ContentForReplacement @@ -42,29 +39,6 @@ func applyEdit(raw, filePath, oldText, newText string) (string, error) { return bom + restoreLineEndings(updated, originalEnding), nil } -// ShellQuote wraps a string in single quotes, escaping embedded single quotes. -func ShellQuote(s string) string { - if s == "" { - return "''" - } - if strings.IndexByte(s, '\'') < 0 { - return "'" + s + "'" - } - var b strings.Builder - b.WriteByte('\'') - for _, c := range s { - if c == '\'' { - b.WriteString("'\\''") - } else { - b.WriteRune(c) - } - } - b.WriteByte('\'') - return b.String() -} - -// ---------- fuzzy matching helpers ---------- - type fuzzyMatchResult struct { Found bool Index int @@ -132,28 +106,13 @@ func normalizeForFuzzyMatch(text string) string { func fuzzyFindText(content, oldText string) fuzzyMatchResult { exactIndex := strings.Index(content, oldText) if exactIndex != -1 { - return fuzzyMatchResult{ - Found: true, - Index: exactIndex, - MatchLength: len(oldText), - ContentForReplacement: content, - } + return fuzzyMatchResult{Found: true, Index: exactIndex, MatchLength: len(oldText), ContentForReplacement: content} } fuzzyContent := normalizeForFuzzyMatch(content) fuzzyOld := normalizeForFuzzyMatch(oldText) fuzzyIndex := strings.Index(fuzzyContent, fuzzyOld) if fuzzyIndex == -1 { - return fuzzyMatchResult{ - Found: false, - Index: -1, - MatchLength: 0, - ContentForReplacement: content, - } - } - return fuzzyMatchResult{ - Found: true, - Index: fuzzyIndex, - MatchLength: len(fuzzyOld), - ContentForReplacement: fuzzyContent, + return fuzzyMatchResult{Found: false, Index: -1, ContentForReplacement: content} } + return fuzzyMatchResult{Found: true, Index: fuzzyIndex, MatchLength: len(fuzzyOld), ContentForReplacement: fuzzyContent} } diff --git a/internal/agent/tools/inbox.go b/internal/agent/tools/inbox.go new file mode 100644 index 00000000..43d95a02 --- /dev/null +++ b/internal/agent/tools/inbox.go @@ -0,0 +1,114 @@ +package tools + +import ( + "context" + "errors" + "fmt" + "log/slog" + "strings" + "time" + + sdk "github.com/memohai/twilight-ai/sdk" + + inboxsvc "github.com/memohai/memoh/internal/inbox" +) + +const ( + defaultInboxSearchLimit = 20 + maxInboxSearchLimit = 100 +) + +type InboxProvider struct { + service *inboxsvc.Service + logger *slog.Logger +} + +func NewInboxProvider(log *slog.Logger, service *inboxsvc.Service) *InboxProvider { + if log == nil { + log = slog.Default() + } + return &InboxProvider{ + service: service, + logger: log.With(slog.String("tool", "inbox")), + } +} + +func (p *InboxProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + if p.service == nil { + return nil, nil + } + sess := session + return []sdk.Tool{ + { + Name: "search_inbox", + Description: "Search historical inbox messages by keyword. Inbox contains messages from group conversations where the bot was not directly mentioned, as well as notifications from external sources.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "query": map[string]any{"type": "string", "description": "Search keyword to match against inbox message content"}, + "start_time": map[string]any{"type": "string", "description": "ISO 8601 start time filter (e.g. 2025-01-01T00:00:00Z)"}, + "end_time": map[string]any{"type": "string", "description": "ISO 8601 end time filter"}, + "limit": map[string]any{"type": "integer", "description": "Maximum number of results (default 20, max 100)"}, + "include_read": map[string]any{"type": "boolean", "description": "Whether to include already-read items (default true)"}, + }, + "required": []string{}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + query := StringArg(args, "query") + limit := defaultInboxSearchLimit + if value, ok, err := IntArg(args, "limit"); err != nil { + return nil, err + } else if ok { + limit = value + } + if limit <= 0 { + limit = defaultInboxSearchLimit + } + if limit > maxInboxSearchLimit { + limit = maxInboxSearchLimit + } + req := inboxsvc.SearchRequest{Query: query, Limit: limit} + if startStr := StringArg(args, "start_time"); startStr != "" { + t, err := time.Parse(time.RFC3339, startStr) + if err != nil { + return nil, fmt.Errorf("invalid start_time: %w", err) + } + req.StartTime = &t + } + if endStr := StringArg(args, "end_time"); endStr != "" { + t, err := time.Parse(time.RFC3339, endStr) + if err != nil { + return nil, fmt.Errorf("invalid end_time: %w", err) + } + req.EndTime = &t + } + if includeRead, ok, err := BoolArg(args, "include_read"); err != nil { + return nil, err + } else if ok { + req.IncludeRead = &includeRead + } + items, err := p.service.Search(ctx.Context, botID, req) + if err != nil { + return nil, errors.New("inbox search failed") + } + results := make([]map[string]any, 0, len(items)) + for _, item := range items { + results = append(results, map[string]any{ + "id": item.ID, + "source": item.Source, + "header": item.Header, + "content": item.Content, + "is_read": item.IsRead, + "created_at": item.CreatedAt.Format(time.RFC3339), + }) + } + return map[string]any{"query": query, "total": len(results), "results": results}, nil + }, + }, + }, nil +} diff --git a/internal/agent/tools/memory.go b/internal/agent/tools/memory.go new file mode 100644 index 00000000..c01a6022 --- /dev/null +++ b/internal/agent/tools/memory.go @@ -0,0 +1,126 @@ +package tools + +import ( + "context" + "encoding/json" + "log/slog" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/mcp" + memprovider "github.com/memohai/memoh/internal/memory/adapters" + "github.com/memohai/memoh/internal/settings" +) + +// MemorySettingsReader returns bot settings for memory provider resolution. +type MemorySettingsReader interface { + GetBot(ctx context.Context, botID string) (settings.Settings, error) +} + +type MemoryProvider struct { + registry *memprovider.Registry + settings MemorySettingsReader + logger *slog.Logger +} + +func NewMemoryProvider(log *slog.Logger, registry *memprovider.Registry, settingsSvc MemorySettingsReader) *MemoryProvider { + if log == nil { + log = slog.Default() + } + return &MemoryProvider{ + registry: registry, + settings: settingsSvc, + logger: log.With(slog.String("tool", "memory")), + } +} + +func (p *MemoryProvider) Tools(ctx context.Context, session SessionContext) ([]sdk.Tool, error) { + provider := p.resolveProvider(ctx, session.BotID) + if provider == nil { + return nil, nil + } + mcpSession := toMCPSession(session) + descriptors, err := provider.ListTools(ctx, mcpSession) + if err != nil { + return nil, nil + } + var tools []sdk.Tool + for _, desc := range descriptors { + desc := desc + prov := provider + sess := mcpSession + tools = append(tools, sdk.Tool{ + Name: desc.Name, + Description: desc.Description, + Parameters: desc.InputSchema, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + result, err := prov.CallTool(ctx.Context, sess, desc.Name, args) + if err != nil { + return nil, err + } + return normalizeToolResult(result), nil + }, + }) + } + return tools, nil +} + +func (p *MemoryProvider) resolveProvider(ctx context.Context, botID string) memprovider.Provider { + if p.registry == nil || p.settings == nil { + return nil + } + botID = strings.TrimSpace(botID) + if botID == "" { + return nil + } + botSettings, err := p.settings.GetBot(ctx, botID) + if err != nil { + return nil + } + providerID := strings.TrimSpace(botSettings.MemoryProviderID) + if providerID == "" { + return nil + } + prov, err := p.registry.Get(providerID) + if err != nil { + return nil + } + return prov +} + +func toMCPSession(s SessionContext) mcp.ToolSessionContext { + return mcp.ToolSessionContext{ + BotID: s.BotID, + ChatID: s.ChatID, + ChannelIdentityID: s.ChannelIdentityID, + SessionToken: s.SessionToken, + CurrentPlatform: s.CurrentPlatform, + ReplyTarget: s.ReplyTarget, + IsSubagent: s.IsSubagent, + } +} + +// normalizeToolResult extracts structuredContent from MCP-style results +// so the LLM sees clean data instead of the MCP wrapper. +func normalizeToolResult(result map[string]any) any { + if result == nil { + return map[string]any{"ok": true} + } + if sc, ok := result["structuredContent"]; ok && sc != nil { + return sc + } + if content, ok := result["content"]; ok { + if items, ok := content.([]map[string]any); ok && len(items) == 1 { + if text, ok := items[0]["text"].(string); ok { + var parsed any + if json.Unmarshal([]byte(text), &parsed) == nil { + return parsed + } + return text + } + } + } + return result +} diff --git a/internal/agent/tools/message.go b/internal/agent/tools/message.go new file mode 100644 index 00000000..f1a6bb9f --- /dev/null +++ b/internal/agent/tools/message.go @@ -0,0 +1,399 @@ +package tools + +import ( + "context" + "encoding/json" + "errors" + "log/slog" + "path/filepath" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/channel" +) + +// MessageSender sends outbound messages through channel manager. +type MessageSender interface { + Send(ctx context.Context, botID string, channelType channel.ChannelType, req channel.SendRequest) error +} + +// MessageReactor adds or removes emoji reactions through channel manager. +type MessageReactor interface { + React(ctx context.Context, botID string, channelType channel.ChannelType, req channel.ReactRequest) error +} + +// MessageChannelResolver parses platform name to channel type. +type MessageChannelResolver interface { + ParseChannelType(raw string) (channel.ChannelType, error) +} + +// AssetMeta holds resolved metadata for a media asset. +type AssetMeta struct { + ContentHash string + Mime string + SizeBytes int64 + StorageKey string +} + +// AssetResolver looks up persisted media assets by storage key. +type AssetResolver interface { + GetByStorageKey(ctx context.Context, botID, storageKey string) (AssetMeta, error) + IngestContainerFile(ctx context.Context, botID, containerPath string) (AssetMeta, error) +} + +type MessageProvider struct { + sender MessageSender + reactor MessageReactor + resolver MessageChannelResolver + assetResolver AssetResolver + logger *slog.Logger +} + +func NewMessageProvider(log *slog.Logger, sender MessageSender, reactor MessageReactor, resolver MessageChannelResolver, assetResolver AssetResolver) *MessageProvider { + if log == nil { + log = slog.Default() + } + return &MessageProvider{ + sender: sender, reactor: reactor, resolver: resolver, + assetResolver: assetResolver, + logger: log.With(slog.String("tool", "message")), + } +} + +func (p *MessageProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + var tools []sdk.Tool + sess := session + if p.sender != nil && p.resolver != nil { + tools = append(tools, sdk.Tool{ + Name: "send", + Description: "Send a message to a DIFFERENT channel or person — NOT for replying to the current conversation. Use this only for cross-channel messaging, forwarding, or replying to inbox items.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "bot_id": map[string]any{"type": "string", "description": "Bot ID, optional and defaults to current bot"}, + "platform": map[string]any{"type": "string", "description": "Channel platform name"}, + "target": map[string]any{"type": "string", "description": "Channel target (chat/group/thread ID). Use get_contacts to find available targets."}, + "text": map[string]any{"type": "string", "description": "Message text shortcut when message object is omitted"}, + "reply_to": map[string]any{"type": "string", "description": "Message ID to reply to. The reply will reference this message on the platform."}, + "attachments": map[string]any{"type": "array", "description": "File paths or URLs to attach.", "items": map[string]any{"type": "string"}}, + "message": map[string]any{"type": "object", "description": "Structured message payload with text/parts/attachments"}, + }, + "required": []string{}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execSend(ctx.Context, sess, inputAsMap(input)) + }, + }) + } + if p.reactor != nil && p.resolver != nil { + tools = append(tools, sdk.Tool{ + Name: "react", + Description: "Add or remove an emoji reaction on a channel message", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "bot_id": map[string]any{"type": "string", "description": "Bot ID, optional and defaults to current bot"}, + "platform": map[string]any{"type": "string", "description": "Channel platform name. Defaults to current session platform."}, + "target": map[string]any{"type": "string", "description": "Channel target (chat/group ID). Defaults to current session reply target."}, + "message_id": map[string]any{"type": "string", "description": "The message ID to react to"}, + "emoji": map[string]any{"type": "string", "description": "Emoji to react with (e.g. 👍, ❤️). Required when adding a reaction."}, + "remove": map[string]any{"type": "boolean", "description": "If true, remove the reaction instead of adding it. Default false."}, + }, + "required": []string{"message_id"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execReact(ctx.Context, sess, inputAsMap(input)) + }, + }) + } + return tools, nil +} + +func (p *MessageProvider) execSend(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID, err := resolveBotID(args, session) + if err != nil { + return nil, err + } + channelType, err := p.resolvePlatform(args, session) + if err != nil { + return nil, err + } + messageText := FirstStringArg(args, "text") + outboundMessage, parseErr := parseOutboundMessage(args, messageText) + if parseErr != nil { + if rawAtt, ok := args["attachments"]; !ok || rawAtt == nil { + return nil, parseErr + } + outboundMessage = channel.Message{Text: strings.TrimSpace(messageText)} + } + if rawAttachments, ok := args["attachments"]; ok && rawAttachments != nil { + items := normalizeAttachmentInputs(rawAttachments) + if items == nil { + return nil, errors.New("attachments must be a string, object, or array") + } + if len(items) > 0 { + resolved := p.resolveAttachments(ctx, botID, items) + if len(resolved) == 0 { + return nil, errors.New("attachments could not be resolved") + } + outboundMessage.Attachments = append(outboundMessage.Attachments, resolved...) + } + } + if outboundMessage.IsEmpty() { + return nil, errors.New("message or attachments required") + } + if replyTo := FirstStringArg(args, "reply_to"); replyTo != "" { + outboundMessage.Reply = &channel.ReplyRef{MessageID: replyTo} + } + if outboundMessage.Format == "" && channel.ContainsMarkdown(outboundMessage.Text) { + outboundMessage.Format = channel.MessageFormatMarkdown + } + target := FirstStringArg(args, "target") + if target == "" { + target = strings.TrimSpace(session.ReplyTarget) + } + if target == "" { + return nil, errors.New("target is required") + } + if strings.EqualFold(channelType.String(), strings.TrimSpace(session.CurrentPlatform)) && + target == strings.TrimSpace(session.ReplyTarget) { + return nil, errors.New("you are trying to send a message to the same conversation you are already in. " + + "Do not use the send tool for this. Instead, write your reply as plain text directly. " + + "To include files, use the block in your response (e.g. [{\"type\":\"image\",\"path\":\"/data/media/file.jpg\"}])") + } + if err := p.sender.Send(ctx, botID, channelType, channel.SendRequest{Target: target, Message: outboundMessage}); err != nil { + return nil, err + } + return map[string]any{ + "ok": true, "bot_id": botID, "platform": channelType.String(), "target": target, + "instruction": "Message delivered successfully. You have completed your response. Please STOP now and do not call any more tools.", + }, nil +} + +func (p *MessageProvider) execReact(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID, err := resolveBotID(args, session) + if err != nil { + return nil, err + } + channelType, err := p.resolvePlatform(args, session) + if err != nil { + return nil, err + } + target := FirstStringArg(args, "target") + if target == "" { + target = strings.TrimSpace(session.ReplyTarget) + } + if target == "" { + return nil, errors.New("target is required") + } + messageID := FirstStringArg(args, "message_id") + if messageID == "" { + return nil, errors.New("message_id is required") + } + emoji := FirstStringArg(args, "emoji") + remove, _, _ := BoolArg(args, "remove") + if err := p.reactor.React(ctx, botID, channelType, channel.ReactRequest{ + Target: target, MessageID: messageID, Emoji: emoji, Remove: remove, + }); err != nil { + return nil, err + } + action := "added" + if remove { + action = "removed" + } + return map[string]any{ + "ok": true, "bot_id": botID, "platform": channelType.String(), + "target": target, "message_id": messageID, "emoji": emoji, "action": action, + }, nil +} + +func (p *MessageProvider) resolvePlatform(args map[string]any, session SessionContext) (channel.ChannelType, error) { + platform := FirstStringArg(args, "platform") + if platform == "" { + platform = strings.TrimSpace(session.CurrentPlatform) + } + if platform == "" { + return "", errors.New("platform is required") + } + return p.resolver.ParseChannelType(platform) +} + +func resolveBotID(args map[string]any, session SessionContext) (string, error) { + botID := FirstStringArg(args, "bot_id") + if botID == "" { + botID = strings.TrimSpace(session.BotID) + } + if botID == "" { + return "", errors.New("bot_id is required") + } + if strings.TrimSpace(session.BotID) != "" && botID != strings.TrimSpace(session.BotID) { + return "", errors.New("bot_id mismatch") + } + return botID, nil +} + +func (p *MessageProvider) resolveAttachments(ctx context.Context, botID string, items []any) []channel.Attachment { + var result []channel.Attachment + for _, item := range items { + switch v := item.(type) { + case string: + if att := p.resolveAttachmentRef(ctx, botID, strings.TrimSpace(v), "", ""); att != nil { + result = append(result, *att) + } + case map[string]any: + path := FirstStringArg(v, "path") + urlVal := FirstStringArg(v, "url") + attType := FirstStringArg(v, "type") + name := FirstStringArg(v, "name") + ref := path + if ref == "" { + ref = urlVal + } + if ref == "" { + continue + } + if att := p.resolveAttachmentRef(ctx, botID, ref, attType, name); att != nil { + result = append(result, *att) + } + } + } + return result +} + +func normalizeAttachmentInputs(raw any) []any { + switch v := raw.(type) { + case nil: + return nil + case []any: + if v == nil { + return []any{} + } + return v + case []string: + items := make([]any, 0, len(v)) + for _, item := range v { + items = append(items, item) + } + return items + case string, map[string]any: + return []any{v} + default: + return nil + } +} + +func (p *MessageProvider) resolveAttachmentRef(ctx context.Context, botID, ref, attType, name string) *channel.Attachment { + ref = strings.TrimSpace(ref) + if ref == "" { + return nil + } + lower := strings.ToLower(ref) + if strings.HasPrefix(lower, "http://") || strings.HasPrefix(lower, "https://") { + t := channel.AttachmentType(attType) + if t == "" { + t = inferAttachmentTypeFromExt(ref) + } + return &channel.Attachment{Type: t, URL: ref, Name: name} + } + if strings.HasPrefix(lower, "data:") { + t := channel.AttachmentType(attType) + if t == "" { + t = channel.AttachmentImage + } + return &channel.Attachment{Type: t, Base64: ref, Name: name} + } + if name == "" { + name = filepath.Base(ref) + } + mediaMarker := filepath.Join("/data", "media") + if !strings.HasSuffix(mediaMarker, "/") { + mediaMarker += "/" + } + if idx := strings.Index(ref, mediaMarker); idx >= 0 && p.assetResolver != nil { + storageKey := ref[idx+len(mediaMarker):] + asset, err := p.assetResolver.GetByStorageKey(ctx, botID, storageKey) + if err == nil { + return assetMetaToAttachment(asset, botID, attType, name) + } + } + dataPrefix := "/data/" + if strings.HasPrefix(ref, dataPrefix) && p.assetResolver != nil { + asset, err := p.assetResolver.IngestContainerFile(ctx, botID, ref) + if err == nil { + return assetMetaToAttachment(asset, botID, attType, name) + } + return nil + } + t := channel.AttachmentType(attType) + if t == "" { + t = inferAttachmentTypeFromExt(ref) + } + return &channel.Attachment{Type: t, URL: ref, Name: name} +} + +func assetMetaToAttachment(asset AssetMeta, botID, attType, name string) *channel.Attachment { + t := channel.AttachmentType(attType) + if t == "" { + t = inferAttachmentTypeFromMime(asset.Mime) + } + return &channel.Attachment{ + Type: t, ContentHash: asset.ContentHash, Mime: asset.Mime, Size: asset.SizeBytes, Name: name, + Metadata: map[string]any{"bot_id": botID, "storage_key": asset.StorageKey}, + } +} + +func inferAttachmentTypeFromMime(mime string) channel.AttachmentType { + mime = strings.ToLower(strings.TrimSpace(mime)) + switch { + case strings.HasPrefix(mime, "image/"): + return channel.AttachmentImage + case strings.HasPrefix(mime, "audio/"): + return channel.AttachmentAudio + case strings.HasPrefix(mime, "video/"): + return channel.AttachmentVideo + default: + return channel.AttachmentFile + } +} + +func inferAttachmentTypeFromExt(path string) channel.AttachmentType { + ext := strings.ToLower(filepath.Ext(path)) + switch ext { + case ".jpg", ".jpeg", ".png", ".gif", ".webp", ".svg": + return channel.AttachmentImage + case ".mp3", ".wav", ".ogg", ".flac", ".aac": + return channel.AttachmentAudio + case ".mp4", ".webm", ".avi", ".mov": + return channel.AttachmentVideo + default: + return channel.AttachmentFile + } +} + +func parseOutboundMessage(arguments map[string]any, fallbackText string) (channel.Message, error) { + var msg channel.Message + if raw, ok := arguments["message"]; ok && raw != nil { + switch value := raw.(type) { + case string: + msg.Text = strings.TrimSpace(value) + case map[string]any: + data, err := json.Marshal(value) + if err != nil { + return channel.Message{}, err + } + if err := json.Unmarshal(data, &msg); err != nil { + return channel.Message{}, err + } + default: + return channel.Message{}, errors.New("message must be object or string") + } + } + if msg.IsEmpty() && strings.TrimSpace(fallbackText) != "" { + msg.Text = strings.TrimSpace(fallbackText) + } + if msg.IsEmpty() { + return channel.Message{}, errors.New("message is required") + } + return msg, nil +} diff --git a/internal/agent/tools/prune.go b/internal/agent/tools/prune.go new file mode 100644 index 00000000..bcce1c3a --- /dev/null +++ b/internal/agent/tools/prune.go @@ -0,0 +1,32 @@ +package tools + +import ( + textprune "github.com/memohai/memoh/internal/prune" +) + +const ( + toolOutputHeadBytes = 4 * 1024 + toolOutputTailBytes = 1 * 1024 + toolOutputHeadLines = 150 + toolOutputTailLines = 50 + + readMaxLines = 200 + readMaxBytes = 5120 + readMaxLineLength = 1000 + readHeadBytes = 3072 + readTailBytes = 1024 + readHeadLines = 120 + readTailLines = 40 +) + +func pruneToolOutputText(text, label string) string { + return textprune.PruneWithEdges(text, label, textprune.Config{ + MaxBytes: textprune.DefaultMaxBytes, + MaxLines: textprune.DefaultMaxLines, + HeadBytes: toolOutputHeadBytes, + TailBytes: toolOutputTailBytes, + HeadLines: toolOutputHeadLines, + TailLines: toolOutputTailLines, + Marker: textprune.DefaultMarker, + }) +} diff --git a/internal/agent/tools/schedule.go b/internal/agent/tools/schedule.go new file mode 100644 index 00000000..6ded2b65 --- /dev/null +++ b/internal/agent/tools/schedule.go @@ -0,0 +1,246 @@ +package tools + +import ( + "context" + "errors" + "log/slog" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + sched "github.com/memohai/memoh/internal/schedule" +) + +type ScheduleProvider struct { + service Scheduler + logger *slog.Logger +} + +// Scheduler is the interface for schedule CRUD operations. +type Scheduler interface { + List(ctx context.Context, botID string) ([]sched.Schedule, error) + Get(ctx context.Context, id string) (sched.Schedule, error) + Create(ctx context.Context, botID string, req sched.CreateRequest) (sched.Schedule, error) + Update(ctx context.Context, id string, req sched.UpdateRequest) (sched.Schedule, error) + Delete(ctx context.Context, id string) error +} + +func NewScheduleProvider(log *slog.Logger, service Scheduler) *ScheduleProvider { + if log == nil { + log = slog.Default() + } + return &ScheduleProvider{ + service: service, + logger: log.With(slog.String("tool", "schedule")), + } +} + +func (p *ScheduleProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + if p.service == nil { + return nil, nil + } + sess := session + return []sdk.Tool{ + { + Name: "list_schedule", Description: "List schedules for current bot", + Parameters: emptyObjectSchema(), + Execute: func(ctx *sdk.ToolExecContext, _ any) (any, error) { + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + items, err := p.service.List(ctx.Context, botID) + if err != nil { + return nil, err + } + return map[string]any{"items": items}, nil + }, + }, + { + Name: "get_schedule", Description: "Get a schedule by id", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "id": map[string]any{"type": "string", "description": "Schedule ID"}, + }, + "required": []string{"id"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + id := StringArg(args, "id") + if id == "" { + return nil, errors.New("id is required") + } + item, err := p.service.Get(ctx.Context, id) + if err != nil { + return nil, err + } + if item.BotID != botID { + return nil, errors.New("bot mismatch") + } + return item, nil + }, + }, + { + Name: "create_schedule", Description: "Create a new schedule", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "name": map[string]any{"type": "string"}, "description": map[string]any{"type": "string"}, + "pattern": map[string]any{"type": "string"}, "command": map[string]any{"type": "string"}, + "max_calls": map[string]any{"type": []string{"integer", "null"}, "description": "Optional max calls, null means unlimited"}, + "enabled": map[string]any{"type": "boolean"}, + }, + "required": []string{"name", "description", "pattern", "command"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + name := StringArg(args, "name") + description := StringArg(args, "description") + pattern := StringArg(args, "pattern") + command := StringArg(args, "command") + if name == "" || description == "" || pattern == "" || command == "" { + return nil, errors.New("name, description, pattern, command are required") + } + req := sched.CreateRequest{Name: name, Description: description, Pattern: pattern, Command: command} + maxCalls, err := parseNullableIntArg(args, "max_calls") + if err != nil { + return nil, err + } + req.MaxCalls = maxCalls + if enabled, ok, err := BoolArg(args, "enabled"); err != nil { + return nil, err + } else if ok { + req.Enabled = &enabled + } + item, err := p.service.Create(ctx.Context, botID, req) + if err != nil { + return nil, err + } + return item, nil + }, + }, + { + Name: "update_schedule", Description: "Update an existing schedule", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "id": map[string]any{"type": "string"}, "name": map[string]any{"type": "string"}, + "description": map[string]any{"type": "string"}, "pattern": map[string]any{"type": "string"}, + "command": map[string]any{"type": "string"}, + "max_calls": map[string]any{"type": []string{"integer", "null"}}, + "enabled": map[string]any{"type": "boolean"}, + }, + "required": []string{"id"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + id := StringArg(args, "id") + if id == "" { + return nil, errors.New("id is required") + } + req := sched.UpdateRequest{} + maxCalls, err := parseNullableIntArg(args, "max_calls") + if err != nil { + return nil, err + } + req.MaxCalls = maxCalls + if v := StringArg(args, "name"); v != "" { + req.Name = &v + } + if v := StringArg(args, "description"); v != "" { + req.Description = &v + } + if v := StringArg(args, "pattern"); v != "" { + req.Pattern = &v + } + if v := StringArg(args, "command"); v != "" { + req.Command = &v + } + if enabled, ok, err := BoolArg(args, "enabled"); err != nil { + return nil, err + } else if ok { + req.Enabled = &enabled + } + item, err := p.service.Update(ctx.Context, id, req) + if err != nil { + return nil, err + } + if item.BotID != botID { + return nil, errors.New("bot mismatch") + } + return item, nil + }, + }, + { + Name: "delete_schedule", Description: "Delete a schedule by id", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "id": map[string]any{"type": "string", "description": "Schedule ID"}, + }, + "required": []string{"id"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + id := StringArg(args, "id") + if id == "" { + return nil, errors.New("id is required") + } + item, err := p.service.Get(ctx.Context, id) + if err != nil { + return nil, err + } + if item.BotID != botID { + return nil, errors.New("bot mismatch") + } + if err := p.service.Delete(ctx.Context, id); err != nil { + return nil, err + } + return map[string]any{"success": true}, nil + }, + }, + }, nil +} + +func parseNullableIntArg(arguments map[string]any, key string) (sched.NullableInt, error) { + req := sched.NullableInt{} + if arguments == nil { + return req, nil + } + raw, exists := arguments[key] + if !exists { + return req, nil + } + req.Set = true + if raw == nil { + req.Value = nil + return req, nil + } + value, _, err := IntArg(arguments, key) + if err != nil { + return sched.NullableInt{}, err + } + req.Value = &value + return req, nil +} + +func emptyObjectSchema() map[string]any { + return map[string]any{"type": "object", "properties": map[string]any{}} +} diff --git a/internal/agent/tools/skill.go b/internal/agent/tools/skill.go new file mode 100644 index 00000000..3ca0282c --- /dev/null +++ b/internal/agent/tools/skill.go @@ -0,0 +1,59 @@ +package tools + +import ( + "context" + "errors" + "log/slog" + + sdk "github.com/memohai/twilight-ai/sdk" +) + +type SkillProvider struct { + logger *slog.Logger +} + +func NewSkillProvider(log *slog.Logger) *SkillProvider { + if log == nil { + log = slog.Default() + } + return &SkillProvider{logger: log.With(slog.String("tool", "skill"))} +} + +func (*SkillProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + if session.IsSubagent { + return nil, nil + } + return []sdk.Tool{ + { + Name: "use_skill", + Description: "Use a skill if you think it is relevant to the current task", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "skillName": map[string]any{ + "type": "string", + "description": "The name of the skill to use", + }, + "reason": map[string]any{ + "type": "string", + "description": "The reason why you think this skill is relevant to the current task", + }, + }, + "required": []string{"skillName", "reason"}, + }, + Execute: func(_ *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + skillName := StringArg(args, "skillName") + reason := StringArg(args, "reason") + if skillName == "" { + return nil, errors.New("skillName is required") + } + return map[string]any{ + "success": true, + "skillName": skillName, + "reason": reason, + }, nil + }, + }, + }, nil +} diff --git a/internal/agent/tools/subagent.go b/internal/agent/tools/subagent.go new file mode 100644 index 00000000..7e1fd7bf --- /dev/null +++ b/internal/agent/tools/subagent.go @@ -0,0 +1,300 @@ +package tools + +import ( + "bytes" + "context" + "encoding/json" + "errors" + "fmt" + "io" + "log/slog" + "net/http" + "slices" + "strings" + "time" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/db/sqlc" + "github.com/memohai/memoh/internal/models" + "github.com/memohai/memoh/internal/settings" + subagentsvc "github.com/memohai/memoh/internal/subagent" +) + +const subagentGatewayTimeout = 120 * time.Second + +type SubagentProvider struct { + logger *slog.Logger + service *subagentsvc.Service + settings *settings.Service + models *models.Service + queries *sqlc.Queries + gatewayBaseURL string + httpClient *http.Client +} + +func NewSubagentProvider( + log *slog.Logger, + service *subagentsvc.Service, + settingsSvc *settings.Service, + modelsSvc *models.Service, + queries *sqlc.Queries, + gatewayBaseURL string, +) *SubagentProvider { + if log == nil { + log = slog.Default() + } + return &SubagentProvider{ + logger: log.With(slog.String("tool", "subagent")), + service: service, + settings: settingsSvc, + models: modelsSvc, + queries: queries, + gatewayBaseURL: strings.TrimRight(gatewayBaseURL, "/"), + httpClient: &http.Client{Timeout: subagentGatewayTimeout}, + } +} + +func (p *SubagentProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + if p.service == nil || session.IsSubagent { + return nil, nil + } + sess := session + return []sdk.Tool{ + { + Name: "list_subagents", Description: "List subagents for current bot", + Parameters: emptyObjectSchema(), + Execute: func(ctx *sdk.ToolExecContext, _ any) (any, error) { + botID := strings.TrimSpace(sess.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + items, err := p.service.List(ctx.Context, botID) + if err != nil { + return nil, err + } + result := make([]map[string]any, 0, len(items)) + for _, item := range items { + result = append(result, map[string]any{"id": item.ID, "name": item.Name, "description": item.Description}) + } + return map[string]any{"items": result}, nil + }, + }, + { + Name: "delete_subagent", Description: "Delete a subagent by id", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "id": map[string]any{"type": "string", "description": "Subagent ID"}, + }, + "required": []string{"id"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + id := StringArg(args, "id") + if id == "" { + return nil, errors.New("id is required") + } + if err := p.service.Delete(ctx.Context, id); err != nil { + return nil, err + } + return map[string]any{"success": true}, nil + }, + }, + { + Name: "query_subagent", Description: "Query a subagent. If the subagent does not exist it will be created automatically.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "name": map[string]any{"type": "string", "description": "The name of the subagent"}, + "description": map[string]any{"type": "string", "description": "A short description of the subagent purpose (used when creating)"}, + "query": map[string]any{"type": "string", "description": "The prompt to ask the subagent to do."}, + }, + "required": []string{"name", "description", "query"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execQuery(ctx.Context, sess, inputAsMap(input)) + }, + }, + }, nil +} + +func (p *SubagentProvider) execQuery(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + name := StringArg(args, "name") + description := StringArg(args, "description") + query := StringArg(args, "query") + if name == "" || description == "" || query == "" { + return nil, errors.New("name, description, and query are required") + } + target, err := p.service.GetOrCreate(ctx, botID, subagentsvc.CreateRequest{Name: name, Description: description}) + if err != nil { + return nil, fmt.Errorf("failed to get or create subagent: %w", err) + } + modelCfg, provider, err := p.resolveModel(ctx, botID) + if err != nil { + return nil, fmt.Errorf("failed to resolve model: %w", err) + } + gwResp, err := p.postSubagent(ctx, session, subagentGWRequest{ + Model: subagentModelCfg{ + ModelID: modelCfg.ModelID, ClientType: string(modelCfg.ClientType), + Input: modelCfg.InputModalities, APIKey: provider.ApiKey, BaseURL: provider.BaseUrl, + }, + Identity: subagentIdentityCfg{ + BotID: botID, ChannelIdentityID: session.ChannelIdentityID, + CurrentPlatform: session.CurrentPlatform, SessionToken: session.SessionToken, + }, + Messages: target.Messages, Query: query, Name: name, Desc: description, + }) + if err != nil { + return nil, fmt.Errorf("subagent query failed: %w", err) + } + updatedMessages := slices.Clone(target.Messages) + updatedMessages = append(updatedMessages, gwResp.Messages...) + usage := mergeSubagentUsage(target.Usage, gwResp.Usage) + if _, err := p.service.UpdateContext(ctx, target.ID, subagentsvc.UpdateContextRequest{ + Messages: updatedMessages, Usage: usage, + }); err != nil { + p.logger.Warn("failed to persist subagent context", slog.String("subagent_id", target.ID), slog.Any("error", err)) + } + resultContent := gwResp.Text + if resultContent == "" && len(gwResp.Messages) > 0 { + last := gwResp.Messages[len(gwResp.Messages)-1] + if content, ok := last["content"]; ok { + resultContent = fmt.Sprintf("%v", content) + } + } + return map[string]any{"success": true, "result": resultContent}, nil +} + +func (p *SubagentProvider) resolveModel(ctx context.Context, botID string) (models.GetResponse, sqlc.LlmProvider, error) { + if p.settings == nil || p.models == nil || p.queries == nil { + return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("model resolution services not configured") + } + botSettings, err := p.settings.GetBot(ctx, botID) + if err != nil { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + chatModelID := strings.TrimSpace(botSettings.ChatModelID) + if chatModelID == "" { + return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("no chat model configured for bot") + } + model, err := p.models.GetByID(ctx, chatModelID) + if err != nil { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + provider, err := models.FetchProviderByID(ctx, p.queries, model.LlmProviderID) + if err != nil { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + return model, provider, nil +} + +type subagentModelCfg struct { + ModelID string `json:"modelId"` + ClientType string `json:"clientType"` + Input []string `json:"input"` + APIKey string `json:"apiKey"` //nolint:gosec // forwarded to agent gateway + BaseURL string `json:"baseUrl"` +} + +type subagentIdentityCfg struct { + BotID string `json:"botId"` + ChannelIdentityID string `json:"channelIdentityId"` + CurrentPlatform string `json:"currentPlatform,omitempty"` + SessionToken string `json:"sessionToken,omitempty"` //nolint:gosec // session token forwarded +} + +type subagentGWRequest struct { + Model subagentModelCfg `json:"model"` + Identity subagentIdentityCfg `json:"identity"` + Messages []map[string]any `json:"messages"` + Query string `json:"query"` + Name string `json:"name"` + Desc string `json:"description"` +} + +type subagentGWResponse struct { + Messages []map[string]any `json:"messages"` + Text string `json:"text,omitempty"` + Usage json.RawMessage `json:"usage,omitempty"` +} + +func (p *SubagentProvider) postSubagent(ctx context.Context, session SessionContext, payload subagentGWRequest) (subagentGWResponse, error) { + url := p.gatewayBaseURL + "/chat/subagent" + body, err := json.Marshal(payload) + if err != nil { + return subagentGWResponse{}, err + } + req, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(body)) + if err != nil { + return subagentGWResponse{}, err + } + req.Header.Set("Content-Type", "application/json") + if token := strings.TrimSpace(session.SessionToken); token != "" { + req.Header.Set("Authorization", "Bearer "+token) + } + resp, err := p.httpClient.Do(req) //nolint:gosec // URL is from operator-configured agent gateway + if err != nil { + return subagentGWResponse{}, err + } + defer func() { _ = resp.Body.Close() }() + respBody, err := io.ReadAll(resp.Body) + if err != nil { + return subagentGWResponse{}, err + } + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + detail := string(respBody) + if len(detail) > 300 { + detail = detail[:300] + } + return subagentGWResponse{}, fmt.Errorf("agent gateway error (HTTP %d): %s", resp.StatusCode, strings.TrimSpace(detail)) + } + var parsed subagentGWResponse + if err := json.Unmarshal(respBody, &parsed); err != nil { + return subagentGWResponse{}, fmt.Errorf("failed to parse gateway response: %w", err) + } + return parsed, nil +} + +func mergeSubagentUsage(existing map[string]any, delta json.RawMessage) map[string]any { + if existing == nil { + existing = map[string]any{} + } + if len(delta) == 0 { + return existing + } + var deltaMap map[string]any + if err := json.Unmarshal(delta, &deltaMap); err != nil { + return existing + } + for key, val := range deltaMap { + if num, ok := toFloat64(val); ok { + if prev, ok := toFloat64(existing[key]); ok { + existing[key] = prev + num + } else { + existing[key] = num + } + } + } + return existing +} + +func toFloat64(v any) (float64, bool) { + switch n := v.(type) { + case float64: + return n, true + case int: + return float64(n), true + case int64: + return float64(n), true + case json.Number: + f, err := n.Float64() + return f, err == nil + default: + return 0, false + } +} diff --git a/internal/agent/tools/tts.go b/internal/agent/tools/tts.go new file mode 100644 index 00000000..c993725b --- /dev/null +++ b/internal/agent/tools/tts.go @@ -0,0 +1,153 @@ +package tools + +import ( + "context" + "encoding/base64" + "errors" + "fmt" + "log/slog" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/channel" + "github.com/memohai/memoh/internal/settings" + ttspkg "github.com/memohai/memoh/internal/tts" +) + +const ttsMaxTextLen = 500 + +// TTSSender sends outbound messages through the channel manager. +type TTSSender interface { + Send(ctx context.Context, botID string, channelType channel.ChannelType, req channel.SendRequest) error +} + +// TTSChannelResolver parses platform name to channel type. +type TTSChannelResolver interface { + ParseChannelType(raw string) (channel.ChannelType, error) +} + +type TTSProvider struct { + logger *slog.Logger + settings *settings.Service + tts *ttspkg.Service + sender TTSSender + resolver TTSChannelResolver +} + +func NewTTSProvider(log *slog.Logger, settingsSvc *settings.Service, ttsSvc *ttspkg.Service, sender TTSSender, resolver TTSChannelResolver) *TTSProvider { + if log == nil { + log = slog.Default() + } + return &TTSProvider{ + logger: log.With(slog.String("tool", "tts")), + settings: settingsSvc, + tts: ttsSvc, + sender: sender, + resolver: resolver, + } +} + +func (p *TTSProvider) Tools(ctx context.Context, session SessionContext) ([]sdk.Tool, error) { + if p.settings == nil || p.tts == nil || p.sender == nil || p.resolver == nil { + return nil, nil + } + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, nil + } + botSettings, err := p.settings.GetBot(ctx, botID) + if err != nil { + return nil, nil + } + if strings.TrimSpace(botSettings.TtsModelID) == "" { + return nil, nil + } + sess := session + return []sdk.Tool{ + { + Name: "speak", + Description: "Send a voice message to a DIFFERENT channel or person. Synthesizes text to speech and delivers as audio. Do NOT use this for the current conversation — use block instead.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "text": map[string]any{"type": "string", "description": "The text to convert to speech (max 500 characters)"}, + "platform": map[string]any{"type": "string", "description": "Channel platform name. Defaults to current session platform."}, + "target": map[string]any{"type": "string", "description": "Channel target (chat/group/thread ID). Use get_contacts to find available targets."}, + "reply_to": map[string]any{"type": "string", "description": "Message ID to reply to. The voice message will reference this message on the platform."}, + }, + "required": []string{"text"}, + }, + Execute: func(execCtx *sdk.ToolExecContext, input any) (any, error) { + return p.execSpeak(execCtx.Context, sess, inputAsMap(input)) + }, + }, + }, nil +} + +func (p *TTSProvider) execSpeak(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + text := strings.TrimSpace(StringArg(args, "text")) + if text == "" { + return nil, errors.New("text is required") + } + if len([]rune(text)) > ttsMaxTextLen { + return nil, errors.New("text too long, max 500 characters") + } + channelType, err := p.resolvePlatform(args, session) + if err != nil { + return nil, err + } + target := FirstStringArg(args, "target") + if target == "" { + target = strings.TrimSpace(session.ReplyTarget) + } + if target == "" { + return nil, errors.New("target is required") + } + if strings.EqualFold(channelType.String(), strings.TrimSpace(session.CurrentPlatform)) && + target == strings.TrimSpace(session.ReplyTarget) { + return nil, errors.New("you are trying to speak in the same conversation you are already in. " + + "Do not use the speak tool for this. Instead, use the block in your response " + + "(e.g. Hello world)") + } + botSettings, err := p.settings.GetBot(ctx, botID) + if err != nil { + return nil, errors.New("failed to load bot settings") + } + if botSettings.TtsModelID == "" { + return nil, errors.New("bot has no TTS model configured") + } + audioData, contentType, synthErr := p.tts.Synthesize(ctx, botSettings.TtsModelID, text, nil) + if synthErr != nil { + return nil, fmt.Errorf("speech synthesis failed: %s", synthErr.Error()) + } + dataURL := fmt.Sprintf("data:%s;base64,%s", contentType, base64.StdEncoding.EncodeToString(audioData)) + msg := channel.Message{ + Attachments: []channel.Attachment{{Type: channel.AttachmentVoice, URL: dataURL, Mime: contentType, Size: int64(len(audioData))}}, + } + if replyTo := FirstStringArg(args, "reply_to"); replyTo != "" { + msg.Reply = &channel.ReplyRef{MessageID: replyTo} + } + if err := p.sender.Send(ctx, botID, channelType, channel.SendRequest{Target: target, Message: msg}); err != nil { + return nil, err + } + return map[string]any{ + "ok": true, "bot_id": botID, "platform": channelType.String(), "target": target, + "instruction": "Voice message delivered successfully. You have completed your response. Please STOP now and do not call any more tools.", + }, nil +} + +func (p *TTSProvider) resolvePlatform(args map[string]any, session SessionContext) (channel.ChannelType, error) { + platform := FirstStringArg(args, "platform") + if platform == "" { + platform = strings.TrimSpace(session.CurrentPlatform) + } + if platform == "" { + return "", errors.New("platform is required") + } + return p.resolver.ParseChannelType(platform) +} diff --git a/internal/agent/tools/types.go b/internal/agent/tools/types.go new file mode 100644 index 00000000..28134583 --- /dev/null +++ b/internal/agent/tools/types.go @@ -0,0 +1,125 @@ +package tools + +import ( + "context" + "encoding/json" + "fmt" + "math" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" +) + +// SessionContext carries request-scoped identity for tool execution. +type SessionContext struct { + BotID string + ChatID string + ChannelIdentityID string + SessionToken string //nolint:gosec // carries session credential material at runtime + CurrentPlatform string + ReplyTarget string + IsSubagent bool +} + +// ToolProvider supplies a set of tools for the agent. +// Tools() is called per-request; implementations may return different +// tool sets based on session context (e.g. subagent restrictions, bot settings). +type ToolProvider interface { + Tools(ctx context.Context, session SessionContext) ([]sdk.Tool, error) +} + +// ---- argument parsing helpers ---- + +func StringArg(arguments map[string]any, key string) string { + if arguments == nil { + return "" + } + raw, ok := arguments[key] + if !ok { + return "" + } + switch value := raw.(type) { + case string: + return strings.TrimSpace(value) + default: + return strings.TrimSpace(fmt.Sprintf("%v", raw)) + } +} + +func FirstStringArg(arguments map[string]any, keys ...string) string { + for _, key := range keys { + if value := StringArg(arguments, key); value != "" { + return value + } + } + return "" +} + +func IntArg(arguments map[string]any, key string) (int, bool, error) { + if arguments == nil { + return 0, false, nil + } + raw, ok := arguments[key] + if !ok || raw == nil { + return 0, false, nil + } + switch value := raw.(type) { + case int: + return value, true, nil + case int64: + if value < int64(math.MinInt) || value > int64(math.MaxInt) { + return 0, true, fmt.Errorf("%s out of range", key) + } + return int(value), true, nil + case float64: + if math.IsNaN(value) || math.IsInf(value, 0) { + return 0, true, fmt.Errorf("%s must be a valid number", key) + } + if value < float64(math.MinInt) || value > float64(math.MaxInt) { + return 0, true, fmt.Errorf("%s out of range", key) + } + return int(value), true, nil + case json.Number: + i, err := value.Int64() + if err != nil { + return 0, true, fmt.Errorf("%s must be an integer", key) + } + if i < int64(math.MinInt) || i > int64(math.MaxInt) { + return 0, true, fmt.Errorf("%s out of range", key) + } + return int(i), true, nil + default: + return 0, true, fmt.Errorf("%s must be a number", key) + } +} + +func BoolArg(arguments map[string]any, key string) (bool, bool, error) { + if arguments == nil { + return false, false, nil + } + raw, ok := arguments[key] + if !ok || raw == nil { + return false, false, nil + } + value, ok := raw.(bool) + if !ok { + return false, true, fmt.Errorf("%s must be a boolean", key) + } + return value, true, nil +} + +func inputAsMap(input any) map[string]any { + args, ok := input.(map[string]any) + if ok { + return args + } + if input == nil { + return map[string]any{} + } + raw, _ := json.Marshal(input) + _ = json.Unmarshal(raw, &args) + if args == nil { + args = map[string]any{} + } + return args +} diff --git a/internal/agent/tools/web.go b/internal/agent/tools/web.go new file mode 100644 index 00000000..d701f30b --- /dev/null +++ b/internal/agent/tools/web.go @@ -0,0 +1,910 @@ +package tools + +import ( + "bytes" + "context" + "crypto/hmac" + "crypto/sha256" + "encoding/base64" + "encoding/hex" + "encoding/json" + "encoding/xml" + "errors" + "fmt" + "html" + "io" + "log/slog" + "net/http" + "net/url" + "regexp" + "sort" + "strconv" + "strings" + "time" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/channel" + "github.com/memohai/memoh/internal/db/sqlc" + "github.com/memohai/memoh/internal/searchproviders" + "github.com/memohai/memoh/internal/settings" +) + +type WebProvider struct { + logger *slog.Logger + settings *settings.Service + searchProviders *searchproviders.Service +} + +func NewWebProvider(log *slog.Logger, settingsSvc *settings.Service, searchSvc *searchproviders.Service) *WebProvider { + if log == nil { + log = slog.Default() + } + return &WebProvider{ + logger: log.With(slog.String("tool", "web")), + settings: settingsSvc, + searchProviders: searchSvc, + } +} + +func (p *WebProvider) Tools(_ context.Context, session SessionContext) ([]sdk.Tool, error) { + if p.settings == nil || p.searchProviders == nil { + return nil, nil + } + sess := session + return []sdk.Tool{ + { + Name: "web_search", + Description: "Search web results via configured search provider.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "query": map[string]any{"type": "string", "description": "Search query"}, + "count": map[string]any{"type": "integer", "description": "Number of results, default 5"}, + }, + "required": []string{"query"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + return p.execWebSearch(ctx.Context, sess, inputAsMap(input)) + }, + }, + }, nil +} + +func (p *WebProvider) execWebSearch(ctx context.Context, session SessionContext, args map[string]any) (any, error) { + botID := strings.TrimSpace(session.BotID) + if botID == "" { + return nil, errors.New("bot_id is required") + } + botSettings, err := p.settings.GetBot(ctx, botID) + if err != nil { + return nil, err + } + searchProviderID := strings.TrimSpace(botSettings.SearchProviderID) + if searchProviderID == "" { + return nil, errors.New("search provider not configured for this bot") + } + provider, err := p.searchProviders.GetRawByID(ctx, searchProviderID) + if err != nil { + return nil, err + } + registerSearchProviderSecrets(provider) + + query := strings.TrimSpace(StringArg(args, "query")) + if query == "" { + return nil, errors.New("query is required") + } + count := 5 + if value, ok, err := IntArg(args, "count"); err != nil { + return nil, err + } else if ok && value > 0 { + count = value + } + if count > 20 { + count = 20 + } + return p.callSearch(ctx, provider.Provider, provider.Config, query, count) +} + +func (*WebProvider) callSearch(ctx context.Context, providerName string, configJSON []byte, query string, count int) (any, error) { + switch strings.TrimSpace(providerName) { + case string(searchproviders.ProviderBrave): + return callBraveSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderBing): + return callBingSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderGoogle): + return callGoogleSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderTavily): + return callTavilySearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderSogou): + return callSogouSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderSerper): + return callSerperSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderSearXNG): + return callSearXNGSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderJina): + return callJinaSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderExa): + return callExaSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderBocha): + return callBochaSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderDuckDuckGo): + return callDuckDuckGoSearch(ctx, configJSON, query, count) + case string(searchproviders.ProviderYandex): + return callYandexSearch(ctx, configJSON, query, count) + default: + return nil, errors.New("unsupported search provider") + } +} + +// ---- search provider implementations ---- + +func callBraveSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := strings.TrimRight(firstNonEmpty(stringValue(cfg["base_url"]), "https://api.search.brave.com/res/v1/web/search"), "/") + reqURL, err := url.Parse(endpoint) + if err != nil { + return nil, errors.New("invalid search provider base_url") + } + params := reqURL.Query() + params.Set("q", query) + params.Set("count", strconv.Itoa(count)) + reqURL.RawQuery = params.Encode() + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, err := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) + if err != nil { + return nil, err + } + req.Header.Set("Accept", "application/json") + if apiKey := stringValue(cfg["api_key"]); strings.TrimSpace(apiKey) != "" { + req.Header.Set("X-Subscription-Token", strings.TrimSpace(apiKey)) + } + resp, err := client.Do(req) //nolint:gosec // web browsing tool intentionally fetches user-specified URLs + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, err := io.ReadAll(resp.Body) + if err != nil { + return nil, err + } + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Web struct { + Results []struct { + Title, URL, Description string + } `json:"results"` + } `json:"web"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + return buildSearchResults(query, raw.Web.Results, func(r struct{ Title, URL, Description string }) map[string]any { + return map[string]any{"title": r.Title, "url": r.URL, "description": r.Description} + }), nil +} + +func callBingSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := strings.TrimRight(firstNonEmpty(stringValue(cfg["base_url"]), "https://api.bing.microsoft.com/v7.0/search"), "/") + reqURL, _ := url.Parse(endpoint) + params := reqURL.Query() + params.Set("q", query) + params.Set("count", strconv.Itoa(count)) + reqURL.RawQuery = params.Encode() + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) + req.Header.Set("Accept", "application/json") + if apiKey := stringValue(cfg["api_key"]); apiKey != "" { + req.Header.Set("Ocp-Apim-Subscription-Key", apiKey) + } + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + WebPages struct { + Value []struct { + Name, URL, Snippet string + } `json:"value"` + } `json:"webPages"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + results := make([]map[string]any, 0, len(raw.WebPages.Value)) + for _, item := range raw.WebPages.Value { + results = append(results, map[string]any{"title": item.Name, "url": item.URL, "description": item.Snippet}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callGoogleSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := strings.TrimRight(firstNonEmpty(stringValue(cfg["base_url"]), "https://customsearch.googleapis.com/customsearch/v1"), "/") + reqURL, _ := url.Parse(endpoint) + cx := stringValue(cfg["cx"]) + if cx == "" { + return nil, errors.New("google custom search requires cx (search engine ID)") + } + if count > 10 { + count = 10 + } + params := reqURL.Query() + params.Set("q", query) + params.Set("cx", cx) + params.Set("num", strconv.Itoa(count)) + if apiKey := stringValue(cfg["api_key"]); apiKey != "" { + params.Set("key", apiKey) + } + reqURL.RawQuery = params.Encode() + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) + req.Header.Set("Accept", "application/json") + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Items []struct { + Title, Link, Snippet string + } `json:"items"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + results := make([]map[string]any, 0, len(raw.Items)) + for _, item := range raw.Items { + results = append(results, map[string]any{"title": item.Title, "url": item.Link, "description": item.Snippet}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callTavilySearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://api.tavily.com/search") + apiKey := stringValue(cfg["api_key"]) + if apiKey == "" { + return nil, errors.New("tavily API key is required") + } + payload, _ := json.Marshal(map[string]any{"query": query, "max_results": count}) + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) + req.Header.Set("Content-Type", "application/json") + req.Header.Set("Accept", "application/json") + req.Header.Set("Authorization", "Bearer "+apiKey) + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Results []struct { + Title, URL, Content string + } `json:"results"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + results := make([]map[string]any, 0, len(raw.Results)) + for _, item := range raw.Results { + results = append(results, map[string]any{"title": item.Title, "url": item.URL, "description": item.Content}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callSogouSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + host := firstNonEmpty(stringValue(cfg["base_url"]), "wsa.tencentcloudapi.com") + secretID := stringValue(cfg["secret_id"]) + secretKey := stringValue(cfg["secret_key"]) + if secretID == "" || secretKey == "" { + return nil, errors.New("sogou search requires Tencent Cloud SecretId and SecretKey") + } + action := "SearchPro" + version := "2025-05-08" + service := "wsa" + payload, _ := json.Marshal(map[string]any{"Query": query, "Mode": 0}) + now := time.Now().UTC() + timestamp := strconv.FormatInt(now.Unix(), 10) + date := now.Format("2006-01-02") + hashedPayload := sha256Hex(payload) + canonicalHeaders := fmt.Sprintf("content-type:%s\nhost:%s\n", "application/json", host) + signedHeaders := "content-type;host" + canonicalRequest := fmt.Sprintf("%s\n%s\n%s\n%s\n%s\n%s", "POST", "/", "", canonicalHeaders, signedHeaders, hashedPayload) + credentialScope := fmt.Sprintf("%s/%s/tc3_request", date, service) + stringToSign := fmt.Sprintf("TC3-HMAC-SHA256\n%s\n%s\n%s", timestamp, credentialScope, sha256Hex([]byte(canonicalRequest))) + secretDate := hmacSHA256([]byte("TC3"+secretKey), []byte(date)) + secretService := hmacSHA256(secretDate, []byte(service)) + secretSigning := hmacSHA256(secretService, []byte("tc3_request")) + signature := hex.EncodeToString(hmacSHA256(secretSigning, []byte(stringToSign))) + authorization := fmt.Sprintf("TC3-HMAC-SHA256 Credential=%s/%s, SignedHeaders=%s, Signature=%s", secretID, credentialScope, signedHeaders, signature) + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, "https://"+host+"/", bytes.NewReader(payload)) + req.Header.Set("Content-Type", "application/json") + req.Header.Set("Authorization", authorization) + req.Header.Set("Host", host) + req.Header.Set("X-TC-Action", action) + req.Header.Set("X-TC-Version", version) + req.Header.Set("X-TC-Timestamp", timestamp) + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var rawResp struct { + Response struct { + Error *struct{ Code, Message string } `json:"Error,omitempty"` + Pages []json.RawMessage `json:"Pages"` + } `json:"Response"` + } + if err := json.Unmarshal(body, &rawResp); err != nil { + return nil, errors.New("invalid search response") + } + if rawResp.Response.Error != nil { + return nil, fmt.Errorf("sogou search failed: %s", rawResp.Response.Error.Message) + } + type sogouPage struct { + Title, URL, Passage string + Score float64 `json:"scour"` + } + var pages []sogouPage + for _, raw := range rawResp.Response.Pages { + var rawStr string + if err := json.Unmarshal(raw, &rawStr); err == nil { + var page sogouPage + if json.Unmarshal([]byte(rawStr), &page) == nil { + pages = append(pages, page) + } + } else { + var page sogouPage + if json.Unmarshal(raw, &page) == nil { + pages = append(pages, page) + } + } + } + sort.Slice(pages, func(i, j int) bool { return pages[i].Score > pages[j].Score }) + results := make([]map[string]any, 0) + for i, page := range pages { + if i >= count { + break + } + results = append(results, map[string]any{"title": page.Title, "url": page.URL, "description": page.Passage}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callSerperSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://google.serper.dev/search") + apiKey := stringValue(cfg["api_key"]) + if apiKey == "" { + return nil, errors.New("serper API key is required") + } + payload, _ := json.Marshal(map[string]any{"q": query}) + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) + req.Header.Set("Content-Type", "application/json") + req.Header.Set("Accept", "application/json") + req.Header.Set("X-API-KEY", apiKey) + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Organic []struct { + Title, Link, Description string + Position int + } `json:"organic"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + sort.Slice(raw.Organic, func(i, j int) bool { return raw.Organic[i].Position < raw.Organic[j].Position }) + results := make([]map[string]any, 0) + for i, item := range raw.Organic { + if i >= count { + break + } + results = append(results, map[string]any{"title": item.Title, "url": item.Link, "description": item.Description}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callSearXNGSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + baseURL := stringValue(cfg["base_url"]) + if baseURL == "" { + return nil, errors.New("SearXNG base URL is required") + } + reqURL, _ := url.Parse(strings.TrimRight(baseURL, "/")) + params := reqURL.Query() + params.Set("q", query) + params.Set("format", "json") + params.Set("pageno", "1") + if lang := stringValue(cfg["language"]); lang != "" { + params.Set("language", lang) + } + if ss := stringValue(cfg["safesearch"]); ss != "" { + params.Set("safesearch", ss) + } + if cats := stringValue(cfg["categories"]); cats != "" { + params.Set("categories", cats) + } + reqURL.RawQuery = params.Encode() + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) + req.Header.Set("Accept", "application/json") + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Results []struct { + Title, URL, Content string + Score float64 + } `json:"results"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + sort.Slice(raw.Results, func(i, j int) bool { return raw.Results[i].Score > raw.Results[j].Score }) + results := make([]map[string]any, 0) + for i, item := range raw.Results { + if i >= count { + break + } + results = append(results, map[string]any{"title": item.Title, "url": item.URL, "description": item.Content}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callJinaSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://s.jina.ai/") + apiKey := stringValue(cfg["api_key"]) + if apiKey == "" { + return nil, errors.New("jina API key is required") + } + if count > 10 { + count = 10 + } + payload, _ := json.Marshal(map[string]any{"q": query, "count": count}) + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) + req.Header.Set("Content-Type", "application/json") + req.Header.Set("Accept", "application/json") + req.Header.Set("X-Retain-Images", "none") + req.Header.Set("Authorization", apiKey) + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Data []struct{ Title, URL, Content string } `json:"data"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + results := make([]map[string]any, 0, len(raw.Data)) + for _, item := range raw.Data { + results = append(results, map[string]any{"title": item.Title, "url": item.URL, "description": item.Content}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callExaSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://api.exa.ai/search") + apiKey := stringValue(cfg["api_key"]) + if apiKey == "" { + return nil, errors.New("exa API key is required") + } + payload, _ := json.Marshal(map[string]any{"query": query, "numResults": count, "contents": map[string]any{"text": true, "highlights": true}, "type": "auto"}) + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) + req.Header.Set("Content-Type", "application/json") + req.Header.Set("Accept", "application/json") + req.Header.Set("Authorization", "Bearer "+apiKey) + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Results []struct{ Title, URL, Text string } `json:"results"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + results := make([]map[string]any, 0, len(raw.Results)) + for _, item := range raw.Results { + results = append(results, map[string]any{"title": item.Title, "url": item.URL, "description": item.Text}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callBochaSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://api.bochaai.com/v1/web-search") + apiKey := stringValue(cfg["api_key"]) + if apiKey == "" { + return nil, errors.New("bocha API key is required") + } + payload, _ := json.Marshal(map[string]any{"query": query, "summary": true, "freshness": "noLimit", "count": count}) + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) + req.Header.Set("Content-Type", "application/json") + req.Header.Set("Accept", "application/json") + req.Header.Set("Authorization", "Bearer "+apiKey) + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var raw struct { + Data struct { + WebPages struct { + Value []struct{ Name, URL, Summary string } `json:"value"` + } `json:"webPages"` + } `json:"data"` + } + if err := json.Unmarshal(body, &raw); err != nil { + return nil, errors.New("invalid search response") + } + results := make([]map[string]any, 0, len(raw.Data.WebPages.Value)) + for _, item := range raw.Data.WebPages.Value { + results = append(results, map[string]any{"title": item.Name, "url": item.URL, "description": item.Summary}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callDuckDuckGoSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://html.duckduckgo.com/html/") + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + form := url.Values{} + form.Set("q", query) + form.Set("b", "") + form.Set("kl", "") + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, strings.NewReader(form.Encode())) + req.Header.Set("Content-Type", "application/x-www-form-urlencoded") + req.Header.Set("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36") + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + htmlStr := string(body) + links := ddgResultLinkRe.FindAllStringSubmatch(htmlStr, -1) + titles := ddgResultTitleRe.FindAllStringSubmatch(htmlStr, -1) + snippets := ddgResultSnippetRe.FindAllStringSubmatch(htmlStr, -1) + n := len(links) + if len(titles) < n { + n = len(titles) + } + if count < n { + n = count + } + results := make([]map[string]any, 0, n) + for i := 0; i < n; i++ { + rawURL := html.UnescapeString(links[i][1]) + realURL := extractDDGURL(rawURL) + title := html.UnescapeString(strings.TrimSpace(titles[i][1])) + snippet := "" + if i < len(snippets) { + snippet = html.UnescapeString(strings.TrimSpace(ddgHTMLTagRe.ReplaceAllString(snippets[i][1], ""))) + } + if realURL == "" { + continue + } + results = append(results, map[string]any{"title": title, "url": realURL, "description": snippet}) + } + return map[string]any{"query": query, "results": results}, nil +} + +func callYandexSearch(ctx context.Context, configJSON []byte, query string, count int) (any, error) { + cfg := parseSearchConfig(configJSON) + endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://searchapi.api.cloud.yandex.net/v2/web/search") + apiKey := stringValue(cfg["api_key"]) + if apiKey == "" { + return nil, errors.New("yandex API key is required") + } + searchType := firstNonEmpty(stringValue(cfg["search_type"]), "SEARCH_TYPE_RU") + payload, _ := json.Marshal(map[string]any{ + "query": map[string]any{"queryText": query, "searchType": searchType}, + "groupSpec": map[string]any{"groupMode": "GROUP_MODE_DEEP", "groupsOnPage": count, "docsInGroup": 1}, + }) + timeout := parseSearchTimeout(configJSON, 15*time.Second) + client := &http.Client{Timeout: timeout} + req, _ := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) + req.Header.Set("Content-Type", "application/json") + req.Header.Set("Authorization", "Api-Key "+apiKey) + resp, err := client.Do(req) //nolint:gosec + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + body, _ := io.ReadAll(resp.Body) + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, buildSearchHTTPError(resp.StatusCode, body) + } + var rawResp struct { + RawData string `json:"rawData"` + } + if err := json.Unmarshal(body, &rawResp); err != nil { + return nil, errors.New("invalid search response") + } + xmlData, err := base64.StdEncoding.DecodeString(rawResp.RawData) + if err != nil { + return nil, errors.New("failed to decode Yandex response") + } + results, err := parseYandexXML(xmlData) + if err != nil { + return nil, errors.New("failed to parse Yandex XML response") + } + return map[string]any{"query": query, "results": results}, nil +} + +// ---- helpers ---- + +func buildSearchResults[T any](query string, items []T, mapper func(T) map[string]any) map[string]any { + results := make([]map[string]any, 0, len(items)) + for _, item := range items { + results = append(results, mapper(item)) + } + return map[string]any{"query": query, "results": results} +} + +func buildSearchHTTPError(statusCode int, body []byte) error { + detail := extractJSONErrorMessage(body) + if detail == "" { + detail = strings.TrimSpace(string(body)) + } + if len(detail) > 200 { + detail = detail[:200] + "..." + } + if detail != "" { + return fmt.Errorf("search request failed (HTTP %d): %s", statusCode, detail) + } + return fmt.Errorf("search request failed (HTTP %d)", statusCode) +} + +func extractJSONErrorMessage(body []byte) string { + var obj map[string]any + if json.Unmarshal(body, &obj) != nil { + return "" + } + for _, key := range []string{"error", "message", "detail", "error_message"} { + v, ok := obj[key] + if !ok { + continue + } + switch val := v.(type) { + case string: + return val + case map[string]any: + if msg, ok := val["message"].(string); ok { + return msg + } + } + } + return "" +} + +func parseSearchTimeout(configJSON []byte, fallback time.Duration) time.Duration { + cfg := parseSearchConfig(configJSON) + raw, ok := cfg["timeout_seconds"] + if !ok { + return fallback + } + switch value := raw.(type) { + case float64: + if value > 0 { + return time.Duration(value * float64(time.Second)) + } + case int: + if value > 0 { + return time.Duration(value) * time.Second + } + } + return fallback +} + +func parseSearchConfig(configJSON []byte) map[string]any { + if len(configJSON) == 0 { + return map[string]any{} + } + var cfg map[string]any + if err := json.Unmarshal(configJSON, &cfg); err != nil || cfg == nil { + return map[string]any{} + } + return cfg +} + +func stringValue(raw any) string { + if value, ok := raw.(string); ok { + return strings.TrimSpace(value) + } + return "" +} + +func firstNonEmpty(values ...string) string { + for _, value := range values { + if strings.TrimSpace(value) != "" { + return strings.TrimSpace(value) + } + } + return "" +} + +func sha256Hex(data []byte) string { + h := sha256.Sum256(data) + return hex.EncodeToString(h[:]) +} + +func hmacSHA256(key, data []byte) []byte { + h := hmac.New(sha256.New, key) + h.Write(data) + return h.Sum(nil) +} + +var searchProviderSecretFields = []string{"api_key", "secret_id", "secret_key"} + +func registerSearchProviderSecrets(provider sqlc.SearchProvider) { + cfg := parseSearchConfig(provider.Config) + var secrets []string + for _, key := range searchProviderSecretFields { + if v := stringValue(cfg[key]); v != "" { + secrets = append(secrets, v) + } + } + if len(secrets) > 0 { + channel.SetIMErrorSecrets("search:"+provider.ID.String(), secrets...) + } +} + +var ( + ddgResultLinkRe = regexp.MustCompile(`class="result__a"[^>]*href="([^"]+)"`) + ddgResultTitleRe = regexp.MustCompile(`class="result__a"[^>]*>([^<]+)<`) + ddgResultSnippetRe = regexp.MustCompile(`class="result__snippet"[^>]*>([\s\S]*?)`) + ddgHTMLTagRe = regexp.MustCompile(`<[^>]*>`) +) + +func extractDDGURL(rawURL string) string { + if strings.Contains(rawURL, "uddg=") { + parsed, err := url.Parse(rawURL) + if err == nil { + if uddg := parsed.Query().Get("uddg"); uddg != "" { + return uddg + } + } + } + if strings.HasPrefix(rawURL, "//") { + return "https:" + rawURL + } + return rawURL +} + +type xmlInnerText string + +func (t *xmlInnerText) UnmarshalXML(d *xml.Decoder, _ xml.StartElement) error { + var buf strings.Builder + for { + tok, err := d.Token() + if err != nil { + break + } + switch v := tok.(type) { + case xml.CharData: + buf.Write(v) + case xml.StartElement: + var inner xmlInnerText + if err := d.DecodeElement(&inner, &v); err != nil { + return err + } + buf.WriteString(string(inner)) + case xml.EndElement: + *t = xmlInnerText(buf.String()) + return nil + } + } + *t = xmlInnerText(buf.String()) + return nil +} + +type yandexResponse struct { + XMLName xml.Name `xml:"response"` + Results yandexResults `xml:"results"` +} +type yandexResults struct { + Grouping yandexGrouping `xml:"grouping"` +} +type yandexGrouping struct { + Groups []yandexGroup `xml:"group"` +} +type yandexGroup struct { + Doc yandexDoc `xml:"doc"` +} +type yandexDoc struct { + URL xmlInnerText `xml:"url"` + Title xmlInnerText `xml:"title"` + Passages yandexPassages `xml:"passages"` +} +type yandexPassages struct { + Passage []xmlInnerText `xml:"passage"` +} + +func parseYandexXML(data []byte) ([]map[string]any, error) { + var resp yandexResponse + if err := xml.Unmarshal(data, &resp); err != nil { + return nil, err + } + results := make([]map[string]any, 0, len(resp.Results.Grouping.Groups)) + for _, group := range resp.Results.Grouping.Groups { + snippet := "" + if len(group.Doc.Passages.Passage) > 0 { + snippet = string(group.Doc.Passages.Passage[0]) + } + results = append(results, map[string]any{"title": string(group.Doc.Title), "url": string(group.Doc.URL), "description": snippet}) + } + return results, nil +} diff --git a/internal/agent/tools/webfetch.go b/internal/agent/tools/webfetch.go new file mode 100644 index 00000000..214c3184 --- /dev/null +++ b/internal/agent/tools/webfetch.go @@ -0,0 +1,182 @@ +package tools + +import ( + "context" + "encoding/json" + "errors" + "fmt" + "io" + "log/slog" + "net/http" + "net/url" + "strings" + "time" + + htmltomarkdown "github.com/JohannesKaufmann/html-to-markdown/v2" + readability "github.com/go-shiori/go-readability" + sdk "github.com/memohai/twilight-ai/sdk" +) + +const ( + webFetchMaxTextContent = 10000 + webFetchTimeout = 30 * time.Second + webFetchUserAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36" +) + +type WebFetchProvider struct { + logger *slog.Logger + client *http.Client +} + +func NewWebFetchProvider(log *slog.Logger) *WebFetchProvider { + if log == nil { + log = slog.Default() + } + return &WebFetchProvider{ + logger: log.With(slog.String("tool", "webfetch")), + client: &http.Client{Timeout: webFetchTimeout}, + } +} + +func (p *WebFetchProvider) Tools(_ context.Context, _ SessionContext) ([]sdk.Tool, error) { + return []sdk.Tool{ + { + Name: "web_fetch", + Description: "Fetch a URL and convert the response to readable content. Supports HTML (converts to Markdown), JSON, XML, and plain text formats.", + Parameters: map[string]any{ + "type": "object", + "properties": map[string]any{ + "url": map[string]any{ + "type": "string", + "description": "The URL to fetch", + }, + "format": map[string]any{ + "type": "string", + "enum": []string{"auto", "markdown", "json", "xml", "text"}, + "description": "Output format (default: auto - detects from content type)", + }, + }, + "required": []string{"url"}, + }, + Execute: func(ctx *sdk.ToolExecContext, input any) (any, error) { + args := inputAsMap(input) + rawURL := strings.TrimSpace(StringArg(args, "url")) + if rawURL == "" { + return nil, errors.New("url is required") + } + format := strings.TrimSpace(StringArg(args, "format")) + if format == "" { + format = "auto" + } + return p.callWebFetch(ctx.Context, rawURL, format) + }, + }, + }, nil +} + +func (p *WebFetchProvider) callWebFetch(ctx context.Context, rawURL, format string) (any, error) { + req, err := http.NewRequestWithContext(ctx, http.MethodGet, rawURL, nil) + if err != nil { + return nil, fmt.Errorf("invalid url: %w", err) + } + req.Header.Set("User-Agent", webFetchUserAgent) + + resp, err := p.client.Do(req) //nolint:gosec // intentionally fetches user-specified URLs + if err != nil { + return nil, err + } + defer func() { _ = resp.Body.Close() }() + + if resp.StatusCode < 200 || resp.StatusCode >= 300 { + return nil, fmt.Errorf("http error: %d %s", resp.StatusCode, resp.Status) + } + + contentType := resp.Header.Get("Content-Type") + detected := format + if format == "auto" { + detected = detectWebFetchFormat(contentType) + } + + body, err := io.ReadAll(resp.Body) + if err != nil { + return nil, err + } + + switch detected { + case "json": + return p.processJSON(rawURL, contentType, body) + case "xml": + return p.processXML(rawURL, contentType, body) + case "markdown": + return p.processHTML(rawURL, contentType, body) + default: + return p.processText(rawURL, contentType, body) + } +} + +func detectWebFetchFormat(contentType string) string { + ct := strings.ToLower(contentType) + switch { + case strings.Contains(ct, "application/json"): + return "json" + case strings.Contains(ct, "application/xml"), strings.Contains(ct, "text/xml"): + return "xml" + case strings.Contains(ct, "text/html"): + return "markdown" + default: + return "text" + } +} + +func (*WebFetchProvider) processJSON(fetchedURL, contentType string, body []byte) (any, error) { + var data any + if err := json.Unmarshal(body, &data); err != nil { + return nil, errors.New("failed to parse json") + } + return map[string]any{"success": true, "url": fetchedURL, "format": "json", "contentType": contentType, "data": data}, nil +} + +func (*WebFetchProvider) processXML(fetchedURL, contentType string, body []byte) (any, error) { + content := string(body) + if len(content) > webFetchMaxTextContent { + content = content[:webFetchMaxTextContent] + } + return map[string]any{"success": true, "url": fetchedURL, "format": "xml", "contentType": contentType, "content": content}, nil +} + +func (p *WebFetchProvider) processHTML(fetchedURL, contentType string, body []byte) (any, error) { + parsed, err := url.Parse(fetchedURL) + if err != nil { + parsed = &url.URL{} + } + article, err := readability.FromReader(strings.NewReader(string(body)), parsed) + if err != nil { + return nil, fmt.Errorf("failed to extract readable content from html: %w", err) + } + if strings.TrimSpace(article.Content) == "" { + return nil, errors.New("failed to extract readable content from html") + } + markdown, err := htmltomarkdown.ConvertString(article.Content) + if err != nil { + p.logger.Warn("html-to-markdown conversion failed, falling back to text", slog.Any("error", err)) + markdown = article.TextContent + } + textPreview := article.TextContent + if len(textPreview) > 500 { + textPreview = textPreview[:500] + } + return map[string]any{ + "success": true, "url": fetchedURL, "format": "markdown", "contentType": contentType, + "title": article.Title, "byline": article.Byline, "excerpt": article.Excerpt, + "content": markdown, "textContent": textPreview, "length": article.Length, + }, nil +} + +func (*WebFetchProvider) processText(fetchedURL, contentType string, body []byte) (any, error) { + content := string(body) + length := len(content) + if length > webFetchMaxTextContent { + content = content[:webFetchMaxTextContent] + } + return map[string]any{"success": true, "url": fetchedURL, "format": "text", "contentType": contentType, "content": content, "length": length}, nil +} diff --git a/internal/agent/types.go b/internal/agent/types.go new file mode 100644 index 00000000..b2c81574 --- /dev/null +++ b/internal/agent/types.go @@ -0,0 +1,158 @@ +package agent + +import ( + "encoding/json" + "time" + + sdk "github.com/memohai/twilight-ai/sdk" +) + +// SessionContext carries request-scoped identity and routing information. +type SessionContext struct { + BotID string + ChatID string + ChannelIdentityID string + DisplayName string + CurrentPlatform string + ReplyTarget string + ConversationType string + SessionToken string //nolint:gosec // carries session credential material at runtime + IsSubagent bool +} + +// SkillEntry represents a skill loaded from the bot container. +type SkillEntry struct { + Name string + Description string + Content string + Metadata map[string]any +} + +// InboxItem represents an unread inbox notification. +type InboxItem struct { + ID string `json:"id"` + Source string `json:"source"` + Header map[string]any `json:"header"` + Content string `json:"content"` + CreatedAt string `json:"createdAt"` +} + +// Schedule represents a scheduled task definition. +type Schedule struct { + ID string `json:"id"` + Name string `json:"name"` + Description string `json:"description"` + Pattern string `json:"pattern"` + MaxCalls *int `json:"maxCalls,omitempty"` + Command string `json:"command"` +} + +// LoopDetectionConfig controls loop detection behavior. +type LoopDetectionConfig struct { + Enabled bool +} + +// RunConfig holds everything needed for a single agent invocation. +type RunConfig struct { + Model *sdk.Model + ReasoningEffort string + Messages []sdk.Message + Query string + System string + Tools []sdk.Tool + Channels []string + CurrentChannel string + Identity SessionContext + Skills []SkillEntry + EnabledSkillNames []string + Inbox []InboxItem + LoopDetection LoopDetectionConfig + ActiveContextTime int +} + +// GenerateResult holds the result of a non-streaming agent invocation. +type GenerateResult struct { + Messages []sdk.Message + Text string + Attachments []FileAttachment + Reactions []ReactionItem + Speeches []SpeechItem + Skills []string + Usage *sdk.Usage +} + +// FileAttachment represents a file reference extracted from agent output. +type FileAttachment struct { + Type string `json:"type"` + Path string `json:"path,omitempty"` + URL string `json:"url,omitempty"` + Mime string `json:"mime,omitempty"` + Name string `json:"name,omitempty"` +} + +// ReactionItem represents an emoji reaction extracted from agent output. +type ReactionItem struct { + Emoji string `json:"emoji"` +} + +// SpeechItem represents a TTS request extracted from agent output. +type SpeechItem struct { + Text string `json:"text"` +} + +// SystemFile is a file loaded from the bot container for prompt generation. +type SystemFile struct { + Filename string + Content string +} + +// ModelConfig holds provider and model information resolved from DB. +type ModelConfig struct { + ModelID string + ClientType string + InputModalities []string + APIKey string //nolint:gosec // carries provider credential material at runtime + BaseURL string + ReasoningConfig *ReasoningConfig +} + +// ReasoningConfig controls extended thinking/reasoning behavior. +type ReasoningConfig struct { + Enabled bool + Effort string +} + +func mustMarshal(v any) json.RawMessage { + data, err := json.Marshal(v) + if err != nil { + return nil + } + return data +} + +// StripTagsFromMessages strips attachment/reaction/speech tags from assistant messages. +func StripTagsFromMessages(msgs []sdk.Message) []sdk.Message { + resolvers := DefaultTagResolvers() + result := make([]sdk.Message, 0, len(msgs)) + for _, msg := range msgs { + if msg.Role != sdk.MessageRoleAssistant { + result = append(result, msg) + continue + } + cleaned := make([]sdk.MessagePart, 0, len(msg.Content)) + for _, part := range msg.Content { + if tp, ok := part.(sdk.TextPart); ok { + text, _ := ExtractTagsFromText(tp.Text, resolvers) + cleaned = append(cleaned, sdk.TextPart{Text: text}) + } else { + cleaned = append(cleaned, part) + } + } + msg.Content = cleaned + result = append(result, msg) + } + return result +} + +// TimeNow is a hook for testing. Defaults to time.Now. +var TimeNow = time.Now diff --git a/internal/bun/runtime/manager.go b/internal/bun/runtime/manager.go deleted file mode 100644 index e858e4e5..00000000 --- a/internal/bun/runtime/manager.go +++ /dev/null @@ -1,259 +0,0 @@ -package runtime - -import ( - "context" - "errors" - "fmt" - "io" - "io/fs" - "log/slog" - "net/http" - "os" - "os/exec" - "path/filepath" - "runtime" - "sync" - "syscall" - "time" - - "github.com/BurntSushi/toml" - - "github.com/memohai/memoh/internal/config" - "github.com/memohai/memoh/internal/embedded" -) - -type Manager struct { - log *slog.Logger - cfg config.Config - host string - port int - workdir string - cmd *exec.Cmd - stopOnce sync.Once -} - -const ( - defaultGatewayHost = "127.0.0.1" - defaultGatewayPort = 8081 - agentConfigFileName = "config.toml" - agentBinName = "agent-bin" - agentUnavailableMarker = "UNAVAILABLE" - healthCheckTimeout = 30 * time.Second - healthCheckRetryBackoff = 400 * time.Millisecond - processStopTimeout = 5 * time.Second -) - -func NewManager(log *slog.Logger, cfg config.Config) *Manager { - host := cfg.AgentGateway.Host - if host == "" { - host = defaultGatewayHost - } - port := cfg.AgentGateway.Port - if port == 0 { - port = defaultGatewayPort - } - return &Manager{ - log: log.With(slog.String("component", "agent-runtime")), - cfg: cfg, - host: host, - port: port, - } -} - -func (m *Manager) Start(ctx context.Context) error { - workdir, err := os.MkdirTemp("", "memoh-agent-runtime-*") - if err != nil { - return fmt.Errorf("create runtime temp dir: %w", err) - } - m.workdir = workdir - - agentFS, err := embedded.AgentFS() - if err != nil { - return err - } - - agentDir := filepath.Join(workdir, "agent") - if err := extractFS(agentFS, agentDir); err != nil { - return fmt.Errorf("extract agent assets: %w", err) - } - - agentBinPath := filepath.Join(agentDir, agentBinaryNameForRuntime()) - if _, err := os.Stat(agentBinPath); err != nil { - if errors.Is(err, os.ErrNotExist) { - markerPath := filepath.Join(agentDir, agentUnavailableMarker) - if _, markerErr := os.Stat(markerPath); markerErr == nil { - m.log.Warn("bundled agent binary unavailable for current platform; falling back to configured agent gateway", slog.String("platform", runtimePlatform())) - return nil - } - } - return fmt.Errorf("agent binary missing: %w", err) - } - if err := os.Chmod(agentBinPath, 0o755); err != nil { //nolint:gosec // G302: executable binary requires execute bit; 0600 would make it non-executable - return fmt.Errorf("chmod agent binary: %w", err) - } - agentConfigPath := filepath.Join(agentDir, agentConfigFileName) - if err := writeAgentConfig(agentConfigPath, m.cfg); err != nil { - return err - } - - cmd := exec.CommandContext(ctx, agentBinPath) //nolint:gosec // G204: path is constructed internally from an embedded asset, not user input - cmd.Dir = agentDir - cmd.Env = append( - os.Environ(), - "MEMOH_CONFIG_PATH="+agentConfigPath, - "CONFIG_PATH="+agentConfigPath, - ) - cmd.Stdout = &logWriter{log: m.log, level: slog.LevelInfo} - cmd.Stderr = &logWriter{log: m.log, level: slog.LevelError} - - if err := cmd.Start(); err != nil { - return fmt.Errorf("start bundled agent runtime: %w", err) - } - m.cmd = cmd - - m.log.Info("bundled agent runtime started", slog.Int("pid", cmd.Process.Pid), slog.String("addr", m.address())) - if err := m.waitHealthy(ctx); err != nil { - return err - } - return nil -} - -func (m *Manager) Stop(ctx context.Context) error { - var retErr error - m.stopOnce.Do(func() { - if m.cmd == nil || m.cmd.Process == nil { - return - } - - _ = m.cmd.Process.Signal(os.Interrupt) - done := make(chan error, 1) - go func() { - done <- m.cmd.Wait() - }() - - select { - case err := <-done: - if err != nil && !errors.Is(err, syscall.EINTR) { - retErr = err - } - case <-ctx.Done(): - _ = m.cmd.Process.Kill() - retErr = ctx.Err() - case <-time.After(processStopTimeout): - _ = m.cmd.Process.Kill() - <-done - } - - if m.workdir != "" { - _ = os.RemoveAll(m.workdir) - } - }) - return retErr -} - -func (m *Manager) waitHealthy(ctx context.Context) error { - client := &http.Client{Timeout: 2 * time.Second} - healthURL := fmt.Sprintf("http://%s/health", m.address()) - deadline := time.Now().Add(healthCheckTimeout) - for time.Now().Before(deadline) { - req, _ := http.NewRequestWithContext(ctx, http.MethodGet, healthURL, nil) - resp, err := client.Do(req) //nolint:gosec // G704: URL is constructed from operator-configured host/port, not from user input - if err == nil { - _ = resp.Body.Close() - if resp.StatusCode >= 200 && resp.StatusCode < 300 { - return nil - } - } - time.Sleep(healthCheckRetryBackoff) - } - return fmt.Errorf("bundled agent runtime health check timeout: %s", healthURL) -} - -func (m *Manager) address() string { - return fmt.Sprintf("%s:%d", m.host, m.port) -} - -func extractFS(src fs.FS, targetDir string) error { - if err := os.MkdirAll(targetDir, 0o750); err != nil { - return err - } - return fs.WalkDir(src, ".", func(path string, d fs.DirEntry, err error) error { - if err != nil { - return err - } - if path == "." { - return nil - } - target := filepath.Join(targetDir, path) - if d.IsDir() { - return os.MkdirAll(target, 0o750) - } - - if err := os.MkdirAll(filepath.Dir(target), 0o750); err != nil { - return err - } - r, err := src.Open(path) - if err != nil { - return err - } - defer func() { _ = r.Close() }() - - w, err := os.OpenFile(target, os.O_CREATE|os.O_TRUNC|os.O_WRONLY, 0o600) //nolint:gosec // G304: target is derived from an embedded FS walk within a process-owned temp dir - if err != nil { - return err - } - if _, err := io.Copy(w, r); err != nil { - _ = w.Close() - return err - } - return w.Close() - }) -} - -func writeAgentConfig(path string, cfg config.Config) error { - if err := os.MkdirAll(filepath.Dir(path), 0o750); err != nil { - return err - } - f, err := os.Create(path) //nolint:gosec // G304: path is constructed internally from a process-owned temp dir - if err != nil { - return fmt.Errorf("create agent config: %w", err) - } - defer func() { _ = f.Close() }() - return toml.NewEncoder(f).Encode(cfg) -} - -type logWriter struct { - log *slog.Logger - level slog.Level -} - -func (w *logWriter) Write(p []byte) (n int, err error) { - msg := string(p) - msg = trimTrailingNewline(msg) - if msg != "" { - w.log.LogAttrs(context.Background(), w.level, "runtime process output", slog.String("detail", msg)) - } - return len(p), nil -} - -func trimTrailingNewline(s string) string { - for len(s) > 0 { - last := s[len(s)-1] - if last != '\n' && last != '\r' { - break - } - s = s[:len(s)-1] - } - return s -} - -func runtimePlatform() string { - return fmt.Sprintf("%s/%s", runtime.GOOS, runtime.GOARCH) -} - -func agentBinaryNameForRuntime() string { - if runtime.GOOS == "windows" { - return agentBinName + ".exe" - } - return agentBinName -} diff --git a/internal/channel/inbound/channel.go b/internal/channel/inbound/channel.go index abb29daf..324f7536 100644 --- a/internal/channel/inbound/channel.go +++ b/internal/channel/inbound/channel.go @@ -862,12 +862,12 @@ func contentPartText(part conversation.ContentPart) string { return "" } -type gatewayStreamEnvelope struct { +// agentStreamEnvelope is the JSON shape produced by internal/agent.StreamEvent. +type agentStreamEnvelope struct { Type string `json:"type"` Delta string `json:"delta"` Error string `json:"error"` Message string `json:"message"` - Image string `json:"image"` Data json.RawMessage `json:"data"` Messages []conversation.ModelMessage `json:"messages"` @@ -880,26 +880,16 @@ type gatewayStreamEnvelope struct { Speeches json.RawMessage `json:"speeches"` } -type gatewayStreamDoneData struct { - Messages []conversation.ModelMessage `json:"messages"` -} - func mapStreamChunkToChannelEvents(chunk conversation.StreamChunk) ([]channel.StreamEvent, []conversation.ModelMessage, error) { if len(chunk) == 0 { return nil, nil, nil } - var envelope gatewayStreamEnvelope + var envelope agentStreamEnvelope if err := json.Unmarshal(chunk, &envelope); err != nil { return nil, nil, err } finalMessages := make([]conversation.ModelMessage, 0, len(envelope.Messages)) finalMessages = append(finalMessages, envelope.Messages...) - if len(finalMessages) == 0 && len(envelope.Data) > 0 { - var done gatewayStreamDoneData - if err := json.Unmarshal(envelope.Data, &done); err == nil && len(done.Messages) > 0 { - finalMessages = append(finalMessages, done.Messages...) - } - } eventType := strings.ToLower(strings.TrimSpace(envelope.Type)) switch eventType { case "text_delta": @@ -1687,6 +1677,7 @@ func (p *ChannelInboundProcessor) ingestOutboundAttachments(ctx context.Context, result = append(result, item) continue } + sourceURL := item.URL item.ContentHash = asset.ContentHash item.URL = "" item.Base64 = "" @@ -1695,6 +1686,12 @@ func (p *ChannelInboundProcessor) ingestOutboundAttachments(ctx context.Context, } item.Metadata["bot_id"] = botID item.Metadata["storage_key"] = asset.StorageKey + if n := strings.TrimSpace(item.Name); n != "" { + item.Metadata["name"] = n + } + if su := strings.TrimSpace(sourceURL); su != "" && !isDataURL(su) { + item.Metadata["source_url"] = su + } if strings.TrimSpace(item.Mime) == "" { item.Mime = attachment.NormalizeMime(asset.Mime) } @@ -1720,12 +1717,14 @@ func isHTTPURL(raw string) bool { // For non-media-marker paths, it ingests the file into the media store first. // Returns true if the asset was resolved and item was updated. func (p *ChannelInboundProcessor) resolveContainerPathAsset(ctx context.Context, botID, accessPath string, item *channel.Attachment) bool { + sourcePath := accessPath + // Try media marker lookup first. storageKey := extractStorageKey(accessPath, botID) if storageKey != "" { asset, err := p.mediaService.GetByStorageKey(ctx, botID, storageKey) if err == nil { - applyAssetToAttachment(asset, botID, item) + applyAssetToAttachment(asset, botID, item, sourcePath) return true } } @@ -1743,14 +1742,15 @@ func (p *ChannelInboundProcessor) resolveContainerPathAsset(ctx context.Context, } return false } - applyAssetToAttachment(asset, botID, item) + applyAssetToAttachment(asset, botID, item, sourcePath) return true } return false } -func applyAssetToAttachment(asset media.Asset, botID string, item *channel.Attachment) { +func applyAssetToAttachment(asset media.Asset, botID string, item *channel.Attachment, sourcePath string) { + sourceURL := item.URL item.ContentHash = asset.ContentHash item.URL = "" if item.Metadata == nil { @@ -1758,13 +1758,21 @@ func applyAssetToAttachment(asset media.Asset, botID string, item *channel.Attac } item.Metadata["bot_id"] = botID item.Metadata["storage_key"] = asset.StorageKey + if n := strings.TrimSpace(item.Name); n != "" { + item.Metadata["name"] = n + } + if sp := strings.TrimSpace(sourcePath); sp != "" { + item.Metadata["source_path"] = sp + } + if su := strings.TrimSpace(sourceURL); su != "" && !isDataURL(su) { + item.Metadata["source_url"] = su + } if strings.TrimSpace(item.Mime) == "" { item.Mime = attachment.NormalizeMime(asset.Mime) } if item.Size == 0 && asset.SizeBytes > 0 { item.Size = asset.SizeBytes } - // Infer a better attachment type from MIME when the TS side sent a generic "file". if item.Type == channel.AttachmentFile || item.Type == "" { item.Type = inferAttachmentTypeFromMime(strings.TrimSpace(item.Mime)) } @@ -2034,6 +2042,8 @@ func buildAssetRefs(attachments []channel.Attachment, startOrdinal int) []conver Ordinal: startOrdinal + len(refs), Mime: strings.TrimSpace(att.Mime), SizeBytes: att.Size, + Name: strings.TrimSpace(att.Name), + Metadata: att.Metadata, } if att.Metadata != nil { if sk, ok := att.Metadata["storage_key"].(string); ok { diff --git a/internal/command/formatter.go b/internal/command/formatter.go index ec813aca..0b03996f 100644 --- a/internal/command/formatter.go +++ b/internal/command/formatter.go @@ -3,6 +3,7 @@ package command import ( "fmt" "strings" + "unicode/utf8" ) // formatItems renders a list of records as a Markdown-style list. @@ -57,15 +58,15 @@ type kv struct { value string } -// truncate shortens a string to maxLen, appending "..." if truncated. +// truncate shortens a string to at most maxLen runes, appending "..." if truncated. func truncate(s string, maxLen int) string { - if len(s) <= maxLen { + if utf8.RuneCountInString(s) <= maxLen { return s } if maxLen <= 3 { - return s[:maxLen] + return string([]rune(s)[:maxLen]) } - return s[:maxLen-3] + "..." + return string([]rune(s)[:maxLen-3]) + "..." } // boolStr returns "yes" or "no". diff --git a/internal/command/fs.go b/internal/command/fs.go index e67fe466..873c8378 100644 --- a/internal/command/fs.go +++ b/internal/command/fs.go @@ -3,6 +3,7 @@ package command import ( "fmt" "strings" + "unicode/utf8" ) func (h *Handler) buildFSGroup() *CommandGroup { @@ -51,9 +52,9 @@ func (h *Handler) buildFSGroup() *CommandGroup { if err != nil { return "", err } - const maxLen = 2000 - if len(content) > maxLen { - content = content[:maxLen] + "\n... (truncated)" + const maxRunes = 2000 + if utf8.RuneCountInString(content) > maxRunes { + content = string([]rune(content)[:maxRunes]) + "\n... (truncated)" } return fmt.Sprintf("```\n%s\n```", content), nil }, diff --git a/internal/config/config.go b/internal/config/config.go index a8de1d89..11efb2d1 100644 --- a/internal/config/config.go +++ b/internal/config/config.go @@ -41,7 +41,6 @@ type Config struct { Postgres PostgresConfig `toml:"postgres"` Qdrant QdrantConfig `toml:"qdrant"` Sparse SparseConfig `toml:"sparse"` - AgentGateway AgentGatewayConfig `toml:"agent_gateway"` BrowserGateway BrowserGatewayConfig `toml:"browser_gateway"` } @@ -140,23 +139,6 @@ type SparseConfig struct { BaseURL string `toml:"base_url"` } -type AgentGatewayConfig struct { - Host string `toml:"host"` - Port int `toml:"port"` -} - -func (c AgentGatewayConfig) BaseURL() string { - host := c.Host - if host == "" { - host = "127.0.0.1" - } - port := c.Port - if port == 0 { - port = 8081 - } - return "http://" + host + ":" + strconv.Itoa(port) -} - type BrowserGatewayConfig struct { Host string `toml:"host"` Port int `toml:"port"` @@ -208,10 +190,6 @@ func Load(path string) (Config, error) { Database: DefaultPGDatabase, SSLMode: DefaultPGSSLMode, }, - AgentGateway: AgentGatewayConfig{ - Host: "127.0.0.1", - Port: 8081, - }, BrowserGateway: BrowserGatewayConfig{ Host: "127.0.0.1", Port: 8083, diff --git a/internal/conversation/flow/resolver.go b/internal/conversation/flow/resolver.go index 5545dacb..574d0f96 100644 --- a/internal/conversation/flow/resolver.go +++ b/internal/conversation/flow/resolver.go @@ -1,49 +1,28 @@ package flow import ( - "bufio" - "bytes" "context" - "encoding/base64" "encoding/json" "errors" - "fmt" "io" "log/slog" - "net/http" - "sort" - "strconv" "strings" "time" - "github.com/gorilla/websocket" - "github.com/jackc/pgx/v5" - "github.com/jackc/pgx/v5/pgtype" + sdk "github.com/memohai/twilight-ai/sdk" - attachmentpkg "github.com/memohai/memoh/internal/attachment" + agentpkg "github.com/memohai/memoh/internal/agent" "github.com/memohai/memoh/internal/conversation" - "github.com/memohai/memoh/internal/db" "github.com/memohai/memoh/internal/db/sqlc" - "github.com/memohai/memoh/internal/heartbeat" "github.com/memohai/memoh/internal/inbox" memprovider "github.com/memohai/memoh/internal/memory/adapters" messagepkg "github.com/memohai/memoh/internal/message" "github.com/memohai/memoh/internal/models" - "github.com/memohai/memoh/internal/schedule" "github.com/memohai/memoh/internal/settings" - "github.com/memohai/memoh/internal/textutil" ) const ( defaultMaxContextMinutes = 24 * 60 - // Keep gateway payload bounded when inlining binary attachments as data URLs. - gatewayInlineAttachmentMaxBytes int64 = 20 * 1024 * 1024 - // SSE payloads (especially attachment/tool results) can be very large. - // bufio.Scanner hard-fails with "token too long" if a single line exceeds its max token size. - // Use a reader-based parser and enforce an explicit per-line cap here. The agent gateway - // stream is expected to chunk large JSON payloads across multiple SSE "data:" lines, so - // this limit should stay relatively small. - gatewaySSEMaxLineBytes = 256 * 1024 ) // SkillEntry represents a skill loaded from the container. @@ -69,8 +48,9 @@ type gatewayAssetLoader interface { OpenForGateway(ctx context.Context, botID, contentHash string) (reader io.ReadCloser, mime string, err error) } -// Resolver orchestrates chat with the agent gateway. +// Resolver orchestrates chat with the internal agent. type Resolver struct { + agent *agentpkg.Agent modelsService *models.Service queries *sqlc.Queries memoryRegistry *memprovider.Registry @@ -80,14 +60,11 @@ type Resolver struct { inboxService *inbox.Service skillLoader SkillLoader assetLoader gatewayAssetLoader - gatewayBaseURL string timeout time.Duration logger *slog.Logger - httpClient *http.Client - streamingClient *http.Client } -// NewResolver creates a Resolver that communicates with the agent gateway. +// NewResolver creates a Resolver that uses the internal agent directly. func NewResolver( log *slog.Logger, modelsService *models.Service, @@ -95,27 +72,21 @@ func NewResolver( conversationSvc ConversationSettingsReader, messageService messagepkg.Service, settingsService *settings.Service, - gatewayBaseURL string, + a *agentpkg.Agent, timeout time.Duration, ) *Resolver { - if strings.TrimSpace(gatewayBaseURL) == "" { - gatewayBaseURL = "http://127.0.0.1:8081" - } - gatewayBaseURL = strings.TrimRight(gatewayBaseURL, "/") if timeout <= 0 { timeout = 60 * time.Second } return &Resolver{ + agent: a, modelsService: modelsService, queries: queries, conversationSvc: conversationSvc, messageService: messageService, settingsService: settingsService, - gatewayBaseURL: gatewayBaseURL, timeout: timeout, logger: log.With(slog.String("service", "conversation_resolver")), - httpClient: &http.Client{Timeout: timeout}, - streamingClient: &http.Client{}, } } @@ -141,150 +112,19 @@ func (r *Resolver) SetInboxService(service *inbox.Service) { r.inboxService = service } -// --- gateway payload --- - -type gatewayReasoningConfig struct { - Enabled bool `json:"enabled"` - Effort string `json:"effort"` -} - -type gatewayModelConfig struct { - ModelID string `json:"modelId"` - ClientType string `json:"clientType"` - Input []string `json:"input"` - APIKey string `json:"apiKey"` //nolint:gosec // intentional: forwarded to agent gateway for model authentication - BaseURL string `json:"baseUrl"` - Reasoning *gatewayReasoningConfig `json:"reasoning,omitempty"` -} - -type gatewayIdentity struct { - BotID string `json:"botId"` - ChannelIdentityID string `json:"channelIdentityId"` - DisplayName string `json:"displayName"` - CurrentPlatform string `json:"currentPlatform,omitempty"` - ReplyTarget string `json:"replyTarget,omitempty"` - ConversationType string `json:"conversationType,omitempty"` - SessionToken string `json:"sessionToken,omitempty"` //nolint:gosec // intentional: session token forwarded to agent gateway for channel reply routing -} - -type gatewaySkill struct { - Name string `json:"name"` - Description string `json:"description"` - Content string `json:"content"` - Metadata map[string]any `json:"metadata,omitempty"` -} - -type gatewayInboxItem struct { - ID string `json:"id"` - Source string `json:"source"` - Header map[string]any `json:"header"` - Content string `json:"content"` - CreatedAt string `json:"createdAt"` -} - -type gatewayLoopDetectionConfig struct { - Enabled bool `json:"enabled"` -} - -type gatewayRequest struct { - Model gatewayModelConfig `json:"model"` - ActiveContextTime int `json:"activeContextTime"` - Channels []string `json:"channels"` - CurrentChannel string `json:"currentChannel"` - Messages []conversation.ModelMessage `json:"messages"` - Skills []string `json:"skills"` - UsableSkills []gatewaySkill `json:"usableSkills"` - Query string `json:"query"` - Identity gatewayIdentity `json:"identity"` - Attachments []any `json:"attachments"` - Inbox []gatewayInboxItem `json:"inbox,omitempty"` - LoopDetection *gatewayLoopDetectionConfig `json:"loopDetection,omitempty"` -} - -type gatewayResponse struct { - Messages []conversation.ModelMessage `json:"messages"` - Skills []string `json:"skills"` - Text string `json:"text,omitempty"` - Usage json.RawMessage `json:"usage,omitempty"` - Usages []json.RawMessage `json:"usages,omitempty"` -} - -type gatewayUsage struct { +type usageInfo struct { InputTokens *int `json:"inputTokens"` OutputTokens *int `json:"outputTokens"` } -// gatewaySchedule matches the agent gateway ScheduleModel for /chat/trigger-schedule. -type gatewaySchedule struct { - ID string `json:"id"` - Name string `json:"name"` - Description string `json:"description"` - Pattern string `json:"pattern"` - MaxCalls *int `json:"maxCalls,omitempty"` - Command string `json:"command"` -} - -// triggerScheduleRequest is the payload for POST /chat/trigger-schedule. -// It omits "query" from JSON so the trigger-schedule endpoint does not receive it. -type triggerScheduleRequest struct { - gatewayRequest - Schedule gatewaySchedule `json:"schedule"` -} - -// MarshalJSON marshals the request without the "query" field for trigger-schedule. -func (t triggerScheduleRequest) MarshalJSON() ([]byte, error) { - type alias struct { - gatewayRequest - Schedule gatewaySchedule `json:"schedule"` - } - raw, err := json.Marshal(alias(t)) - if err != nil { - return nil, err - } - var m map[string]json.RawMessage - if err := json.Unmarshal(raw, &m); err != nil { - return nil, err - } - delete(m, "query") - return json.Marshal(m) -} - -// gatewayHeartbeat matches the agent gateway HeartbeatModel for /chat/trigger-heartbeat. -type gatewayHeartbeat struct { - Interval int `json:"interval"` -} - -// triggerHeartbeatRequest is the payload for POST /chat/trigger-heartbeat. -type triggerHeartbeatRequest struct { - gatewayRequest - Heartbeat gatewayHeartbeat `json:"heartbeat"` -} - -// MarshalJSON marshals the request without the "query" field for trigger-heartbeat. -func (t triggerHeartbeatRequest) MarshalJSON() ([]byte, error) { - type alias struct { - gatewayRequest - Heartbeat gatewayHeartbeat `json:"heartbeat"` - } - raw, err := json.Marshal(alias(t)) - if err != nil { - return nil, err - } - var m map[string]json.RawMessage - if err := json.Unmarshal(raw, &m); err != nil { - return nil, err - } - delete(m, "query") - return json.Marshal(m) -} - // --- resolved context (shared by Chat / StreamChat / TriggerSchedule) --- type resolvedContext struct { - payload gatewayRequest + runConfig agentpkg.RunConfig model models.GetResponse provider sqlc.LlmProvider inboxItemIDs []string + query string // headerified query } func (r *Resolver) resolve(ctx context.Context, req conversation.ChatRequest) (resolvedContext, error) { @@ -380,27 +220,31 @@ func (r *Resolver) resolve(ctx context.Context, req conversation.ChatRequest) (r messages = append(messages, reqMessages...) messages = sanitizeMessages(messages) skills := dedup(req.Skills) - var usableSkills []gatewaySkill + var agentSkills []agentpkg.SkillEntry if r.skillLoader != nil { entries, err := r.skillLoader.LoadSkills(ctx, req.BotID) if err != nil { r.logger.Warn("failed to load usable skills", slog.String("bot_id", req.BotID), slog.Any("error", err)) } else { - usableSkills = make([]gatewaySkill, 0, len(entries)) + agentSkills = make([]agentpkg.SkillEntry, 0, len(entries)) for _, e := range entries { - skill, ok := normalizeGatewaySkill(e) - if !ok { + if strings.TrimSpace(e.Name) == "" { continue } - usableSkills = append(usableSkills, skill) + agentSkills = append(agentSkills, agentpkg.SkillEntry{ + Name: e.Name, + Description: e.Description, + Content: e.Content, + Metadata: e.Metadata, + }) } } } - if usableSkills == nil { - usableSkills = []gatewaySkill{} + if agentSkills == nil { + agentSkills = []agentpkg.SkillEntry{} } - var inboxGatewayItems []gatewayInboxItem + var agentInbox []agentpkg.InboxItem var inboxItemIDs []string if r.inboxService != nil { maxInbox := botSettings.MaxInboxItems @@ -411,10 +255,10 @@ func (r *Resolver) resolve(ctx context.Context, req conversation.ChatRequest) (r if err != nil { r.logger.Warn("failed to load inbox items", slog.String("bot_id", req.BotID), slog.Any("error", err)) } else if len(items) > 0 { - inboxGatewayItems = make([]gatewayInboxItem, 0, len(items)) + agentInbox = make([]agentpkg.InboxItem, 0, len(items)) inboxItemIDs = make([]string, 0, len(items)) for _, item := range items { - inboxGatewayItems = append(inboxGatewayItems, gatewayInboxItem{ + agentInbox = append(agentInbox, agentpkg.InboxItem{ ID: item.ID, Source: item.Source, Header: item.Header, @@ -426,7 +270,6 @@ func (r *Resolver) resolve(ctx context.Context, req conversation.ChatRequest) (r } } - attachments := r.routeAndMergeAttachments(ctx, chatModel, req) displayName := r.resolveDisplayName(ctx, req) headerifiedQuery := FormatUserHeader( @@ -436,36 +279,45 @@ func (r *Resolver) resolve(ctx context.Context, req conversation.ChatRequest) (r req.CurrentChannel, strings.TrimSpace(req.ConversationType), strings.TrimSpace(req.ConversationName), - extractFileRefPaths(attachments), + nil, // attachments paths handled separately req.Query, ) - var reasoning *gatewayReasoningConfig + reasoningEffort := "" if chatModel.SupportsReasoning && botSettings.ReasoningEnabled { - reasoning = &gatewayReasoningConfig{ + reasoningEffort = botSettings.ReasoningEffort + } + + var reasoningConfig *agentpkg.ReasoningConfig + if reasoningEffort != "" { + reasoningConfig = &agentpkg.ReasoningConfig{ Enabled: true, - Effort: botSettings.ReasoningEffort, + Effort: reasoningEffort, } } - payload := gatewayRequest{ - Model: gatewayModelConfig{ - ModelID: chatModel.ModelID, - ClientType: clientType, - Input: chatModel.InputModalities, - APIKey: provider.ApiKey, - BaseURL: provider.BaseUrl, - Reasoning: reasoning, - }, - ActiveContextTime: maxCtx, - Channels: nonNilStrings(req.Channels), - CurrentChannel: req.CurrentChannel, - Messages: nonNilModelMessages(messages), - Skills: nonNilStrings(skills), - UsableSkills: usableSkills, - Query: headerifiedQuery, - Identity: gatewayIdentity{ + modelCfg := agentpkg.ModelConfig{ + ModelID: chatModel.ModelID, + ClientType: clientType, + InputModalities: chatModel.InputModalities, + APIKey: provider.ApiKey, + BaseURL: provider.BaseUrl, + ReasoningConfig: reasoningConfig, + } + + sdkModel := agentpkg.CreateModel(modelCfg) + sdkMessages := modelMessagesToSDKMessages(nonNilModelMessages(messages)) + + runCfg := agentpkg.RunConfig{ + Model: sdkModel, + ReasoningEffort: reasoningEffort, + Messages: sdkMessages, + Query: headerifiedQuery, + Channels: nonNilStrings(req.Channels), + CurrentChannel: req.CurrentChannel, + Identity: agentpkg.SessionContext{ BotID: req.BotID, + ChatID: req.ChatID, ChannelIdentityID: strings.TrimSpace(req.SourceChannelIdentityID), DisplayName: displayName, CurrentPlatform: req.CurrentChannel, @@ -473,1830 +325,72 @@ func (r *Resolver) resolve(ctx context.Context, req conversation.ChatRequest) (r ConversationType: strings.TrimSpace(req.ConversationType), SessionToken: req.ChatToken, }, - Attachments: attachments, - Inbox: inboxGatewayItems, - LoopDetection: &gatewayLoopDetectionConfig{Enabled: loopDetectionEnabled}, + Skills: agentSkills, + EnabledSkillNames: nonNilStrings(skills), + Inbox: agentInbox, + LoopDetection: agentpkg.LoopDetectionConfig{Enabled: loopDetectionEnabled}, + ActiveContextTime: maxCtx, } - return resolvedContext{payload: payload, model: chatModel, provider: provider, inboxItemIDs: inboxItemIDs}, nil + return resolvedContext{runConfig: runCfg, model: chatModel, provider: provider, inboxItemIDs: inboxItemIDs, query: headerifiedQuery}, nil } -// --- Chat --- - -// Chat sends a synchronous chat request to the agent gateway and stores the result. +// Chat sends a synchronous chat request and stores the result. func (r *Resolver) Chat(ctx context.Context, req conversation.ChatRequest) (conversation.ChatResponse, error) { rc, err := r.resolve(ctx, req) if err != nil { return conversation.ChatResponse{}, err } - req.Query = rc.payload.Query - resp, err := r.postChat(ctx, rc.payload, req.Token) + req.Query = rc.query + + cfg := rc.runConfig + cfg = r.prepareRunConfig(ctx, cfg) + + result, err := r.agent.Generate(ctx, cfg) if err != nil { return conversation.ChatResponse{}, err } - if err := r.storeRound(ctx, req, resp.Messages, resp.Usage, resp.Usages, rc.model.ID); err != nil { + + outputMessages := sdkMessagesToModelMessages(result.Messages) + roundMessages := prependUserMessage(req.Query, outputMessages) + usageJSON, _ := json.Marshal(result.Usage) + if err := r.storeRound(ctx, req, roundMessages, usageJSON, nil, rc.model.ID); err != nil { return conversation.ChatResponse{}, err } r.markInboxRead(ctx, req.BotID, rc.inboxItemIDs) return conversation.ChatResponse{ - Messages: resp.Messages, - Skills: resp.Skills, + Messages: outputMessages, + Skills: result.Skills, Model: rc.model.ModelID, Provider: string(rc.model.ClientType), }, nil } -// --- TriggerSchedule --- - -// TriggerSchedule executes a scheduled command through the agent gateway trigger-schedule endpoint. -func (r *Resolver) TriggerSchedule(ctx context.Context, botID string, payload schedule.TriggerPayload, token string) error { - if strings.TrimSpace(botID) == "" { - return errors.New("bot id is required") - } - if strings.TrimSpace(payload.Command) == "" { - return errors.New("schedule command is required") - } - - req := conversation.ChatRequest{ - BotID: botID, - ChatID: botID, - Query: payload.Command, - UserID: payload.OwnerUserID, - Token: token, - } - rc, err := r.resolve(ctx, req) - if err != nil { - return err - } - - schedulePayload := rc.payload - schedulePayload.Identity.ChannelIdentityID = strings.TrimSpace(payload.OwnerUserID) - schedulePayload.Identity.DisplayName = "Scheduler" - - triggerReq := triggerScheduleRequest{ - gatewayRequest: schedulePayload, - Schedule: gatewaySchedule{ - ID: payload.ID, - Name: payload.Name, - Description: payload.Description, - Pattern: payload.Pattern, - MaxCalls: payload.MaxCalls, - Command: payload.Command, - }, - } - - resp, err := r.postTriggerSchedule(ctx, triggerReq, token) - if err != nil { - return err - } - return r.storeRound(ctx, req, resp.Messages, resp.Usage, resp.Usages, rc.model.ID) -} - -// --- TriggerHeartbeat --- - -// TriggerHeartbeat executes a heartbeat check through the agent gateway trigger-heartbeat endpoint. -func (r *Resolver) TriggerHeartbeat(ctx context.Context, botID string, payload heartbeat.TriggerPayload, token string) (heartbeat.TriggerResult, error) { - if strings.TrimSpace(botID) == "" { - return heartbeat.TriggerResult{}, errors.New("bot id is required") - } - - // If a dedicated heartbeat model is configured, use it instead of the - // default chat model. We load the bot settings first so that we can - // set req.Model, which takes highest priority in selectChatModel. - var heartbeatModel string - if botSettings, err := r.loadBotSettings(ctx, botID); err == nil { - heartbeatModel = strings.TrimSpace(botSettings.HeartbeatModelID) - } - - req := conversation.ChatRequest{ - BotID: botID, - ChatID: botID, - Query: "heartbeat", - UserID: payload.OwnerUserID, - Token: token, - Model: heartbeatModel, - } - rc, err := r.resolve(ctx, req) - if err != nil { - return heartbeat.TriggerResult{}, err - } - - hbPayload := rc.payload - hbPayload.Identity.ChannelIdentityID = strings.TrimSpace(payload.OwnerUserID) - hbPayload.Identity.DisplayName = "Heartbeat" - - triggerReq := triggerHeartbeatRequest{ - gatewayRequest: hbPayload, - Heartbeat: gatewayHeartbeat{ - Interval: payload.Interval, - }, - } - - resp, err := r.postTriggerHeartbeat(ctx, triggerReq, token) - if err != nil { - return heartbeat.TriggerResult{}, err - } - - status := "alert" - text := strings.TrimSpace(resp.Text) - if isHeartbeatOK(text) { - status = "ok" - } - - var usageBytes []byte - if resp.Usage != nil { - usageBytes, _ = json.Marshal(resp.Usage) - } - - return heartbeat.TriggerResult{ - Status: status, - Text: text, - Usage: resp.Usage, - UsageBytes: usageBytes, - ModelID: rc.model.ID, - }, nil -} - -func isHeartbeatOK(text string) bool { - t := strings.TrimSpace(text) - return strings.HasPrefix(t, "HEARTBEAT_OK") || strings.HasSuffix(t, "HEARTBEAT_OK") || t == "HEARTBEAT_OK" -} - -// --- StreamChat --- - -// StreamChat sends a streaming chat request to the agent gateway. -func (r *Resolver) StreamChat(ctx context.Context, req conversation.ChatRequest) (<-chan conversation.StreamChunk, <-chan error) { - chunkCh := make(chan conversation.StreamChunk) - errCh := make(chan error, 1) - r.logger.Info("gateway stream start", - slog.String("bot_id", req.BotID), - slog.String("chat_id", req.ChatID), - ) - - go func() { - defer close(chunkCh) - defer close(errCh) - - streamReq := req - rc, err := r.resolve(ctx, streamReq) - if err != nil { - r.logger.Error("gateway stream resolve failed", - slog.String("bot_id", streamReq.BotID), - slog.String("chat_id", streamReq.ChatID), - slog.Any("error", err), - ) - errCh <- err - return - } - streamReq.Query = rc.payload.Query - // User message persistence is deferred to storeRound so that user + - // assistant messages are written atomically. This prevents duplicate - // user messages when concurrent requests hit the same bot. - if err := r.streamChat(ctx, rc.payload, streamReq, chunkCh, rc.model.ID); err != nil { - r.logger.Error("gateway stream request failed", - slog.String("bot_id", streamReq.BotID), - slog.String("chat_id", streamReq.ChatID), - slog.Any("error", err), - ) - errCh <- err - return - } - r.markInboxRead(ctx, streamReq.BotID, rc.inboxItemIDs) - }() - return chunkCh, errCh -} - -// --- WebSocket streaming --- - -// WSStreamEvent represents a raw JSON event forwarded from the agent gateway -// WebSocket connection to the Go server's client WebSocket. -type WSStreamEvent = json.RawMessage - -// StreamChatWS resolves the agent context and streams agent events from the -// gateway WebSocket endpoint. Events are sent on eventCh. When abortCh is -// closed or receives a value, an abort message is forwarded to the gateway. -// Terminal events (agent_end, agent_abort) trigger message persistence before -// being forwarded. -func (r *Resolver) StreamChatWS( - ctx context.Context, - req conversation.ChatRequest, - eventCh chan<- WSStreamEvent, - abortCh <-chan struct{}, -) error { - rc, err := r.resolve(ctx, req) - if err != nil { - return fmt.Errorf("resolve: %w", err) - } - req.Query = rc.payload.Query - - wsURL := strings.Replace(r.gatewayBaseURL, "http://", "ws://", 1) - wsURL = strings.Replace(wsURL, "https://", "wss://", 1) - wsURL += "/chat/ws" - - r.logger.Info("gateway ws connect", - slog.String("url", wsURL), - slog.String("bot_id", req.BotID), - ) - - dialer := websocket.Dialer{ - HandshakeTimeout: r.timeout, - } - conn, resp, err := dialer.DialContext(ctx, wsURL, nil) - if resp != nil { - defer func() { _ = resp.Body.Close() }() - } - if err != nil { - return fmt.Errorf("gateway ws dial: %w", err) - } - defer func() { _ = conn.Close() }() - - // The gateway WS handler uses the bearer field directly (not as an HTTP - // header), so strip the "Bearer " prefix that the Token field carries. - rawToken := strings.TrimSpace(req.Token) - rawToken = strings.TrimPrefix(rawToken, "Bearer ") - rawToken = strings.TrimPrefix(rawToken, "bearer ") - - startPayload := struct { - Type string `json:"type"` - Bearer string `json:"bearer,omitempty"` - gatewayRequest - }{ - Type: "start", - Bearer: rawToken, - gatewayRequest: rc.payload, - } - if err := conn.WriteJSON(startPayload); err != nil { - return fmt.Errorf("gateway ws write start: %w", err) - } - - // Forward abort signal to gateway. - abortDone := make(chan struct{}) - go func() { - defer close(abortDone) - select { - case <-abortCh: - _ = conn.WriteJSON(map[string]string{"type": "abort"}) - case <-ctx.Done(): - } - }() - defer func() { <-abortDone }() - - modelID := rc.model.ID - stored := false - for { - _, msgData, err := conn.ReadMessage() - if err != nil { - if websocket.IsCloseError(err, websocket.CloseNormalClosure, websocket.CloseGoingAway) { - break - } - if ctx.Err() != nil { - break - } - return fmt.Errorf("gateway ws read: %w", err) - } - - if !stored { - var envelope struct { - Type string `json:"type"` - } - if json.Unmarshal(msgData, &envelope) == nil && isTerminalStreamEvent(envelope.Type) { - if _, storeErr := r.tryStoreStream(ctx, req, msgData, modelID); storeErr != nil { - r.logger.Error("ws persist failed", slog.Any("error", storeErr)) - } else { - stored = true - } - } - } - - select { - case eventCh <- json.RawMessage(msgData): - case <-ctx.Done(): - return ctx.Err() - } - } - - r.markInboxRead(ctx, req.BotID, rc.inboxItemIDs) - return nil -} - -// --- HTTP helpers --- - -func (r *Resolver) postChat(ctx context.Context, payload gatewayRequest, token string) (gatewayResponse, error) { - url := r.gatewayBaseURL + "/chat/" - r.logger.Info( - "gateway request", - slog.String("url", url), - slog.Int("messages", len(payload.Messages)), - slog.Int("attachments", len(payload.Attachments)), - ) - - httpReq, err := newJSONRequestWithContext(ctx, http.MethodPost, url, payload) - if err != nil { - return gatewayResponse{}, err - } - if strings.TrimSpace(token) != "" { - httpReq.Header.Set("Authorization", token) - } - - resp, err := r.httpClient.Do(httpReq) //nolint:gosec // G704: URL is from operator-configured agent gateway, not user input - if err != nil { - return gatewayResponse{}, err - } - defer func() { _ = resp.Body.Close() }() - - respBody, err := io.ReadAll(resp.Body) - if err != nil { - return gatewayResponse{}, err - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - r.logger.Error("gateway error", slog.String("url", url), slog.Int("status", resp.StatusCode), slog.String("body_prefix", truncate(string(respBody), 300))) - return gatewayResponse{}, fmt.Errorf("agent gateway error: %s", strings.TrimSpace(string(respBody))) - } - - var parsed gatewayResponse - if err := json.Unmarshal(respBody, &parsed); err != nil { - r.logger.Error("gateway response parse failed", slog.String("body_prefix", truncate(string(respBody), 300)), slog.Any("error", err)) - return gatewayResponse{}, fmt.Errorf("failed to parse gateway response: %w", err) - } - return parsed, nil -} - -// postTriggerSchedule sends a trigger-schedule request to the agent gateway. -func (r *Resolver) postTriggerSchedule(ctx context.Context, payload triggerScheduleRequest, token string) (gatewayResponse, error) { - url := r.gatewayBaseURL + "/chat/trigger-schedule" - r.logger.Info("gateway trigger-schedule request", slog.String("url", url), slog.String("schedule_id", payload.Schedule.ID)) - - httpReq, err := newJSONRequestWithContext(ctx, http.MethodPost, url, payload) - if err != nil { - return gatewayResponse{}, err - } - if strings.TrimSpace(token) != "" { - httpReq.Header.Set("Authorization", token) - } - - resp, err := r.httpClient.Do(httpReq) //nolint:gosec // G704: URL is from operator-configured agent gateway, not user input - if err != nil { - return gatewayResponse{}, err - } - defer func() { _ = resp.Body.Close() }() - - respBody, err := io.ReadAll(resp.Body) - if err != nil { - return gatewayResponse{}, err - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - r.logger.Error("gateway trigger-schedule error", slog.String("url", url), slog.Int("status", resp.StatusCode), slog.String("body_prefix", truncate(string(respBody), 300))) - return gatewayResponse{}, fmt.Errorf("agent gateway error: %s", strings.TrimSpace(string(respBody))) - } - - var parsed gatewayResponse - if err := json.Unmarshal(respBody, &parsed); err != nil { - r.logger.Error("gateway trigger-schedule response parse failed", slog.String("body_prefix", truncate(string(respBody), 300)), slog.Any("error", err)) - return gatewayResponse{}, fmt.Errorf("failed to parse gateway response: %w", err) - } - return parsed, nil -} - -// postTriggerHeartbeat sends a trigger-heartbeat request to the agent gateway. -func (r *Resolver) postTriggerHeartbeat(ctx context.Context, payload triggerHeartbeatRequest, token string) (gatewayResponse, error) { - url := r.gatewayBaseURL + "/chat/trigger-heartbeat" - r.logger.Info("gateway trigger-heartbeat request", slog.String("url", url)) - - httpReq, err := newJSONRequestWithContext(ctx, http.MethodPost, url, payload) - if err != nil { - return gatewayResponse{}, err - } - if strings.TrimSpace(token) != "" { - httpReq.Header.Set("Authorization", token) - } - - resp, err := r.httpClient.Do(httpReq) //nolint:gosec // G704: URL is from operator-configured agent gateway, not user input - if err != nil { - return gatewayResponse{}, err - } - defer func() { _ = resp.Body.Close() }() - - respBody, err := io.ReadAll(resp.Body) - if err != nil { - return gatewayResponse{}, err - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - r.logger.Error("gateway trigger-heartbeat error", slog.String("url", url), slog.Int("status", resp.StatusCode), slog.String("body_prefix", truncate(string(respBody), 300))) - return gatewayResponse{}, fmt.Errorf("agent gateway error: %s", strings.TrimSpace(string(respBody))) - } - - var parsed gatewayResponse - if err := json.Unmarshal(respBody, &parsed); err != nil { - r.logger.Error("gateway trigger-heartbeat response parse failed", slog.String("body_prefix", truncate(string(respBody), 300)), slog.Any("error", err)) - return gatewayResponse{}, fmt.Errorf("failed to parse gateway response: %w", err) - } - return parsed, nil -} - -func (r *Resolver) streamChat(ctx context.Context, payload gatewayRequest, req conversation.ChatRequest, chunkCh chan<- conversation.StreamChunk, modelID string) error { - url := r.gatewayBaseURL + "/chat/stream" - r.logger.Info( - "gateway stream request", - slog.String("url", url), - slog.Int("messages", len(payload.Messages)), - slog.Int("attachments", len(payload.Attachments)), - ) - httpReq, err := newJSONRequestWithContext(ctx, http.MethodPost, url, payload) - if err != nil { - return err - } - httpReq.Header.Set("Accept", "text/event-stream") - if strings.TrimSpace(req.Token) != "" { - httpReq.Header.Set("Authorization", req.Token) - } - - resp, err := r.streamingClient.Do(httpReq) //nolint:gosec // G704: URL is from operator-configured agent gateway, not user input - if err != nil { - r.logger.Error("gateway stream connect failed", slog.String("url", url), slog.Any("error", err)) - return err - } - defer func() { _ = resp.Body.Close() }() - - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - errBody, _ := io.ReadAll(resp.Body) - r.logger.Error("gateway stream error", slog.String("url", url), slog.Int("status", resp.StatusCode), slog.String("body_prefix", truncate(string(errBody), 300))) - return fmt.Errorf("agent gateway error: %s", strings.TrimSpace(string(errBody))) - } - - stored := false - var dataBuf bytes.Buffer - - flushEvent := func() error { - if dataBuf.Len() == 0 { - return nil - } - out := append([]byte(nil), dataBuf.Bytes()...) - dataBuf.Reset() - if len(out) == 0 || bytes.Equal(bytes.TrimSpace(out), []byte("[DONE]")) { - return nil - } - // Persist final messages before forwarding the "done"/"agent_end" event so the - // next user turn can immediately see the assistant output in history. - if !stored { - if handled, storeErr := r.tryStoreStream(ctx, req, out, modelID); storeErr != nil { - return storeErr - } else if handled { - stored = true - } - } - chunkCh <- conversation.StreamChunk(out) - return nil - } - - scanner := bufio.NewScanner(resp.Body) - scanner.Buffer(make([]byte, 64*1024), gatewaySSEMaxLineBytes) - for scanner.Scan() { - line := scanner.Bytes() - if len(line) == 0 { - if err := flushEvent(); err != nil { - return err - } - continue - } - if len(line) > 0 && line[0] == ':' { - continue - } - if !bytes.HasPrefix(line, []byte("data:")) { - continue - } - part := bytes.TrimPrefix(line, []byte("data:")) - // Backward-compat: older SSE writers used "data: " (note the space). - // Only strip the first leading space for the *first* fragment to avoid corrupting - // chunked payloads split inside JSON string values. - if dataBuf.Len() == 0 && len(part) > 0 && part[0] == ' ' { - part = part[1:] - } - if len(part) == 0 { - continue - } - _, _ = dataBuf.Write(part) - } - if err := scanner.Err(); err != nil { - if errors.Is(err, bufio.ErrTooLong) { - return fmt.Errorf("sse line too long (max %d bytes)", gatewaySSEMaxLineBytes) - } - return err - } - return flushEvent() -} - -func newJSONRequestWithContext(ctx context.Context, method, url string, payload any) (*http.Request, error) { - pr, pw := io.Pipe() - go func() { - enc := json.NewEncoder(pw) - _ = pw.CloseWithError(enc.Encode(payload)) - }() - req, err := http.NewRequestWithContext(ctx, method, url, pr) - if err != nil { - _ = pr.Close() - return nil, err - } - req.Header.Set("Content-Type", "application/json") - return req, nil -} - -// isTerminalStreamEvent returns true for event types that carry the final -// message round (agent_end, agent_abort, done). -func isTerminalStreamEvent(eventType string) bool { - return eventType == "agent_end" || eventType == "agent_abort" || eventType == "done" -} - -// tryStoreStream attempts to extract final messages from a stream event and persist them. -func (r *Resolver) tryStoreStream(ctx context.Context, req conversation.ChatRequest, data []byte, modelID string) (bool, error) { - var envelope struct { - Type string `json:"type"` - Data json.RawMessage `json:"data"` - Messages []conversation.ModelMessage `json:"messages"` - Usage json.RawMessage `json:"usage,omitempty"` - Usages []json.RawMessage `json:"usages,omitempty"` - } - if err := json.Unmarshal(data, &envelope); err == nil { - if isTerminalStreamEvent(envelope.Type) && len(envelope.Messages) > 0 { - return true, r.storeRound(ctx, req, envelope.Messages, envelope.Usage, envelope.Usages, modelID) - } - if envelope.Type == "done" && len(envelope.Data) > 0 { - var resp gatewayResponse - if err := json.Unmarshal(envelope.Data, &resp); err == nil && len(resp.Messages) > 0 { - return true, r.storeRound(ctx, req, resp.Messages, resp.Usage, resp.Usages, modelID) - } - } - } - - // fallback: data: {messages: [...]} - var resp gatewayResponse - if err := json.Unmarshal(data, &resp); err == nil && len(resp.Messages) > 0 { - return true, r.storeRound(ctx, req, resp.Messages, resp.Usage, resp.Usages, modelID) - } - return false, nil -} - -// routeAndMergeAttachments applies CapabilityFallbackPolicy to split -// request attachments by model input modalities, then merges the results -// into a single []any for the gateway request. -func (r *Resolver) routeAndMergeAttachments(ctx context.Context, model models.GetResponse, req conversation.ChatRequest) []any { - if len(req.Attachments) == 0 { - return []any{} - } - typed := r.prepareGatewayAttachments(ctx, req) - routed := routeAttachmentsByCapability(model.InputModalities, typed) - // Convert unsupported attachments to tool file references. - for i := range routed.Fallback { - fallbackPath := strings.TrimSpace(routed.Fallback[i].FallbackPath) - if fallbackPath == "" { - // Cannot downgrade non-file payloads to tool file references. - // Drop them explicitly to keep gateway contract deterministic. - if r != nil && r.logger != nil { - r.logger.Warn( - "drop attachment without fallback path", - slog.String("type", strings.TrimSpace(routed.Fallback[i].Type)), - slog.String("transport", strings.TrimSpace(routed.Fallback[i].Transport)), - slog.String("content_hash", strings.TrimSpace(routed.Fallback[i].ContentHash)), - slog.Bool("has_payload", strings.TrimSpace(routed.Fallback[i].Payload) != ""), - ) - } - routed.Fallback[i] = gatewayAttachment{} - continue - } - routed.Fallback[i].Type = "file" - routed.Fallback[i].Transport = gatewayTransportToolFileRef - routed.Fallback[i].Payload = fallbackPath - } - merged := make([]any, 0, len(routed.Native)+len(routed.Fallback)) - merged = append(merged, attachmentsToAny(routed.Native)...) - for _, fb := range routed.Fallback { - if fb.Type == "" || strings.TrimSpace(fb.Transport) == "" || strings.TrimSpace(fb.Payload) == "" { - continue - } - merged = append(merged, fb) - } - if len(merged) == 0 { - return []any{} - } - return merged -} - -func (r *Resolver) prepareGatewayAttachments(ctx context.Context, req conversation.ChatRequest) []gatewayAttachment { - if len(req.Attachments) == 0 { - return nil - } - prepared := make([]gatewayAttachment, 0, len(req.Attachments)) - for _, raw := range req.Attachments { - attachmentType := strings.ToLower(strings.TrimSpace(raw.Type)) - payload := strings.TrimSpace(raw.Base64) - transport := "" - fallbackPath := strings.TrimSpace(raw.Path) - if payload != "" { - transport = gatewayTransportInlineDataURL - } else { - rawURL := strings.TrimSpace(raw.URL) - switch { - case isDataURL(rawURL): - payload = rawURL - transport = gatewayTransportInlineDataURL - case isLikelyPublicURL(rawURL): - payload = rawURL - transport = gatewayTransportPublicURL - case rawURL != "" && fallbackPath == "": - fallbackPath = rawURL - } - } - item := gatewayAttachment{ - ContentHash: strings.TrimSpace(raw.ContentHash), - Type: attachmentType, - Mime: strings.TrimSpace(raw.Mime), - Size: raw.Size, - Name: strings.TrimSpace(raw.Name), - Transport: transport, - Payload: payload, - Metadata: raw.Metadata, - FallbackPath: fallbackPath, - } - item = normalizeGatewayAttachmentPayload(item) - item = r.inlineImageAttachmentAssetIfNeeded(ctx, strings.TrimSpace(req.BotID), item) - prepared = append(prepared, item) - } - return prepared -} - -func normalizeGatewayAttachmentPayload(item gatewayAttachment) gatewayAttachment { - if item.Transport != gatewayTransportInlineDataURL { - return item - } - payload := strings.TrimSpace(item.Payload) - if payload == "" { - return item - } - if strings.HasPrefix(strings.ToLower(payload), "data:") { - mime := strings.TrimSpace(item.Mime) - if mime == "" || strings.EqualFold(mime, "application/octet-stream") { - if extracted := attachmentpkg.MimeFromDataURL(payload); extracted != "" { - item.Mime = extracted - } - } - item.Payload = payload - return item - } - mime := strings.TrimSpace(item.Mime) - if mime == "" { - mime = "application/octet-stream" - } - item.Payload = attachmentpkg.NormalizeBase64DataURL(payload, mime) - return item -} - -func isLikelyPublicURL(raw string) bool { - trimmed := strings.ToLower(strings.TrimSpace(raw)) - return strings.HasPrefix(trimmed, "http://") || strings.HasPrefix(trimmed, "https://") -} - -func isDataURL(raw string) bool { - trimmed := strings.ToLower(strings.TrimSpace(raw)) - return strings.HasPrefix(trimmed, "data:") -} - -func (r *Resolver) inlineImageAttachmentAssetIfNeeded(ctx context.Context, botID string, item gatewayAttachment) gatewayAttachment { - if item.Type != "image" { - return item - } - if strings.TrimSpace(item.Payload) != "" && - (item.Transport == gatewayTransportInlineDataURL || item.Transport == gatewayTransportPublicURL) { - return item - } - contentHash := strings.TrimSpace(item.ContentHash) - if contentHash == "" { - return item - } - dataURL, mime, err := r.inlineAssetAsDataURL(ctx, botID, contentHash, item.Type, item.Mime) - if err != nil { - if r != nil && r.logger != nil { - r.logger.Warn( - "inline gateway image attachment failed", - slog.Any("error", err), - slog.String("bot_id", botID), - slog.String("content_hash", contentHash), - ) - } - return item - } - item.Transport = gatewayTransportInlineDataURL - item.Payload = dataURL - if strings.TrimSpace(item.Mime) == "" { - item.Mime = mime - } - return item -} - -func (r *Resolver) inlineAssetAsDataURL(ctx context.Context, botID, contentHash, attachmentType, fallbackMime string) (string, string, error) { - if r == nil || r.assetLoader == nil { - return "", "", errors.New("gateway asset loader not configured") - } - reader, assetMime, err := r.assetLoader.OpenForGateway(ctx, botID, contentHash) - if err != nil { - return "", "", fmt.Errorf("open asset: %w", err) - } - defer func() { - _ = reader.Close() - }() - mime := strings.TrimSpace(fallbackMime) - if mime == "" { - mime = strings.TrimSpace(assetMime) - } - dataURL, resolvedMime, err := encodeReaderAsDataURL(reader, gatewayInlineAttachmentMaxBytes, attachmentType, mime) - if err != nil { - return "", "", err - } - return dataURL, resolvedMime, nil -} - -func encodeReaderAsDataURL(reader io.Reader, maxBytes int64, attachmentType, fallbackMime string) (string, string, error) { - if reader == nil { - return "", "", errors.New("reader is required") - } - if maxBytes <= 0 { - return "", "", errors.New("max bytes must be greater than 0") - } - limited := &io.LimitedReader{R: reader, N: maxBytes + 1} - head := make([]byte, 512) - n, err := limited.Read(head) - if err != nil && !errors.Is(err, io.EOF) { - return "", "", fmt.Errorf("read asset: %w", err) - } - head = head[:n] - - mime := strings.TrimSpace(fallbackMime) - if strings.EqualFold(strings.TrimSpace(attachmentType), "image") && - (strings.TrimSpace(mime) == "" || strings.EqualFold(strings.TrimSpace(mime), "application/octet-stream")) { - detected := strings.TrimSpace(http.DetectContentType(head)) - if strings.HasPrefix(strings.ToLower(detected), "image/") { - mime = detected - } - } - if mime == "" { - mime = "application/octet-stream" - } - - var encoded strings.Builder - encoded.Grow(len("data:") + len(mime) + len(";base64,")) - encoded.WriteString("data:") - encoded.WriteString(mime) - encoded.WriteString(";base64,") - - encoder := base64.NewEncoder(base64.StdEncoding, &encoded) - if len(head) > 0 { - if _, err := encoder.Write(head); err != nil { - _ = encoder.Close() - return "", "", fmt.Errorf("encode asset head: %w", err) - } - } - copied, err := io.Copy(encoder, limited) - if err != nil { - _ = encoder.Close() - return "", "", fmt.Errorf("encode asset body: %w", err) - } - if err := encoder.Close(); err != nil { - return "", "", fmt.Errorf("finalize asset encoding: %w", err) - } - - total := int64(len(head)) + copied - if total > maxBytes { - return "", "", fmt.Errorf( - "asset too large to inline: %d > %d", - total, - maxBytes, - ) - } - return encoded.String(), mime, nil -} - -// --- message loading --- - -type messageWithUsage struct { - Message conversation.ModelMessage - UsageInputTokens *int - UsageOutputTokens *int - RouteID string - ExternalMessageID string - Platform string - SenderChannelID string -} - -func (r *Resolver) loadMessages(ctx context.Context, chatID string, maxContextMinutes int) ([]messageWithUsage, error) { - if r.messageService == nil { - return nil, nil - } - since := time.Now().UTC().Add(-time.Duration(maxContextMinutes) * time.Minute) - msgs, err := r.messageService.ListActiveSince(ctx, chatID, since) - if err != nil { - return nil, err - } - var result []messageWithUsage - for _, m := range msgs { - var mm conversation.ModelMessage - if err := json.Unmarshal(m.Content, &mm); err != nil { - r.logger.Warn("loadMessages: content unmarshal failed, treating as raw text", - slog.String("chat_id", chatID), slog.Any("error", err)) - mm = conversation.ModelMessage{Role: m.Role, Content: m.Content} - } else { - mm.Role = m.Role - } - var inputTokens *int - var outputTokens *int - if len(m.Usage) > 0 { - var u gatewayUsage - if json.Unmarshal(m.Usage, &u) == nil { - inputTokens = u.InputTokens - outputTokens = u.OutputTokens - } - } - result = append(result, messageWithUsage{ - Message: mm, - UsageInputTokens: inputTokens, - UsageOutputTokens: outputTokens, - RouteID: strings.TrimSpace(m.RouteID), - ExternalMessageID: strings.TrimSpace(m.ExternalMessageID), - Platform: strings.TrimSpace(m.Platform), - SenderChannelID: strings.TrimSpace(m.SenderChannelIdentityID), - }) - } - return result, nil -} - -func dedupePersistedCurrentUserMessage(messages []messageWithUsage, req conversation.ChatRequest) []messageWithUsage { - if !req.UserMessagePersisted || len(messages) == 0 { - return messages - } - - targetRouteID := strings.TrimSpace(req.RouteID) - targetExternalID := strings.TrimSpace(req.ExternalMessageID) - targetPlatform := strings.TrimSpace(req.CurrentChannel) - targetSenderChannelID := strings.TrimSpace(req.SourceChannelIdentityID) - if targetExternalID == "" { - return messages - } - - for i := len(messages) - 1; i >= 0; i-- { - item := messages[i] - if !strings.EqualFold(strings.TrimSpace(item.Message.Role), "user") { - continue - } - if strings.TrimSpace(item.ExternalMessageID) != targetExternalID { - continue - } - if targetRouteID != "" && item.RouteID != "" && item.RouteID != targetRouteID { - continue - } - if targetPlatform != "" && item.Platform != "" && !strings.EqualFold(item.Platform, targetPlatform) { - continue - } - if targetSenderChannelID != "" && item.SenderChannelID != "" && item.SenderChannelID != targetSenderChannelID { - continue - } - return append(messages[:i], messages[i+1:]...) - } - - return messages -} - -func estimateMessageTokens(msg conversation.ModelMessage) int { - text := msg.TextContent() - if len(text) == 0 { - data, _ := json.Marshal(msg.Content) - return len(data) / 4 - } - return len(text) / 4 -} - -func trimMessagesByTokens(log *slog.Logger, messages []messageWithUsage, maxTokens int) []conversation.ModelMessage { - if maxTokens == 0 || len(messages) == 0 { - result := make([]conversation.ModelMessage, len(messages)) - for i, m := range messages { - result[i] = m.Message - } - return result - } - - // Scan from newest to oldest, accumulating per-message token costs. - // Messages with stored usage data use that value; others fall back to a - // character-based estimate so that user/tool messages are not free-passed. - totalTokens := 0 - cutoff := 0 - messagesWithUsage := 0 - for i := len(messages) - 1; i >= 0; i-- { - if messages[i].UsageOutputTokens != nil { - totalTokens += *messages[i].UsageOutputTokens - messagesWithUsage++ - } else { - totalTokens += estimateMessageTokens(messages[i].Message) - } - if totalTokens > maxTokens { - cutoff = i + 1 - break - } - } - - // Keep provider-valid message order: a "tool" message must follow a preceding - // assistant tool call. When history is head-trimmed, a leading tool message - // may become orphaned and cause provider 400 errors. - for cutoff < len(messages) && strings.EqualFold(strings.TrimSpace(messages[cutoff].Message.Role), "tool") { - cutoff++ - } - - if log != nil { - log.Debug("trimMessagesByTokens", - slog.Int("total_messages", len(messages)), - slog.Int("messages_with_usage", messagesWithUsage), - slog.Int("accumulated_output_tokens", totalTokens), - slog.Int("max_tokens", maxTokens), - slog.Int("cutoff_index", cutoff), - slog.Int("kept_messages", len(messages)-cutoff), - ) - } - - result := make([]conversation.ModelMessage, 0, len(messages)-cutoff) - for _, m := range messages[cutoff:] { - result = append(result, m.Message) - } - return result -} - -func (r *Resolver) resolveMemoryProvider(ctx context.Context, botID string) memprovider.Provider { - if r.memoryRegistry == nil { - return nil - } - if r.settingsService == nil { - return nil - } - botSettings, err := r.settingsService.GetBot(ctx, botID) - if err != nil { - return nil - } - providerID := strings.TrimSpace(botSettings.MemoryProviderID) - if providerID == "" { - return nil - } - p, err := r.memoryRegistry.Get(providerID) - if err != nil { - r.logger.Warn("memory provider lookup failed", slog.String("provider_id", providerID), slog.Any("error", err)) - return nil - } - return p -} - -func (r *Resolver) loadMemoryContextMessage(ctx context.Context, req conversation.ChatRequest) *conversation.ModelMessage { - p := r.resolveMemoryProvider(ctx, req.BotID) - if p == nil { - return nil - } - result, err := p.OnBeforeChat(ctx, memprovider.BeforeChatRequest{ - Query: req.Query, - BotID: req.BotID, - ChatID: req.ChatID, +// prepareRunConfig generates the system prompt and appends the user message. +func (r *Resolver) prepareRunConfig(ctx context.Context, cfg agentpkg.RunConfig) agentpkg.RunConfig { + supportsImageInput := false + for _, m := range cfg.Identity.CurrentPlatform { + _ = m + } + // Build system prompt + var files []agentpkg.SystemFile + if r.agent != nil { + fs := agentpkg.NewFSClient(nil, cfg.Identity.BotID) + files = fs.LoadSystemFiles(ctx) + } + + cfg.System = agentpkg.GenerateSystemPrompt(agentpkg.SystemPromptParams{ + Skills: cfg.Skills, + EnabledSkills: nil, + Files: files, + Inbox: cfg.Inbox, + SupportsImageInput: supportsImageInput, }) - if err != nil { - r.logger.Warn("memory provider OnBeforeChat failed", slog.Any("error", err)) - return nil - } - if result == nil || strings.TrimSpace(result.ContextText) == "" { - return nil - } - return &conversation.ModelMessage{ - Role: "user", - Content: conversation.NewTextContent(result.ContextText), + + // Add user message with the headerified query + if cfg.Query != "" { + cfg.Messages = append(cfg.Messages, sdk.UserMessage(cfg.Query)) } -} - -// --- store helpers --- - -func (r *Resolver) storeRound(ctx context.Context, req conversation.ChatRequest, messages []conversation.ModelMessage, usage json.RawMessage, usages []json.RawMessage, modelID string) error { - fullRound := make([]conversation.ModelMessage, 0, len(messages)) - roundUsages := make([]json.RawMessage, 0, len(usages)) - - // When the user message was already persisted by a channel adapter, skip - // the duplicate from the round. Otherwise keep it so that user + assistant - // messages are written atomically (deferred persistence). - skipUserQuery := req.UserMessagePersisted - for i, m := range messages { - if skipUserQuery && m.Role == "user" && strings.TrimSpace(m.TextContent()) == strings.TrimSpace(req.Query) { - skipUserQuery = false // only skip the first matching user message - continue - } - fullRound = append(fullRound, m) - if i < len(usages) { - roundUsages = append(roundUsages, usages[i]) - } - } - if len(fullRound) == 0 { - return nil - } - - r.storeMessages(ctx, req, fullRound, usage, roundUsages, modelID) - go r.storeMemory(context.WithoutCancel(ctx), req, fullRound) - return nil -} - -func (r *Resolver) storeMessages(ctx context.Context, req conversation.ChatRequest, messages []conversation.ModelMessage, usage json.RawMessage, usages []json.RawMessage, modelID string) { - if r.messageService == nil { - return - } - if strings.TrimSpace(req.BotID) == "" { - return - } - meta := buildRouteMetadata(req) - senderChannelIdentityID, senderUserID := r.resolvePersistSenderIDs(ctx, req) - - // Determine the last assistant message index for outbound asset attachment. - lastAssistantIdx := -1 - if req.OutboundAssetCollector != nil { - for i := len(messages) - 1; i >= 0; i-- { - if messages[i].Role == "assistant" { - lastAssistantIdx = i - break - } - } - } - var outboundAssets []messagepkg.AssetRef - if lastAssistantIdx >= 0 { - outboundAssets = outboundAssetRefsToMessageRefs(req.OutboundAssetCollector()) - } - - for i, msg := range messages { - content, err := json.Marshal(msg) - if err != nil { - r.logger.Warn("storeMessages: marshal failed", slog.Any("error", err)) - continue - } - messageSenderChannelIdentityID := "" - messageSenderUserID := "" - externalMessageID := "" - sourceReplyToMessageID := "" - assets := []messagepkg.AssetRef(nil) - if msg.Role == "user" { - messageSenderChannelIdentityID = senderChannelIdentityID - messageSenderUserID = senderUserID - externalMessageID = req.ExternalMessageID - if strings.TrimSpace(msg.TextContent()) == strings.TrimSpace(req.Query) { - assets = chatAttachmentsToAssetRefs(req.Attachments) - } - } else if strings.TrimSpace(req.ExternalMessageID) != "" { - sourceReplyToMessageID = req.ExternalMessageID - } - if i == lastAssistantIdx && len(outboundAssets) > 0 { - assets = append(assets, outboundAssets...) - } - var msgUsage json.RawMessage - if i < len(usages) && len(usages[i]) > 0 && !isJSONNull(usages[i]) { - msgUsage = usages[i] - } else if i == len(messages)-1 && len(usage) > 0 { - msgUsage = usage - } - if _, err := r.messageService.Persist(ctx, messagepkg.PersistInput{ - BotID: req.BotID, - RouteID: req.RouteID, - SenderChannelIdentityID: messageSenderChannelIdentityID, - SenderUserID: messageSenderUserID, - Platform: req.CurrentChannel, - ExternalMessageID: externalMessageID, - SourceReplyToMessageID: sourceReplyToMessageID, - Role: msg.Role, - Content: content, - Metadata: meta, - Usage: msgUsage, - Assets: assets, - ModelID: modelID, - }); err != nil { - r.logger.Warn("persist message failed", slog.Any("error", err)) - } - } -} - -func isJSONNull(data json.RawMessage) bool { - return len(data) == 0 || bytes.Equal(bytes.TrimSpace(data), []byte("null")) -} - -// outboundAssetRefsToMessageRefs converts outbound asset refs from the streaming -// collector into message-level asset refs for persistence. -func outboundAssetRefsToMessageRefs(refs []conversation.OutboundAssetRef) []messagepkg.AssetRef { - if len(refs) == 0 { - return nil - } - result := make([]messagepkg.AssetRef, 0, len(refs)) - for _, ref := range refs { - contentHash := strings.TrimSpace(ref.ContentHash) - if contentHash == "" { - continue - } - role := ref.Role - if strings.TrimSpace(role) == "" { - role = "attachment" - } - result = append(result, messagepkg.AssetRef{ - ContentHash: contentHash, - Role: role, - Ordinal: ref.Ordinal, - Mime: ref.Mime, - SizeBytes: ref.SizeBytes, - StorageKey: ref.StorageKey, - }) - } - return result -} - -// chatAttachmentsToAssetRefs converts ChatAttachment slice to message AssetRef slice. -// Only attachments that carry a content_hash are included. -func chatAttachmentsToAssetRefs(attachments []conversation.ChatAttachment) []messagepkg.AssetRef { - if len(attachments) == 0 { - return nil - } - refs := make([]messagepkg.AssetRef, 0, len(attachments)) - for i, att := range attachments { - contentHash := strings.TrimSpace(att.ContentHash) - if contentHash == "" { - continue - } - ref := messagepkg.AssetRef{ - ContentHash: contentHash, - Role: "attachment", - Ordinal: i, - Mime: strings.TrimSpace(att.Mime), - SizeBytes: att.Size, - } - if att.Metadata != nil { - if sk, ok := att.Metadata["storage_key"].(string); ok { - ref.StorageKey = sk - } - } - refs = append(refs, ref) - } - return refs -} - -func buildRouteMetadata(req conversation.ChatRequest) map[string]any { - if strings.TrimSpace(req.RouteID) == "" && strings.TrimSpace(req.CurrentChannel) == "" { - return nil - } - meta := map[string]any{} - if strings.TrimSpace(req.RouteID) != "" { - meta["route_id"] = req.RouteID - } - if strings.TrimSpace(req.CurrentChannel) != "" { - meta["platform"] = req.CurrentChannel - } - return meta -} - -func (r *Resolver) resolvePersistSenderIDs(ctx context.Context, req conversation.ChatRequest) (string, string) { - channelIdentityID := strings.TrimSpace(req.SourceChannelIdentityID) - userID := strings.TrimSpace(req.UserID) - - senderChannelIdentityID := "" - if r.isExistingChannelIdentityID(ctx, channelIdentityID) { - senderChannelIdentityID = channelIdentityID - } - - senderUserID := "" - if r.isExistingUserID(ctx, userID) { - senderUserID = userID - } - if senderUserID == "" && senderChannelIdentityID != "" { - if linked := r.linkedUserIDFromChannelIdentity(ctx, senderChannelIdentityID); linked != "" { - senderUserID = linked - } - } - return senderChannelIdentityID, senderUserID -} - -func (r *Resolver) isExistingChannelIdentityID(ctx context.Context, id string) bool { - if r.queries == nil { - return false - } - pgID, err := parseResolverUUID(id) - if err != nil { - return false - } - _, err = r.queries.GetChannelIdentityByID(ctx, pgID) - return err == nil -} - -func (r *Resolver) isExistingUserID(ctx context.Context, id string) bool { - if r.queries == nil { - return false - } - pgID, err := parseResolverUUID(id) - if err != nil { - return false - } - _, err = r.queries.GetUserByID(ctx, pgID) - return err == nil -} - -func (r *Resolver) linkedUserIDFromChannelIdentity(ctx context.Context, channelIdentityID string) string { - if r.queries == nil { - return "" - } - pgID, err := parseResolverUUID(channelIdentityID) - if err != nil { - return "" - } - row, err := r.queries.GetChannelIdentityByID(ctx, pgID) - if err != nil || !row.UserID.Valid { - return "" - } - return row.UserID.String() -} - -// resolveDisplayName returns the best available display name for the request identity: -// req.DisplayName if set, else channel identity's display_name, else linked user's display_name, else "User". -func (r *Resolver) resolveDisplayName(ctx context.Context, req conversation.ChatRequest) string { - if name := strings.TrimSpace(req.DisplayName); name != "" { - return name - } - if r.queries == nil { - return "User" - } - channelIdentityID := strings.TrimSpace(req.SourceChannelIdentityID) - if channelIdentityID == "" { - return "User" - } - pgID, err := parseResolverUUID(channelIdentityID) - if err != nil { - return "User" - } - ci, err := r.queries.GetChannelIdentityByID(ctx, pgID) - if err == nil && ci.DisplayName.Valid { - if name := strings.TrimSpace(ci.DisplayName.String); name != "" { - return name - } - } - linkedUserID := r.linkedUserIDFromChannelIdentity(ctx, channelIdentityID) - if linkedUserID == "" { - return "User" - } - userPgID, err := parseResolverUUID(linkedUserID) - if err != nil { - return "User" - } - u, err := r.queries.GetUserByID(ctx, userPgID) - if err != nil || !u.DisplayName.Valid { - return "User" - } - if name := strings.TrimSpace(u.DisplayName.String); name != "" { - return name - } - return "User" -} - -func (r *Resolver) storeMemory(ctx context.Context, req conversation.ChatRequest, messages []conversation.ModelMessage) { - botID := strings.TrimSpace(req.BotID) - if botID == "" { - return - } - memMsgs := toProviderMessages(messages) - if len(memMsgs) == 0 { - return - } - - p := r.resolveMemoryProvider(ctx, botID) - if p == nil { - return - } - if err := p.OnAfterChat(ctx, memprovider.AfterChatRequest{ - BotID: botID, - Messages: memMsgs, - UserID: strings.TrimSpace(req.UserID), - ChannelIdentityID: strings.TrimSpace(req.SourceChannelIdentityID), - DisplayName: r.resolveDisplayName(ctx, req), - }); err != nil { - r.logger.Warn("memory provider OnAfterChat failed", slog.String("bot_id", botID), slog.Any("error", err)) - } -} - -func toProviderMessages(messages []conversation.ModelMessage) []memprovider.Message { - out := make([]memprovider.Message, 0, len(messages)) - for _, msg := range messages { - text := strings.TrimSpace(msg.TextContent()) - if text == "" { - continue - } - role := strings.TrimSpace(msg.Role) - if role == "" { - role = "assistant" - } - out = append(out, memprovider.Message{Role: role, Content: text}) - } - return out -} - -// --- model selection --- - -func (r *Resolver) selectChatModel(ctx context.Context, req conversation.ChatRequest, botSettings settings.Settings, cs conversation.Settings) (models.GetResponse, sqlc.LlmProvider, error) { - if r.modelsService == nil { - return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("models service not configured") - } - modelID := strings.TrimSpace(req.Model) - providerFilter := strings.TrimSpace(req.Provider) - - // Priority: request model > chat settings > bot settings. - if modelID == "" && providerFilter == "" { - if value := strings.TrimSpace(cs.ModelID); value != "" { - modelID = value - } else if value := strings.TrimSpace(botSettings.ChatModelID); value != "" { - modelID = value - } - } - - if modelID == "" { - return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("chat model not configured: specify model in request or bot settings") - } - - if providerFilter == "" { - return r.fetchChatModel(ctx, modelID) - } - - candidates, err := r.listCandidates(ctx, providerFilter) - if err != nil { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - for _, m := range candidates { - if matchesModelReference(m, modelID) { - prov, err := models.FetchProviderByID(ctx, r.queries, m.LlmProviderID) - if err != nil { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - return m, prov, nil - } - } - return models.GetResponse{}, sqlc.LlmProvider{}, fmt.Errorf("chat model %q not found for provider %q", modelID, providerFilter) -} - -func (r *Resolver) fetchChatModel(ctx context.Context, modelID string) (models.GetResponse, sqlc.LlmProvider, error) { - modelRef := strings.TrimSpace(modelID) - if modelRef == "" { - return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("model id is required") - } - - // Support both model UUID and model_id slug. UUID-formatted slugs still - // work because we fall back to GetByModelID when UUID lookup misses. - var model models.GetResponse - var err error - if _, parseErr := db.ParseUUID(modelRef); parseErr == nil { - model, err = r.modelsService.GetByID(ctx, modelRef) - if err == nil { - goto resolved - } - if !errors.Is(err, pgx.ErrNoRows) { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - } - model, err = r.modelsService.GetByModelID(ctx, modelRef) - if err != nil { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - -resolved: - if model.Type != models.ModelTypeChat { - return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("model is not a chat model") - } - prov, err := models.FetchProviderByID(ctx, r.queries, model.LlmProviderID) - if err != nil { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - return model, prov, nil -} - -func matchesModelReference(model models.GetResponse, modelRef string) bool { - ref := strings.TrimSpace(modelRef) - if ref == "" { - return false - } - return model.ID == ref || model.ModelID == ref -} - -func (r *Resolver) listCandidates(ctx context.Context, providerFilter string) ([]models.GetResponse, error) { - var all []models.GetResponse - var err error - if providerFilter != "" { - all, err = r.modelsService.ListByClientType(ctx, models.ClientType(providerFilter)) - } else { - all, err = r.modelsService.ListByType(ctx, models.ModelTypeChat) - } - if err != nil { - return nil, err - } - filtered := make([]models.GetResponse, 0, len(all)) - for _, m := range all { - if m.Type == models.ModelTypeChat { - filtered = append(filtered, m) - } - } - return filtered, nil -} - -// --- inbox --- - -func (r *Resolver) markInboxRead(ctx context.Context, botID string, ids []string) { - if r.inboxService == nil || len(ids) == 0 { - return - } - if err := r.inboxService.MarkRead(ctx, botID, ids); err != nil { - r.logger.Warn("failed to mark inbox items as read", slog.String("bot_id", botID), slog.Any("error", err)) - } -} - -// --- allowed actions --- - -// --- settings --- - -func (r *Resolver) loadBotSettings(ctx context.Context, botID string) (settings.Settings, error) { - if r.settingsService == nil { - return settings.Settings{}, errors.New("settings service not configured") - } - return r.settingsService.GetBot(ctx, botID) -} - -func (r *Resolver) loadBotLoopDetectionEnabled(ctx context.Context, botID string) bool { - if r.queries == nil { - return false - } - botUUID, err := db.ParseUUID(botID) - if err != nil { - return false - } - row, err := r.queries.GetBotByID(ctx, botUUID) - if err != nil { - r.logger.Debug("failed to load bot metadata for loop detection", - slog.String("bot_id", botID), - slog.Any("error", err), - ) - return false - } - return parseLoopDetectionEnabledFromMetadata(row.Metadata) -} - -func parseLoopDetectionEnabledFromMetadata(payload []byte) bool { - if len(payload) == 0 { - return false - } - var metadata map[string]any - if err := json.Unmarshal(payload, &metadata); err != nil || metadata == nil { - return false - } - features, ok := metadata["features"].(map[string]any) - if !ok { - return false - } - loopDetection, ok := features["loop_detection"].(map[string]any) - if !ok { - return false - } - enabled, ok := loopDetection["enabled"].(bool) - if !ok { - return false - } - return enabled -} - -// --- utility --- - -func sanitizeMessages(messages []conversation.ModelMessage) []conversation.ModelMessage { - cleaned := make([]conversation.ModelMessage, 0, len(messages)) - for _, msg := range messages { - if normalized, ok := normalizeImagePartsToDataURL(msg); ok { - msg = normalized - } - if strings.TrimSpace(msg.Role) == "" { - continue - } - if !msg.HasContent() && strings.TrimSpace(msg.ToolCallID) == "" { - continue - } - cleaned = append(cleaned, msg) - } - return cleaned -} - -func normalizeImagePartsToDataURL(msg conversation.ModelMessage) (conversation.ModelMessage, bool) { - if len(msg.Content) == 0 { - return msg, false - } - var parts []map[string]json.RawMessage - if err := json.Unmarshal(msg.Content, &parts); err != nil || len(parts) == 0 { - return msg, false - } - - changed := false - for i := range parts { - partTypeRaw, ok := parts[i]["type"] - if !ok { - continue - } - var partType string - if err := json.Unmarshal(partTypeRaw, &partType); err != nil || !strings.EqualFold(partType, "image") { - continue - } - - imageRaw, ok := parts[i]["image"] - if !ok || len(imageRaw) == 0 { - continue - } - var tmp string - if json.Unmarshal(imageRaw, &tmp) == nil { - continue - } - - var payload []byte - if b, ok := decodeIndexedByteObject(imageRaw); ok { - payload = b - } else if b, ok := decodeByteArray(imageRaw); ok { - payload = b - } else { - continue - } - if len(payload) == 0 { - continue - } - - // action trigger to image only here. - mediaType := "application/octet-stream" - if mediaTypeRaw, ok := parts[i]["mediaType"]; ok { - var mt string - if err := json.Unmarshal(mediaTypeRaw, &mt); err == nil && strings.TrimSpace(mt) != "" { - mediaType = strings.TrimSpace(mt) - } - } - dataURL := "data:" + mediaType + ";base64," + base64.StdEncoding.EncodeToString(payload) - rebuilt, err := json.Marshal(dataURL) - if err != nil { - continue - } - parts[i]["image"] = rebuilt - changed = true - } - - if !changed { - return msg, false - } - rebuiltContent, err := json.Marshal(parts) - if err != nil { - return msg, false - } - msg.Content = rebuiltContent - return msg, true -} - -func decodeByteArray(raw json.RawMessage) ([]byte, bool) { - var arr []int - if err := json.Unmarshal(raw, &arr); err != nil { - return nil, false - } - if len(arr) == 0 { - return nil, false - } - out := make([]byte, len(arr)) - for i, v := range arr { - if v < 0 || v > 255 { - return nil, false - } - out[i] = byte(v) - } - return out, true -} - -func decodeIndexedByteObject(raw json.RawMessage) ([]byte, bool) { - var obj map[string]json.RawMessage - if err := json.Unmarshal(raw, &obj); err != nil || len(obj) == 0 { - return nil, false - } - type indexedByte struct { - idx int - val byte - } - items := make([]indexedByte, 0, len(obj)) - for k, vRaw := range obj { - idx, err := strconv.Atoi(k) - if err != nil || idx < 0 { - return nil, false - } - var val int - if err := json.Unmarshal(vRaw, &val); err != nil || val < 0 || val > 255 { - return nil, false - } - items = append(items, indexedByte{idx: idx, val: byte(val)}) - } - sort.Slice(items, func(i, j int) bool { return items[i].idx < items[j].idx }) - for i := range items { - if items[i].idx != i { - return nil, false - } - } - out := make([]byte, len(items)) - for i := range items { - out[i] = items[i].val - } - return out, true -} - -func normalizeGatewaySkill(entry SkillEntry) (gatewaySkill, bool) { - name := strings.TrimSpace(entry.Name) - if name == "" { - return gatewaySkill{}, false - } - description := strings.TrimSpace(entry.Description) - if description == "" { - description = name - } - content := strings.TrimSpace(entry.Content) - if content == "" { - content = description - } - return gatewaySkill{ - Name: name, - Description: description, - Content: content, - Metadata: entry.Metadata, - }, true -} - -func dedup(items []string) []string { - seen := make(map[string]struct{}, len(items)) - result := make([]string, 0, len(items)) - for _, s := range items { - trimmed := strings.TrimSpace(s) - if trimmed == "" { - continue - } - if _, ok := seen[trimmed]; ok { - continue - } - seen[trimmed] = struct{}{} - result = append(result, trimmed) - } - return result -} - -func coalescePositiveInt(values ...int) int { - for _, v := range values { - if v > 0 { - return v - } - } - return defaultMaxContextMinutes -} - -func nonNilStrings(s []string) []string { - if s == nil { - return []string{} - } - return s -} - -func nonNilModelMessages(m []conversation.ModelMessage) []conversation.ModelMessage { - if m == nil { - return []conversation.ModelMessage{} - } - return m -} - -func truncate(s string, n int) string { - return textutil.TruncateRunesWithSuffix(s, n, "...") -} - -func parseResolverUUID(id string) (pgtype.UUID, error) { - if strings.TrimSpace(id) == "" { - return pgtype.UUID{}, errors.New("empty id") - } - return db.ParseUUID(id) -} - -// UserMessageMeta holds the structured metadata attached to every user -// message. It is the single source of truth shared by the YAML header -// (sent to the LLM) and the inbox content JSONB. -type UserMessageMeta struct { - MessageID string `json:"message-id,omitempty"` - ChannelIdentityID string `json:"channel-identity-id"` - DisplayName string `json:"display-name"` - Channel string `json:"channel"` - ConversationType string `json:"conversation-type"` - ConversationName string `json:"conversation-name,omitempty"` - Time string `json:"time"` - AttachmentPaths []string `json:"attachments"` -} - -// BuildUserMessageMeta constructs a UserMessageMeta from the inbound -// parameters. Both FormatUserHeader and inbox content use this. -func BuildUserMessageMeta(messageID, channelIdentityID, displayName, channel, conversationType, conversationName string, attachmentPaths []string) UserMessageMeta { - if attachmentPaths == nil { - attachmentPaths = []string{} - } - return UserMessageMeta{ - MessageID: messageID, - ChannelIdentityID: channelIdentityID, - DisplayName: displayName, - Channel: channel, - ConversationType: conversationType, - ConversationName: conversationName, - Time: time.Now().UTC().Format(time.RFC3339), - AttachmentPaths: attachmentPaths, - } -} - -// ToMap returns the metadata as a map with the same keys used in the YAML -// header, suitable for storing as inbox content JSONB. -func (m UserMessageMeta) ToMap() map[string]any { - result := map[string]any{ - "channel-identity-id": m.ChannelIdentityID, - "display-name": m.DisplayName, - "channel": m.Channel, - "conversation-type": m.ConversationType, - "time": m.Time, - "attachments": m.AttachmentPaths, - } - if m.MessageID != "" { - result["message-id"] = m.MessageID - } - if m.ConversationName != "" { - result["conversation-name"] = m.ConversationName - } - return result -} - -// FormatUserHeader wraps a user query with YAML front-matter metadata so -// the LLM sees structured context (sender, channel, time, attachments) -// alongside the raw message. This must be the single source of truth for -// user-message formatting — the agent gateway must NOT add its own header. -func FormatUserHeader(messageID, channelIdentityID, displayName, channel, conversationType, conversationName string, attachmentPaths []string, query string) string { - meta := BuildUserMessageMeta(messageID, channelIdentityID, displayName, channel, conversationType, conversationName, attachmentPaths) - return FormatUserHeaderFromMeta(meta, query) -} - -// FormatUserHeaderFromMeta formats a pre-built UserMessageMeta into the -// YAML front-matter string sent to the LLM. -func FormatUserHeaderFromMeta(meta UserMessageMeta, query string) string { - var sb strings.Builder - sb.WriteString("---\n") - if meta.MessageID != "" { - writeYAMLString(&sb, "message-id", meta.MessageID) - } - writeYAMLString(&sb, "channel-identity-id", meta.ChannelIdentityID) - writeYAMLString(&sb, "display-name", meta.DisplayName) - writeYAMLString(&sb, "channel", meta.Channel) - writeYAMLString(&sb, "conversation-type", meta.ConversationType) - if meta.ConversationName != "" { - writeYAMLString(&sb, "conversation-name", meta.ConversationName) - } - writeYAMLString(&sb, "time", meta.Time) - if len(meta.AttachmentPaths) > 0 { - sb.WriteString("attachments:\n") - for _, p := range meta.AttachmentPaths { - sb.WriteString(" - ") - sb.WriteString(p) - sb.WriteByte('\n') - } - } else { - sb.WriteString("attachments: []\n") - } - sb.WriteString("---\n") - sb.WriteString(query) - return sb.String() -} - -func writeYAMLString(sb *strings.Builder, key, value string) { - sb.WriteString(key) - sb.WriteString(": ") - if value == "" || needsYAMLQuote(value) { - sb.WriteByte('"') - sb.WriteString(strings.ReplaceAll(value, `"`, `\"`)) - sb.WriteByte('"') - } else { - sb.WriteString(value) - } - sb.WriteByte('\n') -} - -func needsYAMLQuote(s string) bool { - if s == "" { - return true - } - for _, c := range s { - if c == ':' || c == '#' || c == '"' || c == '\'' || c == '{' || c == '}' || c == '[' || c == ']' || c == ',' || c == '\n' { - return true - } - } - return false -} - -// extractFileRefPaths collects container file paths from gateway attachments -// that use the tool_file_ref transport (files already written to the bot container). -func extractFileRefPaths(attachments []any) []string { - var paths []string - for _, att := range attachments { - if ga, ok := att.(gatewayAttachment); ok && ga.Transport == gatewayTransportToolFileRef && strings.TrimSpace(ga.Payload) != "" { - paths = append(paths, ga.Payload) - } - } - return paths + + return cfg } diff --git a/internal/conversation/flow/resolver_attachments.go b/internal/conversation/flow/resolver_attachments.go new file mode 100644 index 00000000..50dae04f --- /dev/null +++ b/internal/conversation/flow/resolver_attachments.go @@ -0,0 +1,255 @@ +package flow + +import ( + "context" + "encoding/base64" + "errors" + "fmt" + "io" + "log/slog" + "net/http" + "strings" + + attachmentpkg "github.com/memohai/memoh/internal/attachment" + "github.com/memohai/memoh/internal/conversation" + "github.com/memohai/memoh/internal/models" +) + +const ( + gatewayInlineAttachmentMaxBytes int64 = 20 * 1024 * 1024 +) + +// routeAndMergeAttachments applies CapabilityFallbackPolicy to split +// request attachments by model input modalities, then merges the results +// into a single []any for the gateway request. +func (r *Resolver) routeAndMergeAttachments(ctx context.Context, model models.GetResponse, req conversation.ChatRequest) []any { + if len(req.Attachments) == 0 { + return []any{} + } + typed := r.prepareGatewayAttachments(ctx, req) + routed := routeAttachmentsByCapability(model.InputModalities, typed) + for i := range routed.Fallback { + fallbackPath := strings.TrimSpace(routed.Fallback[i].FallbackPath) + if fallbackPath == "" { + if r != nil && r.logger != nil { + r.logger.Warn( + "drop attachment without fallback path", + slog.String("type", strings.TrimSpace(routed.Fallback[i].Type)), + slog.String("transport", strings.TrimSpace(routed.Fallback[i].Transport)), + slog.String("content_hash", strings.TrimSpace(routed.Fallback[i].ContentHash)), + slog.Bool("has_payload", strings.TrimSpace(routed.Fallback[i].Payload) != ""), + ) + } + routed.Fallback[i] = gatewayAttachment{} + continue + } + routed.Fallback[i].Type = "file" + routed.Fallback[i].Transport = gatewayTransportToolFileRef + routed.Fallback[i].Payload = fallbackPath + } + merged := make([]any, 0, len(routed.Native)+len(routed.Fallback)) + merged = append(merged, attachmentsToAny(routed.Native)...) + for _, fb := range routed.Fallback { + if fb.Type == "" || strings.TrimSpace(fb.Transport) == "" || strings.TrimSpace(fb.Payload) == "" { + continue + } + merged = append(merged, fb) + } + if len(merged) == 0 { + return []any{} + } + return merged +} + +func (r *Resolver) prepareGatewayAttachments(ctx context.Context, req conversation.ChatRequest) []gatewayAttachment { + if len(req.Attachments) == 0 { + return nil + } + prepared := make([]gatewayAttachment, 0, len(req.Attachments)) + for _, raw := range req.Attachments { + attachmentType := strings.ToLower(strings.TrimSpace(raw.Type)) + payload := strings.TrimSpace(raw.Base64) + transport := "" + fallbackPath := strings.TrimSpace(raw.Path) + if payload != "" { + transport = gatewayTransportInlineDataURL + } else { + rawURL := strings.TrimSpace(raw.URL) + switch { + case isDataURL(rawURL): + payload = rawURL + transport = gatewayTransportInlineDataURL + case isLikelyPublicURL(rawURL): + payload = rawURL + transport = gatewayTransportPublicURL + case rawURL != "" && fallbackPath == "": + fallbackPath = rawURL + } + } + item := gatewayAttachment{ + ContentHash: strings.TrimSpace(raw.ContentHash), + Type: attachmentType, + Mime: strings.TrimSpace(raw.Mime), + Size: raw.Size, + Name: strings.TrimSpace(raw.Name), + Transport: transport, + Payload: payload, + Metadata: raw.Metadata, + FallbackPath: fallbackPath, + } + item = normalizeGatewayAttachmentPayload(item) + item = r.inlineImageAttachmentAssetIfNeeded(ctx, strings.TrimSpace(req.BotID), item) + prepared = append(prepared, item) + } + return prepared +} + +func normalizeGatewayAttachmentPayload(item gatewayAttachment) gatewayAttachment { + if item.Transport != gatewayTransportInlineDataURL { + return item + } + payload := strings.TrimSpace(item.Payload) + if payload == "" { + return item + } + if strings.HasPrefix(strings.ToLower(payload), "data:") { + mime := strings.TrimSpace(item.Mime) + if mime == "" || strings.EqualFold(mime, "application/octet-stream") { + if extracted := attachmentpkg.MimeFromDataURL(payload); extracted != "" { + item.Mime = extracted + } + } + item.Payload = payload + return item + } + mime := strings.TrimSpace(item.Mime) + if mime == "" { + mime = "application/octet-stream" + } + item.Payload = attachmentpkg.NormalizeBase64DataURL(payload, mime) + return item +} + +func isLikelyPublicURL(raw string) bool { + trimmed := strings.ToLower(strings.TrimSpace(raw)) + return strings.HasPrefix(trimmed, "http://") || strings.HasPrefix(trimmed, "https://") +} + +func isDataURL(raw string) bool { + trimmed := strings.ToLower(strings.TrimSpace(raw)) + return strings.HasPrefix(trimmed, "data:") +} + +func (r *Resolver) inlineImageAttachmentAssetIfNeeded(ctx context.Context, botID string, item gatewayAttachment) gatewayAttachment { + if item.Type != "image" { + return item + } + if strings.TrimSpace(item.Payload) != "" && + (item.Transport == gatewayTransportInlineDataURL || item.Transport == gatewayTransportPublicURL) { + return item + } + contentHash := strings.TrimSpace(item.ContentHash) + if contentHash == "" { + return item + } + dataURL, mime, err := r.inlineAssetAsDataURL(ctx, botID, contentHash, item.Type, item.Mime) + if err != nil { + if r != nil && r.logger != nil { + r.logger.Warn( + "inline gateway image attachment failed", + slog.Any("error", err), + slog.String("bot_id", botID), + slog.String("content_hash", contentHash), + ) + } + return item + } + item.Transport = gatewayTransportInlineDataURL + item.Payload = dataURL + if strings.TrimSpace(item.Mime) == "" { + item.Mime = mime + } + return item +} + +func (r *Resolver) inlineAssetAsDataURL(ctx context.Context, botID, contentHash, attachmentType, fallbackMime string) (string, string, error) { + if r == nil || r.assetLoader == nil { + return "", "", errors.New("gateway asset loader not configured") + } + reader, assetMime, err := r.assetLoader.OpenForGateway(ctx, botID, contentHash) + if err != nil { + return "", "", fmt.Errorf("open asset: %w", err) + } + defer func() { + _ = reader.Close() + }() + mime := strings.TrimSpace(fallbackMime) + if mime == "" { + mime = strings.TrimSpace(assetMime) + } + dataURL, resolvedMime, err := encodeReaderAsDataURL(reader, gatewayInlineAttachmentMaxBytes, attachmentType, mime) + if err != nil { + return "", "", err + } + return dataURL, resolvedMime, nil +} + +func encodeReaderAsDataURL(reader io.Reader, maxBytes int64, attachmentType, fallbackMime string) (string, string, error) { + if reader == nil { + return "", "", errors.New("reader is required") + } + if maxBytes <= 0 { + return "", "", errors.New("max bytes must be greater than 0") + } + limited := &io.LimitedReader{R: reader, N: maxBytes + 1} + head := make([]byte, 512) + n, err := limited.Read(head) + if err != nil && !errors.Is(err, io.EOF) { + return "", "", fmt.Errorf("read asset: %w", err) + } + head = head[:n] + + mime := strings.TrimSpace(fallbackMime) + if strings.EqualFold(strings.TrimSpace(attachmentType), "image") && + (strings.TrimSpace(mime) == "" || strings.EqualFold(strings.TrimSpace(mime), "application/octet-stream")) { + detected := strings.TrimSpace(http.DetectContentType(head)) + if strings.HasPrefix(strings.ToLower(detected), "image/") { + mime = detected + } + } + if mime == "" { + mime = "application/octet-stream" + } + + var encoded strings.Builder + encoded.Grow(len("data:") + len(mime) + len(";base64,")) + encoded.WriteString("data:") + encoded.WriteString(mime) + encoded.WriteString(";base64,") + + encoder := base64.NewEncoder(base64.StdEncoding, &encoded) + if len(head) > 0 { + if _, err := encoder.Write(head); err != nil { + _ = encoder.Close() + return "", "", fmt.Errorf("encode asset head: %w", err) + } + } + copied, err := io.Copy(encoder, limited) + if err != nil { + _ = encoder.Close() + return "", "", fmt.Errorf("encode asset body: %w", err) + } + if err := encoder.Close(); err != nil { + return "", "", fmt.Errorf("finalize asset encoding: %w", err) + } + + total := int64(len(head)) + copied + if total > maxBytes { + return "", "", fmt.Errorf( + "asset too large to inline: %d > %d", + total, + maxBytes, + ) + } + return encoded.String(), mime, nil +} diff --git a/internal/conversation/flow/resolver_history.go b/internal/conversation/flow/resolver_history.go new file mode 100644 index 00000000..ef1ba771 --- /dev/null +++ b/internal/conversation/flow/resolver_history.go @@ -0,0 +1,160 @@ +package flow + +import ( + "context" + "encoding/json" + "log/slog" + "strings" + "time" + + "github.com/memohai/memoh/internal/conversation" +) + +type messageWithUsage struct { + Message conversation.ModelMessage + UsageInputTokens *int + UsageOutputTokens *int + RouteID string + ExternalMessageID string + Platform string + SenderChannelID string +} + +func (r *Resolver) loadMessages(ctx context.Context, chatID string, maxContextMinutes int) ([]messageWithUsage, error) { + if r.messageService == nil { + return nil, nil + } + since := time.Now().UTC().Add(-time.Duration(maxContextMinutes) * time.Minute) + msgs, err := r.messageService.ListActiveSince(ctx, chatID, since) + if err != nil { + return nil, err + } + var result []messageWithUsage + for _, m := range msgs { + var mm conversation.ModelMessage + if err := json.Unmarshal(m.Content, &mm); err != nil { + r.logger.Warn("loadMessages: content unmarshal failed, treating as raw text", + slog.String("chat_id", chatID), slog.Any("error", err)) + mm = conversation.ModelMessage{Role: m.Role, Content: m.Content} + } else { + mm.Role = m.Role + } + var inputTokens *int + var outputTokens *int + if len(m.Usage) > 0 { + var u usageInfo + if json.Unmarshal(m.Usage, &u) == nil { + inputTokens = u.InputTokens + outputTokens = u.OutputTokens + } + } + result = append(result, messageWithUsage{ + Message: mm, + UsageInputTokens: inputTokens, + UsageOutputTokens: outputTokens, + RouteID: strings.TrimSpace(m.RouteID), + ExternalMessageID: strings.TrimSpace(m.ExternalMessageID), + Platform: strings.TrimSpace(m.Platform), + SenderChannelID: strings.TrimSpace(m.SenderChannelIdentityID), + }) + } + return result, nil +} + +func dedupePersistedCurrentUserMessage(messages []messageWithUsage, req conversation.ChatRequest) []messageWithUsage { + if !req.UserMessagePersisted || len(messages) == 0 { + return messages + } + + targetRouteID := strings.TrimSpace(req.RouteID) + targetExternalID := strings.TrimSpace(req.ExternalMessageID) + targetPlatform := strings.TrimSpace(req.CurrentChannel) + targetSenderChannelID := strings.TrimSpace(req.SourceChannelIdentityID) + if targetExternalID == "" { + return messages + } + + for i := len(messages) - 1; i >= 0; i-- { + item := messages[i] + if !strings.EqualFold(strings.TrimSpace(item.Message.Role), "user") { + continue + } + if strings.TrimSpace(item.ExternalMessageID) != targetExternalID { + continue + } + if targetRouteID != "" && item.RouteID != "" && item.RouteID != targetRouteID { + continue + } + if targetPlatform != "" && item.Platform != "" && !strings.EqualFold(item.Platform, targetPlatform) { + continue + } + if targetSenderChannelID != "" && item.SenderChannelID != "" && item.SenderChannelID != targetSenderChannelID { + continue + } + return append(messages[:i], messages[i+1:]...) + } + + return messages +} + +func estimateMessageTokens(msg conversation.ModelMessage) int { + text := msg.TextContent() + if len(text) == 0 { + data, _ := json.Marshal(msg.Content) + return len(data) / 4 + } + return len(text) / 4 +} + +func trimMessagesByTokens(log *slog.Logger, messages []messageWithUsage, maxTokens int) []conversation.ModelMessage { + if maxTokens == 0 || len(messages) == 0 { + result := make([]conversation.ModelMessage, len(messages)) + for i, m := range messages { + result[i] = m.Message + } + return result + } + + // Scan from newest to oldest, accumulating per-message token costs. + // Messages with stored usage data use that value; others fall back to a + // character-based estimate so that user/tool messages are not free-passed. + totalTokens := 0 + cutoff := 0 + messagesWithUsage := 0 + for i := len(messages) - 1; i >= 0; i-- { + if messages[i].UsageOutputTokens != nil { + totalTokens += *messages[i].UsageOutputTokens + messagesWithUsage++ + } else { + totalTokens += estimateMessageTokens(messages[i].Message) + } + if totalTokens > maxTokens { + cutoff = i + 1 + break + } + } + + // Keep provider-valid message order: a "tool" message must follow a preceding + // assistant tool call. When history is head-trimmed, a leading tool message + // may become orphaned and cause provider 400 errors. + for cutoff < len(messages) && strings.EqualFold(strings.TrimSpace(messages[cutoff].Message.Role), "tool") { + cutoff++ + } + + if log != nil { + log.Debug("trimMessagesByTokens", + slog.Int("total_messages", len(messages)), + slog.Int("messages_with_usage", messagesWithUsage), + slog.Int("accumulated_output_tokens", totalTokens), + slog.Int("max_tokens", maxTokens), + slog.Int("cutoff_index", cutoff), + slog.Int("kept_messages", len(messages)-cutoff), + ) + } + + result := make([]conversation.ModelMessage, 0, len(messages)-cutoff) + for _, m := range messages[cutoff:] { + result = append(result, m.Message) + } + return result +} diff --git a/internal/conversation/flow/resolver_identity.go b/internal/conversation/flow/resolver_identity.go new file mode 100644 index 00000000..1168943c --- /dev/null +++ b/internal/conversation/flow/resolver_identity.go @@ -0,0 +1,88 @@ +package flow + +import ( + "context" + "strings" + + "github.com/memohai/memoh/internal/conversation" +) + +// resolveDisplayName returns the best available display name for the request identity: +// req.DisplayName if set, else channel identity's display_name, else linked user's display_name, else "User". +func (r *Resolver) resolveDisplayName(ctx context.Context, req conversation.ChatRequest) string { + if name := strings.TrimSpace(req.DisplayName); name != "" { + return name + } + if r.queries == nil { + return "User" + } + channelIdentityID := strings.TrimSpace(req.SourceChannelIdentityID) + if channelIdentityID == "" { + return "User" + } + pgID, err := parseResolverUUID(channelIdentityID) + if err != nil { + return "User" + } + ci, err := r.queries.GetChannelIdentityByID(ctx, pgID) + if err == nil && ci.DisplayName.Valid { + if name := strings.TrimSpace(ci.DisplayName.String); name != "" { + return name + } + } + linkedUserID := r.linkedUserIDFromChannelIdentity(ctx, channelIdentityID) + if linkedUserID == "" { + return "User" + } + userPgID, err := parseResolverUUID(linkedUserID) + if err != nil { + return "User" + } + u, err := r.queries.GetUserByID(ctx, userPgID) + if err != nil || !u.DisplayName.Valid { + return "User" + } + if name := strings.TrimSpace(u.DisplayName.String); name != "" { + return name + } + return "User" +} + +func (r *Resolver) isExistingChannelIdentityID(ctx context.Context, id string) bool { + if r.queries == nil { + return false + } + pgID, err := parseResolverUUID(id) + if err != nil { + return false + } + _, err = r.queries.GetChannelIdentityByID(ctx, pgID) + return err == nil +} + +func (r *Resolver) isExistingUserID(ctx context.Context, id string) bool { + if r.queries == nil { + return false + } + pgID, err := parseResolverUUID(id) + if err != nil { + return false + } + _, err = r.queries.GetUserByID(ctx, pgID) + return err == nil +} + +func (r *Resolver) linkedUserIDFromChannelIdentity(ctx context.Context, channelIdentityID string) string { + if r.queries == nil { + return "" + } + pgID, err := parseResolverUUID(channelIdentityID) + if err != nil { + return "" + } + row, err := r.queries.GetChannelIdentityByID(ctx, pgID) + if err != nil || !row.UserID.Valid { + return "" + } + return row.UserID.String() +} diff --git a/internal/conversation/flow/resolver_memory.go b/internal/conversation/flow/resolver_memory.go new file mode 100644 index 00000000..c1d198c0 --- /dev/null +++ b/internal/conversation/flow/resolver_memory.go @@ -0,0 +1,97 @@ +package flow + +import ( + "context" + "log/slog" + "strings" + + "github.com/memohai/memoh/internal/conversation" + memprovider "github.com/memohai/memoh/internal/memory/adapters" +) + +func (r *Resolver) resolveMemoryProvider(ctx context.Context, botID string) memprovider.Provider { + if r.memoryRegistry == nil { + return nil + } + if r.settingsService == nil { + return nil + } + botSettings, err := r.settingsService.GetBot(ctx, botID) + if err != nil { + return nil + } + providerID := strings.TrimSpace(botSettings.MemoryProviderID) + if providerID == "" { + return nil + } + p, err := r.memoryRegistry.Get(providerID) + if err != nil { + r.logger.Warn("memory provider lookup failed", slog.String("provider_id", providerID), slog.Any("error", err)) + return nil + } + return p +} + +func (r *Resolver) loadMemoryContextMessage(ctx context.Context, req conversation.ChatRequest) *conversation.ModelMessage { + p := r.resolveMemoryProvider(ctx, req.BotID) + if p == nil { + return nil + } + result, err := p.OnBeforeChat(ctx, memprovider.BeforeChatRequest{ + Query: req.Query, + BotID: req.BotID, + ChatID: req.ChatID, + }) + if err != nil { + r.logger.Warn("memory provider OnBeforeChat failed", slog.Any("error", err)) + return nil + } + if result == nil || strings.TrimSpace(result.ContextText) == "" { + return nil + } + return &conversation.ModelMessage{ + Role: "user", + Content: conversation.NewTextContent(result.ContextText), + } +} + +func (r *Resolver) storeMemory(ctx context.Context, req conversation.ChatRequest, messages []conversation.ModelMessage) { + botID := strings.TrimSpace(req.BotID) + if botID == "" { + return + } + memMsgs := toProviderMessages(messages) + if len(memMsgs) == 0 { + return + } + + p := r.resolveMemoryProvider(ctx, botID) + if p == nil { + return + } + if err := p.OnAfterChat(ctx, memprovider.AfterChatRequest{ + BotID: botID, + Messages: memMsgs, + UserID: strings.TrimSpace(req.UserID), + ChannelIdentityID: strings.TrimSpace(req.SourceChannelIdentityID), + DisplayName: r.resolveDisplayName(ctx, req), + }); err != nil { + r.logger.Warn("memory provider OnAfterChat failed", slog.String("bot_id", botID), slog.Any("error", err)) + } +} + +func toProviderMessages(messages []conversation.ModelMessage) []memprovider.Message { + out := make([]memprovider.Message, 0, len(messages)) + for _, msg := range messages { + text := strings.TrimSpace(msg.TextContent()) + if text == "" { + continue + } + role := strings.TrimSpace(msg.Role) + if role == "" { + role = "assistant" + } + out = append(out, memprovider.Message{Role: role, Content: text}) + } + return out +} diff --git a/internal/conversation/flow/resolver_messages.go b/internal/conversation/flow/resolver_messages.go new file mode 100644 index 00000000..3be9b5c9 --- /dev/null +++ b/internal/conversation/flow/resolver_messages.go @@ -0,0 +1,84 @@ +package flow + +import ( + "encoding/json" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + "github.com/memohai/memoh/internal/conversation" +) + +// sdkMessagesToModelMessages converts SDK messages to the persistence/API format +// at the resolver boundary. This is the only place where this conversion should happen. +func sdkMessagesToModelMessages(msgs []sdk.Message) []conversation.ModelMessage { + result := make([]conversation.ModelMessage, 0, len(msgs)) + for _, msg := range msgs { + data, err := json.Marshal(msg) + if err != nil { + continue + } + var envelope struct { + Content json.RawMessage `json:"content"` + } + if err := json.Unmarshal(data, &envelope); err != nil { + continue + } + result = append(result, conversation.ModelMessage{ + Role: string(msg.Role), + Content: envelope.Content, + }) + } + return result +} + +// modelMessageToSDKMessage converts a persistence format message to SDK message +// at the resolver boundary using sdk.Message's native JSON deserialization. +func modelMessageToSDKMessage(mm conversation.ModelMessage) sdk.Message { + var s string + if err := json.Unmarshal(mm.Content, &s); err == nil { + return sdk.Message{ + Role: sdk.MessageRole(mm.Role), + Content: []sdk.MessagePart{sdk.TextPart{Text: s}}, + } + } + + // Try the full sdk.Message format (content is an array of typed parts) + envelope, _ := json.Marshal(struct { + Role string `json:"role"` + Content json.RawMessage `json:"content"` + }{ + Role: mm.Role, + Content: mm.Content, + }) + var msg sdk.Message + if err := json.Unmarshal(envelope, &msg); err == nil { + return msg + } + + return sdk.Message{Role: sdk.MessageRole(mm.Role)} +} + +// prependUserMessage prepends the user query as a ModelMessage to the output +// messages from the agent. The SDK only returns output messages (assistant + tool); +// user messages must be added back at the resolver boundary for persistence. +func prependUserMessage(query string, output []conversation.ModelMessage) []conversation.ModelMessage { + if strings.TrimSpace(query) == "" { + return output + } + round := make([]conversation.ModelMessage, 0, 1+len(output)) + round = append(round, conversation.ModelMessage{ + Role: "user", + Content: conversation.NewTextContent(query), + }) + return append(round, output...) +} + +// modelMessagesToSDKMessages converts a slice of persistence messages to SDK messages. +func modelMessagesToSDKMessages(msgs []conversation.ModelMessage) []sdk.Message { + result := make([]sdk.Message, 0, len(msgs)) + for _, mm := range msgs { + result = append(result, modelMessageToSDKMessage(mm)) + } + return result +} diff --git a/internal/conversation/flow/resolver_model_selection.go b/internal/conversation/flow/resolver_model_selection.go new file mode 100644 index 00000000..a608a078 --- /dev/null +++ b/internal/conversation/flow/resolver_model_selection.go @@ -0,0 +1,119 @@ +package flow + +import ( + "context" + "errors" + "fmt" + "strings" + + "github.com/jackc/pgx/v5" + + "github.com/memohai/memoh/internal/conversation" + "github.com/memohai/memoh/internal/db" + "github.com/memohai/memoh/internal/db/sqlc" + "github.com/memohai/memoh/internal/models" + "github.com/memohai/memoh/internal/settings" +) + +func (r *Resolver) selectChatModel(ctx context.Context, req conversation.ChatRequest, botSettings settings.Settings, cs conversation.Settings) (models.GetResponse, sqlc.LlmProvider, error) { + if r.modelsService == nil { + return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("models service not configured") + } + modelID := strings.TrimSpace(req.Model) + providerFilter := strings.TrimSpace(req.Provider) + + // Priority: request model > chat settings > bot settings. + if modelID == "" && providerFilter == "" { + if value := strings.TrimSpace(cs.ModelID); value != "" { + modelID = value + } else if value := strings.TrimSpace(botSettings.ChatModelID); value != "" { + modelID = value + } + } + + if modelID == "" { + return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("chat model not configured: specify model in request or bot settings") + } + + if providerFilter == "" { + return r.fetchChatModel(ctx, modelID) + } + + candidates, err := r.listCandidates(ctx, providerFilter) + if err != nil { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + for _, m := range candidates { + if matchesModelReference(m, modelID) { + prov, err := models.FetchProviderByID(ctx, r.queries, m.LlmProviderID) + if err != nil { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + return m, prov, nil + } + } + return models.GetResponse{}, sqlc.LlmProvider{}, fmt.Errorf("chat model %q not found for provider %q", modelID, providerFilter) +} + +func (r *Resolver) fetchChatModel(ctx context.Context, modelID string) (models.GetResponse, sqlc.LlmProvider, error) { + modelRef := strings.TrimSpace(modelID) + if modelRef == "" { + return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("model id is required") + } + + // Support both model UUID and model_id slug. UUID-formatted slugs still + // work because we fall back to GetByModelID when UUID lookup misses. + var model models.GetResponse + var err error + if _, parseErr := db.ParseUUID(modelRef); parseErr == nil { + model, err = r.modelsService.GetByID(ctx, modelRef) + if err == nil { + goto resolved + } + if !errors.Is(err, pgx.ErrNoRows) { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + } + model, err = r.modelsService.GetByModelID(ctx, modelRef) + if err != nil { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + +resolved: + if model.Type != models.ModelTypeChat { + return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("model is not a chat model") + } + prov, err := models.FetchProviderByID(ctx, r.queries, model.LlmProviderID) + if err != nil { + return models.GetResponse{}, sqlc.LlmProvider{}, err + } + return model, prov, nil +} + +func matchesModelReference(model models.GetResponse, modelRef string) bool { + ref := strings.TrimSpace(modelRef) + if ref == "" { + return false + } + return model.ID == ref || model.ModelID == ref +} + +func (r *Resolver) listCandidates(ctx context.Context, providerFilter string) ([]models.GetResponse, error) { + var all []models.GetResponse + var err error + if providerFilter != "" { + all, err = r.modelsService.ListByClientType(ctx, models.ClientType(providerFilter)) + } else { + all, err = r.modelsService.ListByType(ctx, models.ModelTypeChat) + } + if err != nil { + return nil, err + } + filtered := make([]models.GetResponse, 0, len(all)) + for _, m := range all { + if m.Type == models.ModelTypeChat { + filtered = append(filtered, m) + } + } + return filtered, nil +} diff --git a/internal/conversation/flow/resolver_settings.go b/internal/conversation/flow/resolver_settings.go new file mode 100644 index 00000000..d4eeecad --- /dev/null +++ b/internal/conversation/flow/resolver_settings.go @@ -0,0 +1,69 @@ +package flow + +import ( + "context" + "encoding/json" + "errors" + "log/slog" + + "github.com/memohai/memoh/internal/db" + "github.com/memohai/memoh/internal/settings" +) + +func (r *Resolver) loadBotSettings(ctx context.Context, botID string) (settings.Settings, error) { + if r.settingsService == nil { + return settings.Settings{}, errors.New("settings service not configured") + } + return r.settingsService.GetBot(ctx, botID) +} + +func (r *Resolver) loadBotLoopDetectionEnabled(ctx context.Context, botID string) bool { + if r.queries == nil { + return false + } + botUUID, err := db.ParseUUID(botID) + if err != nil { + return false + } + row, err := r.queries.GetBotByID(ctx, botUUID) + if err != nil { + r.logger.Debug("failed to load bot metadata for loop detection", + slog.String("bot_id", botID), + slog.Any("error", err), + ) + return false + } + return parseLoopDetectionEnabledFromMetadata(row.Metadata) +} + +func parseLoopDetectionEnabledFromMetadata(payload []byte) bool { + if len(payload) == 0 { + return false + } + var metadata map[string]any + if err := json.Unmarshal(payload, &metadata); err != nil || metadata == nil { + return false + } + features, ok := metadata["features"].(map[string]any) + if !ok { + return false + } + loopDetection, ok := features["loop_detection"].(map[string]any) + if !ok { + return false + } + enabled, ok := loopDetection["enabled"].(bool) + if !ok { + return false + } + return enabled +} + +func (r *Resolver) markInboxRead(ctx context.Context, botID string, ids []string) { + if r.inboxService == nil || len(ids) == 0 { + return + } + if err := r.inboxService.MarkRead(ctx, botID, ids); err != nil { + r.logger.Warn("failed to mark inbox items as read", slog.String("bot_id", botID), slog.Any("error", err)) + } +} diff --git a/internal/conversation/flow/resolver_skills_test.go b/internal/conversation/flow/resolver_skills_test.go deleted file mode 100644 index 551a14aa..00000000 --- a/internal/conversation/flow/resolver_skills_test.go +++ /dev/null @@ -1,32 +0,0 @@ -package flow - -import "testing" - -func TestNormalizeGatewaySkill_Fallbacks(t *testing.T) { - got, ok := normalizeGatewaySkill(SkillEntry{ - Name: " demo-skill ", - }) - if !ok { - t.Fatal("expected valid skill") - } - if got.Name != "demo-skill" { - t.Fatalf("expected trimmed name demo-skill, got %q", got.Name) - } - if got.Description != "demo-skill" { - t.Fatalf("expected description fallback to name, got %q", got.Description) - } - if got.Content != "demo-skill" { - t.Fatalf("expected content fallback to description, got %q", got.Content) - } -} - -func TestNormalizeGatewaySkill_RejectsEmptyName(t *testing.T) { - _, ok := normalizeGatewaySkill(SkillEntry{ - Name: " ", - Description: "desc", - Content: "content", - }) - if ok { - t.Fatal("expected invalid skill when name is empty") - } -} diff --git a/internal/conversation/flow/resolver_store.go b/internal/conversation/flow/resolver_store.go new file mode 100644 index 00000000..9cbdadef --- /dev/null +++ b/internal/conversation/flow/resolver_store.go @@ -0,0 +1,238 @@ +package flow + +import ( + "bytes" + "context" + "encoding/json" + "log/slog" + "strings" + + "github.com/memohai/memoh/internal/conversation" + messagepkg "github.com/memohai/memoh/internal/message" +) + +func (r *Resolver) storeRound(ctx context.Context, req conversation.ChatRequest, messages []conversation.ModelMessage, usage json.RawMessage, usages []json.RawMessage, modelID string) error { + fullRound := make([]conversation.ModelMessage, 0, len(messages)) + roundUsages := make([]json.RawMessage, 0, len(usages)) + + // When the user message was already persisted by a channel adapter, skip + // the duplicate from the round. Otherwise keep it so that user + assistant + // messages are written atomically (deferred persistence). + skipUserQuery := req.UserMessagePersisted + for i, m := range messages { + if skipUserQuery && m.Role == "user" && strings.TrimSpace(m.TextContent()) == strings.TrimSpace(req.Query) { + skipUserQuery = false // only skip the first matching user message + continue + } + fullRound = append(fullRound, m) + if i < len(usages) { + roundUsages = append(roundUsages, usages[i]) + } + } + if len(fullRound) == 0 { + return nil + } + + r.storeMessages(ctx, req, fullRound, usage, roundUsages, modelID) + go r.storeMemory(context.WithoutCancel(ctx), req, fullRound) + return nil +} + +func (r *Resolver) storeMessages(ctx context.Context, req conversation.ChatRequest, messages []conversation.ModelMessage, usage json.RawMessage, usages []json.RawMessage, modelID string) { + if r.messageService == nil { + return + } + if strings.TrimSpace(req.BotID) == "" { + return + } + meta := buildRouteMetadata(req) + senderChannelIdentityID, senderUserID := r.resolvePersistSenderIDs(ctx, req) + + // Determine the last assistant message index for outbound asset attachment. + lastAssistantIdx := -1 + if req.OutboundAssetCollector != nil { + for i := len(messages) - 1; i >= 0; i-- { + if messages[i].Role == "assistant" { + lastAssistantIdx = i + break + } + } + } + var outboundAssets []messagepkg.AssetRef + if lastAssistantIdx >= 0 { + outboundAssets = outboundAssetRefsToMessageRefs(req.OutboundAssetCollector()) + } + + for i, msg := range messages { + content, err := json.Marshal(msg) + if err != nil { + r.logger.Warn("storeMessages: marshal failed", slog.Any("error", err)) + continue + } + messageSenderChannelIdentityID := "" + messageSenderUserID := "" + externalMessageID := "" + sourceReplyToMessageID := "" + assets := []messagepkg.AssetRef(nil) + if msg.Role == "user" { + messageSenderChannelIdentityID = senderChannelIdentityID + messageSenderUserID = senderUserID + externalMessageID = req.ExternalMessageID + if strings.TrimSpace(msg.TextContent()) == strings.TrimSpace(req.Query) { + assets = chatAttachmentsToAssetRefs(req.Attachments) + } + } else if strings.TrimSpace(req.ExternalMessageID) != "" { + sourceReplyToMessageID = req.ExternalMessageID + } + if i == lastAssistantIdx && len(outboundAssets) > 0 { + assets = append(assets, outboundAssets...) + } + var msgUsage json.RawMessage + if i < len(usages) && len(usages[i]) > 0 && !isJSONNull(usages[i]) { + msgUsage = usages[i] + } else if i == len(messages)-1 && len(usage) > 0 { + msgUsage = usage + } + if _, err := r.messageService.Persist(ctx, messagepkg.PersistInput{ + BotID: req.BotID, + RouteID: req.RouteID, + SenderChannelIdentityID: messageSenderChannelIdentityID, + SenderUserID: messageSenderUserID, + Platform: req.CurrentChannel, + ExternalMessageID: externalMessageID, + SourceReplyToMessageID: sourceReplyToMessageID, + Role: msg.Role, + Content: content, + Metadata: meta, + Usage: msgUsage, + Assets: assets, + ModelID: modelID, + }); err != nil { + r.logger.Warn("persist message failed", slog.Any("error", err)) + } + } +} + +func isJSONNull(data json.RawMessage) bool { + return len(data) == 0 || bytes.Equal(bytes.TrimSpace(data), []byte("null")) +} + +// outboundAssetRefsToMessageRefs converts outbound asset refs from the streaming +// collector into message-level asset refs for persistence. +func outboundAssetRefsToMessageRefs(refs []conversation.OutboundAssetRef) []messagepkg.AssetRef { + if len(refs) == 0 { + return nil + } + result := make([]messagepkg.AssetRef, 0, len(refs)) + for _, ref := range refs { + contentHash := strings.TrimSpace(ref.ContentHash) + if contentHash == "" { + continue + } + role := ref.Role + if strings.TrimSpace(role) == "" { + role = "attachment" + } + result = append(result, messagepkg.AssetRef{ + ContentHash: contentHash, + Role: role, + Ordinal: ref.Ordinal, + Mime: ref.Mime, + SizeBytes: ref.SizeBytes, + StorageKey: ref.StorageKey, + Name: ref.Name, + Metadata: ref.Metadata, + }) + } + return result +} + +// chatAttachmentsToAssetRefs converts ChatAttachment slice to message AssetRef slice. +// Only attachments that carry a content_hash are included. +func chatAttachmentsToAssetRefs(attachments []conversation.ChatAttachment) []messagepkg.AssetRef { + if len(attachments) == 0 { + return nil + } + refs := make([]messagepkg.AssetRef, 0, len(attachments)) + for i, att := range attachments { + contentHash := strings.TrimSpace(att.ContentHash) + if contentHash == "" { + continue + } + ref := messagepkg.AssetRef{ + ContentHash: contentHash, + Role: "attachment", + Ordinal: i, + Mime: strings.TrimSpace(att.Mime), + SizeBytes: att.Size, + Name: strings.TrimSpace(att.Name), + Metadata: att.Metadata, + } + if att.Metadata != nil { + if sk, ok := att.Metadata["storage_key"].(string); ok { + ref.StorageKey = sk + } + } + refs = append(refs, ref) + } + return refs +} + +func buildRouteMetadata(req conversation.ChatRequest) map[string]any { + if strings.TrimSpace(req.RouteID) == "" && strings.TrimSpace(req.CurrentChannel) == "" { + return nil + } + meta := map[string]any{} + if strings.TrimSpace(req.RouteID) != "" { + meta["route_id"] = req.RouteID + } + if strings.TrimSpace(req.CurrentChannel) != "" { + meta["platform"] = req.CurrentChannel + } + return meta +} + +func (r *Resolver) resolvePersistSenderIDs(ctx context.Context, req conversation.ChatRequest) (string, string) { + channelIdentityID := strings.TrimSpace(req.SourceChannelIdentityID) + userID := strings.TrimSpace(req.UserID) + + senderChannelIdentityID := "" + if r.isExistingChannelIdentityID(ctx, channelIdentityID) { + senderChannelIdentityID = channelIdentityID + } + + senderUserID := "" + if r.isExistingUserID(ctx, userID) { + senderUserID = userID + } + if senderUserID == "" && senderChannelIdentityID != "" { + if linked := r.linkedUserIDFromChannelIdentity(ctx, senderChannelIdentityID); linked != "" { + senderUserID = linked + } + } + return senderChannelIdentityID, senderUserID +} + +// LinkOutboundAssets links bot-generated assets to the latest assistant +// message for the given bot. Used by the WebSocket path where attachment +// ingestion happens after message persistence. +func (r *Resolver) LinkOutboundAssets(ctx context.Context, botID string, assets []messagepkg.AssetRef) { + if r.messageService == nil || len(assets) == 0 || strings.TrimSpace(botID) == "" { + return + } + // ListLatest returns messages in DESC order (newest first). + msgs, err := r.messageService.ListLatest(ctx, botID, 5) + if err != nil { + r.logger.Warn("LinkOutboundAssets: list latest failed", slog.Any("error", err)) + return + } + for _, msg := range msgs { + if msg.Role == "assistant" { + if linkErr := r.messageService.LinkAssets(ctx, msg.ID, assets); linkErr != nil { + r.logger.Warn("LinkOutboundAssets: link failed", slog.Any("error", linkErr)) + } + return + } + } + r.logger.Warn("LinkOutboundAssets: no assistant message found", slog.String("bot_id", botID)) +} diff --git a/internal/conversation/flow/resolver_stream.go b/internal/conversation/flow/resolver_stream.go new file mode 100644 index 00000000..2f23b6de --- /dev/null +++ b/internal/conversation/flow/resolver_stream.go @@ -0,0 +1,170 @@ +package flow + +import ( + "context" + "encoding/json" + "fmt" + "log/slog" + + sdk "github.com/memohai/twilight-ai/sdk" + + agentpkg "github.com/memohai/memoh/internal/agent" + "github.com/memohai/memoh/internal/conversation" +) + +// WSStreamEvent represents a raw JSON event forwarded from the agent. +type WSStreamEvent = json.RawMessage + +// StreamChat runs a streaming chat via the internal agent. +func (r *Resolver) StreamChat(ctx context.Context, req conversation.ChatRequest) (<-chan conversation.StreamChunk, <-chan error) { + chunkCh := make(chan conversation.StreamChunk) + errCh := make(chan error, 1) + r.logger.Info("agent stream start", + slog.String("bot_id", req.BotID), + slog.String("chat_id", req.ChatID), + ) + + go func() { + defer close(chunkCh) + defer close(errCh) + + streamReq := req + rc, err := r.resolve(ctx, streamReq) + if err != nil { + r.logger.Error("agent stream resolve failed", + slog.String("bot_id", streamReq.BotID), + slog.String("chat_id", streamReq.ChatID), + slog.Any("error", err), + ) + errCh <- err + return + } + streamReq.Query = rc.query + + cfg := rc.runConfig + cfg = r.prepareRunConfig(ctx, cfg) + + eventCh := r.agent.Stream(ctx, cfg) + stored := false + for event := range eventCh { + if event.Type == agentpkg.EventError { + r.logger.Error("agent stream error", + slog.String("bot_id", streamReq.BotID), + slog.String("chat_id", streamReq.ChatID), + slog.String("model_id", rc.model.ID), + slog.String("error", event.Error), + ) + } + + data, err := json.Marshal(event) + if err != nil { + continue + } + if !stored && event.IsTerminal() && len(event.Messages) > 0 { + if _, storeErr := r.tryStoreStream(ctx, streamReq, data, rc.model.ID); storeErr != nil { + r.logger.Error("stream persist failed", slog.Any("error", storeErr)) + } else { + stored = true + } + } + chunkCh <- conversation.StreamChunk(data) + } + r.markInboxRead(ctx, streamReq.BotID, rc.inboxItemIDs) + }() + return chunkCh, errCh +} + +// StreamChatWS resolves the agent context and streams agent events. +// Events are sent on eventCh. When abortCh is closed, the context is cancelled. +func (r *Resolver) StreamChatWS( + ctx context.Context, + req conversation.ChatRequest, + eventCh chan<- WSStreamEvent, + abortCh <-chan struct{}, +) error { + rc, err := r.resolve(ctx, req) + if err != nil { + return fmt.Errorf("resolve: %w", err) + } + req.Query = rc.query + + streamCtx, cancel := context.WithCancel(ctx) + defer cancel() + + go func() { + select { + case <-abortCh: + cancel() + case <-streamCtx.Done(): + } + }() + + cfg := rc.runConfig + cfg = r.prepareRunConfig(streamCtx, cfg) + + agentEventCh := r.agent.Stream(streamCtx, cfg) + modelID := rc.model.ID + stored := false + for event := range agentEventCh { + if event.Type == agentpkg.EventError { + r.logger.Error("agent stream error", + slog.String("bot_id", req.BotID), + slog.String("chat_id", req.ChatID), + slog.String("model_id", modelID), + slog.String("error", event.Error), + ) + } + + data, err := json.Marshal(event) + if err != nil { + continue + } + + if !stored && event.IsTerminal() && len(event.Messages) > 0 { + if _, storeErr := r.tryStoreStream(ctx, req, data, modelID); storeErr != nil { + r.logger.Error("ws persist failed", slog.Any("error", storeErr)) + } else { + stored = true + } + } + + select { + case eventCh <- json.RawMessage(data): + case <-ctx.Done(): + return ctx.Err() + } + } + + r.markInboxRead(ctx, req.BotID, rc.inboxItemIDs) + return nil +} + +// tryStoreStream attempts to extract final messages from a stream event and persist them. +func (r *Resolver) tryStoreStream(ctx context.Context, req conversation.ChatRequest, data []byte, modelID string) (bool, error) { + var envelope struct { + Type string `json:"type"` + Messages json.RawMessage `json:"messages"` + Usage json.RawMessage `json:"usage,omitempty"` + Usages json.RawMessage `json:"usages,omitempty"` + } + if err := json.Unmarshal(data, &envelope); err != nil { + return false, nil + } + if len(envelope.Messages) == 0 { + return false, nil + } + + var sdkMsgs []sdk.Message + if err := json.Unmarshal(envelope.Messages, &sdkMsgs); err != nil || len(sdkMsgs) == 0 { + return false, nil + } + outputMessages := sdkMessagesToModelMessages(sdkMsgs) + roundMessages := prependUserMessage(req.Query, outputMessages) + + var usages []json.RawMessage + if len(envelope.Usages) > 0 { + _ = json.Unmarshal(envelope.Usages, &usages) + } + + return true, r.storeRound(ctx, req, roundMessages, envelope.Usage, usages, modelID) +} diff --git a/internal/conversation/flow/resolver_stream_order_test.go b/internal/conversation/flow/resolver_stream_order_test.go deleted file mode 100644 index aa3ee2cd..00000000 --- a/internal/conversation/flow/resolver_stream_order_test.go +++ /dev/null @@ -1,142 +0,0 @@ -package flow - -import ( - "context" - "encoding/json" - "log/slog" - "net/http" - "net/http/httptest" - "testing" - "time" - - "github.com/memohai/memoh/internal/conversation" - messagepkg "github.com/memohai/memoh/internal/message" -) - -type blockingMessageService struct { - persistCalled chan struct{} - persistContinue chan struct{} -} - -func (s *blockingMessageService) Persist(_ context.Context, _ messagepkg.PersistInput) (messagepkg.Message, error) { - select { - case <-s.persistCalled: - default: - close(s.persistCalled) - } - <-s.persistContinue - return messagepkg.Message{}, nil -} - -func (*blockingMessageService) List(_ context.Context, _ string) ([]messagepkg.Message, error) { - return nil, nil -} - -func (*blockingMessageService) ListSince(_ context.Context, _ string, _ time.Time) ([]messagepkg.Message, error) { - return nil, nil -} - -func (*blockingMessageService) ListActiveSince(_ context.Context, _ string, _ time.Time) ([]messagepkg.Message, error) { - return nil, nil -} - -func (*blockingMessageService) ListLatest(_ context.Context, _ string, _ int32) ([]messagepkg.Message, error) { - return nil, nil -} - -func (*blockingMessageService) ListBefore(_ context.Context, _ string, _ time.Time, _ int32) ([]messagepkg.Message, error) { - return nil, nil -} - -func (*blockingMessageService) DeleteByBot(_ context.Context, _ string) error { - return nil -} - -func TestStreamChat_PersistsFinalMessagesBeforeForwardingDoneEvent(t *testing.T) { - t.Parallel() - - msgSvc := &blockingMessageService{ - persistCalled: make(chan struct{}), - persistContinue: make(chan struct{}), - } - - doneResp := gatewayResponse{ - Messages: []conversation.ModelMessage{ - {Role: "assistant", Content: conversation.NewTextContent("ok")}, - }, - } - doneData, err := json.Marshal(doneResp) - if err != nil { - t.Fatalf("marshal done response: %v", err) - } - - srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { - if r.URL.Path != "/chat/stream" { - http.NotFound(w, r) - return - } - w.Header().Set("Content-Type", "text/event-stream") - w.WriteHeader(http.StatusOK) - if f, ok := w.(http.Flusher); ok { - f.Flush() - } - _, _ = w.Write([]byte("event: done\n")) - _, _ = w.Write([]byte("data: ")) - _, _ = w.Write(doneData) - _, _ = w.Write([]byte("\n\n")) - if f, ok := w.(http.Flusher); ok { - f.Flush() - } - })) - t.Cleanup(srv.Close) - - r := &Resolver{ - messageService: msgSvc, - gatewayBaseURL: srv.URL, - logger: slog.New(slog.DiscardHandler), - streamingClient: srv.Client(), - httpClient: srv.Client(), - } - - chunkCh := make(chan conversation.StreamChunk, 10) - req := conversation.ChatRequest{BotID: "bot-test", ChatID: "chat-test"} - payload := gatewayRequest{} - - streamDone := make(chan error, 1) - go func() { - streamDone <- r.streamChat(context.Background(), payload, req, chunkCh, "model-test") - close(chunkCh) - }() - - select { - case <-msgSvc.persistCalled: - case <-time.After(2 * time.Second): - t.Fatal("timeout waiting for Persist to be called") - } - - select { - case got := <-chunkCh: - t.Fatalf("done event forwarded before persistence finished: %s", string(got)) - default: - } - - close(msgSvc.persistContinue) - - select { - case err := <-streamDone: - if err != nil { - t.Fatalf("streamChat returned error: %v", err) - } - case <-time.After(2 * time.Second): - t.Fatal("timeout waiting for streamChat to finish") - } - - select { - case got := <-chunkCh: - if len(got) == 0 { - t.Fatal("expected forwarded done event data") - } - case <-time.After(2 * time.Second): - t.Fatal("timeout waiting for forwarded done event data") - } -} diff --git a/internal/conversation/flow/resolver_test.go b/internal/conversation/flow/resolver_test.go index 89eee1a5..6ccb373a 100644 --- a/internal/conversation/flow/resolver_test.go +++ b/internal/conversation/flow/resolver_test.go @@ -7,170 +7,13 @@ import ( "encoding/json" "io" "log/slog" - "net/http" - "net/http/httptest" "strings" "testing" - "time" - - "github.com/stretchr/testify/require" "github.com/memohai/memoh/internal/conversation" "github.com/memohai/memoh/internal/models" ) -func TestPostTriggerSchedule_Endpoint(t *testing.T) { - var capturedPath string - var capturedBody []byte - var capturedAuth string - - srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { - capturedPath = r.URL.Path - capturedAuth = r.Header.Get("Authorization") - capturedBody, _ = io.ReadAll(r.Body) - resp := gatewayResponse{ - Messages: []conversation.ModelMessage{{Role: "assistant", Content: conversation.NewTextContent("ok")}}, - } - w.Header().Set("Content-Type", "application/json") - require.NoError(t, json.NewEncoder(w).Encode(resp)) - })) - defer srv.Close() - - resolver := &Resolver{ - gatewayBaseURL: srv.URL, - httpClient: &http.Client{Timeout: 5 * time.Second}, - logger: slog.Default(), - } - - maxCalls := 5 - req := triggerScheduleRequest{ - gatewayRequest: gatewayRequest{ - Model: gatewayModelConfig{ - ModelID: "gpt-4", - ClientType: "openai", - APIKey: "sk-test", - BaseURL: "https://api.openai.com", - }, - ActiveContextTime: 1440, - Channels: []string{}, - Messages: []conversation.ModelMessage{}, - Skills: []string{}, - Identity: gatewayIdentity{ - BotID: "bot-123", - ChannelIdentityID: "owner-user-1", - DisplayName: "Scheduler", - }, - Attachments: []any{}, - }, - Schedule: gatewaySchedule{ - ID: "sched-1", - Name: "daily report", - Description: "generate daily report", - Pattern: "0 9 * * *", - MaxCalls: &maxCalls, - Command: "generate the daily report", - }, - } - - resp, err := resolver.postTriggerSchedule(context.Background(), req, "Bearer test-token") - if err != nil { - t.Fatalf("postTriggerSchedule returned error: %v", err) - } - - if capturedPath != "/chat/trigger-schedule" { - t.Errorf("expected path /chat/trigger-schedule, got %s", capturedPath) - } - if capturedAuth != "Bearer test-token" { - t.Errorf("expected Authorization header 'Bearer test-token', got %s", capturedAuth) - } - if len(resp.Messages) != 1 { - t.Errorf("expected 1 message, got %d", len(resp.Messages)) - } - - var body map[string]any - if err := json.Unmarshal(capturedBody, &body); err != nil { - t.Fatalf("failed to parse captured body: %v", err) - } - schedule, ok := body["schedule"].(map[string]any) - if !ok { - t.Fatal("expected 'schedule' field in request body") - } - if schedule["id"] != "sched-1" { - t.Errorf("expected schedule.id=sched-1, got %v", schedule["id"]) - } - if schedule["command"] != "generate the daily report" { - t.Errorf("expected schedule.command, got %v", schedule["command"]) - } - if _, hasQuery := body["query"]; hasQuery { - t.Error("trigger-schedule request should not contain 'query' field") - } -} - -func TestPostTriggerSchedule_NoAuth(t *testing.T) { - var capturedAuth string - - srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { - capturedAuth = r.Header.Get("Authorization") - resp := gatewayResponse{Messages: []conversation.ModelMessage{}} - require.NoError(t, json.NewEncoder(w).Encode(resp)) - })) - defer srv.Close() - - resolver := &Resolver{ - gatewayBaseURL: srv.URL, - httpClient: &http.Client{Timeout: 5 * time.Second}, - logger: slog.Default(), - } - - req := triggerScheduleRequest{ - gatewayRequest: gatewayRequest{ - Channels: []string{}, - Messages: []conversation.ModelMessage{}, - Skills: []string{}, - Attachments: []any{}, - }, - Schedule: gatewaySchedule{ID: "s1", Command: "test"}, - } - - _, err := resolver.postTriggerSchedule(context.Background(), req, "") - if err != nil { - t.Fatalf("postTriggerSchedule returned error: %v", err) - } - if capturedAuth != "" { - t.Errorf("expected no Authorization header, got %s", capturedAuth) - } -} - -func TestPostTriggerSchedule_GatewayError(t *testing.T) { - srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, _ *http.Request) { - w.WriteHeader(http.StatusInternalServerError) - _, err := w.Write([]byte("internal error")) - require.NoError(t, err) - })) - defer srv.Close() - - resolver := &Resolver{ - gatewayBaseURL: srv.URL, - httpClient: &http.Client{Timeout: 5 * time.Second}, - logger: slog.Default(), - } - - req := triggerScheduleRequest{ - gatewayRequest: gatewayRequest{ - Channels: []string{}, - Messages: []conversation.ModelMessage{}, - Skills: []string{}, - Attachments: []any{}, - }, - Schedule: gatewaySchedule{ID: "s1", Command: "test"}, - } - - _, err := resolver.postTriggerSchedule(context.Background(), req, "Bearer tok") - if err == nil { - t.Fatal("expected error for 500 response") - } -} - type fakeGatewayAssetLoader struct { openFn func(ctx context.Context, botID, contentHash string) (io.ReadCloser, string, error) } @@ -245,108 +88,6 @@ func TestPrepareGatewayAttachments_DataURLFromURLFieldIsNativeInline(t *testing. } } -func TestStreamChat_AllowsLargeSSEDataLines(t *testing.T) { - const overOldScannerLimit = 3 * 1024 * 1024 - hugeDelta := strings.Repeat("a", overOldScannerLimit) - dataJSON, err := json.Marshal(map[string]any{ - "type": "text_delta", - "delta": hugeDelta, - }) - if err != nil { - t.Fatalf("failed to marshal test payload: %v", err) - } - dataStr := string(dataJSON) - parts := make([]string, 0, (len(dataStr)/8192)+1) - for i := 0; i < len(dataStr); i += 8192 { - end := i + 8192 - if end > len(dataStr) { - end = len(dataStr) - } - parts = append(parts, dataStr[i:end]) - } - - srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { - if r.URL.Path != "/chat/stream" { - w.WriteHeader(http.StatusNotFound) - return - } - w.Header().Set("Content-Type", "text/event-stream") - _, _ = io.WriteString(w, "event: message\n") - for _, part := range parts { - _, _ = io.WriteString(w, "data:") - _, _ = io.WriteString(w, part) - _, _ = io.WriteString(w, "\n") - } - _, _ = io.WriteString(w, "\n") - })) - defer srv.Close() - - resolver := &Resolver{ - gatewayBaseURL: srv.URL, - streamingClient: srv.Client(), - logger: slog.Default(), - } - - chunkCh := make(chan conversation.StreamChunk, 1) - err = resolver.streamChat( - context.Background(), - gatewayRequest{}, - conversation.ChatRequest{}, - chunkCh, - "model-test", - ) - if err != nil { - t.Fatalf("streamChat returned error: %v", err) - } - - select { - case chunk := <-chunkCh: - if !bytes.Equal(chunk, dataJSON) { - t.Fatalf("unexpected reconstructed payload: got prefix %q", string(chunk[:minInt(len(chunk), 80)])) - } - default: - t.Fatalf("expected at least one streamed chunk") - } -} - -func TestStreamChat_RejectsOverLimitSSELine(t *testing.T) { - tooLong := strings.Repeat("x", gatewaySSEMaxLineBytes+10) - srv := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { - if r.URL.Path != "/chat/stream" { - w.WriteHeader(http.StatusNotFound) - return - } - w.Header().Set("Content-Type", "text/event-stream") - _, _ = io.WriteString(w, "event: message\n") - _, _ = io.WriteString(w, "data:") - _, _ = io.WriteString(w, tooLong) - _, _ = io.WriteString(w, "\n\n") - })) - defer srv.Close() - - resolver := &Resolver{ - gatewayBaseURL: srv.URL, - streamingClient: srv.Client(), - logger: slog.Default(), - } - - chunkCh := make(chan conversation.StreamChunk, 1) - err := resolver.streamChat(context.Background(), gatewayRequest{}, conversation.ChatRequest{}, chunkCh, "model-test") - if err == nil { - t.Fatalf("expected streamChat to error on oversized SSE line") - } - if !strings.Contains(err.Error(), "sse line too long") { - t.Fatalf("expected line-too-long error, got: %v", err) - } -} - -func minInt(a, b int) int { - if a < b { - return a - } - return b -} - func TestPrepareGatewayAttachments_PublicURLFromURLFieldIsNativePublic(t *testing.T) { resolver := &Resolver{logger: slog.Default()} req := conversation.ChatRequest{ diff --git a/internal/conversation/flow/resolver_trigger.go b/internal/conversation/flow/resolver_trigger.go new file mode 100644 index 00000000..62a4e7bd --- /dev/null +++ b/internal/conversation/flow/resolver_trigger.go @@ -0,0 +1,126 @@ +package flow + +import ( + "context" + "encoding/json" + "errors" + "strings" + + sdk "github.com/memohai/twilight-ai/sdk" + + agentpkg "github.com/memohai/memoh/internal/agent" + "github.com/memohai/memoh/internal/conversation" + "github.com/memohai/memoh/internal/heartbeat" + "github.com/memohai/memoh/internal/schedule" +) + +// TriggerSchedule executes a scheduled command via the internal agent. +func (r *Resolver) TriggerSchedule(ctx context.Context, botID string, payload schedule.TriggerPayload, token string) error { + if strings.TrimSpace(botID) == "" { + return errors.New("bot id is required") + } + if strings.TrimSpace(payload.Command) == "" { + return errors.New("schedule command is required") + } + + req := conversation.ChatRequest{ + BotID: botID, + ChatID: botID, + Query: payload.Command, + UserID: payload.OwnerUserID, + Token: token, + } + rc, err := r.resolve(ctx, req) + if err != nil { + return err + } + + cfg := rc.runConfig + cfg.Identity.ChannelIdentityID = strings.TrimSpace(payload.OwnerUserID) + cfg.Identity.DisplayName = "Scheduler" + + schedulePrompt := agentpkg.GenerateSchedulePrompt(agentpkg.Schedule{ + ID: payload.ID, + Name: payload.Name, + Description: payload.Description, + Pattern: payload.Pattern, + MaxCalls: payload.MaxCalls, + Command: payload.Command, + }) + cfg.Messages = append(cfg.Messages, sdk.UserMessage(schedulePrompt)) + cfg = r.prepareRunConfig(ctx, cfg) + + result, err := r.agent.Generate(ctx, cfg) + if err != nil { + return err + } + + outputMessages := sdkMessagesToModelMessages(result.Messages) + roundMessages := prependUserMessage(req.Query, outputMessages) + usageJSON, _ := json.Marshal(result.Usage) + return r.storeRound(ctx, req, roundMessages, usageJSON, nil, rc.model.ID) +} + +// TriggerHeartbeat executes a heartbeat check via the internal agent. +func (r *Resolver) TriggerHeartbeat(ctx context.Context, botID string, payload heartbeat.TriggerPayload, token string) (heartbeat.TriggerResult, error) { + if strings.TrimSpace(botID) == "" { + return heartbeat.TriggerResult{}, errors.New("bot id is required") + } + + var heartbeatModel string + if botSettings, err := r.loadBotSettings(ctx, botID); err == nil { + heartbeatModel = strings.TrimSpace(botSettings.HeartbeatModelID) + } + + req := conversation.ChatRequest{ + BotID: botID, + ChatID: botID, + Query: "heartbeat", + UserID: payload.OwnerUserID, + Token: token, + Model: heartbeatModel, + } + rc, err := r.resolve(ctx, req) + if err != nil { + return heartbeat.TriggerResult{}, err + } + + cfg := rc.runConfig + cfg.Identity.ChannelIdentityID = strings.TrimSpace(payload.OwnerUserID) + cfg.Identity.DisplayName = "Heartbeat" + + var checklist string + if r.agent != nil { + fs := agentpkg.NewFSClient(nil, botID) + checklist = fs.ReadTextSafe(ctx, "/data/HEARTBEAT.md") + } + heartbeatPrompt := agentpkg.GenerateHeartbeatPrompt(payload.Interval, checklist) + cfg.Messages = append(cfg.Messages, sdk.UserMessage(heartbeatPrompt)) + cfg = r.prepareRunConfig(ctx, cfg) + + result, err := r.agent.Generate(ctx, cfg) + if err != nil { + return heartbeat.TriggerResult{}, err + } + + status := "alert" + text := strings.TrimSpace(result.Text) + if isHeartbeatOK(text) { + status = "ok" + } + + usageJSON, _ := json.Marshal(result.Usage) + + return heartbeat.TriggerResult{ + Status: status, + Text: text, + Usage: usageJSON, + UsageBytes: usageJSON, + ModelID: rc.model.ID, + }, nil +} + +func isHeartbeatOK(text string) bool { + t := strings.TrimSpace(text) + return strings.HasPrefix(t, "HEARTBEAT_OK") || strings.HasSuffix(t, "HEARTBEAT_OK") || t == "HEARTBEAT_OK" +} diff --git a/internal/conversation/flow/resolver_util.go b/internal/conversation/flow/resolver_util.go new file mode 100644 index 00000000..58c8b057 --- /dev/null +++ b/internal/conversation/flow/resolver_util.go @@ -0,0 +1,200 @@ +package flow + +import ( + "encoding/base64" + "encoding/json" + "errors" + "sort" + "strconv" + "strings" + + "github.com/jackc/pgx/v5/pgtype" + + "github.com/memohai/memoh/internal/conversation" + "github.com/memohai/memoh/internal/db" +) + +func sanitizeMessages(messages []conversation.ModelMessage) []conversation.ModelMessage { + cleaned := make([]conversation.ModelMessage, 0, len(messages)) + for _, msg := range messages { + if normalized, ok := normalizeImagePartsToDataURL(msg); ok { + msg = normalized + } + if strings.TrimSpace(msg.Role) == "" { + continue + } + if !msg.HasContent() && strings.TrimSpace(msg.ToolCallID) == "" { + continue + } + cleaned = append(cleaned, msg) + } + return cleaned +} + +func normalizeImagePartsToDataURL(msg conversation.ModelMessage) (conversation.ModelMessage, bool) { + if len(msg.Content) == 0 { + return msg, false + } + var parts []map[string]json.RawMessage + if err := json.Unmarshal(msg.Content, &parts); err != nil || len(parts) == 0 { + return msg, false + } + + changed := false + for i := range parts { + partTypeRaw, ok := parts[i]["type"] + if !ok { + continue + } + var partType string + if err := json.Unmarshal(partTypeRaw, &partType); err != nil || !strings.EqualFold(partType, "image") { + continue + } + + imageRaw, ok := parts[i]["image"] + if !ok || len(imageRaw) == 0 { + continue + } + var tmp string + if json.Unmarshal(imageRaw, &tmp) == nil { + continue + } + + var payload []byte + if b, ok := decodeIndexedByteObject(imageRaw); ok { + payload = b + } else if b, ok := decodeByteArray(imageRaw); ok { + payload = b + } else { + continue + } + if len(payload) == 0 { + continue + } + + // action trigger to image only here. + mediaType := "application/octet-stream" + if mediaTypeRaw, ok := parts[i]["mediaType"]; ok { + var mt string + if err := json.Unmarshal(mediaTypeRaw, &mt); err == nil && strings.TrimSpace(mt) != "" { + mediaType = strings.TrimSpace(mt) + } + } + dataURL := "data:" + mediaType + ";base64," + base64.StdEncoding.EncodeToString(payload) + rebuilt, err := json.Marshal(dataURL) + if err != nil { + continue + } + parts[i]["image"] = rebuilt + changed = true + } + + if !changed { + return msg, false + } + rebuiltContent, err := json.Marshal(parts) + if err != nil { + return msg, false + } + msg.Content = rebuiltContent + return msg, true +} + +func decodeByteArray(raw json.RawMessage) ([]byte, bool) { + var arr []int + if err := json.Unmarshal(raw, &arr); err != nil { + return nil, false + } + if len(arr) == 0 { + return nil, false + } + out := make([]byte, len(arr)) + for i, v := range arr { + if v < 0 || v > 255 { + return nil, false + } + out[i] = byte(v) + } + return out, true +} + +func decodeIndexedByteObject(raw json.RawMessage) ([]byte, bool) { + var obj map[string]json.RawMessage + if err := json.Unmarshal(raw, &obj); err != nil || len(obj) == 0 { + return nil, false + } + type indexedByte struct { + idx int + val byte + } + items := make([]indexedByte, 0, len(obj)) + for k, vRaw := range obj { + idx, err := strconv.Atoi(k) + if err != nil || idx < 0 { + return nil, false + } + var val int + if err := json.Unmarshal(vRaw, &val); err != nil || val < 0 || val > 255 { + return nil, false + } + items = append(items, indexedByte{idx: idx, val: byte(val)}) + } + sort.Slice(items, func(i, j int) bool { return items[i].idx < items[j].idx }) + for i := range items { + if items[i].idx != i { + return nil, false + } + } + out := make([]byte, len(items)) + for i := range items { + out[i] = items[i].val + } + return out, true +} + +func dedup(items []string) []string { + seen := make(map[string]struct{}, len(items)) + result := make([]string, 0, len(items)) + for _, s := range items { + trimmed := strings.TrimSpace(s) + if trimmed == "" { + continue + } + if _, ok := seen[trimmed]; ok { + continue + } + seen[trimmed] = struct{}{} + result = append(result, trimmed) + } + return result +} + +func coalescePositiveInt(values ...int) int { + for _, v := range values { + if v > 0 { + return v + } + } + return defaultMaxContextMinutes +} + +func nonNilStrings(s []string) []string { + if s == nil { + return []string{} + } + return s +} + +func nonNilModelMessages(m []conversation.ModelMessage) []conversation.ModelMessage { + if m == nil { + return []conversation.ModelMessage{} + } + return m +} + +func parseResolverUUID(id string) (pgtype.UUID, error) { + if strings.TrimSpace(id) == "" { + return pgtype.UUID{}, errors.New("empty id") + } + return db.ParseUUID(id) +} diff --git a/internal/conversation/flow/user_header.go b/internal/conversation/flow/user_header.go new file mode 100644 index 00000000..142fb23e --- /dev/null +++ b/internal/conversation/flow/user_header.go @@ -0,0 +1,123 @@ +package flow + +import ( + "strings" + "time" +) + +// UserMessageMeta holds the structured metadata attached to every user +// message. It is the single source of truth shared by the YAML header +// (sent to the LLM) and the inbox content JSONB. +type UserMessageMeta struct { + MessageID string `json:"message-id,omitempty"` + ChannelIdentityID string `json:"channel-identity-id"` + DisplayName string `json:"display-name"` + Channel string `json:"channel"` + ConversationType string `json:"conversation-type"` + ConversationName string `json:"conversation-name,omitempty"` + Time string `json:"time"` + AttachmentPaths []string `json:"attachments"` +} + +// BuildUserMessageMeta constructs a UserMessageMeta from the inbound +// parameters. Both FormatUserHeader and inbox content use this. +func BuildUserMessageMeta(messageID, channelIdentityID, displayName, channel, conversationType, conversationName string, attachmentPaths []string) UserMessageMeta { + if attachmentPaths == nil { + attachmentPaths = []string{} + } + return UserMessageMeta{ + MessageID: messageID, + ChannelIdentityID: channelIdentityID, + DisplayName: displayName, + Channel: channel, + ConversationType: conversationType, + ConversationName: conversationName, + Time: time.Now().UTC().Format(time.RFC3339), + AttachmentPaths: attachmentPaths, + } +} + +// ToMap returns the metadata as a map with the same keys used in the YAML +// header, suitable for storing as inbox content JSONB. +func (m UserMessageMeta) ToMap() map[string]any { + result := map[string]any{ + "channel-identity-id": m.ChannelIdentityID, + "display-name": m.DisplayName, + "channel": m.Channel, + "conversation-type": m.ConversationType, + "time": m.Time, + "attachments": m.AttachmentPaths, + } + if m.MessageID != "" { + result["message-id"] = m.MessageID + } + if m.ConversationName != "" { + result["conversation-name"] = m.ConversationName + } + return result +} + +// FormatUserHeader wraps a user query with YAML front-matter metadata so +// the LLM sees structured context (sender, channel, time, attachments) +// alongside the raw message. This must be the single source of truth for +// user-message formatting — the agent gateway must NOT add its own header. +func FormatUserHeader(messageID, channelIdentityID, displayName, channel, conversationType, conversationName string, attachmentPaths []string, query string) string { + meta := BuildUserMessageMeta(messageID, channelIdentityID, displayName, channel, conversationType, conversationName, attachmentPaths) + return FormatUserHeaderFromMeta(meta, query) +} + +// FormatUserHeaderFromMeta formats a pre-built UserMessageMeta into the +// YAML front-matter string sent to the LLM. +func FormatUserHeaderFromMeta(meta UserMessageMeta, query string) string { + var sb strings.Builder + sb.WriteString("---\n") + if meta.MessageID != "" { + writeYAMLString(&sb, "message-id", meta.MessageID) + } + writeYAMLString(&sb, "channel-identity-id", meta.ChannelIdentityID) + writeYAMLString(&sb, "display-name", meta.DisplayName) + writeYAMLString(&sb, "channel", meta.Channel) + writeYAMLString(&sb, "conversation-type", meta.ConversationType) + if meta.ConversationName != "" { + writeYAMLString(&sb, "conversation-name", meta.ConversationName) + } + writeYAMLString(&sb, "time", meta.Time) + if len(meta.AttachmentPaths) > 0 { + sb.WriteString("attachments:\n") + for _, p := range meta.AttachmentPaths { + sb.WriteString(" - ") + sb.WriteString(p) + sb.WriteByte('\n') + } + } else { + sb.WriteString("attachments: []\n") + } + sb.WriteString("---\n") + sb.WriteString(query) + return sb.String() +} + +func writeYAMLString(sb *strings.Builder, key, value string) { + sb.WriteString(key) + sb.WriteString(": ") + if value == "" || needsYAMLQuote(value) { + sb.WriteByte('"') + sb.WriteString(strings.ReplaceAll(value, `"`, `\"`)) + sb.WriteByte('"') + } else { + sb.WriteString(value) + } + sb.WriteByte('\n') +} + +func needsYAMLQuote(s string) bool { + if s == "" { + return true + } + for _, c := range s { + if c == ':' || c == '#' || c == '"' || c == '\'' || c == '{' || c == '}' || c == '[' || c == ']' || c == ',' || c == '\n' { + return true + } + } + return false +} diff --git a/internal/conversation/types.go b/internal/conversation/types.go index ce05005c..55989c9c 100644 --- a/internal/conversation/types.go +++ b/internal/conversation/types.go @@ -215,6 +215,8 @@ type OutboundAssetRef struct { Mime string SizeBytes int64 StorageKey string + Name string + Metadata map[string]any } // ChatRequest is the input for Chat and StreamChat. diff --git a/internal/db/sqlc/media.sql.go b/internal/db/sqlc/media.sql.go index 91b78409..a3035565 100644 --- a/internal/db/sqlc/media.sql.go +++ b/internal/db/sqlc/media.sql.go @@ -12,17 +12,21 @@ import ( ) const createMessageAsset = `-- name: CreateMessageAsset :one -INSERT INTO bot_history_message_assets (message_id, role, ordinal, content_hash) +INSERT INTO bot_history_message_assets (message_id, role, ordinal, content_hash, name, metadata) VALUES ( $1, $2, $3, - $4 + $4, + $5, + $6 ) ON CONFLICT (message_id, content_hash) DO UPDATE SET role = EXCLUDED.role, - ordinal = EXCLUDED.ordinal -RETURNING id, message_id, role, ordinal, content_hash, created_at + ordinal = EXCLUDED.ordinal, + name = EXCLUDED.name, + metadata = EXCLUDED.metadata +RETURNING id, message_id, role, ordinal, content_hash, name, metadata, created_at ` type CreateMessageAssetParams struct { @@ -30,6 +34,8 @@ type CreateMessageAssetParams struct { Role string `json:"role"` Ordinal int32 `json:"ordinal"` ContentHash string `json:"content_hash"` + Name string `json:"name"` + Metadata []byte `json:"metadata"` } func (q *Queries) CreateMessageAsset(ctx context.Context, arg CreateMessageAssetParams) (BotHistoryMessageAsset, error) { @@ -38,6 +44,8 @@ func (q *Queries) CreateMessageAsset(ctx context.Context, arg CreateMessageAsset arg.Role, arg.Ordinal, arg.ContentHash, + arg.Name, + arg.Metadata, ) var i BotHistoryMessageAsset err := row.Scan( @@ -46,6 +54,8 @@ func (q *Queries) CreateMessageAsset(ctx context.Context, arg CreateMessageAsset &i.Role, &i.Ordinal, &i.ContentHash, + &i.Name, + &i.Metadata, &i.CreatedAt, ) return i, err @@ -141,7 +151,7 @@ func (q *Queries) GetStorageProviderByName(ctx context.Context, name string) (St } const listMessageAssets = `-- name: ListMessageAssets :many -SELECT id AS rel_id, message_id, role, ordinal, content_hash +SELECT id AS rel_id, message_id, role, ordinal, content_hash, name, metadata FROM bot_history_message_assets WHERE message_id = $1 ORDER BY ordinal ASC @@ -153,6 +163,8 @@ type ListMessageAssetsRow struct { Role string `json:"role"` Ordinal int32 `json:"ordinal"` ContentHash string `json:"content_hash"` + Name string `json:"name"` + Metadata []byte `json:"metadata"` } func (q *Queries) ListMessageAssets(ctx context.Context, messageID pgtype.UUID) ([]ListMessageAssetsRow, error) { @@ -170,6 +182,8 @@ func (q *Queries) ListMessageAssets(ctx context.Context, messageID pgtype.UUID) &i.Role, &i.Ordinal, &i.ContentHash, + &i.Name, + &i.Metadata, ); err != nil { return nil, err } @@ -182,7 +196,7 @@ func (q *Queries) ListMessageAssets(ctx context.Context, messageID pgtype.UUID) } const listMessageAssetsBatch = `-- name: ListMessageAssetsBatch :many -SELECT id AS rel_id, message_id, role, ordinal, content_hash +SELECT id AS rel_id, message_id, role, ordinal, content_hash, name, metadata FROM bot_history_message_assets WHERE message_id = ANY($1::uuid[]) ORDER BY message_id, ordinal ASC @@ -194,6 +208,8 @@ type ListMessageAssetsBatchRow struct { Role string `json:"role"` Ordinal int32 `json:"ordinal"` ContentHash string `json:"content_hash"` + Name string `json:"name"` + Metadata []byte `json:"metadata"` } func (q *Queries) ListMessageAssetsBatch(ctx context.Context, messageIds []pgtype.UUID) ([]ListMessageAssetsBatchRow, error) { @@ -211,6 +227,8 @@ func (q *Queries) ListMessageAssetsBatch(ctx context.Context, messageIds []pgtyp &i.Role, &i.Ordinal, &i.ContentHash, + &i.Name, + &i.Metadata, ); err != nil { return nil, err } diff --git a/internal/db/sqlc/models.go b/internal/db/sqlc/models.go index 8dbf1243..0110c7b0 100644 --- a/internal/db/sqlc/models.go +++ b/internal/db/sqlc/models.go @@ -129,6 +129,8 @@ type BotHistoryMessageAsset struct { Role string `json:"role"` Ordinal int32 `json:"ordinal"` ContentHash string `json:"content_hash"` + Name string `json:"name"` + Metadata []byte `json:"metadata"` CreatedAt pgtype.Timestamptz `json:"created_at"` } diff --git a/internal/embedded/agent/.gitignore b/internal/embedded/agent/.gitignore deleted file mode 100644 index d6b7ef32..00000000 --- a/internal/embedded/agent/.gitignore +++ /dev/null @@ -1,2 +0,0 @@ -* -!.gitignore diff --git a/internal/embedded/assets.go b/internal/embedded/assets.go index 60225f97..e3449bb6 100644 --- a/internal/embedded/assets.go +++ b/internal/embedded/assets.go @@ -5,7 +5,7 @@ import ( "io/fs" ) -//go:embed all:web all:agent +//go:embed all:web var assetsFS embed.FS func AssetsFS() fs.FS { @@ -15,7 +15,3 @@ func AssetsFS() fs.FS { func WebFS() (fs.FS, error) { return fs.Sub(assetsFS, "web") } - -func AgentFS() (fs.FS, error) { - return fs.Sub(assetsFS, "agent") -} diff --git a/internal/embedded/bun/.gitignore b/internal/embedded/bun/.gitignore deleted file mode 100644 index d6b7ef32..00000000 --- a/internal/embedded/bun/.gitignore +++ /dev/null @@ -1,2 +0,0 @@ -* -!.gitignore diff --git a/internal/handlers/local_channel.go b/internal/handlers/local_channel.go index f913d66c..3b0c6ef6 100644 --- a/internal/handlers/local_channel.go +++ b/internal/handlers/local_channel.go @@ -10,7 +10,9 @@ import ( "log/slog" "maps" "net/http" + "path/filepath" "strings" + "sync" "time" "github.com/gorilla/websocket" @@ -24,6 +26,7 @@ import ( "github.com/memohai/memoh/internal/conversation" "github.com/memohai/memoh/internal/conversation/flow" "github.com/memohai/memoh/internal/media" + messagepkg "github.com/memohai/memoh/internal/message" ) // localTtsSynthesizer synthesizes text to speech audio. @@ -395,6 +398,11 @@ func (h *LocalChannelHandler) HandleWebSocket(c echo.Context) error { activeCancel = streamCancel eventCh := make(chan flow.WSStreamEvent, 64) + var ( + outboundAssetMu sync.Mutex + outboundAssetRefs []messagepkg.AssetRef + ) + go func() { defer streamCancel() defer close(eventCh) @@ -420,10 +428,22 @@ func (h *LocalChannelHandler) HandleWebSocket(c echo.Context) error { go func() { for event := range eventCh { - for _, processed := range h.processWSEvent(streamCtx, botID, event) { - writer.Send(processed) + processed := h.processWSEvent(streamCtx, botID, event) + for _, p := range processed { + writer.Send(p) + if refs := extractAssetRefsFromProcessedEvent(p); len(refs) > 0 { + outboundAssetMu.Lock() + outboundAssetRefs = append(outboundAssetRefs, refs...) + outboundAssetMu.Unlock() + } } } + outboundAssetMu.Lock() + refs := outboundAssetRefs + outboundAssetMu.Unlock() + if len(refs) > 0 { + h.resolver.LinkOutboundAssets(context.WithoutCancel(ctx), botID, refs) + } }() default: @@ -650,12 +670,42 @@ func (h *LocalChannelHandler) buildTtsAttachment(ctx context.Context, botID, con func applyAssetToItem(item map[string]any, botID string, asset media.Asset) map[string]any { result := maps.Clone(item) + + sourcePath := strings.TrimSpace(itemStr(item, "path")) + sourceURL := strings.TrimSpace(itemStr(item, "url")) + + existingName, _ := result["name"].(string) + if strings.TrimSpace(existingName) == "" { + if sourcePath != "" { + result["name"] = filepath.Base(sourcePath) + } else if sourceURL != "" { + result["name"] = filepath.Base(sourceURL) + } + } + delete(result, "path") result["url"] = "" applyAssetToMap(result, botID, asset) + + if meta, ok := result["metadata"].(map[string]any); ok { + if n, _ := result["name"].(string); n != "" { + meta["name"] = n + } + if sourcePath != "" { + meta["source_path"] = sourcePath + } + if sourceURL != "" { + meta["source_url"] = sourceURL + } + } return result } +func itemStr(m map[string]any, key string) string { + v, _ := m[key].(string) + return v +} + func applyAssetToMap(m map[string]any, botID string, asset media.Asset) { m["content_hash"] = asset.ContentHash m["metadata"] = map[string]any{ @@ -669,3 +719,48 @@ func applyAssetToMap(m map[string]any, botID string, asset media.Asset) { m["size"] = asset.SizeBytes } } + +// extractAssetRefsFromProcessedEvent parses a processed attachment_delta +// event to collect asset refs for post-persist linking. +func extractAssetRefsFromProcessedEvent(event json.RawMessage) []messagepkg.AssetRef { + var envelope struct { + Type string `json:"type"` + Attachments []struct { + ContentHash string `json:"content_hash"` + Name string `json:"name"` + Mime string `json:"mime"` + Size float64 `json:"size"` + Metadata map[string]any `json:"metadata"` + } `json:"attachments"` + } + if err := json.Unmarshal(event, &envelope); err != nil || envelope.Type != "attachment_delta" { + return nil + } + var refs []messagepkg.AssetRef + for i, att := range envelope.Attachments { + ch := strings.TrimSpace(att.ContentHash) + if ch == "" { + continue + } + name := strings.TrimSpace(att.Name) + if name == "" && att.Metadata != nil { + name, _ = att.Metadata["name"].(string) + } + ref := messagepkg.AssetRef{ + ContentHash: ch, + Role: "attachment", + Ordinal: i, + Name: name, + Mime: strings.TrimSpace(att.Mime), + SizeBytes: int64(att.Size), + Metadata: att.Metadata, + } + if att.Metadata != nil { + if sk, ok := att.Metadata["storage_key"].(string); ok { + ref.StorageKey = sk + } + } + refs = append(refs, ref) + } + return refs +} diff --git a/internal/handlers/mcp_tools_test.go b/internal/handlers/mcp_tools_test.go index d3c2eb0d..9b93a77a 100644 --- a/internal/handlers/mcp_tools_test.go +++ b/internal/handlers/mcp_tools_test.go @@ -102,7 +102,7 @@ func (e *mcpToolsTestExecutor) CallTool(_ context.Context, session mcpgw.ToolSes func TestHandleMCPToolsWithGatewayAcceptCompatibility(t *testing.T) { e := echo.New() executor := &mcpToolsTestExecutor{} - toolGateway := mcpgw.NewToolGatewayService(slog.Default(), []mcpgw.ToolExecutor{executor}, nil) + toolGateway := mcpgw.NewToolGatewayService(slog.Default(), []mcpgw.ToolSource{executor}) handler := &ContainerdHandler{ logger: slog.Default(), toolGateway: toolGateway, diff --git a/internal/mcp/providers/contacts/provider.go b/internal/mcp/providers/contacts/provider.go deleted file mode 100644 index 71bd69a7..00000000 --- a/internal/mcp/providers/contacts/provider.go +++ /dev/null @@ -1,108 +0,0 @@ -package contacts - -import ( - "context" - "log/slog" - "strings" - - "github.com/memohai/memoh/internal/channel/route" - mcpgw "github.com/memohai/memoh/internal/mcp" -) - -const toolGetContacts = "get_contacts" - -// Executor exposes get_contacts as an MCP tool. -type Executor struct { - routeService route.Service - logger *slog.Logger -} - -// NewExecutor creates a contacts tool executor. -func NewExecutor(log *slog.Logger, routeService route.Service) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - routeService: routeService, - logger: log.With(slog.String("provider", "contacts_tool")), - } -} - -func (p *Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if p.routeService == nil { - return []mcpgw.ToolDescriptor{}, nil - } - return []mcpgw.ToolDescriptor{ - { - Name: toolGetContacts, - Description: "List all known contacts and conversations for the current bot. Returns platform, conversation type, reply target, and metadata for each route.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "platform": map[string]any{ - "type": "string", - "description": "Filter by channel platform (e.g. telegram, feishu). Returns all platforms when omitted.", - }, - }, - "required": []string{}, - }, - }, - }, nil -} - -func (p *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if toolName != toolGetContacts { - return nil, mcpgw.ErrToolNotFound - } - if p.routeService == nil { - return mcpgw.BuildToolErrorResult("contacts service not available"), nil - } - - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - routes, err := p.routeService.List(ctx, botID) - if err != nil { - p.logger.Warn("list routes failed", slog.String("bot_id", botID), slog.Any("error", err)) - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - platformFilter := strings.ToLower(strings.TrimSpace(mcpgw.FirstStringArg(arguments, "platform"))) - - contacts := make([]map[string]any, 0, len(routes)) - for _, r := range routes { - if platformFilter != "" && !strings.EqualFold(r.Platform, platformFilter) { - continue - } - entry := map[string]any{ - "route_id": r.ID, - "platform": r.Platform, - "conversation_type": r.ConversationType, - "target": r.ReplyTarget, - "conversation_id": r.ConversationID, - "last_active": r.UpdatedAt.Format("2006-01-02T15:04:05Z"), - } - if len(r.Metadata) > 0 { - if v, ok := r.Metadata["conversation_name"].(string); ok && v != "" { - entry["display_name"] = v - } else if v, ok := r.Metadata["sender_display_name"].(string); ok && v != "" { - entry["display_name"] = v - } - if v, ok := r.Metadata["sender_username"].(string); ok && v != "" { - entry["username"] = v - } - entry["metadata"] = r.Metadata - } - contacts = append(contacts, entry) - } - - payload := map[string]any{ - "ok": true, - "bot_id": botID, - "count": len(contacts), - "contacts": contacts, - } - return mcpgw.BuildToolSuccessResult(payload), nil -} diff --git a/internal/mcp/providers/container/fsops_test.go b/internal/mcp/providers/container/fsops_test.go deleted file mode 100644 index c4f36cc6..00000000 --- a/internal/mcp/providers/container/fsops_test.go +++ /dev/null @@ -1,100 +0,0 @@ -package container - -import "testing" - -func TestShellQuote(t *testing.T) { - tests := []struct { - in string - want string - }{ - {"hello", "'hello'"}, - {"", "''"}, - {"it's", `'it'\''s'`}, - {"a b", "'a b'"}, - } - for _, tt := range tests { - got := ShellQuote(tt.in) - if got != tt.want { - t.Errorf("ShellQuote(%q) = %q, want %q", tt.in, got, tt.want) - } - } -} - -func TestApplyEdit(t *testing.T) { - raw := "hello world\n" - updated, err := applyEdit(raw, "test.txt", "hello", "goodbye") - if err != nil { - t.Fatal(err) - } - if updated != "goodbye world\n" { - t.Errorf("updated = %q", updated) - } -} - -func TestApplyEdit_NotFound(t *testing.T) { - raw := "hello world\n" - _, err := applyEdit(raw, "test.txt", "missing text", "new") - if err == nil { - t.Error("expected error for missing text") - } -} - -func TestApplyEdit_MultipleOccurrences(t *testing.T) { - raw := "foo bar foo\n" - _, err := applyEdit(raw, "test.txt", "foo", "baz") - if err == nil { - t.Error("expected error for multiple occurrences") - } -} - -func TestApplyEdit_NoChange(t *testing.T) { - raw := "hello world\n" - _, err := applyEdit(raw, "test.txt", "hello", "hello") - if err == nil { - t.Error("expected error for identical replacement") - } -} - -func TestFuzzyFindText(t *testing.T) { - tests := []struct { - name string - content string - old string - found bool - }{ - {"exact match", "hello world", "hello", true}, - {"no match", "hello world", "missing", false}, - {"smart quote match", "it\u2019s a test", "it's a test", true}, - } - for _, tt := range tests { - t.Run(tt.name, func(t *testing.T) { - result := fuzzyFindText(tt.content, tt.old) - if result.Found != tt.found { - t.Errorf("found = %v, want %v", result.Found, tt.found) - } - }) - } -} - -func TestDetectLineEnding(t *testing.T) { - if detectLineEnding("foo\r\nbar") != "\r\n" { - t.Error("expected CRLF") - } - if detectLineEnding("foo\nbar") != "\n" { - t.Error("expected LF") - } - if detectLineEnding("foo") != "\n" { - t.Error("expected LF default") - } -} - -func TestStripBOM(t *testing.T) { - bom, content := stripBOM("\uFEFFhello") - if bom != "\uFEFF" || content != "hello" { - t.Errorf("bom=%q content=%q", bom, content) - } - bom2, content2 := stripBOM("hello") - if bom2 != "" || content2 != "hello" { - t.Errorf("bom=%q content=%q", bom2, content2) - } -} diff --git a/internal/mcp/providers/container/provider.go b/internal/mcp/providers/container/provider.go deleted file mode 100644 index 8a4cc14c..00000000 --- a/internal/mcp/providers/container/provider.go +++ /dev/null @@ -1,324 +0,0 @@ -package container - -import ( - "context" - "fmt" - "io" - "log/slog" - "math" - "strings" - - mcpgw "github.com/memohai/memoh/internal/mcp" - "github.com/memohai/memoh/internal/workspace/bridge" -) - -const ( - toolRead = "read" - toolWrite = "write" - toolList = "list" - toolEdit = "edit" - toolExec = "exec" - - defaultExecWorkDir = "/data" -) - -// Executor provides filesystem and exec tools (read, write, list, edit, exec) that -// operate inside the bot container via gRPC. All I/O goes through the container -// sandbox — no direct host filesystem access. -type Executor struct { - clients bridge.Provider - execWorkDir string - logger *slog.Logger -} - -// NewExecutor returns a tool executor backed by gRPC container clients. -func NewExecutor(log *slog.Logger, clients bridge.Provider, execWorkDir string) *Executor { - if log == nil { - log = slog.Default() - } - wd := strings.TrimSpace(execWorkDir) - if wd == "" { - wd = defaultExecWorkDir - } - return &Executor{ - clients: clients, - execWorkDir: wd, - logger: log.With(slog.String("provider", "container_tool")), - } -} - -// ListTools returns read, write, list, edit, and exec tool descriptors. -func (p *Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - wd := p.execWorkDir - if wd == "" { - wd = defaultExecWorkDir - } - return []mcpgw.ToolDescriptor{ - { - Name: toolRead, - Description: fmt.Sprintf("Read file content inside the bot container. Supports pagination for large files. Max %d lines / %d bytes per call.", readMaxLines, readMaxBytes), - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "path": map[string]any{ - "type": "string", - "description": fmt.Sprintf("File path (relative to %s or absolute inside container)", wd), - }, - "line_offset": map[string]any{ - "type": "integer", - "description": "Line number to start reading from (1-indexed). Default: 1.", - "minimum": 1, - "default": 1, - }, - "n_lines": map[string]any{ - "type": "integer", - "description": fmt.Sprintf("Number of lines to read per call. Default: %d (the per-call maximum). Use a smaller value with line_offset for finer pagination. Max: %d.", readMaxLines, readMaxLines), - "minimum": 1, - "maximum": readMaxLines, - "default": readMaxLines, - }, - }, - "required": []string{"path"}, - }, - }, - { - Name: toolWrite, - Description: "Write file content inside the bot container.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "path": map[string]any{"type": "string", "description": fmt.Sprintf("File path (relative to %s or absolute inside container)", wd)}, - "content": map[string]any{"type": "string", "description": "File content"}, - }, - "required": []string{"path", "content"}, - }, - }, - { - Name: toolList, - Description: "List directory entries inside the bot container.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "path": map[string]any{"type": "string", "description": fmt.Sprintf("Directory path (relative to %s or absolute inside container)", wd)}, - "recursive": map[string]any{"type": "boolean", "description": "List recursively"}, - }, - "required": []string{"path"}, - }, - }, - { - Name: toolEdit, - Description: "Replace exact text in a file inside the bot container.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "path": map[string]any{"type": "string", "description": fmt.Sprintf("File path (relative to %s or absolute inside container)", wd)}, - "old_text": map[string]any{"type": "string", "description": "Exact text to find"}, - "new_text": map[string]any{"type": "string", "description": "Replacement text"}, - }, - "required": []string{"path", "old_text", "new_text"}, - }, - }, - { - Name: toolExec, - Description: fmt.Sprintf("Execute a command in the bot container. Runs in the bot's data directory (%s) by default.", wd), - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "command": map[string]any{ - "type": "string", - "description": "Shell command to run (e.g. ls -la, cat file.txt)", - }, - "work_dir": map[string]any{ - "type": "string", - "description": fmt.Sprintf("Working directory inside the container (default: %s)", wd), - }, - }, - "required": []string{"command"}, - }, - }, - }, nil -} - -// normalizePath converts paths that the LLM may send as /data/... into relative -// paths under the working directory. e.g. /data/test.txt -> test.txt, /data -> . -func (p *Executor) normalizePath(path string) string { - path = strings.TrimSpace(path) - if path == "" { - return path - } - prefix := p.execWorkDir - if prefix == "" { - prefix = defaultExecWorkDir - } - if path == prefix { - return "." - } - if strings.HasPrefix(path, prefix+"/") { - return strings.TrimLeft(strings.TrimPrefix(path, prefix+"/"), "/") - } - return path -} - -// CallTool dispatches to the appropriate gRPC-backed implementation. -func (p *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - client, err := p.clients.MCPClient(ctx, botID) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("container not reachable: %v", err)), nil - } - - switch toolName { - case toolRead: - return p.callRead(ctx, client, arguments) - case toolWrite: - return p.callWrite(ctx, client, arguments) - case toolList: - return p.callList(ctx, client, arguments) - case toolEdit: - return p.callEdit(ctx, client, arguments) - case toolExec: - return p.callExec(ctx, client, botID, arguments) - default: - return nil, mcpgw.ErrToolNotFound - } -} - -func (p *Executor) callRead(ctx context.Context, client *bridge.Client, args map[string]any) (map[string]any, error) { - filePath := p.normalizePath(mcpgw.StringArg(args, "path")) - if filePath == "" { - return mcpgw.BuildToolErrorResult("path is required"), nil - } - - lineOffset := int32(1) - if offset, ok, err := mcpgw.IntArg(args, "line_offset"); err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("invalid line_offset: %v", err)), nil - } else if ok { - if offset < 1 { - return mcpgw.BuildToolErrorResult("line_offset must be >= 1"), nil - } - if offset > math.MaxInt32 { - return mcpgw.BuildToolErrorResult("line_offset exceeds maximum"), nil - } - lineOffset = int32(offset) - } - - nLines := int32(readMaxLines) - if n, ok, err := mcpgw.IntArg(args, "n_lines"); err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("invalid n_lines: %v", err)), nil - } else if ok { - if n < 1 { - return mcpgw.BuildToolErrorResult("n_lines must be >= 1"), nil - } - if n > readMaxLines { - n = readMaxLines - } - nLines = int32(n) //nolint:gosec // bounded by readMaxLines (200) - } - - resp, err := client.ReadFile(ctx, filePath, lineOffset, nLines) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.GetBinary() { - return mcpgw.BuildToolErrorResult("file appears to be binary. Read tool only supports text files"), nil - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "content": resp.GetContent(), - "total_lines": resp.GetTotalLines(), - }), nil -} - -func (p *Executor) callWrite(ctx context.Context, client *bridge.Client, args map[string]any) (map[string]any, error) { - filePath := p.normalizePath(mcpgw.StringArg(args, "path")) - content := mcpgw.StringArg(args, "content") - if filePath == "" { - return mcpgw.BuildToolErrorResult("path is required"), nil - } - if err := client.WriteFile(ctx, filePath, []byte(content)); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - return mcpgw.BuildToolSuccessResult(map[string]any{"ok": true}), nil -} - -func (p *Executor) callList(ctx context.Context, client *bridge.Client, args map[string]any) (map[string]any, error) { - dirPath := p.normalizePath(mcpgw.StringArg(args, "path")) - if dirPath == "" { - dirPath = "." - } - recursive, _, _ := mcpgw.BoolArg(args, "recursive") - - entries, err := client.ListDir(ctx, dirPath, recursive) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - entriesMaps := make([]map[string]any, len(entries)) - for i, e := range entries { - entriesMaps[i] = map[string]any{ - "path": e.GetPath(), - "is_dir": e.GetIsDir(), - "size": e.GetSize(), - "mode": e.GetMode(), - "mod_time": e.GetModTime(), - } - } - return mcpgw.BuildToolSuccessResult(map[string]any{"path": dirPath, "entries": entriesMaps}), nil -} - -func (p *Executor) callEdit(ctx context.Context, client *bridge.Client, args map[string]any) (map[string]any, error) { - filePath := p.normalizePath(mcpgw.StringArg(args, "path")) - oldText := mcpgw.StringArg(args, "old_text") - newText := mcpgw.StringArg(args, "new_text") - if filePath == "" || oldText == "" { - return mcpgw.BuildToolErrorResult("path, old_text and new_text are required"), nil - } - - reader, err := client.ReadRaw(ctx, filePath) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = reader.Close() }() - raw, err := io.ReadAll(reader) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - updated, err := applyEdit(string(raw), filePath, oldText, newText) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - if err := client.WriteFile(ctx, filePath, []byte(updated)); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - return mcpgw.BuildToolSuccessResult(map[string]any{"ok": true}), nil -} - -func (p *Executor) callExec(ctx context.Context, client *bridge.Client, botID string, args map[string]any) (map[string]any, error) { - command := strings.TrimSpace(mcpgw.StringArg(args, "command")) - if command == "" { - return mcpgw.BuildToolErrorResult("command is required"), nil - } - workDir := strings.TrimSpace(mcpgw.StringArg(args, "work_dir")) - if workDir == "" { - workDir = p.execWorkDir - } - - result, err := client.Exec(ctx, command, workDir, 30) - if err != nil { - p.logger.Warn("exec failed", slog.String("bot_id", botID), slog.String("command", command), slog.Any("error", err)) - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - stdout := pruneToolOutputText(result.Stdout, "tool result (exec stdout)") - stderr := pruneToolOutputText(result.Stderr, "tool result (exec stderr)") - return mcpgw.BuildToolSuccessResult(map[string]any{ - "stdout": stdout, - "stderr": stderr, - "exit_code": result.ExitCode, - }), nil -} diff --git a/internal/mcp/providers/container/provider_test.go b/internal/mcp/providers/container/provider_test.go deleted file mode 100644 index d978725a..00000000 --- a/internal/mcp/providers/container/provider_test.go +++ /dev/null @@ -1,516 +0,0 @@ -package container - -import ( - "context" - "math" - "net" - "strings" - "sync" - "testing" - - "google.golang.org/grpc" - "google.golang.org/grpc/credentials/insecure" - "google.golang.org/grpc/test/bufconn" - - mcpgw "github.com/memohai/memoh/internal/mcp" - "github.com/memohai/memoh/internal/workspace/bridge" - pb "github.com/memohai/memoh/internal/workspace/bridgepb" -) - -const bufSize = 1 << 20 - -// fakeContainerService is an in-process gRPC server for testing. -// Each RPC handler can be overridden via handler fields. -type fakeContainerService struct { - pb.UnimplementedContainerServiceServer - - mu sync.Mutex - files map[string][]byte // path -> content - - execStdout string - execStderr string - execExitCode int32 -} - -func newFakeService() *fakeContainerService { - return &fakeContainerService{files: make(map[string][]byte)} -} - -func (f *fakeContainerService) setFile(path, content string) { - f.mu.Lock() - defer f.mu.Unlock() - f.files[path] = []byte(content) -} - -func (f *fakeContainerService) getFile(path string) ([]byte, bool) { - f.mu.Lock() - defer f.mu.Unlock() - data, ok := f.files[path] - return data, ok -} - -func (f *fakeContainerService) ReadFile(_ context.Context, req *pb.ReadFileRequest) (*pb.ReadFileResponse, error) { - data, ok := f.getFile(req.GetPath()) - if !ok { - return &pb.ReadFileResponse{Content: "", TotalLines: 0}, nil - } - content := string(data) - lines := splitLines(content) - total := int32(min(len(lines), math.MaxInt32)) //nolint:gosec // G115: value is clamped to math.MaxInt32 above - - offset := req.GetLineOffset() - if offset < 1 { - offset = 1 - } - n := req.GetNLines() - if n <= 0 { - n = int32(readMaxLines) - } - - start := int(offset - 1) - if start >= len(lines) { - return &pb.ReadFileResponse{Content: "", TotalLines: total}, nil - } - end := start + int(n) - if end > len(lines) { - end = len(lines) - } - result := "" - var resultSb76 strings.Builder - for i, l := range lines[start:end] { - if i > 0 { - resultSb76.WriteString("\n") - } - resultSb76.WriteString(l) - } - result += resultSb76.String() - return &pb.ReadFileResponse{Content: result, TotalLines: total}, nil -} - -func (f *fakeContainerService) WriteFile(_ context.Context, req *pb.WriteFileRequest) (*pb.WriteFileResponse, error) { - f.mu.Lock() - defer f.mu.Unlock() - f.files[req.GetPath()] = req.GetContent() - return &pb.WriteFileResponse{}, nil -} - -func (f *fakeContainerService) ListDir(_ context.Context, req *pb.ListDirRequest) (*pb.ListDirResponse, error) { - f.mu.Lock() - defer f.mu.Unlock() - var entries []*pb.FileEntry - dir := req.GetPath() - if dir == "." { - dir = "" - } - for path := range f.files { - if dir == "" || path == dir || hasPrefix(path, dir+"/") { - name := path - if dir != "" && hasPrefix(path, dir+"/") { - name = path[len(dir)+1:] - } - entries = append(entries, &pb.FileEntry{ - Path: name, - IsDir: false, - Size: int64(len(f.files[path])), - }) - } - } - return &pb.ListDirResponse{Entries: entries}, nil -} - -func (f *fakeContainerService) ReadRaw(req *pb.ReadRawRequest, stream pb.ContainerService_ReadRawServer) error { - data, ok := f.getFile(req.GetPath()) - if !ok { - return nil - } - return stream.Send(&pb.DataChunk{Data: data}) -} - -func (f *fakeContainerService) Exec(stream pb.ContainerService_ExecServer) error { - // Consume the config message. - if _, err := stream.Recv(); err != nil { - return err - } - if f.execStdout != "" { - if err := stream.Send(&pb.ExecOutput{Stream: pb.ExecOutput_STDOUT, Data: []byte(f.execStdout)}); err != nil { - return err - } - } - if f.execStderr != "" { - if err := stream.Send(&pb.ExecOutput{Stream: pb.ExecOutput_STDERR, Data: []byte(f.execStderr)}); err != nil { - return err - } - } - return stream.Send(&pb.ExecOutput{Stream: pb.ExecOutput_EXIT, ExitCode: f.execExitCode}) -} - -func hasPrefix(s, prefix string) bool { - return len(s) >= len(prefix) && s[:len(prefix)] == prefix -} - -func splitLines(s string) []string { - if s == "" { - return nil - } - var lines []string - start := 0 - for i := 0; i < len(s); i++ { - if s[i] == '\n' { - lines = append(lines, s[start:i]) - start = i + 1 - } - } - lines = append(lines, s[start:]) - return lines -} - -// testSetup creates a bufconn gRPC server and a matching bridge.Provider. -func testSetup(t *testing.T, svc *fakeContainerService) bridge.Provider { - t.Helper() - lis := bufconn.Listen(bufSize) - srv := grpc.NewServer() - pb.RegisterContainerServiceServer(srv, svc) - - done := make(chan struct{}) - go func() { - defer close(done) - _ = srv.Serve(lis) - }() - t.Cleanup(func() { - srv.Stop() - <-done - }) - - dialer := func(ctx context.Context, _ string) (net.Conn, error) { - return lis.DialContext(ctx) - } - conn, err := grpc.NewClient("passthrough://bufnet", - grpc.WithContextDialer(dialer), - grpc.WithTransportCredentials(insecure.NewCredentials()), - ) - if err != nil { - t.Fatalf("grpc.NewClient: %v", err) - } - t.Cleanup(func() { _ = conn.Close() }) - - client := bridge.NewClientFromConn(conn) - return &staticProvider{client: client} -} - -// staticProvider always returns the same client, ignoring botID. -type staticProvider struct { - client *bridge.Client -} - -func (p *staticProvider) MCPClient(_ context.Context, _ string) (*bridge.Client, error) { - return p.client, nil -} - -func session() mcpgw.ToolSessionContext { - return mcpgw.ToolSessionContext{BotID: "bot-test"} -} - -func executor(provider bridge.Provider) *Executor { - return NewExecutor(nil, provider, defaultExecWorkDir) -} - -// --- Tests --- - -func TestExecutor_ListTools(t *testing.T) { - svc := newFakeService() - provider := testSetup(t, svc) - ex := executor(provider) - - tools, err := ex.ListTools(context.Background(), session()) - if err != nil { - t.Fatalf("ListTools: %v", err) - } - want := map[string]bool{toolRead: false, toolWrite: false, toolList: false, toolEdit: false, toolExec: false} - for _, tool := range tools { - want[tool.Name] = true - } - for name, found := range want { - if !found { - t.Errorf("tool %q missing from ListTools", name) - } - } - if len(tools) != 5 { - t.Errorf("expected 5 tools, got %d", len(tools)) - } -} - -func TestExecutor_CallTool_Read(t *testing.T) { - svc := newFakeService() - svc.setFile("hello.txt", "line1\nline2\nline3") - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolRead, map[string]any{ - "path": "hello.txt", - }) - if err != nil { - t.Fatalf("CallTool read: %v", err) - } - if isError, _ := result["isError"].(bool); isError { - t.Fatalf("unexpected tool error: %v", result) - } - structured, ok := result["structuredContent"].(map[string]any) - if !ok { - t.Fatalf("expected structuredContent, got %T: %v", result["structuredContent"], result) - } - content, _ := structured["content"].(string) - if content == "" { - t.Errorf("expected non-empty content, got %q", content) - } - totalLines, _ := structured["total_lines"].(int32) - if totalLines != 3 { - t.Errorf("expected total_lines=3, got %v", structured["total_lines"]) - } -} - -func TestExecutor_CallTool_Read_Binary(t *testing.T) { - svc := newFakeService() - provider := testSetup(t, svc) - ex := executor(provider) - - // Reading a nonexistent file should return empty content, not error. - result, err := ex.CallTool(context.Background(), session(), toolRead, map[string]any{ - "path": "missing.txt", - }) - if err != nil { - t.Fatalf("CallTool: %v", err) - } - // Empty file returns success with empty content (total_lines=0). - if isError, _ := result["isError"].(bool); isError { - t.Logf("tool returned error for missing file: %v", result) - } -} - -func TestExecutor_CallTool_Read_Pagination(t *testing.T) { - svc := newFakeService() - svc.setFile("big.txt", "a\nb\nc\nd\ne") - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolRead, map[string]any{ - "path": "big.txt", - "line_offset": float64(3), - "n_lines": float64(2), - }) - if err != nil { - t.Fatalf("CallTool read pagination: %v", err) - } - if isError, _ := result["isError"].(bool); isError { - t.Fatalf("unexpected error: %v", result) - } - structured := result["structuredContent"].(map[string]any) - content := structured["content"].(string) - if content != "c\nd" { - t.Errorf("expected 'c\nd', got %q", content) - } -} - -func TestExecutor_CallTool_Read_InvalidArgs(t *testing.T) { - svc := newFakeService() - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolRead, map[string]any{ - "path": "f.txt", - "line_offset": float64(0), - }) - if err != nil { - t.Fatalf("CallTool: %v", err) - } - if isError, _ := result["isError"].(bool); !isError { - t.Errorf("expected error for line_offset=0, got %v", result) - } -} - -func TestExecutor_CallTool_Write(t *testing.T) { - svc := newFakeService() - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolWrite, map[string]any{ - "path": "out.txt", - "content": "hello world", - }) - if err != nil { - t.Fatalf("CallTool write: %v", err) - } - if isError, _ := result["isError"].(bool); isError { - t.Fatalf("unexpected error: %v", result) - } - - data, ok := svc.getFile("out.txt") - if !ok { - t.Fatal("file not written") - } - if string(data) != "hello world" { - t.Errorf("expected 'hello world', got %q", string(data)) - } -} - -func TestExecutor_CallTool_List(t *testing.T) { - svc := newFakeService() - svc.setFile("dir/a.txt", "aaa") - svc.setFile("dir/b.txt", "bbb") - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolList, map[string]any{ - "path": "dir", - }) - if err != nil { - t.Fatalf("CallTool list: %v", err) - } - if isError, _ := result["isError"].(bool); isError { - t.Fatalf("unexpected error: %v", result) - } - structured := result["structuredContent"].(map[string]any) - entries, _ := structured["entries"].([]map[string]any) - if len(entries) < 1 { - t.Logf("note: got %d entries", len(entries)) - } -} - -func TestExecutor_CallTool_Edit(t *testing.T) { - svc := newFakeService() - svc.setFile("edit.txt", "hello world\n") - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolEdit, map[string]any{ - "path": "edit.txt", - "old_text": "hello world", - "new_text": "goodbye world", - }) - if err != nil { - t.Fatalf("CallTool edit: %v", err) - } - if isError, _ := result["isError"].(bool); isError { - t.Fatalf("unexpected error: %v", result) - } - - data, _ := svc.getFile("edit.txt") - if string(data) != "goodbye world\n" { - t.Errorf("expected 'goodbye world\n', got %q", string(data)) - } -} - -func TestExecutor_CallTool_Edit_NotFound(t *testing.T) { - svc := newFakeService() - svc.setFile("edit.txt", "hello world\n") - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolEdit, map[string]any{ - "path": "edit.txt", - "old_text": "no such text", - "new_text": "replacement", - }) - if err != nil { - t.Fatalf("CallTool: %v", err) - } - if isError, _ := result["isError"].(bool); !isError { - t.Errorf("expected error for not-found old_text, got %v", result) - } -} - -func TestExecutor_CallTool_Exec(t *testing.T) { - svc := newFakeService() - svc.execStdout = "hello from exec\n" - svc.execExitCode = 0 - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolExec, map[string]any{ - "command": "echo hello", - }) - if err != nil { - t.Fatalf("CallTool exec: %v", err) - } - if isError, _ := result["isError"].(bool); isError { - t.Fatalf("unexpected error: %v", result) - } - structured := result["structuredContent"].(map[string]any) - stdout, _ := structured["stdout"].(string) - if stdout == "" { - t.Errorf("expected non-empty stdout, got %q", stdout) - } - exitCode, _ := structured["exit_code"].(int32) - if exitCode != 0 { - t.Errorf("expected exit_code=0, got %v", exitCode) - } -} - -func TestExecutor_CallTool_Exec_NonZeroExit(t *testing.T) { - svc := newFakeService() - svc.execStderr = "command not found\n" - svc.execExitCode = 127 - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), session(), toolExec, map[string]any{ - "command": "nosuchcmd", - }) - if err != nil { - t.Fatalf("CallTool: %v", err) - } - // Non-zero exit is not a tool error — it's returned as structured output. - if isError, _ := result["isError"].(bool); isError { - t.Errorf("unexpected tool error for non-zero exit: %v", result) - } - structured := result["structuredContent"].(map[string]any) - exitCode, _ := structured["exit_code"].(int32) - if exitCode != 127 { - t.Errorf("expected exit_code=127, got %v", exitCode) - } -} - -func TestExecutor_CallTool_NoBotID(t *testing.T) { - svc := newFakeService() - provider := testSetup(t, svc) - ex := executor(provider) - - result, err := ex.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: ""}, toolRead, map[string]any{ - "path": "f.txt", - }) - if err != nil { - t.Fatalf("CallTool: %v", err) - } - if isError, _ := result["isError"].(bool); !isError { - t.Errorf("expected error for empty bot_id") - } -} - -func TestExecutor_CallTool_UnknownTool(t *testing.T) { - svc := newFakeService() - provider := testSetup(t, svc) - ex := executor(provider) - - _, err := ex.CallTool(context.Background(), session(), "nosuch", nil) - if err == nil { - t.Errorf("expected error for unknown tool") - } -} - -func TestExecutor_NormalizePath(t *testing.T) { - ex := &Executor{execWorkDir: "/data"} - cases := []struct { - in, want string - }{ - {"/data/test.txt", "test.txt"}, - {"/data", "."}, - {"/data/a/b.txt", "a/b.txt"}, - {"relative.txt", "relative.txt"}, - {"", ""}, - } - for _, c := range cases { - got := ex.normalizePath(c.in) - if got != c.want { - t.Errorf("normalizePath(%q) = %q, want %q", c.in, got, c.want) - } - } -} diff --git a/internal/mcp/providers/container/prune.go b/internal/mcp/providers/container/prune.go deleted file mode 100644 index a1cbdf99..00000000 --- a/internal/mcp/providers/container/prune.go +++ /dev/null @@ -1,38 +0,0 @@ -package container - -import ( - textprune "github.com/memohai/memoh/internal/prune" -) - -// Output pruning limits for tool results. -const ( - toolOutputHeadBytes = 4 * 1024 - toolOutputTailBytes = 1 * 1024 - toolOutputHeadLines = 150 - toolOutputTailLines = 50 -) - -// Read tool limits - single conservative budget. -// AI can paginate via line_offset/n_lines if file is larger. -const ( - readMaxLines = 200 // Max lines per read - readMaxBytes = 5120 // 5KB per read - readMaxLineLength = 1000 // Max characters per line (runes) - readHeadBytes = 3072 // 3KB head when pruning - readTailBytes = 1024 // 1KB tail when pruning - readHeadLines = 120 // 120 lines head when pruning - readTailLines = 40 // 40 lines tail when pruning -) - -// pruneToolOutputText prunes generic tool output (exec, etc.). -func pruneToolOutputText(text, label string) string { - return textprune.PruneWithEdges(text, label, textprune.Config{ - MaxBytes: textprune.DefaultMaxBytes, - MaxLines: textprune.DefaultMaxLines, - HeadBytes: toolOutputHeadBytes, - TailBytes: toolOutputTailBytes, - HeadLines: toolOutputHeadLines, - TailLines: toolOutputTailLines, - Marker: textprune.DefaultMarker, - }) -} diff --git a/internal/mcp/providers/email/provider.go b/internal/mcp/providers/email/provider.go deleted file mode 100644 index ec024935..00000000 --- a/internal/mcp/providers/email/provider.go +++ /dev/null @@ -1,303 +0,0 @@ -package email - -import ( - "context" - "fmt" - "log/slog" - "math" - "strconv" - "strings" - - "github.com/memohai/memoh/internal/email" - mcpgw "github.com/memohai/memoh/internal/mcp" -) - -const ( - toolListEmailAccounts = "list_email_accounts" - toolSendEmail = "send_email" - toolListEmails = "list_email" - toolReadEmail = "read_email" -) - -type Executor struct { - logger *slog.Logger - service *email.Service - manager *email.Manager -} - -func NewExecutor(log *slog.Logger, service *email.Service, manager *email.Manager) *Executor { - return &Executor{ - logger: log.With(slog.String("provider", "email_tool")), - service: service, - manager: manager, - } -} - -func (*Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - return []mcpgw.ToolDescriptor{ - { - Name: toolListEmailAccounts, - Description: "List the email accounts (provider bindings) configured for this bot, including provider IDs, email addresses, and permissions.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{}, - }, - }, - { - Name: toolSendEmail, - Description: "Send an email via the bot's configured email provider. Requires write permission.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "to": map[string]any{"type": "string", "description": "Recipient email address(es), comma-separated"}, - "subject": map[string]any{"type": "string", "description": "Email subject"}, - "body": map[string]any{"type": "string", "description": "Email body content"}, - "html": map[string]any{"type": "boolean", "description": "Whether body is HTML (default false)"}, - "provider_id": map[string]any{"type": "string", "description": "Email provider ID to send from (optional, uses default if omitted)"}, - }, - "required": []string{"to", "subject", "body"}, - }, - }, - { - Name: toolListEmails, - Description: "List emails from the mailbox (newest first). Supports pagination. Requires read permission.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "page": map[string]any{"type": "integer", "description": "Page number, 0-based (default 0 = newest)"}, - "page_size": map[string]any{"type": "integer", "description": "Emails per page (default 20)"}, - "provider_id": map[string]any{"type": "string", "description": "Email provider ID (optional, uses first readable binding)"}, - }, - }, - }, - { - Name: toolReadEmail, - Description: "Read the full content of an email by its UID. Requires read permission.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "uid": map[string]any{"type": "integer", "description": "The email UID from email_list results"}, - "provider_id": map[string]any{"type": "string", "description": "Email provider ID (optional, uses first readable binding)"}, - }, - "required": []string{"uid"}, - }, - }, - }, nil -} - -func (e *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - bindings, err := e.service.ListBindings(ctx, botID) - if err != nil || len(bindings) == 0 { - return mcpgw.BuildToolErrorResult("no email binding configured for this bot"), nil - } - - resolveReadBinding := func() *email.BindingResponse { - providerID := mcpgw.StringArg(arguments, "provider_id") - for i := range bindings { - if !bindings[i].CanRead { - continue - } - if providerID == "" || bindings[i].EmailProviderID == providerID { - return &bindings[i] - } - } - return nil - } - - resolveWriteBinding := func() *email.BindingResponse { - providerID := mcpgw.StringArg(arguments, "provider_id") - for i := range bindings { - if !bindings[i].CanWrite { - continue - } - if providerID == "" || bindings[i].EmailProviderID == providerID { - return &bindings[i] - } - } - return nil - } - - switch toolName { - case toolListEmailAccounts: - return e.callAccounts(ctx, bindings) - case toolSendEmail: - binding := resolveWriteBinding() - if binding == nil { - return mcpgw.BuildToolErrorResult("email write permission denied or provider not found"), nil - } - return e.callSend(ctx, botID, binding.EmailProviderID, arguments) - case toolListEmails: - binding := resolveReadBinding() - if binding == nil { - return mcpgw.BuildToolErrorResult("email read permission denied or provider not found"), nil - } - return e.callList(ctx, binding.EmailProviderID, arguments) - case toolReadEmail: - binding := resolveReadBinding() - if binding == nil { - return mcpgw.BuildToolErrorResult("email read permission denied or provider not found"), nil - } - return e.callRead(ctx, binding.EmailProviderID, arguments) - default: - return nil, fmt.Errorf("unknown tool: %s", toolName) - } -} - -func (*Executor) callAccounts(_ context.Context, bindings []email.BindingResponse) (map[string]any, error) { - accounts := make([]map[string]any, 0, len(bindings)) - for _, b := range bindings { - accounts = append(accounts, map[string]any{ - "provider_id": b.EmailProviderID, - "email_address": b.EmailAddress, - "can_read": b.CanRead, - "can_write": b.CanWrite, - "can_delete": b.CanDelete, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "accounts": accounts, - }), nil -} - -func (e *Executor) callSend(ctx context.Context, botID string, providerID string, args map[string]any) (map[string]any, error) { - toRaw := mcpgw.StringArg(args, "to") - subject := mcpgw.StringArg(args, "subject") - body := mcpgw.StringArg(args, "body") - isHTML, _, _ := mcpgw.BoolArg(args, "html") - - if toRaw == "" || subject == "" || body == "" { - return mcpgw.BuildToolErrorResult("to, subject, and body are required"), nil - } - - var toList []string - for _, addr := range strings.Split(toRaw, ",") { - addr = strings.TrimSpace(addr) - if addr != "" { - toList = append(toList, addr) - } - } - - msg := email.OutboundEmail{ - To: toList, - Subject: subject, - Body: body, - HTML: isHTML, - } - - messageID, err := e.manager.SendEmail(ctx, botID, providerID, msg) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "message_id": messageID, - "status": "sent", - }), nil -} - -func (e *Executor) callList(ctx context.Context, providerID string, args map[string]any) (map[string]any, error) { - providerName, config, err := e.service.ProviderConfig(ctx, providerID) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - config = ensureProviderID(config, providerID) - return e.callListForProvider(ctx, providerName, config, args) -} - -func (e *Executor) callListForProvider(ctx context.Context, providerName email.ProviderName, config map[string]any, args map[string]any) (map[string]any, error) { - reader, err := e.service.Registry().GetMailboxReader(providerName) - if err != nil { - return mcpgw.BuildToolErrorResult("mailbox listing not supported for this provider"), nil - } - - page, _, _ := mcpgw.IntArg(args, "page") - pageSize, _, _ := mcpgw.IntArg(args, "page_size") - if pageSize <= 0 { - pageSize = 20 - } - - emails, total, err := reader.ListMailbox(ctx, config, page, pageSize) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - summaries := make([]map[string]any, 0, len(emails)) - for _, item := range emails { - summaries = append(summaries, map[string]any{ - "uid": item.MessageID, - "from": item.From, - "subject": item.Subject, - "received_at": item.ReceivedAt, - }) - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "emails": summaries, - "total": total, - "page": page, - }), nil -} - -func (e *Executor) callRead(ctx context.Context, providerID string, args map[string]any) (map[string]any, error) { - uidRaw, ok, _ := mcpgw.IntArg(args, "uid") - if !ok || uidRaw <= 0 { - uidStr := mcpgw.StringArg(args, "uid") - if uidStr != "" { - parsed, _ := strconv.Atoi(uidStr) - uidRaw = parsed - } - } - if uidRaw <= 0 { - return mcpgw.BuildToolErrorResult("uid is required"), nil - } - if uidRaw > math.MaxUint32 { - return mcpgw.BuildToolErrorResult("uid out of range"), nil - } - - providerName, config, err := e.service.ProviderConfig(ctx, providerID) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - config = ensureProviderID(config, providerID) - return e.callReadForProvider(ctx, providerName, config, uint32(uidRaw)) //nolint:gosec // bounds checked above -} - -func (e *Executor) callReadForProvider(ctx context.Context, providerName email.ProviderName, config map[string]any, uid uint32) (map[string]any, error) { - reader, err := e.service.Registry().GetMailboxReader(providerName) - if err != nil { - return mcpgw.BuildToolErrorResult("mailbox reading not supported for this provider"), nil - } - - item, err := reader.ReadMailbox(ctx, config, uid) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "uid": item.MessageID, - "from": item.From, - "to": item.To, - "subject": item.Subject, - "body": item.BodyText, - "received_at": item.ReceivedAt, - }), nil -} - -func ensureProviderID(config map[string]any, providerID string) map[string]any { - if config == nil { - config = make(map[string]any) - } else { - copied := make(map[string]any, len(config)+1) - for k, v := range config { - copied[k] = v - } - config = copied - } - config["_provider_id"] = providerID - return config -} diff --git a/internal/mcp/providers/inbox/provider.go b/internal/mcp/providers/inbox/provider.go deleted file mode 100644 index 79b97552..00000000 --- a/internal/mcp/providers/inbox/provider.go +++ /dev/null @@ -1,149 +0,0 @@ -package inbox - -import ( - "context" - "fmt" - "log/slog" - "strings" - "time" - - inboxsvc "github.com/memohai/memoh/internal/inbox" - mcpgw "github.com/memohai/memoh/internal/mcp" -) - -const ( - toolSearchInbox = "search_inbox" - defaultSearchLimit = 20 - maxSearchLimit = 100 -) - -type Executor struct { - service *inboxsvc.Service - logger *slog.Logger -} - -func NewExecutor(log *slog.Logger, service *inboxsvc.Service) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - service: service, - logger: log.With(slog.String("provider", "inbox_tool")), - } -} - -func (e *Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if e.service == nil { - return []mcpgw.ToolDescriptor{}, nil - } - return []mcpgw.ToolDescriptor{ - { - Name: toolSearchInbox, - Description: "Search historical inbox messages by keyword. Inbox contains messages from group conversations where the bot was not directly mentioned, as well as notifications from external sources.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "query": map[string]any{ - "type": "string", - "description": "Search keyword to match against inbox message content", - }, - "start_time": map[string]any{ - "type": "string", - "description": "ISO 8601 start time filter (e.g. 2025-01-01T00:00:00Z)", - }, - "end_time": map[string]any{ - "type": "string", - "description": "ISO 8601 end time filter", - }, - "limit": map[string]any{ - "type": "integer", - "description": "Maximum number of results (default 20, max 100)", - }, - "include_read": map[string]any{ - "type": "boolean", - "description": "Whether to include already-read items (default true)", - }, - }, - "required": []string{}, - }, - }, - }, nil -} - -func (e *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if toolName != toolSearchInbox { - return nil, mcpgw.ErrToolNotFound - } - if e.service == nil { - return mcpgw.BuildToolErrorResult("inbox service not available"), nil - } - - query := mcpgw.StringArg(arguments, "query") - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - limit := defaultSearchLimit - if value, ok, err := mcpgw.IntArg(arguments, "limit"); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } else if ok { - limit = value - } - if limit <= 0 { - limit = defaultSearchLimit - } - if limit > maxSearchLimit { - limit = maxSearchLimit - } - - req := inboxsvc.SearchRequest{ - Query: query, - Limit: limit, - } - - if startStr := mcpgw.StringArg(arguments, "start_time"); startStr != "" { - t, err := time.Parse(time.RFC3339, startStr) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("invalid start_time: %v", err)), nil - } - req.StartTime = &t - } - if endStr := mcpgw.StringArg(arguments, "end_time"); endStr != "" { - t, err := time.Parse(time.RFC3339, endStr) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("invalid end_time: %v", err)), nil - } - req.EndTime = &t - } - if includeRead, ok, err := mcpgw.BoolArg(arguments, "include_read"); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } else if ok { - req.IncludeRead = &includeRead - } - - items, err := e.service.Search(ctx, botID, req) - if err != nil { - e.logger.Warn("inbox search failed", slog.String("bot_id", botID), slog.Any("error", err)) - return mcpgw.BuildToolErrorResult("inbox search failed"), nil - } - - results := make([]map[string]any, 0, len(items)) - for _, item := range items { - entry := map[string]any{ - "id": item.ID, - "source": item.Source, - "header": item.Header, - "content": item.Content, - "is_read": item.IsRead, - "created_at": item.CreatedAt.Format(time.RFC3339), - } - results = append(results, entry) - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "total": len(results), - "results": results, - }), nil -} diff --git a/internal/mcp/providers/memory/provider.go b/internal/mcp/providers/memory/provider.go deleted file mode 100644 index c4019368..00000000 --- a/internal/mcp/providers/memory/provider.go +++ /dev/null @@ -1,75 +0,0 @@ -package memory - -import ( - "context" - "log/slog" - "strings" - - mcpgw "github.com/memohai/memoh/internal/mcp" - memprovider "github.com/memohai/memoh/internal/memory/adapters" - "github.com/memohai/memoh/internal/settings" -) - -// BotSettingsReader returns bot settings for provider resolution. -type BotSettingsReader interface { - GetBot(ctx context.Context, botID string) (settings.Settings, error) -} - -// Executor proxies MCP tool calls to the memory provider configured for each bot. -// If a bot has no memory provider, no tools are returned. -type Executor struct { - registry *memprovider.Registry - settingsService BotSettingsReader - logger *slog.Logger -} - -func NewExecutor(log *slog.Logger, registry *memprovider.Registry, settingsService BotSettingsReader) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - registry: registry, - settingsService: settingsService, - logger: log.With(slog.String("provider", "memory_tool")), - } -} - -func (e *Executor) resolveProvider(ctx context.Context, botID string) memprovider.Provider { - if e.registry == nil || e.settingsService == nil { - return nil - } - botID = strings.TrimSpace(botID) - if botID == "" { - return nil - } - botSettings, err := e.settingsService.GetBot(ctx, botID) - if err != nil { - return nil - } - providerID := strings.TrimSpace(botSettings.MemoryProviderID) - if providerID == "" { - return nil - } - p, err := e.registry.Get(providerID) - if err != nil { - e.logger.Warn("memory provider lookup failed", slog.String("provider_id", providerID), slog.Any("error", err)) - return nil - } - return p -} - -func (e *Executor) ListTools(ctx context.Context, session mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - p := e.resolveProvider(ctx, session.BotID) - if p == nil { - return []mcpgw.ToolDescriptor{}, nil - } - return p.ListTools(ctx, session) -} - -func (e *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - p := e.resolveProvider(ctx, session.BotID) - if p == nil { - return mcpgw.BuildToolErrorResult("memory not enabled for this bot"), nil - } - return p.CallTool(ctx, session, toolName, arguments) -} diff --git a/internal/mcp/providers/memory/provider_test.go b/internal/mcp/providers/memory/provider_test.go deleted file mode 100644 index 78bbd99c..00000000 --- a/internal/mcp/providers/memory/provider_test.go +++ /dev/null @@ -1,148 +0,0 @@ -package memory - -import ( - "context" - "testing" - - mcpgw "github.com/memohai/memoh/internal/mcp" - memprovider "github.com/memohai/memoh/internal/memory/adapters" - "github.com/memohai/memoh/internal/settings" -) - -type fakeSettingsService struct { - settings settings.Settings - err error -} - -func (f *fakeSettingsService) GetBot(_ context.Context, _ string) (settings.Settings, error) { - if f.err != nil { - return settings.Settings{}, f.err - } - return f.settings, nil -} - -type fakeProvider struct { - tools []mcpgw.ToolDescriptor - callResp map[string]any - callErr error -} - -func (*fakeProvider) Type() string { return "fake" } -func (*fakeProvider) OnBeforeChat(_ context.Context, _ memprovider.BeforeChatRequest) (*memprovider.BeforeChatResult, error) { - return nil, nil -} - -func (*fakeProvider) OnAfterChat(_ context.Context, _ memprovider.AfterChatRequest) error { - return nil -} - -func (f *fakeProvider) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - return f.tools, nil -} - -func (f *fakeProvider) CallTool(_ context.Context, _ mcpgw.ToolSessionContext, _ string, _ map[string]any) (map[string]any, error) { - return f.callResp, f.callErr -} - -func (*fakeProvider) Add(_ context.Context, _ memprovider.AddRequest) (memprovider.SearchResponse, error) { - return memprovider.SearchResponse{}, nil -} - -func (*fakeProvider) Search(_ context.Context, _ memprovider.SearchRequest) (memprovider.SearchResponse, error) { - return memprovider.SearchResponse{}, nil -} - -func (*fakeProvider) GetAll(_ context.Context, _ memprovider.GetAllRequest) (memprovider.SearchResponse, error) { - return memprovider.SearchResponse{}, nil -} - -func (*fakeProvider) Update(_ context.Context, _ memprovider.UpdateRequest) (memprovider.MemoryItem, error) { - return memprovider.MemoryItem{}, nil -} - -func (*fakeProvider) Delete(_ context.Context, _ string) (memprovider.DeleteResponse, error) { - return memprovider.DeleteResponse{}, nil -} - -func (*fakeProvider) DeleteBatch(_ context.Context, _ []string) (memprovider.DeleteResponse, error) { - return memprovider.DeleteResponse{}, nil -} - -func (*fakeProvider) DeleteAll(_ context.Context, _ memprovider.DeleteAllRequest) (memprovider.DeleteResponse, error) { - return memprovider.DeleteResponse{}, nil -} - -func (*fakeProvider) Compact(_ context.Context, _ map[string]any, _ float64, _ int) (memprovider.CompactResult, error) { - return memprovider.CompactResult{}, nil -} - -func (*fakeProvider) Usage(_ context.Context, _ map[string]any) (memprovider.UsageResponse, error) { - return memprovider.UsageResponse{}, nil -} - -func TestExecutor_ListTools_NoProvider(t *testing.T) { - exec := NewExecutor(nil, nil, nil) - tools, err := exec.ListTools(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}) - if err != nil { - t.Fatal(err) - } - if len(tools) != 0 { - t.Errorf("expected 0 tools, got %d", len(tools)) - } -} - -func TestExecutor_ListTools_WithProvider(t *testing.T) { - registry := memprovider.NewRegistry(nil) - fp := &fakeProvider{ - tools: []mcpgw.ToolDescriptor{{Name: "search_memory", Description: "test"}}, - } - registry.Register("provider-1", fp) - - ss := &fakeSettingsService{ - settings: settings.Settings{MemoryProviderID: "provider-1"}, - } - exec := NewExecutor(nil, registry, ss) - - tools, err := exec.ListTools(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}) - if err != nil { - t.Fatal(err) - } - if len(tools) != 1 { - t.Fatalf("expected 1 tool, got %d", len(tools)) - } - if tools[0].Name != "search_memory" { - t.Errorf("expected search_memory, got %s", tools[0].Name) - } -} - -func TestExecutor_CallTool_NoProvider(t *testing.T) { - exec := NewExecutor(nil, nil, nil) - result, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}, "search_memory", nil) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when no provider") - } -} - -func TestExecutor_CallTool_ProxiesToProvider(t *testing.T) { - registry := memprovider.NewRegistry(nil) - fp := &fakeProvider{ - callResp: mcpgw.BuildToolSuccessResult(map[string]any{"query": "test", "total": 1}), - } - registry.Register("provider-1", fp) - - ss := &fakeSettingsService{ - settings: settings.Settings{MemoryProviderID: "provider-1"}, - } - exec := NewExecutor(nil, registry, ss) - - result, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}, "search_memory", map[string]any{"query": "test"}) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); isErr { - t.Error("unexpected error result") - } -} diff --git a/internal/mcp/providers/message/provider.go b/internal/mcp/providers/message/provider.go index db6c9d7d..b7d860c3 100644 --- a/internal/mcp/providers/message/provider.go +++ b/internal/mcp/providers/message/provider.go @@ -104,7 +104,9 @@ func (p *Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]m "attachments": map[string]any{ "type": "array", "description": "File paths or URLs to attach. Each item is a container path (e.g. /data/media/ab/file.jpg), an HTTP URL, or an object with {path, url, type, name}.", - "items": map[string]any{}, + "items": map[string]any{ + "type": "string", + }, }, "message": map[string]any{ "type": "object", diff --git a/internal/mcp/providers/message/provider_test.go b/internal/mcp/providers/message/provider_test.go deleted file mode 100644 index 5a89c1cf..00000000 --- a/internal/mcp/providers/message/provider_test.go +++ /dev/null @@ -1,583 +0,0 @@ -package message - -import ( - "context" - "errors" - "strings" - "testing" - - "github.com/memohai/memoh/internal/channel" - mcpgw "github.com/memohai/memoh/internal/mcp" -) - -type fakeSender struct { - err error - lastReq channel.SendRequest -} - -func (f *fakeSender) Send(_ context.Context, _ string, _ channel.ChannelType, req channel.SendRequest) error { - f.lastReq = req - return f.err -} - -type fakeReactor struct { - err error - lastReq channel.ReactRequest -} - -func (f *fakeReactor) React(_ context.Context, _ string, _ channel.ChannelType, req channel.ReactRequest) error { - f.lastReq = req - return f.err -} - -type fakeResolver struct { - ct channel.ChannelType - err error -} - -func (f *fakeResolver) ParseChannelType(_ string) (channel.ChannelType, error) { - if f.err != nil { - return "", f.err - } - return f.ct, nil -} - -type fakeAssetResolver struct { - getAsset AssetMeta - getErr error - ingestAsset AssetMeta - ingestErr error -} - -func (f *fakeAssetResolver) GetByStorageKey(context.Context, string, string) (AssetMeta, error) { - if f.getErr != nil { - return AssetMeta{}, f.getErr - } - if strings.TrimSpace(f.getAsset.ContentHash) != "" { - return f.getAsset, nil - } - return AssetMeta{}, errors.New("not found") -} - -func (f *fakeAssetResolver) IngestContainerFile(context.Context, string, string) (AssetMeta, error) { - if f.ingestErr != nil { - return AssetMeta{}, f.ingestErr - } - if strings.TrimSpace(f.ingestAsset.ContentHash) != "" { - return f.ingestAsset, nil - } - return AssetMeta{}, errors.New("ingest disabled") -} - -// --- send tests --- - -func TestExecutor_ListTools_NilDeps(t *testing.T) { - exec := NewExecutor(nil, nil, nil, nil, nil) - tools, err := exec.ListTools(context.Background(), mcpgw.ToolSessionContext{}) - if err != nil { - t.Fatal(err) - } - if len(tools) != 0 { - t.Errorf("expected 0 tools when deps nil, got %d", len(tools)) - } -} - -func TestExecutor_ListTools(t *testing.T) { - sender := &fakeSender{} - reactor := &fakeReactor{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, reactor, resolver, nil) - tools, err := exec.ListTools(context.Background(), mcpgw.ToolSessionContext{}) - if err != nil { - t.Fatal(err) - } - if len(tools) != 2 { - t.Fatalf("expected 2 tools, got %d", len(tools)) - } - if tools[0].Name != toolSend { - t.Errorf("tool[0] name = %q, want %q", tools[0].Name, toolSend) - } - if tools[1].Name != toolReact { - t.Errorf("tool[1] name = %q, want %q", tools[1].Name, toolReact) - } -} - -func TestExecutor_ListTools_OnlySender(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - tools, err := exec.ListTools(context.Background(), mcpgw.ToolSessionContext{}) - if err != nil { - t.Fatal(err) - } - if len(tools) != 1 { - t.Fatalf("expected 1 tool, got %d", len(tools)) - } - if tools[0].Name != toolSend { - t.Errorf("tool name = %q, want %q", tools[0].Name, toolSend) - } -} - -func TestExecutor_CallTool_NotFound(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - _, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}, "other_tool", nil) - if !errors.Is(err, mcpgw.ErrToolNotFound) { - t.Errorf("expected ErrToolNotFound, got %v", err) - } -} - -func TestExecutor_CallTool_NilDeps(t *testing.T) { - exec := NewExecutor(nil, nil, nil, nil, nil) - result, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}, toolSend, map[string]any{ - "platform": "feishu", "target": "t1", "text": "hi", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error result when deps nil") - } -} - -func TestExecutor_CallTool_NoBotID(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - result, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{}, toolSend, map[string]any{ - "platform": "feishu", "target": "t1", "text": "hi", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when bot_id is missing") - } -} - -func TestExecutor_CallTool_BotIDMismatch(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "bot_id": "other", "platform": "feishu", "target": "t1", "text": "hi", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when bot_id mismatch") - } -} - -func TestExecutor_CallTool_NoPlatform(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "target": "t1", "text": "hi", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when platform is missing") - } -} - -func TestExecutor_CallTool_PlatformParseError(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{err: errors.New("unknown platform")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "feishu"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "platform": "bad", "target": "t1", "text": "hi", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when platform parse fails") - } -} - -func TestExecutor_CallTool_NoMessage(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "platform": "feishu", "target": "t1", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when message/text is missing") - } -} - -func TestExecutor_CallTool_NoTarget(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "platform": "feishu", "text": "hi", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when target and channel_identity_id are missing") - } -} - -func TestExecutor_CallTool_SendError(t *testing.T) { - sender := &fakeSender{err: errors.New("send failed")} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", ReplyTarget: "t1"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "platform": "feishu", "text": "hi", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when Send fails") - } -} - -func TestExecutor_CallTool_SameRouteRejected(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "feishu", ReplyTarget: "chat1"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "text": "hello", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when sending to the same route as current session") - } -} - -func TestExecutor_CallTool_Success(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("feishu")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "feishu", ReplyTarget: "chat1"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "target": "chat2", - "text": "hello", - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - content, _ := result["structuredContent"].(map[string]any) - if content == nil { - t.Fatal("no structuredContent") - } - if content["ok"] != true { - t.Errorf("ok = %v", content["ok"]) - } - if content["platform"] != "feishu" { - t.Errorf("platform = %v", content["platform"]) - } -} - -func TestExecutor_CallTool_ReplyTo(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("telegram")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "telegram", ReplyTarget: "123"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "target": "456", - "text": "reply text", - "reply_to": "msg-789", - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - if sender.lastReq.Message.Reply == nil { - t.Fatal("expected Reply to be set") - } - if sender.lastReq.Message.Reply.MessageID != "msg-789" { - t.Errorf("Reply.MessageID = %q, want %q", sender.lastReq.Message.Reply.MessageID, "msg-789") - } -} - -func TestExecutor_CallTool_NoReplyTo(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("telegram")} - exec := NewExecutor(nil, sender, nil, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "telegram", ReplyTarget: "123"} - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "target": "456", - "text": "no reply", - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - if sender.lastReq.Message.Reply != nil { - t.Error("expected Reply to be nil when reply_to is not provided") - } -} - -func TestExecutor_CallTool_TopLevelAttachmentsArePreserved(t *testing.T) { - tests := []struct { - name string - attachments any - }{ - {name: "string array", attachments: []string{"https://example.com/test.jpg"}}, - {name: "single string", attachments: "https://example.com/test.jpg"}, - {name: "object", attachments: map[string]any{"url": "https://example.com/test.jpg"}}, - } - - for _, tt := range tests { - t.Run(tt.name, func(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("qq")} - exec := NewExecutor(nil, sender, nil, resolver, &fakeAssetResolver{}) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "qq"} - - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "platform": "qq", - "target": "3fe2bad9-3eae-4f23-872c-b7a63662aa00", - "text": "test.jpg from QQ", - "attachments": tt.attachments, - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - if len(sender.lastReq.Message.Attachments) != 1 { - t.Fatalf("expected 1 attachment, got %d", len(sender.lastReq.Message.Attachments)) - } - att := sender.lastReq.Message.Attachments[0] - if att.URL != "https://example.com/test.jpg" { - t.Fatalf("unexpected attachment url: %q", att.URL) - } - if att.Type != channel.AttachmentImage { - t.Fatalf("unexpected attachment type: %q", att.Type) - } - }) - } -} - -func TestExecutor_CallTool_AllowsEmptyTopLevelAttachmentsArray(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("qq")} - exec := NewExecutor(nil, sender, nil, resolver, &fakeAssetResolver{}) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "qq"} - - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "platform": "qq", - "target": "3fe2bad9-3eae-4f23-872c-b7a63662aa00", - "text": "hello", - "attachments": []any{}, - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - if len(sender.lastReq.Message.Attachments) != 0 { - t.Fatalf("expected no attachments, got %d", len(sender.lastReq.Message.Attachments)) - } -} - -func TestExecutor_CallTool_DataAttachmentsFailWhenIngestFails(t *testing.T) { - sender := &fakeSender{} - resolver := &fakeResolver{ct: channel.ChannelType("qq")} - exec := NewExecutor(nil, sender, nil, resolver, &fakeAssetResolver{ingestErr: errors.New("ingest disabled")}) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "qq"} - - result, err := exec.CallTool(context.Background(), session, toolSend, map[string]any{ - "platform": "qq", - "target": "3fe2bad9-3eae-4f23-872c-b7a63662aa00", - "text": "test.jpg from QQ", - "attachments": []string{"/data/test.jpg"}, - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Fatal("expected attachment resolution error") - } - payloadMsg := "" - if content, ok := result["content"].([]map[string]any); ok && len(content) > 0 { - payloadMsg, _ = content[0]["text"].(string) - } - if !strings.Contains(payloadMsg, "attachments could not be resolved") { - t.Fatalf("unexpected error: %v", payloadMsg) - } - if len(sender.lastReq.Message.Attachments) != 0 { - t.Fatalf("expected no outbound attachments, got %d", len(sender.lastReq.Message.Attachments)) - } -} - -// --- react tests --- - -func TestExecutor_React_NilReactor(t *testing.T) { - exec := NewExecutor(nil, nil, nil, nil, nil) - result, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}, toolReact, map[string]any{ - "platform": "telegram", "target": "123", "message_id": "456", "emoji": "👍", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when reactor is nil") - } -} - -func TestExecutor_React_NoMessageID(t *testing.T) { - reactor := &fakeReactor{} - resolver := &fakeResolver{ct: channel.ChannelType("telegram")} - exec := NewExecutor(nil, nil, reactor, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "telegram", ReplyTarget: "123"} - result, err := exec.CallTool(context.Background(), session, toolReact, map[string]any{ - "emoji": "👍", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when message_id is missing") - } -} - -func TestExecutor_React_NoTarget(t *testing.T) { - reactor := &fakeReactor{} - resolver := &fakeResolver{ct: channel.ChannelType("telegram")} - exec := NewExecutor(nil, nil, reactor, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "telegram"} - result, err := exec.CallTool(context.Background(), session, toolReact, map[string]any{ - "message_id": "456", "emoji": "👍", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when target is missing") - } -} - -func TestExecutor_React_Success(t *testing.T) { - reactor := &fakeReactor{} - resolver := &fakeResolver{ct: channel.ChannelType("telegram")} - exec := NewExecutor(nil, nil, reactor, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "telegram", ReplyTarget: "123"} - result, err := exec.CallTool(context.Background(), session, toolReact, map[string]any{ - "message_id": "456", "emoji": "👍", - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - content, _ := result["structuredContent"].(map[string]any) - if content == nil { - t.Fatal("no structuredContent") - } - if content["ok"] != true { - t.Errorf("ok = %v", content["ok"]) - } - if content["action"] != "added" { - t.Errorf("action = %v", content["action"]) - } - if content["emoji"] != "👍" { - t.Errorf("emoji = %v", content["emoji"]) - } -} - -func TestExecutor_React_Remove(t *testing.T) { - reactor := &fakeReactor{} - resolver := &fakeResolver{ct: channel.ChannelType("telegram")} - exec := NewExecutor(nil, nil, reactor, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "telegram", ReplyTarget: "123"} - result, err := exec.CallTool(context.Background(), session, toolReact, map[string]any{ - "message_id": "456", "remove": true, - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - content, _ := result["structuredContent"].(map[string]any) - if content["action"] != "removed" { - t.Errorf("action = %v", content["action"]) - } - if reactor.lastReq.Remove != true { - t.Error("expected Remove=true in request") - } -} - -func TestExecutor_React_Error(t *testing.T) { - reactor := &fakeReactor{err: errors.New("reaction failed")} - resolver := &fakeResolver{ct: channel.ChannelType("telegram")} - exec := NewExecutor(nil, nil, reactor, resolver, nil) - session := mcpgw.ToolSessionContext{BotID: "bot1", CurrentPlatform: "telegram", ReplyTarget: "123"} - result, err := exec.CallTool(context.Background(), session, toolReact, map[string]any{ - "message_id": "456", "emoji": "👍", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when React fails") - } -} - -// --- parseOutboundMessage tests --- - -func TestParseOutboundMessage(t *testing.T) { - tests := []struct { - name string - args map[string]any - fallback string - wantEmpty bool - wantErr bool - }{ - {"text fallback", map[string]any{}, "hello", false, false}, - {"message string", map[string]any{"message": "msg"}, "", false, false}, - {"message object", map[string]any{"message": map[string]any{"text": "obj"}}, "", false, false}, - {"empty", map[string]any{}, "", true, true}, - } - for _, tt := range tests { - t.Run(tt.name, func(t *testing.T) { - msg, err := parseOutboundMessage(tt.args, tt.fallback) - if (err != nil) != tt.wantErr { - t.Errorf("parseOutboundMessage() error = %v, wantErr %v", err, tt.wantErr) - return - } - if tt.wantEmpty && !msg.IsEmpty() { - t.Error("expected empty message") - } - if !tt.wantEmpty && msg.IsEmpty() { - t.Error("expected non-empty message") - } - }) - } -} diff --git a/internal/mcp/providers/schedule/provider.go b/internal/mcp/providers/schedule/provider.go deleted file mode 100644 index c8ad8a3a..00000000 --- a/internal/mcp/providers/schedule/provider.go +++ /dev/null @@ -1,259 +0,0 @@ -package schedule - -import ( - "context" - "log/slog" - "strings" - - mcpgw "github.com/memohai/memoh/internal/mcp" - sched "github.com/memohai/memoh/internal/schedule" -) - -const ( - toolScheduleList = "list_schedule" - toolScheduleGet = "get_schedule" - toolScheduleCreate = "create_schedule" - toolScheduleUpdate = "update_schedule" - toolScheduleDelete = "delete_schedule" -) - -type Scheduler interface { - List(ctx context.Context, botID string) ([]sched.Schedule, error) - Get(ctx context.Context, id string) (sched.Schedule, error) - Create(ctx context.Context, botID string, req sched.CreateRequest) (sched.Schedule, error) - Update(ctx context.Context, id string, req sched.UpdateRequest) (sched.Schedule, error) - Delete(ctx context.Context, id string) error -} - -type Executor struct { - service Scheduler - logger *slog.Logger -} - -func NewExecutor(log *slog.Logger, service Scheduler) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - service: service, - logger: log.With(slog.String("provider", "schedule_tool")), - } -} - -func (p *Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if p.service == nil { - return []mcpgw.ToolDescriptor{}, nil - } - return []mcpgw.ToolDescriptor{ - { - Name: toolScheduleList, - Description: "List schedules for current bot", - InputSchema: emptyObjectSchema(), - }, - { - Name: toolScheduleGet, - Description: "Get a schedule by id", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "id": map[string]any{"type": "string", "description": "Schedule ID"}, - }, - "required": []string{"id"}, - }, - }, - { - Name: toolScheduleCreate, - Description: "Create a new schedule", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "name": map[string]any{"type": "string"}, - "description": map[string]any{"type": "string"}, - "pattern": map[string]any{"type": "string"}, - "max_calls": map[string]any{ - "type": []string{"integer", "null"}, - "description": "Optional max calls, null means unlimited", - }, - "enabled": map[string]any{"type": "boolean"}, - "command": map[string]any{"type": "string"}, - }, - "required": []string{"name", "description", "pattern", "command"}, - }, - }, - { - Name: toolScheduleUpdate, - Description: "Update an existing schedule", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "id": map[string]any{"type": "string"}, - "name": map[string]any{"type": "string"}, - "description": map[string]any{"type": "string"}, - "pattern": map[string]any{"type": "string"}, - "max_calls": map[string]any{"type": []string{"integer", "null"}}, - "enabled": map[string]any{"type": "boolean"}, - "command": map[string]any{"type": "string"}, - }, - "required": []string{"id"}, - }, - }, - { - Name: toolScheduleDelete, - Description: "Delete a schedule by id", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "id": map[string]any{"type": "string", "description": "Schedule ID"}, - }, - "required": []string{"id"}, - }, - }, - }, nil -} - -func (p *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if p.service == nil { - return mcpgw.BuildToolErrorResult("schedule service not available"), nil - } - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - switch toolName { - case toolScheduleList: - items, err := p.service.List(ctx, botID) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "items": items, - }), nil - case toolScheduleGet: - id := mcpgw.StringArg(arguments, "id") - if id == "" { - return mcpgw.BuildToolErrorResult("id is required"), nil - } - item, err := p.service.Get(ctx, id) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if item.BotID != botID { - return mcpgw.BuildToolErrorResult("bot mismatch"), nil - } - return mcpgw.BuildToolSuccessResult(item), nil - case toolScheduleCreate: - name := mcpgw.StringArg(arguments, "name") - description := mcpgw.StringArg(arguments, "description") - pattern := mcpgw.StringArg(arguments, "pattern") - command := mcpgw.StringArg(arguments, "command") - if name == "" || description == "" || pattern == "" || command == "" { - return mcpgw.BuildToolErrorResult("name, description, pattern, command are required"), nil - } - - req := sched.CreateRequest{ - Name: name, - Description: description, - Pattern: pattern, - Command: command, - } - maxCalls, err := parseNullableIntArg(arguments, "max_calls") - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.MaxCalls = maxCalls - if enabled, ok, err := mcpgw.BoolArg(arguments, "enabled"); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } else if ok { - req.Enabled = &enabled - } - item, err := p.service.Create(ctx, botID, req) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - return mcpgw.BuildToolSuccessResult(item), nil - case toolScheduleUpdate: - id := mcpgw.StringArg(arguments, "id") - if id == "" { - return mcpgw.BuildToolErrorResult("id is required"), nil - } - req := sched.UpdateRequest{} - maxCalls, err := parseNullableIntArg(arguments, "max_calls") - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.MaxCalls = maxCalls - if value := mcpgw.StringArg(arguments, "name"); value != "" { - req.Name = &value - } - if value := mcpgw.StringArg(arguments, "description"); value != "" { - req.Description = &value - } - if value := mcpgw.StringArg(arguments, "pattern"); value != "" { - req.Pattern = &value - } - if value := mcpgw.StringArg(arguments, "command"); value != "" { - req.Command = &value - } - if enabled, ok, err := mcpgw.BoolArg(arguments, "enabled"); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } else if ok { - req.Enabled = &enabled - } - item, err := p.service.Update(ctx, id, req) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if item.BotID != botID { - return mcpgw.BuildToolErrorResult("bot mismatch"), nil - } - return mcpgw.BuildToolSuccessResult(item), nil - case toolScheduleDelete: - id := mcpgw.StringArg(arguments, "id") - if id == "" { - return mcpgw.BuildToolErrorResult("id is required"), nil - } - item, err := p.service.Get(ctx, id) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if item.BotID != botID { - return mcpgw.BuildToolErrorResult("bot mismatch"), nil - } - if err := p.service.Delete(ctx, id); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - return mcpgw.BuildToolSuccessResult(map[string]any{"success": true}), nil - default: - return nil, mcpgw.ErrToolNotFound - } -} - -func parseNullableIntArg(arguments map[string]any, key string) (sched.NullableInt, error) { - req := sched.NullableInt{} - if arguments == nil { - return req, nil - } - raw, exists := arguments[key] - if !exists { - return req, nil - } - req.Set = true - if raw == nil { - req.Value = nil - return req, nil - } - value, _, err := mcpgw.IntArg(arguments, key) - if err != nil { - return sched.NullableInt{}, err - } - req.Value = &value - return req, nil -} - -func emptyObjectSchema() map[string]any { - return map[string]any{ - "type": "object", - "properties": map[string]any{}, - } -} diff --git a/internal/mcp/providers/schedule/provider_test.go b/internal/mcp/providers/schedule/provider_test.go deleted file mode 100644 index c456726c..00000000 --- a/internal/mcp/providers/schedule/provider_test.go +++ /dev/null @@ -1,374 +0,0 @@ -package schedule - -import ( - "context" - "errors" - "testing" - "time" - - mcpgw "github.com/memohai/memoh/internal/mcp" - sched "github.com/memohai/memoh/internal/schedule" -) - -type fakeScheduler struct { - list []sched.Schedule - get sched.Schedule - getErr error - create sched.Schedule - createErr error - update sched.Schedule - updateErr error - deleteErr error -} - -func (f *fakeScheduler) List(_ context.Context, _ string) ([]sched.Schedule, error) { - return f.list, nil -} - -func (f *fakeScheduler) Get(_ context.Context, _ string) (sched.Schedule, error) { - if f.getErr != nil { - return sched.Schedule{}, f.getErr - } - return f.get, nil -} - -func (f *fakeScheduler) Create(_ context.Context, _ string, _ sched.CreateRequest) (sched.Schedule, error) { - if f.createErr != nil { - return sched.Schedule{}, f.createErr - } - return f.create, nil -} - -func (f *fakeScheduler) Update(_ context.Context, _ string, _ sched.UpdateRequest) (sched.Schedule, error) { - if f.updateErr != nil { - return sched.Schedule{}, f.updateErr - } - return f.update, nil -} - -func (f *fakeScheduler) Delete(_ context.Context, _ string) error { - return f.deleteErr -} - -func TestExecutor_ListTools_NilService(t *testing.T) { - exec := NewExecutor(nil, nil) - tools, err := exec.ListTools(context.Background(), mcpgw.ToolSessionContext{}) - if err != nil { - t.Fatal(err) - } - if len(tools) != 0 { - t.Errorf("expected 0 tools when service nil, got %d", len(tools)) - } -} - -func TestExecutor_ListTools(t *testing.T) { - svc := &fakeScheduler{} - exec := NewExecutor(nil, svc) - tools, err := exec.ListTools(context.Background(), mcpgw.ToolSessionContext{}) - if err != nil { - t.Fatal(err) - } - wantNames := []string{toolScheduleList, toolScheduleGet, toolScheduleCreate, toolScheduleUpdate, toolScheduleDelete} - if len(tools) != len(wantNames) { - t.Fatalf("expected %d tools, got %d", len(wantNames), len(tools)) - } - for i, name := range wantNames { - if tools[i].Name != name { - t.Errorf("tools[%d].Name = %q, want %q", i, tools[i].Name, name) - } - } -} - -func TestExecutor_CallTool_NotFound(t *testing.T) { - svc := &fakeScheduler{} - exec := NewExecutor(nil, svc) - _, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}, "other_tool", nil) - if !errors.Is(err, mcpgw.ErrToolNotFound) { - t.Errorf("expected ErrToolNotFound, got %v", err) - } -} - -func TestExecutor_CallTool_NilService(t *testing.T) { - exec := NewExecutor(nil, nil) - result, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{BotID: "bot1"}, toolScheduleList, nil) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when service nil") - } -} - -func TestExecutor_CallTool_NoBotID(t *testing.T) { - svc := &fakeScheduler{} - exec := NewExecutor(nil, svc) - result, err := exec.CallTool(context.Background(), mcpgw.ToolSessionContext{}, toolScheduleList, nil) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when bot_id is missing") - } -} - -func TestExecutor_CallTool_List(t *testing.T) { - svc := &fakeScheduler{ - list: []sched.Schedule{ - {ID: "id1", Name: "n1", BotID: "bot1"}, - }, - } - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleList, nil) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - content, _ := result["structuredContent"].(map[string]any) - if content == nil { - t.Fatal("no structuredContent") - } - items, _ := content["items"].([]sched.Schedule) - if len(items) != 1 { - t.Errorf("items length = %d", len(items)) - } -} - -func TestExecutor_CallTool_Get_IdRequired(t *testing.T) { - svc := &fakeScheduler{} - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleGet, map[string]any{}) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when id is missing") - } -} - -func TestExecutor_CallTool_Get_BotMismatch(t *testing.T) { - svc := &fakeScheduler{ - get: sched.Schedule{ID: "s1", BotID: "other-bot"}, - } - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleGet, map[string]any{"id": "s1"}) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when bot mismatch") - } -} - -func TestExecutor_CallTool_Get_Success(t *testing.T) { - svc := &fakeScheduler{ - get: sched.Schedule{ID: "s1", Name: "job1", BotID: "bot1"}, - } - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleGet, map[string]any{"id": "s1"}) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - item, ok := result["structuredContent"].(sched.Schedule) - if !ok { - t.Fatal("structuredContent is not Schedule") - } - if item.ID != "s1" { - t.Errorf("id = %v", item.ID) - } -} - -func TestExecutor_CallTool_Create_RequiredFields(t *testing.T) { - svc := &fakeScheduler{} - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleCreate, map[string]any{ - "name": "n", "description": "d", "pattern": "* * * * *", - }) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when command is missing") - } -} - -func TestExecutor_CallTool_Create_Success(t *testing.T) { - svc := &fakeScheduler{ - create: sched.Schedule{ - ID: "new1", Name: "n1", Description: "d1", Pattern: "* * * * *", Command: "echo", - BotID: "bot1", CreatedAt: time.Now(), UpdatedAt: time.Now(), - }, - } - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleCreate, map[string]any{ - "name": "n1", "description": "d1", "pattern": "* * * * *", "command": "echo", - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - item, ok := result["structuredContent"].(sched.Schedule) - if !ok { - t.Fatal("structuredContent is not Schedule") - } - if item.ID != "new1" { - t.Errorf("id = %v", item.ID) - } -} - -func TestExecutor_CallTool_Update_IdRequired(t *testing.T) { - svc := &fakeScheduler{} - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleUpdate, map[string]any{"name": "n"}) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when id is missing") - } -} - -func TestExecutor_CallTool_Update_Success(t *testing.T) { - svc := &fakeScheduler{ - update: sched.Schedule{ID: "s1", Name: "updated", BotID: "bot1"}, - } - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleUpdate, map[string]any{ - "id": "s1", "name": "updated", - }) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } -} - -func TestExecutor_CallTool_Delete_IdRequired(t *testing.T) { - svc := &fakeScheduler{} - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleDelete, map[string]any{}) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when id is missing") - } -} - -func TestExecutor_CallTool_Delete_BotMismatch(t *testing.T) { - svc := &fakeScheduler{ - get: sched.Schedule{ID: "s1", BotID: "other-bot"}, - } - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleDelete, map[string]any{"id": "s1"}) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when bot mismatch on delete") - } -} - -func TestExecutor_CallTool_Delete_Success(t *testing.T) { - svc := &fakeScheduler{ - get: sched.Schedule{ID: "s1", BotID: "bot1"}, - } - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleDelete, map[string]any{"id": "s1"}) - if err != nil { - t.Fatal(err) - } - if err := mcpgw.PayloadError(result); err != nil { - t.Fatal(err) - } - content, _ := result["structuredContent"].(map[string]any) - if content == nil { - t.Fatal("no structuredContent") - } - if success, _ := content["success"].(bool); !success { - t.Errorf("success = %v", content["success"]) - } -} - -func TestExecutor_CallTool_Get_ServiceError(t *testing.T) { - svc := &fakeScheduler{getErr: errors.New("not found")} - exec := NewExecutor(nil, svc) - session := mcpgw.ToolSessionContext{BotID: "bot1"} - result, err := exec.CallTool(context.Background(), session, toolScheduleGet, map[string]any{"id": "missing"}) - if err != nil { - t.Fatal(err) - } - if isErr, _ := result["isError"].(bool); !isErr { - t.Error("expected error when Get fails") - } -} - -func TestParseNullableIntArg(t *testing.T) { - tests := []struct { - name string - args map[string]any - key string - wantSet bool - wantVal *int - wantErr bool - }{ - {"nil args", nil, "x", false, nil, false}, - {"missing key", map[string]any{}, "x", false, nil, false}, - {"null value", map[string]any{"x": nil}, "x", true, nil, false}, - {"int value", map[string]any{"x": 5}, "x", true, intPtr(5), false}, - {"invalid type", map[string]any{"x": "bad"}, "x", false, nil, true}, - } - for _, tt := range tests { - t.Run(tt.name, func(t *testing.T) { - got, err := parseNullableIntArg(tt.args, tt.key) - if (err != nil) != tt.wantErr { - t.Errorf("parseNullableIntArg() error = %v, wantErr %v", err, tt.wantErr) - return - } - if got.Set != tt.wantSet { - t.Errorf("Set = %v, want %v", got.Set, tt.wantSet) - } - if tt.wantVal == nil { - if got.Value != nil { - t.Errorf("Value = %v, want nil", got.Value) - } - } else { - if got.Value == nil || *got.Value != *tt.wantVal { - t.Errorf("Value = %v, want %v", got.Value, tt.wantVal) - } - } - }) - } -} - -func TestEmptyObjectSchema(t *testing.T) { - m := emptyObjectSchema() - if m["type"] != "object" { - t.Errorf("type = %v", m["type"]) - } - if m["properties"] == nil { - t.Error("properties should be non-nil") - } -} - -func intPtr(n int) *int { - return &n -} diff --git a/internal/mcp/providers/skill/provider.go b/internal/mcp/providers/skill/provider.go deleted file mode 100644 index d4439cb6..00000000 --- a/internal/mcp/providers/skill/provider.go +++ /dev/null @@ -1,72 +0,0 @@ -package skill - -import ( - "context" - "log/slog" - - mcpgw "github.com/memohai/memoh/internal/mcp" -) - -const ( - toolUseSkill = "use_skill" -) - -type Executor struct { - logger *slog.Logger -} - -func NewExecutor(log *slog.Logger) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - logger: log.With(slog.String("provider", "skill_tool")), - } -} - -func (*Executor) ListTools(_ context.Context, session mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if session.IsSubagent { - return []mcpgw.ToolDescriptor{}, nil - } - return []mcpgw.ToolDescriptor{ - { - Name: toolUseSkill, - Description: "Use a skill if you think it is relevant to the current task", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "skillName": map[string]any{ - "type": "string", - "description": "The name of the skill to use", - }, - "reason": map[string]any{ - "type": "string", - "description": "The reason why you think this skill is relevant to the current task", - }, - }, - "required": []string{"skillName", "reason"}, - }, - }, - }, nil -} - -func (*Executor) CallTool(_ context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if toolName != toolUseSkill { - return nil, mcpgw.ErrToolNotFound - } - if session.IsSubagent { - return mcpgw.BuildToolErrorResult("skill tools are not available in subagent context"), nil - } - - skillName := mcpgw.StringArg(arguments, "skillName") - reason := mcpgw.StringArg(arguments, "reason") - if skillName == "" { - return mcpgw.BuildToolErrorResult("skillName is required"), nil - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "success": true, - "skillName": skillName, - "reason": reason, - }), nil -} diff --git a/internal/mcp/providers/subagent/provider.go b/internal/mcp/providers/subagent/provider.go deleted file mode 100644 index 19492e54..00000000 --- a/internal/mcp/providers/subagent/provider.go +++ /dev/null @@ -1,358 +0,0 @@ -package subagent - -import ( - "bytes" - "context" - "encoding/json" - "errors" - "fmt" - "io" - "log/slog" - "net/http" - "slices" - "strings" - "time" - - "github.com/memohai/memoh/internal/db/sqlc" - mcpgw "github.com/memohai/memoh/internal/mcp" - "github.com/memohai/memoh/internal/models" - "github.com/memohai/memoh/internal/settings" - subagentsvc "github.com/memohai/memoh/internal/subagent" -) - -const ( - toolListSubagents = "list_subagents" - toolDeleteSubagent = "delete_subagent" - toolQuerySubagent = "query_subagent" - - gatewayTimeout = 120 * time.Second -) - -type Executor struct { - logger *slog.Logger - service *subagentsvc.Service - settings *settings.Service - models *models.Service - queries *sqlc.Queries - gatewayBaseURL string - httpClient *http.Client -} - -func NewExecutor( - log *slog.Logger, - service *subagentsvc.Service, - settingsSvc *settings.Service, - modelsSvc *models.Service, - queries *sqlc.Queries, - gatewayBaseURL string, -) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - logger: log.With(slog.String("provider", "subagent_tool")), - service: service, - settings: settingsSvc, - models: modelsSvc, - queries: queries, - gatewayBaseURL: strings.TrimRight(gatewayBaseURL, "/"), - httpClient: &http.Client{Timeout: gatewayTimeout}, - } -} - -func (e *Executor) ListTools(_ context.Context, session mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if e.service == nil { - return []mcpgw.ToolDescriptor{}, nil - } - if session.IsSubagent { - return []mcpgw.ToolDescriptor{}, nil - } - return []mcpgw.ToolDescriptor{ - { - Name: toolListSubagents, - Description: "List subagents for current bot", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{}, - }, - }, - { - Name: toolDeleteSubagent, - Description: "Delete a subagent by id", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "id": map[string]any{"type": "string", "description": "Subagent ID"}, - }, - "required": []string{"id"}, - }, - }, - { - Name: toolQuerySubagent, - Description: "Query a subagent. If the subagent does not exist it will be created automatically.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "name": map[string]any{"type": "string", "description": "The name of the subagent"}, - "description": map[string]any{"type": "string", "description": "A short description of the subagent purpose (used when creating)"}, - "query": map[string]any{"type": "string", "description": "The prompt to ask the subagent to do."}, - }, - "required": []string{"name", "description", "query"}, - }, - }, - }, nil -} - -func (e *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if e.service == nil { - return mcpgw.BuildToolErrorResult("subagent service not available"), nil - } - if session.IsSubagent { - return mcpgw.BuildToolErrorResult("subagent tools are not available in subagent context"), nil - } - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - switch toolName { - case toolListSubagents: - return e.callList(ctx, botID) - case toolDeleteSubagent: - return e.callDelete(ctx, arguments) - case toolQuerySubagent: - return e.callQuery(ctx, session, botID, arguments) - default: - return nil, mcpgw.ErrToolNotFound - } -} - -func (e *Executor) callList(ctx context.Context, botID string) (map[string]any, error) { - items, err := e.service.List(ctx, botID) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - result := make([]map[string]any, 0, len(items)) - for _, item := range items { - result = append(result, map[string]any{ - "id": item.ID, - "name": item.Name, - "description": item.Description, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{"items": result}), nil -} - -func (e *Executor) callDelete(ctx context.Context, arguments map[string]any) (map[string]any, error) { - id := mcpgw.StringArg(arguments, "id") - if id == "" { - return mcpgw.BuildToolErrorResult("id is required"), nil - } - if err := e.service.Delete(ctx, id); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - return mcpgw.BuildToolSuccessResult(map[string]any{"success": true}), nil -} - -func (e *Executor) callQuery(ctx context.Context, session mcpgw.ToolSessionContext, botID string, arguments map[string]any) (map[string]any, error) { - name := mcpgw.StringArg(arguments, "name") - description := mcpgw.StringArg(arguments, "description") - query := mcpgw.StringArg(arguments, "query") - if name == "" || description == "" || query == "" { - return mcpgw.BuildToolErrorResult("name, description, and query are required"), nil - } - - target, err := e.service.GetOrCreate(ctx, botID, subagentsvc.CreateRequest{ - Name: name, - Description: description, - }) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("failed to get or create subagent: %v", err)), nil - } - - modelCfg, provider, err := e.resolveModel(ctx, botID) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("failed to resolve model: %v", err)), nil - } - - gwResp, err := e.postSubagent(ctx, session, subagentGatewayRequest{ - Model: subagentModelConfig{ - ModelID: modelCfg.ModelID, - ClientType: string(modelCfg.ClientType), - Input: modelCfg.InputModalities, - APIKey: provider.ApiKey, - BaseURL: provider.BaseUrl, - }, - Identity: subagentIdentity{ - BotID: botID, - ChannelIdentityID: session.ChannelIdentityID, - CurrentPlatform: session.CurrentPlatform, - SessionToken: session.SessionToken, - }, - Messages: target.Messages, - Query: query, - Name: name, - Desc: description, - }) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("subagent query failed: %v", err)), nil - } - - updatedMessages := slices.Clone(target.Messages) - updatedMessages = append(updatedMessages, gwResp.Messages...) - usage := mergeUsage(target.Usage, gwResp.Usage) - if _, err := e.service.UpdateContext(ctx, target.ID, subagentsvc.UpdateContextRequest{ - Messages: updatedMessages, - Usage: usage, - }); err != nil { - e.logger.Warn("failed to persist subagent context", slog.String("subagent_id", target.ID), slog.Any("error", err)) - } - - resultContent := gwResp.Text - if resultContent == "" && len(gwResp.Messages) > 0 { - last := gwResp.Messages[len(gwResp.Messages)-1] - if content, ok := last["content"]; ok { - resultContent = fmt.Sprintf("%v", content) - } - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "success": true, - "result": resultContent, - }), nil -} - -func (e *Executor) resolveModel(ctx context.Context, botID string) (models.GetResponse, sqlc.LlmProvider, error) { - if e.settings == nil || e.models == nil || e.queries == nil { - return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("model resolution services not configured") - } - botSettings, err := e.settings.GetBot(ctx, botID) - if err != nil { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - chatModelID := strings.TrimSpace(botSettings.ChatModelID) - if chatModelID == "" { - return models.GetResponse{}, sqlc.LlmProvider{}, errors.New("no chat model configured for bot") - } - model, err := e.models.GetByID(ctx, chatModelID) - if err != nil { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - provider, err := models.FetchProviderByID(ctx, e.queries, model.LlmProviderID) - if err != nil { - return models.GetResponse{}, sqlc.LlmProvider{}, err - } - return model, provider, nil -} - -// --- gateway types --- - -type subagentModelConfig struct { - ModelID string `json:"modelId"` - ClientType string `json:"clientType"` - Input []string `json:"input"` - APIKey string `json:"apiKey"` //nolint:gosec // forwarded to agent gateway - BaseURL string `json:"baseUrl"` -} - -type subagentIdentity struct { - BotID string `json:"botId"` - ChannelIdentityID string `json:"channelIdentityId"` - CurrentPlatform string `json:"currentPlatform,omitempty"` - SessionToken string `json:"sessionToken,omitempty"` //nolint:gosec // session token forwarded to agent gateway -} - -type subagentGatewayRequest struct { - Model subagentModelConfig `json:"model"` - Identity subagentIdentity `json:"identity"` - Messages []map[string]any `json:"messages"` - Query string `json:"query"` - Name string `json:"name"` - Desc string `json:"description"` -} - -type subagentGatewayResponse struct { - Messages []map[string]any `json:"messages"` - Text string `json:"text,omitempty"` - Usage json.RawMessage `json:"usage,omitempty"` -} - -func (e *Executor) postSubagent(ctx context.Context, session mcpgw.ToolSessionContext, payload subagentGatewayRequest) (subagentGatewayResponse, error) { - url := e.gatewayBaseURL + "/chat/subagent" - body, err := json.Marshal(payload) - if err != nil { - return subagentGatewayResponse{}, err - } - - req, err := http.NewRequestWithContext(ctx, http.MethodPost, url, bytes.NewReader(body)) - if err != nil { - return subagentGatewayResponse{}, err - } - req.Header.Set("Content-Type", "application/json") - if token := strings.TrimSpace(session.SessionToken); token != "" { - req.Header.Set("Authorization", "Bearer "+token) - } - - resp, err := e.httpClient.Do(req) //nolint:gosec // URL is from operator-configured agent gateway - if err != nil { - return subagentGatewayResponse{}, err - } - defer func() { _ = resp.Body.Close() }() - - respBody, err := io.ReadAll(resp.Body) - if err != nil { - return subagentGatewayResponse{}, err - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - detail := string(respBody) - if len(detail) > 300 { - detail = detail[:300] - } - return subagentGatewayResponse{}, fmt.Errorf("agent gateway error (HTTP %d): %s", resp.StatusCode, strings.TrimSpace(detail)) - } - - var parsed subagentGatewayResponse - if err := json.Unmarshal(respBody, &parsed); err != nil { - return subagentGatewayResponse{}, fmt.Errorf("failed to parse gateway response: %w", err) - } - return parsed, nil -} - -func mergeUsage(existing map[string]any, delta json.RawMessage) map[string]any { - if existing == nil { - existing = map[string]any{} - } - if len(delta) == 0 { - return existing - } - var deltaMap map[string]any - if err := json.Unmarshal(delta, &deltaMap); err != nil { - return existing - } - for key, val := range deltaMap { - if num, ok := toFloat64(val); ok { - if prev, ok := toFloat64(existing[key]); ok { - existing[key] = prev + num - } else { - existing[key] = num - } - } - } - return existing -} - -func toFloat64(v any) (float64, bool) { - switch n := v.(type) { - case float64: - return n, true - case int: - return float64(n), true - case int64: - return float64(n), true - case json.Number: - f, err := n.Float64() - return f, err == nil - default: - return 0, false - } -} diff --git a/internal/mcp/providers/tts/provider.go b/internal/mcp/providers/tts/provider.go deleted file mode 100644 index 82123e56..00000000 --- a/internal/mcp/providers/tts/provider.go +++ /dev/null @@ -1,199 +0,0 @@ -package tts - -import ( - "context" - "encoding/base64" - "errors" - "fmt" - "log/slog" - "strings" - - "github.com/memohai/memoh/internal/channel" - mcpgw "github.com/memohai/memoh/internal/mcp" - "github.com/memohai/memoh/internal/settings" - ttspkg "github.com/memohai/memoh/internal/tts" -) - -const ( - toolSpeak = "speak" - maxTextLen = 500 -) - -// Sender sends outbound messages through the channel manager. -type Sender interface { - Send(ctx context.Context, botID string, channelType channel.ChannelType, req channel.SendRequest) error -} - -// ChannelTypeResolver parses platform name to channel type. -type ChannelTypeResolver interface { - ParseChannelType(raw string) (channel.ChannelType, error) -} - -type Executor struct { - logger *slog.Logger - settings *settings.Service - tts *ttspkg.Service - sender Sender - resolver ChannelTypeResolver -} - -func NewExecutor(log *slog.Logger, settingsSvc *settings.Service, ttsSvc *ttspkg.Service, sender Sender, resolver ChannelTypeResolver) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - logger: log.With(slog.String("provider", "speak_tool")), - settings: settingsSvc, - tts: ttsSvc, - sender: sender, - resolver: resolver, - } -} - -func (e *Executor) ListTools(ctx context.Context, session mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if e.settings == nil || e.tts == nil || e.sender == nil || e.resolver == nil { - return nil, nil - } - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return nil, nil - } - botSettings, err := e.settings.GetBot(ctx, botID) - if err != nil { - return nil, nil - } - if strings.TrimSpace(botSettings.TtsModelID) == "" { - return nil, nil - } - return []mcpgw.ToolDescriptor{ - { - Name: toolSpeak, - Description: "Send a voice message to a DIFFERENT channel or person. Synthesizes text to speech and delivers as audio. Do NOT use this for the current conversation — use block instead.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "text": map[string]any{ - "type": "string", - "description": "The text to convert to speech (max 500 characters)", - }, - "platform": map[string]any{ - "type": "string", - "description": "Channel platform name. Defaults to current session platform.", - }, - "target": map[string]any{ - "type": "string", - "description": "Channel target (chat/group/thread ID). Use get_contacts to find available targets.", - }, - "reply_to": map[string]any{ - "type": "string", - "description": "Message ID to reply to. The voice message will reference this message on the platform.", - }, - }, - "required": []string{"text"}, - }, - }, - }, nil -} - -func (e *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if toolName != toolSpeak { - return nil, mcpgw.ErrToolNotFound - } - - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - - text := strings.TrimSpace(mcpgw.StringArg(arguments, "text")) - if text == "" { - return mcpgw.BuildToolErrorResult("text is required"), nil - } - if len([]rune(text)) > maxTextLen { - return mcpgw.BuildToolErrorResult("text too long, max 500 characters"), nil - } - - channelType, err := e.resolvePlatform(arguments, session) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - target := mcpgw.FirstStringArg(arguments, "target") - if target == "" { - target = strings.TrimSpace(session.ReplyTarget) - } - if target == "" { - return mcpgw.BuildToolErrorResult("target is required"), nil - } - - // Reject when destination matches the current conversation. - if strings.EqualFold(channelType.String(), strings.TrimSpace(session.CurrentPlatform)) && - target == strings.TrimSpace(session.ReplyTarget) { - return mcpgw.BuildToolErrorResult( - "You are trying to speak in the SAME conversation you are already in. " + - "Do NOT use the speak tool for this. Instead, use the block in your response " + - "(e.g. Hello world).", - ), nil - } - - botSettings, err := e.settings.GetBot(ctx, botID) - if err != nil { - e.logger.Error("failed to load bot settings", slog.String("bot_id", botID), slog.Any("error", err)) - return mcpgw.BuildToolErrorResult("failed to load bot settings"), nil - } - if botSettings.TtsModelID == "" { - return mcpgw.BuildToolErrorResult("bot has no TTS model configured"), nil - } - - audioData, contentType, synthErr := e.tts.Synthesize(ctx, botSettings.TtsModelID, text, nil) - if synthErr != nil { - e.logger.Error("tts synthesis failed", slog.String("bot_id", botID), slog.Any("error", synthErr)) - return mcpgw.BuildToolErrorResult("speech synthesis failed: " + synthErr.Error()), nil - } - - dataURL := fmt.Sprintf("data:%s;base64,%s", contentType, base64.StdEncoding.EncodeToString(audioData)) - - msg := channel.Message{ - Attachments: []channel.Attachment{ - { - Type: channel.AttachmentVoice, - URL: dataURL, - Mime: contentType, - Size: int64(len(audioData)), - }, - }, - } - - if replyTo := mcpgw.FirstStringArg(arguments, "reply_to"); replyTo != "" { - msg.Reply = &channel.ReplyRef{MessageID: replyTo} - } - - sendReq := channel.SendRequest{ - Target: target, - Message: msg, - } - if err := e.sender.Send(ctx, botID, channelType, sendReq); err != nil { - e.logger.Warn("speak send failed", slog.Any("error", err), slog.String("bot_id", botID), slog.String("platform", string(channelType))) - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - payload := map[string]any{ - "ok": true, - "bot_id": botID, - "platform": channelType.String(), - "target": target, - "instruction": "Voice message delivered successfully. You have completed your response. Please STOP now and do not call any more tools.", - } - return mcpgw.BuildToolSuccessResult(payload), nil -} - -func (e *Executor) resolvePlatform(arguments map[string]any, session mcpgw.ToolSessionContext) (channel.ChannelType, error) { - platform := mcpgw.FirstStringArg(arguments, "platform") - if platform == "" { - platform = strings.TrimSpace(session.CurrentPlatform) - } - if platform == "" { - return "", errors.New("platform is required") - } - return e.resolver.ParseChannelType(platform) -} diff --git a/internal/mcp/providers/web/provider.go b/internal/mcp/providers/web/provider.go deleted file mode 100644 index ea4ce4aa..00000000 --- a/internal/mcp/providers/web/provider.go +++ /dev/null @@ -1,1166 +0,0 @@ -package web - -import ( - "bytes" - "context" - "crypto/hmac" - "crypto/sha256" - "encoding/base64" - "encoding/hex" - "encoding/json" - "encoding/xml" - "fmt" - "html" - "io" - "log/slog" - "net/http" - "net/url" - "regexp" - "sort" - "strconv" - "strings" - "time" - - "github.com/memohai/memoh/internal/channel" - "github.com/memohai/memoh/internal/db/sqlc" - mcpgw "github.com/memohai/memoh/internal/mcp" - "github.com/memohai/memoh/internal/searchproviders" - "github.com/memohai/memoh/internal/settings" -) - -const ( - toolWebSearch = "web_search" -) - -type Executor struct { - logger *slog.Logger - settings *settings.Service - searchProviders *searchproviders.Service -} - -func NewExecutor(log *slog.Logger, settingsSvc *settings.Service, searchSvc *searchproviders.Service) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - logger: log.With(slog.String("provider", "web_tool")), - settings: settingsSvc, - searchProviders: searchSvc, - } -} - -func (p *Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - if p.settings == nil || p.searchProviders == nil { - return []mcpgw.ToolDescriptor{}, nil - } - return []mcpgw.ToolDescriptor{ - { - Name: toolWebSearch, - Description: "Search web results via configured search provider.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "query": map[string]any{"type": "string", "description": "Search query"}, - "count": map[string]any{"type": "integer", "description": "Number of results, default 5"}, - }, - "required": []string{"query"}, - }, - }, - }, nil -} - -func (p *Executor) CallTool(ctx context.Context, session mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if p.settings == nil || p.searchProviders == nil { - return mcpgw.BuildToolErrorResult("web tools are not available"), nil - } - botID := strings.TrimSpace(session.BotID) - if botID == "" { - return mcpgw.BuildToolErrorResult("bot_id is required"), nil - } - botSettings, err := p.settings.GetBot(ctx, botID) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - searchProviderID := strings.TrimSpace(botSettings.SearchProviderID) - if searchProviderID == "" { - return mcpgw.BuildToolErrorResult("search provider not configured for this bot"), nil - } - provider, err := p.searchProviders.GetRawByID(ctx, searchProviderID) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - registerSearchProviderSecrets(provider) - - switch toolName { - case toolWebSearch: - return p.callWebSearch(ctx, provider.Provider, provider.Config, arguments) - default: - return nil, mcpgw.ErrToolNotFound - } -} - -func (p *Executor) callWebSearch(ctx context.Context, providerName string, configJSON []byte, arguments map[string]any) (map[string]any, error) { - query := strings.TrimSpace(mcpgw.StringArg(arguments, "query")) - if query == "" { - return mcpgw.BuildToolErrorResult("query is required"), nil - } - count := 5 - if value, ok, err := mcpgw.IntArg(arguments, "count"); err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } else if ok && value > 0 { - count = value - } - if count > 20 { - count = 20 - } - - switch strings.TrimSpace(providerName) { - case string(searchproviders.ProviderBrave): - return p.callBraveSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderBing): - return p.callBingSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderGoogle): - return p.callGoogleSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderTavily): - return p.callTavilySearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderSogou): - return p.callSogouSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderSerper): - return p.callSerperSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderSearXNG): - return p.callSearXNGSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderJina): - return p.callJinaSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderExa): - return p.callExaSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderBocha): - return p.callBochaSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderDuckDuckGo): - return p.callDuckDuckGoSearch(ctx, configJSON, query, count) - case string(searchproviders.ProviderYandex): - return p.callYandexSearch(ctx, configJSON, query, count) - default: - return mcpgw.BuildToolErrorResult("unsupported search provider"), nil - } -} - -func (*Executor) callBraveSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := strings.TrimRight(firstNonEmpty(stringValue(cfg["base_url"]), "https://api.search.brave.com/res/v1/web/search"), "/") - reqURL, err := url.Parse(endpoint) - if err != nil { - return mcpgw.BuildToolErrorResult("invalid search provider base_url"), nil - } - params := reqURL.Query() - params.Set("q", query) - params.Set("count", strconv.Itoa(count)) - reqURL.RawQuery = params.Encode() - - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Accept", "application/json") - apiKey := stringValue(cfg["api_key"]) - if strings.TrimSpace(apiKey) != "" { - req.Header.Set("X-Subscription-Token", strings.TrimSpace(apiKey)) - } - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Web struct { - Results []struct { - Title string `json:"title"` - URL string `json:"url"` - Description string `json:"description"` - } `json:"results"` - } `json:"web"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - results := make([]map[string]any, 0, len(raw.Web.Results)) - for _, item := range raw.Web.Results { - results = append(results, map[string]any{ - "title": item.Title, - "url": item.URL, - "description": item.Description, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callBingSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := strings.TrimRight(firstNonEmpty(stringValue(cfg["base_url"]), "https://api.bing.microsoft.com/v7.0/search"), "/") - reqURL, err := url.Parse(endpoint) - if err != nil { - return mcpgw.BuildToolErrorResult("invalid search provider base_url"), nil - } - params := reqURL.Query() - params.Set("q", query) - params.Set("count", strconv.Itoa(count)) - reqURL.RawQuery = params.Encode() - - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Accept", "application/json") - apiKey := stringValue(cfg["api_key"]) - if strings.TrimSpace(apiKey) != "" { - req.Header.Set("Ocp-Apim-Subscription-Key", strings.TrimSpace(apiKey)) - } - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - WebPages struct { - Value []struct { - Name string `json:"name"` - URL string `json:"url"` - Snippet string `json:"snippet"` - } `json:"value"` - } `json:"webPages"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - results := make([]map[string]any, 0, len(raw.WebPages.Value)) - for _, item := range raw.WebPages.Value { - results = append(results, map[string]any{ - "title": item.Name, - "url": item.URL, - "description": item.Snippet, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callGoogleSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := strings.TrimRight(firstNonEmpty(stringValue(cfg["base_url"]), "https://customsearch.googleapis.com/customsearch/v1"), "/") - reqURL, err := url.Parse(endpoint) - if err != nil { - return mcpgw.BuildToolErrorResult("invalid search provider base_url"), nil - } - cx := stringValue(cfg["cx"]) - if cx == "" { - return mcpgw.BuildToolErrorResult("Google Custom Search requires cx (Search Engine ID)"), nil - } - if count > 10 { - count = 10 - } - params := reqURL.Query() - params.Set("q", query) - params.Set("cx", cx) - params.Set("num", strconv.Itoa(count)) - apiKey := stringValue(cfg["api_key"]) - if strings.TrimSpace(apiKey) != "" { - params.Set("key", strings.TrimSpace(apiKey)) - } - reqURL.RawQuery = params.Encode() - - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Accept", "application/json") - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Items []struct { - Title string `json:"title"` - Link string `json:"link"` - Snippet string `json:"snippet"` - } `json:"items"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - results := make([]map[string]any, 0, len(raw.Items)) - for _, item := range raw.Items { - results = append(results, map[string]any{ - "title": item.Title, - "url": item.Link, - "description": item.Snippet, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callTavilySearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://api.tavily.com/search") - apiKey := stringValue(cfg["api_key"]) - if apiKey == "" { - return mcpgw.BuildToolErrorResult("Tavily API key is required"), nil - } - payload, _ := json.Marshal(map[string]any{ - "query": query, - "max_results": count, - }) - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/json") - req.Header.Set("Accept", "application/json") - req.Header.Set("Authorization", "Bearer "+apiKey) - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Results []struct { - Title string `json:"title"` - URL string `json:"url"` - Content string `json:"content"` - } `json:"results"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - results := make([]map[string]any, 0, len(raw.Results)) - for _, item := range raw.Results { - results = append(results, map[string]any{ - "title": item.Title, - "url": item.URL, - "description": item.Content, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callSogouSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - host := firstNonEmpty(stringValue(cfg["base_url"]), "wsa.tencentcloudapi.com") - secretID := stringValue(cfg["secret_id"]) - secretKey := stringValue(cfg["secret_key"]) - if secretID == "" || secretKey == "" { - return mcpgw.BuildToolErrorResult("Sogou search requires Tencent Cloud SecretId and SecretKey"), nil - } - - action := "SearchPro" - version := "2025-05-08" - service := "wsa" - payload, _ := json.Marshal(map[string]any{ - "Query": query, - "Mode": 0, - }) - - now := time.Now().UTC() - timestamp := strconv.FormatInt(now.Unix(), 10) - date := now.Format("2006-01-02") - - hashedPayload := sha256Hex(payload) - canonicalHeaders := fmt.Sprintf("content-type:%s\nhost:%s\n", - "application/json", host) - signedHeaders := "content-type;host" - canonicalRequest := fmt.Sprintf("%s\n%s\n%s\n%s\n%s\n%s", - "POST", "/", "", canonicalHeaders, signedHeaders, hashedPayload) - - credentialScope := fmt.Sprintf("%s/%s/tc3_request", date, service) - stringToSign := fmt.Sprintf("TC3-HMAC-SHA256\n%s\n%s\n%s", - timestamp, credentialScope, sha256Hex([]byte(canonicalRequest))) - - secretDate := hmacSHA256([]byte("TC3"+secretKey), []byte(date)) - secretService := hmacSHA256(secretDate, []byte(service)) - secretSigning := hmacSHA256(secretService, []byte("tc3_request")) - signature := hex.EncodeToString(hmacSHA256(secretSigning, []byte(stringToSign))) - - authorization := fmt.Sprintf("TC3-HMAC-SHA256 Credential=%s/%s, SignedHeaders=%s, Signature=%s", - secretID, credentialScope, signedHeaders, signature) - - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodPost, "https://"+host+"/", bytes.NewReader(payload)) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/json") - req.Header.Set("Authorization", authorization) - req.Header.Set("Host", host) - req.Header.Set("X-TC-Action", action) - req.Header.Set("X-TC-Version", version) - req.Header.Set("X-TC-Timestamp", timestamp) - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var rawResp struct { - Response struct { - Error *struct { - Code string `json:"Code"` - Message string `json:"Message"` - } `json:"Error,omitempty"` - Pages []json.RawMessage `json:"Pages"` - } `json:"Response"` - } - if err := json.Unmarshal(body, &rawResp); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - if rawResp.Response.Error != nil { - return mcpgw.BuildToolErrorResult("Sogou search failed: " + rawResp.Response.Error.Message), nil - } - - type sogouPage struct { - Title string `json:"title"` - URL string `json:"url"` - Passage string `json:"passage"` - Score float64 `json:"scour"` - } - var pages []sogouPage - for _, raw := range rawResp.Response.Pages { - var rawStr string - if err := json.Unmarshal(raw, &rawStr); err == nil { - var page sogouPage - if err := json.Unmarshal([]byte(rawStr), &page); err == nil { - pages = append(pages, page) - } - } else { - var page sogouPage - if err := json.Unmarshal(raw, &page); err == nil { - pages = append(pages, page) - } - } - } - sort.Slice(pages, func(i, j int) bool { - return pages[i].Score > pages[j].Score - }) - results := make([]map[string]any, 0, len(pages)) - for i, page := range pages { - if i >= count { - break - } - results = append(results, map[string]any{ - "title": page.Title, - "url": page.URL, - "description": page.Passage, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func sha256Hex(data []byte) string { - h := sha256.Sum256(data) - return hex.EncodeToString(h[:]) -} - -func hmacSHA256(key, data []byte) []byte { - h := hmac.New(sha256.New, key) - h.Write(data) - return h.Sum(nil) -} - -func (*Executor) callSerperSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://google.serper.dev/search") - apiKey := stringValue(cfg["api_key"]) - if apiKey == "" { - return mcpgw.BuildToolErrorResult("Serper API key is required"), nil - } - payload, _ := json.Marshal(map[string]any{ - "q": query, - }) - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/json") - req.Header.Set("Accept", "application/json") - req.Header.Set("X-API-KEY", apiKey) - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Organic []struct { - Title string `json:"title"` - Link string `json:"link"` - Description string `json:"description"` - Position int `json:"position"` - } `json:"organic"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - sort.Slice(raw.Organic, func(i, j int) bool { - return raw.Organic[i].Position < raw.Organic[j].Position - }) - results := make([]map[string]any, 0, len(raw.Organic)) - for i, item := range raw.Organic { - if i >= count { - break - } - results = append(results, map[string]any{ - "title": item.Title, - "url": item.Link, - "description": item.Description, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callSearXNGSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - baseURL := stringValue(cfg["base_url"]) - if baseURL == "" { - return mcpgw.BuildToolErrorResult("SearXNG base URL is required"), nil - } - reqURL, err := url.Parse(strings.TrimRight(baseURL, "/")) - if err != nil { - return mcpgw.BuildToolErrorResult("invalid SearXNG base_url"), nil - } - params := reqURL.Query() - params.Set("q", query) - params.Set("format", "json") - params.Set("pageno", "1") - if lang := stringValue(cfg["language"]); lang != "" { - params.Set("language", lang) - } - if ss := stringValue(cfg["safesearch"]); ss != "" { - params.Set("safesearch", ss) - } - if cats := stringValue(cfg["categories"]); cats != "" { - params.Set("categories", cats) - } - reqURL.RawQuery = params.Encode() - - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodGet, reqURL.String(), nil) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Accept", "application/json") - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Results []struct { - Title string `json:"title"` - URL string `json:"url"` - Content string `json:"content"` - Score float64 `json:"score"` - } `json:"results"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - sort.Slice(raw.Results, func(i, j int) bool { - return raw.Results[i].Score > raw.Results[j].Score - }) - results := make([]map[string]any, 0, len(raw.Results)) - for i, item := range raw.Results { - if i >= count { - break - } - results = append(results, map[string]any{ - "title": item.Title, - "url": item.URL, - "description": item.Content, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callJinaSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://s.jina.ai/") - apiKey := stringValue(cfg["api_key"]) - if apiKey == "" { - return mcpgw.BuildToolErrorResult("Jina API key is required"), nil - } - if count > 10 { - count = 10 - } - payload, _ := json.Marshal(map[string]any{ - "q": query, - "count": count, - }) - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/json") - req.Header.Set("Accept", "application/json") - req.Header.Set("X-Retain-Images", "none") - req.Header.Set("Authorization", apiKey) - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Data []struct { - Title string `json:"title"` - URL string `json:"url"` - Content string `json:"content"` - } `json:"data"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - results := make([]map[string]any, 0, len(raw.Data)) - for _, item := range raw.Data { - results = append(results, map[string]any{ - "title": item.Title, - "url": item.URL, - "description": item.Content, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callExaSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://api.exa.ai/search") - apiKey := stringValue(cfg["api_key"]) - if apiKey == "" { - return mcpgw.BuildToolErrorResult("Exa API key is required"), nil - } - payload, _ := json.Marshal(map[string]any{ - "query": query, - "numResults": count, - "contents": map[string]any{ - "text": true, - "highlights": true, - }, - "type": "auto", - }) - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/json") - req.Header.Set("Accept", "application/json") - req.Header.Set("Authorization", "Bearer "+apiKey) - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Results []struct { - Title string `json:"title"` - URL string `json:"url"` - Text string `json:"text"` - } `json:"results"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - results := make([]map[string]any, 0, len(raw.Results)) - for _, item := range raw.Results { - results = append(results, map[string]any{ - "title": item.Title, - "url": item.URL, - "description": item.Text, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callBochaSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://api.bochaai.com/v1/web-search") - apiKey := stringValue(cfg["api_key"]) - if apiKey == "" { - return mcpgw.BuildToolErrorResult("Bocha API key is required"), nil - } - payload, _ := json.Marshal(map[string]any{ - "query": query, - "summary": true, - "freshness": "noLimit", - "count": count, - }) - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/json") - req.Header.Set("Accept", "application/json") - req.Header.Set("Authorization", "Bearer "+apiKey) - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var raw struct { - Data struct { - WebPages struct { - Value []struct { - Name string `json:"name"` - URL string `json:"url"` - Summary string `json:"summary"` - } `json:"value"` - } `json:"webPages"` - } `json:"data"` - } - if err := json.Unmarshal(body, &raw); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - results := make([]map[string]any, 0, len(raw.Data.WebPages.Value)) - for _, item := range raw.Data.WebPages.Value { - results = append(results, map[string]any{ - "title": item.Name, - "url": item.URL, - "description": item.Summary, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -func (*Executor) callDuckDuckGoSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://html.duckduckgo.com/html/") - - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - form := url.Values{} - form.Set("q", query) - form.Set("b", "") - form.Set("kl", "") - req, err := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, strings.NewReader(form.Encode())) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/x-www-form-urlencoded") - req.Header.Set("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36") - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - - htmlStr := string(body) - links := ddgResultLinkRe.FindAllStringSubmatch(htmlStr, -1) - titles := ddgResultTitleRe.FindAllStringSubmatch(htmlStr, -1) - snippets := ddgResultSnippetRe.FindAllStringSubmatch(htmlStr, -1) - - n := len(links) - if len(titles) < n { - n = len(titles) - } - if count < n { - n = count - } - - results := make([]map[string]any, 0, n) - for i := 0; i < n; i++ { - rawURL := html.UnescapeString(links[i][1]) - realURL := extractDDGURL(rawURL) - title := html.UnescapeString(strings.TrimSpace(titles[i][1])) - snippet := "" - if i < len(snippets) { - snippet = html.UnescapeString(strings.TrimSpace(ddgHTMLTagRe.ReplaceAllString(snippets[i][1], ""))) - } - if realURL == "" { - continue - } - results = append(results, map[string]any{ - "title": title, - "url": realURL, - "description": snippet, - }) - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -var ( - ddgResultLinkRe = regexp.MustCompile(`class="result__a"[^>]*href="([^"]+)"`) - ddgResultTitleRe = regexp.MustCompile(`class="result__a"[^>]*>([^<]+)<`) - ddgResultSnippetRe = regexp.MustCompile(`class="result__snippet"[^>]*>([\s\S]*?)`) - ddgHTMLTagRe = regexp.MustCompile(`<[^>]*>`) -) - -func extractDDGURL(rawURL string) string { - if strings.Contains(rawURL, "uddg=") { - parsed, err := url.Parse(rawURL) - if err == nil { - if uddg := parsed.Query().Get("uddg"); uddg != "" { - return uddg - } - } - } - if strings.HasPrefix(rawURL, "//") { - return "https:" + rawURL - } - return rawURL -} - -func (*Executor) callYandexSearch(ctx context.Context, configJSON []byte, query string, count int) (map[string]any, error) { - cfg := parseConfig(configJSON) - endpoint := firstNonEmpty(stringValue(cfg["base_url"]), "https://searchapi.api.cloud.yandex.net/v2/web/search") - apiKey := stringValue(cfg["api_key"]) - if apiKey == "" { - return mcpgw.BuildToolErrorResult("Yandex API key is required"), nil - } - searchType := firstNonEmpty(stringValue(cfg["search_type"]), "SEARCH_TYPE_RU") - payload, _ := json.Marshal(map[string]any{ - "query": map[string]any{ - "queryText": query, - "searchType": searchType, - }, - "groupSpec": map[string]any{ - "groupMode": "GROUP_MODE_DEEP", - "groupsOnPage": count, - "docsInGroup": 1, - }, - }) - timeout := parseTimeout(configJSON, 15*time.Second) - client := &http.Client{Timeout: timeout} - req, err := http.NewRequestWithContext(ctx, http.MethodPost, endpoint, bytes.NewReader(payload)) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - req.Header.Set("Content-Type", "application/json") - req.Header.Set("Authorization", "Api-Key "+apiKey) - resp, err := client.Do(req) //nolint:gosec // G704: web browsing tool intentionally fetches user-specified URLs; SSRF is by design - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return buildSearchHTTPError(resp.StatusCode, body), nil - } - var rawResp struct { - RawData string `json:"rawData"` - } - if err := json.Unmarshal(body, &rawResp); err != nil { - return mcpgw.BuildToolErrorResult("invalid search response"), nil - } - xmlData, err := base64.StdEncoding.DecodeString(rawResp.RawData) - if err != nil { - return mcpgw.BuildToolErrorResult("failed to decode Yandex response"), nil - } - results, err := parseYandexXML(xmlData) - if err != nil { - return mcpgw.BuildToolErrorResult("failed to parse Yandex XML response"), nil - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "query": query, - "results": results, - }), nil -} - -type xmlInnerText string - -func (t *xmlInnerText) UnmarshalXML(d *xml.Decoder, _ xml.StartElement) error { - var buf strings.Builder - for { - tok, err := d.Token() - if err != nil { - break - } - switch v := tok.(type) { - case xml.CharData: - buf.Write(v) - case xml.StartElement: - var inner xmlInnerText - if err := d.DecodeElement(&inner, &v); err != nil { - return err - } - buf.WriteString(string(inner)) - case xml.EndElement: - *t = xmlInnerText(buf.String()) - return nil - } - } - *t = xmlInnerText(buf.String()) - return nil -} - -type yandexResponse struct { - XMLName xml.Name `xml:"response"` - Results yandexResults `xml:"results"` -} - -type yandexResults struct { - Grouping yandexGrouping `xml:"grouping"` -} - -type yandexGrouping struct { - Groups []yandexGroup `xml:"group"` -} - -type yandexGroup struct { - Doc yandexDoc `xml:"doc"` -} - -type yandexDoc struct { - URL xmlInnerText `xml:"url"` - Title xmlInnerText `xml:"title"` - Passages yandexPassages `xml:"passages"` -} - -type yandexPassages struct { - Passage []xmlInnerText `xml:"passage"` -} - -func parseYandexXML(data []byte) ([]map[string]any, error) { - var resp yandexResponse - if err := xml.Unmarshal(data, &resp); err != nil { - return nil, err - } - results := make([]map[string]any, 0, len(resp.Results.Grouping.Groups)) - for _, group := range resp.Results.Grouping.Groups { - snippet := "" - if len(group.Doc.Passages.Passage) > 0 { - snippet = string(group.Doc.Passages.Passage[0]) - } - results = append(results, map[string]any{ - "title": string(group.Doc.Title), - "url": string(group.Doc.URL), - "description": snippet, - }) - } - return results, nil -} - -// buildSearchHTTPError builds an error result for non-2xx search API responses. -// It includes the HTTP status code and attempts to extract a brief error detail -// from the response body (capped at 200 characters to avoid context blowout). -func buildSearchHTTPError(statusCode int, body []byte) map[string]any { - detail := extractJSONErrorMessage(body) - if detail == "" { - detail = strings.TrimSpace(string(body)) - } - if len(detail) > 200 { - detail = detail[:200] + "..." - } - if detail != "" { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("search request failed (HTTP %d): %s", statusCode, detail)) - } - return mcpgw.BuildToolErrorResult(fmt.Sprintf("search request failed (HTTP %d)", statusCode)) -} - -// extractJSONErrorMessage probes common JSON error response patterns and returns -// the first human-readable message found, or "" if none. -func extractJSONErrorMessage(body []byte) string { - var obj map[string]any - if json.Unmarshal(body, &obj) != nil { - return "" - } - for _, key := range []string{"error", "message", "detail", "error_message"} { - v, ok := obj[key] - if !ok { - continue - } - switch val := v.(type) { - case string: - return val - case map[string]any: - if msg, ok := val["message"].(string); ok { - return msg - } - } - } - return "" -} - -func parseTimeout(configJSON []byte, fallback time.Duration) time.Duration { - cfg := parseConfig(configJSON) - raw, ok := cfg["timeout_seconds"] - if !ok { - return fallback - } - switch value := raw.(type) { - case float64: - if value > 0 { - return time.Duration(value * float64(time.Second)) - } - case int: - if value > 0 { - return time.Duration(value) * time.Second - } - } - return fallback -} - -func parseConfig(configJSON []byte) map[string]any { - if len(configJSON) == 0 { - return map[string]any{} - } - var cfg map[string]any - if err := json.Unmarshal(configJSON, &cfg); err != nil || cfg == nil { - return map[string]any{} - } - return cfg -} - -func stringValue(raw any) string { - if value, ok := raw.(string); ok { - return strings.TrimSpace(value) - } - return "" -} - -func firstNonEmpty(values ...string) string { - for _, value := range values { - if strings.TrimSpace(value) != "" { - return strings.TrimSpace(value) - } - } - return "" -} - -// searchProviderSecretFields are config keys known to hold credentials. -var searchProviderSecretFields = []string{"api_key", "secret_id", "secret_key"} - -func registerSearchProviderSecrets(provider sqlc.SearchProvider) { - cfg := parseConfig(provider.Config) - var secrets []string - for _, key := range searchProviderSecretFields { - if v := stringValue(cfg[key]); v != "" { - secrets = append(secrets, v) - } - } - if len(secrets) > 0 { - channel.SetIMErrorSecrets("search:"+provider.ID.String(), secrets...) - } -} diff --git a/internal/mcp/providers/webfetch/provider.go b/internal/mcp/providers/webfetch/provider.go deleted file mode 100644 index b2bff4db..00000000 --- a/internal/mcp/providers/webfetch/provider.go +++ /dev/null @@ -1,219 +0,0 @@ -package webfetch - -import ( - "context" - "encoding/json" - "fmt" - "io" - "log/slog" - "net/http" - "net/url" - "strings" - "time" - - htmltomarkdown "github.com/JohannesKaufmann/html-to-markdown/v2" - readability "github.com/go-shiori/go-readability" - - mcpgw "github.com/memohai/memoh/internal/mcp" -) - -const ( - toolWebFetch = "web_fetch" - maxTextContent = 10000 - fetchTimeout = 30 * time.Second - userAgent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36" -) - -type Executor struct { - logger *slog.Logger - client *http.Client -} - -func NewExecutor(log *slog.Logger) *Executor { - if log == nil { - log = slog.Default() - } - return &Executor{ - logger: log.With(slog.String("provider", "webfetch_tool")), - client: &http.Client{Timeout: fetchTimeout}, - } -} - -func (*Executor) ListTools(_ context.Context, _ mcpgw.ToolSessionContext) ([]mcpgw.ToolDescriptor, error) { - return []mcpgw.ToolDescriptor{ - { - Name: toolWebFetch, - Description: "Fetch a URL and convert the response to readable content. Supports HTML (converts to Markdown), JSON, XML, and plain text formats.", - InputSchema: map[string]any{ - "type": "object", - "properties": map[string]any{ - "url": map[string]any{ - "type": "string", - "description": "The URL to fetch", - }, - "format": map[string]any{ - "type": "string", - "enum": []string{"auto", "markdown", "json", "xml", "text"}, - "description": "Output format (default: auto - detects from content type)", - }, - }, - "required": []string{"url"}, - }, - }, - }, nil -} - -func (e *Executor) CallTool(ctx context.Context, _ mcpgw.ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) { - if toolName != toolWebFetch { - return nil, mcpgw.ErrToolNotFound - } - - rawURL := strings.TrimSpace(mcpgw.StringArg(arguments, "url")) - if rawURL == "" { - return mcpgw.BuildToolErrorResult("url is required"), nil - } - format := strings.TrimSpace(mcpgw.StringArg(arguments, "format")) - if format == "" { - format = "auto" - } - - return e.callWebFetch(ctx, rawURL, format) -} - -func (e *Executor) callWebFetch(ctx context.Context, rawURL, format string) (map[string]any, error) { - req, err := http.NewRequestWithContext(ctx, http.MethodGet, rawURL, nil) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("invalid url: %v", err)), nil - } - req.Header.Set("User-Agent", userAgent) - - resp, err := e.client.Do(req) //nolint:gosec // intentionally fetches user-specified URLs - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - defer func() { _ = resp.Body.Close() }() - - if resp.StatusCode < 200 || resp.StatusCode >= 300 { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("HTTP error: %d %s", resp.StatusCode, resp.Status)), nil - } - - contentType := resp.Header.Get("Content-Type") - detected := format - if format == "auto" { - detected = detectFormat(contentType) - } - - body, err := io.ReadAll(resp.Body) - if err != nil { - return mcpgw.BuildToolErrorResult(err.Error()), nil - } - - switch detected { - case "json": - return e.processJSON(rawURL, contentType, body) - case "xml": - return e.processXML(rawURL, contentType, body) - case "markdown": - return e.processHTML(rawURL, contentType, body) - default: - return e.processText(rawURL, contentType, body) - } -} - -func detectFormat(contentType string) string { - ct := strings.ToLower(contentType) - switch { - case strings.Contains(ct, "application/json"): - return "json" - case strings.Contains(ct, "application/xml"), strings.Contains(ct, "text/xml"): - return "xml" - case strings.Contains(ct, "text/html"): - return "markdown" - default: - return "text" - } -} - -func (*Executor) processJSON(fetchedURL, contentType string, body []byte) (map[string]any, error) { - var data any - if err := json.Unmarshal(body, &data); err != nil { - return mcpgw.BuildToolErrorResult("Failed to parse JSON"), nil - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "success": true, - "url": fetchedURL, - "format": "json", - "contentType": contentType, - "data": data, - }), nil -} - -func (*Executor) processXML(fetchedURL, contentType string, body []byte) (map[string]any, error) { - content := string(body) - if len(content) > maxTextContent { - content = content[:maxTextContent] - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "success": true, - "url": fetchedURL, - "format": "xml", - "contentType": contentType, - "content": content, - }), nil -} - -func (e *Executor) processHTML(fetchedURL, contentType string, body []byte) (map[string]any, error) { - parsed, err := url.Parse(fetchedURL) - if err != nil { - parsed = &url.URL{} - } - - article, err := readability.FromReader(strings.NewReader(string(body)), parsed) - if err != nil { - return mcpgw.BuildToolErrorResult(fmt.Sprintf("Failed to extract readable content from HTML: %v", err)), nil - } - - if strings.TrimSpace(article.Content) == "" { - return mcpgw.BuildToolErrorResult("Failed to extract readable content from HTML"), nil - } - - markdown, err := htmltomarkdown.ConvertString(article.Content) - if err != nil { - e.logger.Warn("html-to-markdown conversion failed, falling back to text", slog.Any("error", err)) - markdown = article.TextContent - } - - textPreview := article.TextContent - if len(textPreview) > 500 { - textPreview = textPreview[:500] - } - - return mcpgw.BuildToolSuccessResult(map[string]any{ - "success": true, - "url": fetchedURL, - "format": "markdown", - "contentType": contentType, - "title": article.Title, - "byline": article.Byline, - "excerpt": article.Excerpt, - "content": markdown, - "textContent": textPreview, - "length": article.Length, - }), nil -} - -func (*Executor) processText(fetchedURL, contentType string, body []byte) (map[string]any, error) { - content := string(body) - length := len(content) - if length > maxTextContent { - content = content[:maxTextContent] - } - return mcpgw.BuildToolSuccessResult(map[string]any{ - "success": true, - "url": fetchedURL, - "format": "text", - "contentType": contentType, - "content": content, - "length": length, - }), nil -} diff --git a/internal/mcp/tool_gateway_service.go b/internal/mcp/tool_gateway_service.go index 2a55216d..9e96fa09 100644 --- a/internal/mcp/tool_gateway_service.go +++ b/internal/mcp/tool_gateway_service.go @@ -18,27 +18,22 @@ type cachedToolRegistry struct { registry *ToolRegistry } -// ToolGatewayService federates tools from executors and sources. +// ToolGatewayService federates tools from external MCP sources (federation). +// Built-in tools are no longer managed here — they are loaded directly +// via agent ToolProviders. type ToolGatewayService struct { - logger *slog.Logger - executors []ToolExecutor - sources []ToolSource - cacheTTL time.Duration + logger *slog.Logger + sources []ToolSource + cacheTTL time.Duration mu sync.Mutex cache map[string]cachedToolRegistry } -func NewToolGatewayService(log *slog.Logger, executors []ToolExecutor, sources []ToolSource) *ToolGatewayService { +func NewToolGatewayService(log *slog.Logger, sources []ToolSource) *ToolGatewayService { if log == nil { log = slog.Default() } - filteredExecutors := make([]ToolExecutor, 0, len(executors)) - for _, executor := range executors { - if executor != nil { - filteredExecutors = append(filteredExecutors, executor) - } - } filteredSources := make([]ToolSource, 0, len(sources)) for _, source := range sources { if source != nil { @@ -46,11 +41,10 @@ func NewToolGatewayService(log *slog.Logger, executors []ToolExecutor, sources [ } } return &ToolGatewayService{ - logger: log.With(slog.String("service", "tool_gateway")), - executors: filteredExecutors, - sources: filteredSources, - cacheTTL: defaultToolRegistryCacheTTL, - cache: map[string]cachedToolRegistry{}, + logger: log.With(slog.String("service", "tool_gateway")), + sources: filteredSources, + cacheTTL: defaultToolRegistryCacheTTL, + cache: map[string]cachedToolRegistry{}, } } @@ -87,14 +81,13 @@ func (s *ToolGatewayService) CallTool(ctx context.Context, session ToolSessionCo if err != nil { return nil, err } - executor, _, ok := registry.Lookup(toolName) + source, _, ok := registry.Lookup(toolName) if !ok { - // Refresh once for dynamic executors/sources. registry, err = s.getRegistry(ctx, session, true) if err != nil { return nil, err } - executor, _, ok = registry.Lookup(toolName) + source, _, ok = registry.Lookup(toolName) if !ok { return BuildToolErrorResult("tool not found: " + toolName), nil } @@ -104,7 +97,7 @@ func (s *ToolGatewayService) CallTool(ctx context.Context, session ToolSessionCo if arguments == nil { arguments = map[string]any{} } - result, err := executor.CallTool(ctx, session, toolName, arguments) + result, err := source.CallTool(ctx, session, toolName, arguments) if err != nil { if errors.Is(err, ErrToolNotFound) { return BuildToolErrorResult("tool not found: " + toolName), nil @@ -133,18 +126,6 @@ func (s *ToolGatewayService) getRegistry(ctx context.Context, session ToolSessio } registry := NewToolRegistry() - for _, executor := range s.executors { - tools, err := executor.ListTools(ctx, session) - if err != nil { - s.logger.Warn("list tools from executor failed", slog.Any("error", err)) - continue - } - for _, tool := range tools { - if err := registry.Register(executor, tool); err != nil { - s.logger.Warn("skip duplicated/invalid tool", slog.String("tool", tool.Name), slog.Any("error", err)) - } - } - } for _, source := range s.sources { tools, err := source.ListTools(ctx, session) if err != nil { diff --git a/internal/mcp/tool_gateway_service_test.go b/internal/mcp/tool_gateway_service_test.go index 46cfb807..d4b4882b 100644 --- a/internal/mcp/tool_gateway_service_test.go +++ b/internal/mcp/tool_gateway_service_test.go @@ -40,7 +40,7 @@ func TestToolGatewayServiceListTools(t *testing.T) { {Name: "dup_tool", InputSchema: map[string]any{"type": "object"}}, }, } - service := NewToolGatewayService(slog.Default(), []ToolExecutor{providerA, providerB}, nil) + service := NewToolGatewayService(slog.Default(), []ToolSource{providerA, providerB}) tools, err := service.ListTools(context.Background(), ToolSessionContext{BotID: "bot-1"}) if err != nil { @@ -65,7 +65,7 @@ func TestToolGatewayServiceCallToolSuccess(t *testing.T) { }, callErr: map[string]error{}, } - service := NewToolGatewayService(slog.Default(), []ToolExecutor{provider}, nil) + service := NewToolGatewayService(slog.Default(), []ToolSource{provider}) result, err := service.CallTool(context.Background(), ToolSessionContext{BotID: "bot-1"}, ToolCallPayload{ Name: "echo_tool", @@ -85,7 +85,7 @@ func TestToolGatewayServiceCallToolNotFound(t *testing.T) { callResult: map[string]map[string]any{}, callErr: map[string]error{}, } - service := NewToolGatewayService(slog.Default(), []ToolExecutor{provider}, nil) + service := NewToolGatewayService(slog.Default(), []ToolSource{provider}) result, err := service.CallTool(context.Background(), ToolSessionContext{BotID: "bot-1"}, ToolCallPayload{ Name: "missing_tool", @@ -110,7 +110,7 @@ func TestToolGatewayServiceCallToolProviderError(t *testing.T) { "broken_tool": errors.New("boom"), }, } - service := NewToolGatewayService(slog.Default(), []ToolExecutor{provider}, nil) + service := NewToolGatewayService(slog.Default(), []ToolSource{provider}) result, err := service.CallTool(context.Background(), ToolSessionContext{BotID: "bot-1"}, ToolCallPayload{ Name: "broken_tool", diff --git a/internal/mcp/tool_registry.go b/internal/mcp/tool_registry.go index 1f6fdd1f..ef40f5e2 100644 --- a/internal/mcp/tool_registry.go +++ b/internal/mcp/tool_registry.go @@ -8,8 +8,8 @@ import ( ) type registryItem struct { - executor ToolExecutor - tool ToolDescriptor + source ToolSource + tool ToolDescriptor } // ToolRegistry stores provider ownership and descriptor metadata. @@ -23,9 +23,9 @@ func NewToolRegistry() *ToolRegistry { } } -func (r *ToolRegistry) Register(executor ToolExecutor, tool ToolDescriptor) error { - if executor == nil { - return errors.New("tool executor is required") +func (r *ToolRegistry) Register(source ToolSource, tool ToolDescriptor) error { + if source == nil { + return errors.New("tool source is required") } name := strings.TrimSpace(tool.Name) if name == "" { @@ -42,18 +42,18 @@ func (r *ToolRegistry) Register(executor ToolExecutor, tool ToolDescriptor) erro } tool.Name = name r.items[name] = registryItem{ - executor: executor, - tool: tool, + source: source, + tool: tool, } return nil } -func (r *ToolRegistry) Lookup(name string) (ToolExecutor, ToolDescriptor, bool) { +func (r *ToolRegistry) Lookup(name string) (ToolSource, ToolDescriptor, bool) { item, ok := r.items[strings.TrimSpace(name)] if !ok { return nil, ToolDescriptor{}, false } - return item.executor, item.tool, true + return item.source, item.tool, true } func (r *ToolRegistry) List() []ToolDescriptor { diff --git a/internal/mcp/tool_types.go b/internal/mcp/tool_types.go index c06bfb90..192a3a20 100644 --- a/internal/mcp/tool_types.go +++ b/internal/mcp/tool_types.go @@ -27,14 +27,9 @@ type ToolDescriptor struct { InputSchema map[string]any `json:"inputSchema"` } -// ToolExecutor represents business-facing tools (message/schedule/memory). -type ToolExecutor interface { - ListTools(ctx context.Context, session ToolSessionContext) ([]ToolDescriptor, error) - CallTool(ctx context.Context, session ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) -} - -// ToolSource represents infrastructure-level tool sources (federation/connectors). -// A source is not a business tool itself; it supplies and routes downstream tools. +// ToolSource represents external tool sources (federation/connectors). +// Built-in tools are no longer managed through this interface — they are +// loaded directly via agent ToolProviders. type ToolSource interface { ListTools(ctx context.Context, session ToolSessionContext) ([]ToolDescriptor, error) CallTool(ctx context.Context, session ToolSessionContext, toolName string, arguments map[string]any) (map[string]any, error) diff --git a/internal/message/service.go b/internal/message/service.go index 70c3e1e8..f176b6e3 100644 --- a/internal/message/service.go +++ b/internal/message/service.go @@ -113,6 +113,8 @@ func (s *DBService) Persist(ctx context.Context, input PersistInput) (Message, e Role: role, Ordinal: int32(ref.Ordinal), ContentHash: contentHash, + Name: ref.Name, + Metadata: marshalMetadata(ref.Metadata), }); assetErr != nil { s.logger.Warn("create message asset link failed", slog.String("message_id", result.ID), slog.Any("error", assetErr)) } @@ -134,6 +136,8 @@ func (s *DBService) Persist(ctx context.Context, input PersistInput) (Message, e Mime: ref.Mime, SizeBytes: ref.SizeBytes, StorageKey: ref.StorageKey, + Name: ref.Name, + Metadata: ref.Metadata, }) } result.Assets = assets @@ -231,6 +235,38 @@ func (s *DBService) ListBefore(ctx context.Context, botID string, before time.Ti return msgs, nil } +// LinkAssets links asset refs to an existing persisted message. +func (s *DBService) LinkAssets(ctx context.Context, messageID string, assets []AssetRef) error { + pgMsgID, err := dbpkg.ParseUUID(messageID) + if err != nil { + return fmt.Errorf("invalid message id: %w", err) + } + for _, ref := range assets { + contentHash := strings.TrimSpace(ref.ContentHash) + if contentHash == "" { + continue + } + role := ref.Role + if strings.TrimSpace(role) == "" { + role = "attachment" + } + if ref.Ordinal < math.MinInt32 || ref.Ordinal > math.MaxInt32 { + return fmt.Errorf("asset ordinal out of range: %d", ref.Ordinal) + } + if _, assetErr := s.queries.CreateMessageAsset(ctx, sqlc.CreateMessageAssetParams{ + MessageID: pgMsgID, + Role: role, + Ordinal: int32(ref.Ordinal), + ContentHash: contentHash, + Name: ref.Name, + Metadata: marshalMetadata(ref.Metadata), + }); assetErr != nil { + s.logger.Warn("link asset failed", slog.String("message_id", messageID), slog.Any("error", assetErr)) + } + } + return nil +} + // DeleteByBot deletes all messages for a bot. func (s *DBService) DeleteByBot(ctx context.Context, botID string) error { pgBotID, err := dbpkg.ParseUUID(botID) @@ -530,9 +566,8 @@ func (s *DBService) enrichAssets(ctx context.Context, messages []Message) { ContentHash: contentHash, Role: row.Role, Ordinal: int(row.Ordinal), - Mime: "", - SizeBytes: 0, - StorageKey: "", + Name: row.Name, + Metadata: unmarshalMetadata(row.Metadata), }) } for i := range messages { @@ -553,3 +588,25 @@ func ensureAssetsSlice(messages []Message) { } } } + +func marshalMetadata(m map[string]any) []byte { + if len(m) == 0 { + return []byte("{}") + } + b, err := json.Marshal(m) + if err != nil { + return []byte("{}") + } + return b +} + +func unmarshalMetadata(b []byte) map[string]any { + if len(b) == 0 { + return nil + } + var m map[string]any + if err := json.Unmarshal(b, &m); err != nil || len(m) == 0 { + return nil + } + return m +} diff --git a/internal/message/types.go b/internal/message/types.go index 2dd74f09..d74d7869 100644 --- a/internal/message/types.go +++ b/internal/message/types.go @@ -9,12 +9,14 @@ import ( // MessageAsset carries media asset metadata attached to a message. // ContentHash is the content-addressed identifier for the media file. type MessageAsset struct { - ContentHash string `json:"content_hash"` - Role string `json:"role"` - Ordinal int `json:"ordinal"` - Mime string `json:"mime"` - SizeBytes int64 `json:"size_bytes"` - StorageKey string `json:"storage_key"` + ContentHash string `json:"content_hash"` + Role string `json:"role"` + Ordinal int `json:"ordinal"` + Mime string `json:"mime"` + SizeBytes int64 `json:"size_bytes"` + StorageKey string `json:"storage_key"` + Name string `json:"name,omitempty"` + Metadata map[string]any `json:"metadata,omitempty"` } // Message represents a single persisted bot message. @@ -40,12 +42,14 @@ type Message struct { // AssetRef links a media asset to a persisted message. // ContentHash is the content-addressed identifier for the media file. type AssetRef struct { - ContentHash string `json:"content_hash"` - Role string `json:"role"` - Ordinal int `json:"ordinal"` - Mime string `json:"mime,omitempty"` - SizeBytes int64 `json:"size_bytes,omitempty"` - StorageKey string `json:"storage_key,omitempty"` + ContentHash string `json:"content_hash"` + Role string `json:"role"` + Ordinal int `json:"ordinal"` + Mime string `json:"mime,omitempty"` + SizeBytes int64 `json:"size_bytes,omitempty"` + StorageKey string `json:"storage_key,omitempty"` + Name string `json:"name,omitempty"` + Metadata map[string]any `json:"metadata,omitempty"` } // PersistInput is the input for persisting a message. @@ -79,4 +83,5 @@ type Service interface { ListLatest(ctx context.Context, botID string, limit int32) ([]Message, error) ListBefore(ctx context.Context, botID string, before time.Time, limit int32) ([]Message, error) DeleteByBot(ctx context.Context, botID string) error + LinkAssets(ctx context.Context, messageID string, assets []AssetRef) error } diff --git a/mise.toml b/mise.toml index b47483d5..ff48dc0c 100644 --- a/mise.toml +++ b/mise.toml @@ -5,7 +5,7 @@ experimental_monorepo_root = true go = "1.25.6" # Node.js for frontend packages node = "25" -# Bun for agent gateway +# Bun for browser gateway bun = "latest" # pnpm for workspace management pnpm = "10" @@ -96,7 +96,7 @@ description = "Release new version" run = "pnpm release" [tasks.build-embedded-assets] -description = "Build and stage embedded web/agent/bun assets" +description = "Build and stage embedded web assets" run = "scripts/release.sh --prepare-assets" depends = ["//:pnpm-install"] diff --git a/packages/agent/.gitignore b/packages/agent/.gitignore deleted file mode 100644 index a14702c4..00000000 --- a/packages/agent/.gitignore +++ /dev/null @@ -1,34 +0,0 @@ -# dependencies (bun install) -node_modules - -# output -out -dist -*.tgz - -# code coverage -coverage -*.lcov - -# logs -logs -_.log -report.[0-9]_.[0-9]_.[0-9]_.[0-9]_.json - -# dotenv environment variable files -.env -.env.development.local -.env.test.local -.env.production.local -.env.local - -# caches -.eslintcache -.cache -*.tsbuildinfo - -# IntelliJ based IDEs -.idea - -# Finder (MacOS) folder config -.DS_Store diff --git a/packages/agent/README.md b/packages/agent/README.md deleted file mode 100644 index af2484c7..00000000 --- a/packages/agent/README.md +++ /dev/null @@ -1,2 +0,0 @@ -# @memoh/agent - diff --git a/packages/agent/package.json b/packages/agent/package.json deleted file mode 100644 index ece8c06a..00000000 --- a/packages/agent/package.json +++ /dev/null @@ -1,29 +0,0 @@ -{ - "name": "@memoh/agent", - "version": "0.5.0", - "exports": { - ".": "./src/index.ts" - }, - "packageManager": "pnpm@10.27.0", - "module": "src/index.ts", - "type": "module", - "private": true, - "peerDependencies": { - "typescript": "^5" - }, - "dependencies": { - "@ai-sdk/anthropic": "^3.0.9", - "@ai-sdk/google": "^3.0.6", - "@ai-sdk/mcp": "^1.0.6", - "@ai-sdk/openai": "^3.0.39", - "@ai-sdk/openai-compatible": "^2.0.33", - "@mozilla/readability": "^0.6.0", - "@types/turndown": "^5.0.6", - "ai": "^6.0.25", - "linkedom": "^0.18.12", - "toml": "^3.0.0", - "turndown": "^7.2.2", - "yaml": "^2.8.2", - "zod": "^4.3.6" - } -} diff --git a/packages/agent/src/agent.test.ts b/packages/agent/src/agent.test.ts deleted file mode 100644 index 475c7ce9..00000000 --- a/packages/agent/src/agent.test.ts +++ /dev/null @@ -1,88 +0,0 @@ -import { describe, expect, it } from 'vitest' -import { createImagePartFromAttachment } from './utils/image-parts' - -describe('createImagePartFromAttachment', () => { - it('converts inline data URLs to binary image parts', () => { - const part = createImagePartFromAttachment({ - type: 'image', - transport: 'inline_data_url', - payload: 'data:image/png;base64,AQID', - }) - - expect(part?.type).toBe('image') - expect(part?.image).toBeInstanceOf(Uint8Array) - expect(Array.from(part?.image as Uint8Array)).toEqual([1, 2, 3]) - expect(part?.mediaType).toBe('image/png') - }) - - it('keeps public URLs as URL objects', () => { - const part = createImagePartFromAttachment({ - type: 'image', - transport: 'public_url', - payload: 'https://example.com/demo.png', - }) - - expect(part?.image).toBeInstanceOf(URL) - expect(String(part?.image)).toBe('https://example.com/demo.png') - }) - - it('falls back to string payloads for malformed public URLs', () => { - const part = createImagePartFromAttachment({ - type: 'image', - transport: 'public_url', - payload: 'https://', - mime: 'image/png', - }) - - expect(part?.image).toBe('https://') - expect(part?.mediaType).toBe('image/png') - }) - - it('keeps inline payload strings when they are not data URLs', () => { - const part = createImagePartFromAttachment({ - type: 'image', - transport: 'inline_data_url', - payload: 'AQID', - mime: 'image/png', - }) - - expect(part?.image).toBe('AQID') - expect(part?.mediaType).toBe('image/png') - }) - - it('falls back to string payloads for malformed non-base64 data URLs', () => { - const payload = 'data:image/png,a%ZZ' - const part = createImagePartFromAttachment({ - type: 'image', - transport: 'inline_data_url', - payload, - mime: 'image/png', - }) - - expect(part?.image).toBe(payload) - expect(part?.mediaType).toBe('image/png') - }) - - it('falls back to string payloads for malformed base64 data URLs', () => { - const payload = 'data:image/png;base64,%%%' - const part = createImagePartFromAttachment({ - type: 'image', - transport: 'inline_data_url', - payload, - mime: 'image/png', - }) - - expect(part?.image).toBe(payload) - expect(part?.mediaType).toBe('image/png') - }) - - it('skips tool file references', () => { - const part = createImagePartFromAttachment({ - type: 'image', - transport: 'tool_file_ref', - payload: '/data/media/demo.png', - }) - - expect(part).toBeNull() - }) -}) diff --git a/packages/agent/src/agent.ts b/packages/agent/src/agent.ts deleted file mode 100644 index eab1bde2..00000000 --- a/packages/agent/src/agent.ts +++ /dev/null @@ -1,798 +0,0 @@ -import { - generateText, - type ImagePart, - LanguageModelUsage, - ModelMessage, - stepCountIs, - type StepResult, - streamText, - ToolSet, - UserModelMessage, - type PrepareStepFunction, -} from 'ai' -import { - AgentInput, - AgentParams, - AgentSkill, - AgentStreamAction, - Heartbeat, - MCPConnection, - Schedule, - SystemFile, -} from './types' -import { ClientType, ModelConfig, ModelInput, hasInputModality } from './types/model' -import { system, schedule, heartbeat, subagentSystem } from './prompts' -import { AuthFetcher } from './types' -import { createModel } from './model' -import { - stripAttachmentsFromMessages, - dedupeAttachments, - attachmentsResolver, -} from './utils/attachments' -import type { ContainerFileAttachment } from './types/attachment' -import { reactionsResolver, type ReactionItem } from './utils/reactions' -import { speechResolver, type SpeechItem } from './utils/speech' -import { StreamTagExtractor, extractTagsFromText, type TagEvent } from './utils/tag-extractor' -import { createImagePartFromAttachment } from './utils/image-parts' -import type { GatewayInputAttachment } from './types/attachment' -import { getMCPTools } from './tools/mcp' -import { buildIdentityHeaders } from './utils/headers' -import { createFS } from './utils' -import { createTextLoopGuard, createTextLoopProbeBuffer } from './sential' -import { createToolLoopGuardedTools } from './tool-loop' -import { createPrepareStepWithReadMedia } from './utils/read-media-injector' - -const ANTHROPIC_BUDGET: Record = { low: 5000, medium: 16000, high: 50000 } -const GOOGLE_BUDGET: Record = { low: 5000, medium: 16000, high: 50000 } -const LOOP_DETECTED_ABORT_MESSAGE = 'loop detected, stream aborted' -const LOOP_DETECTED_STREAK_THRESHOLD = 3 -const LOOP_DETECTED_MIN_NEW_GRAMS_PER_CHUNK = 8 -const LOOP_DETECTED_PROBE_CHARS = 256 -const TOOL_LOOP_DETECTED_ABORT_MESSAGE = 'tool loop detected, stream aborted' -const TOOL_LOOP_REPEAT_THRESHOLD = 5 -const TOOL_LOOP_WARNINGS_BEFORE_ABORT = 1 -const TOOL_LOOP_WARNING_KEY = '__memoh_tool_loop_warning' -const TOOL_LOOP_WARNING_TEXT = '[MEMOH_TOOL_LOOP_WARNING] Repeated identical tool invocation (same tool + arguments) was detected more than 5 times. Stop looping this tool and either summarize current results or change strategy.' - -const buildProviderOptions = (config: ModelConfig): Record> | undefined => { - if (!config.reasoning?.enabled) return undefined - const effort = config.reasoning.effort ?? 'medium' - switch (config.clientType) { - case ClientType.AnthropicMessages: - return { anthropic: { thinking: { type: 'enabled' as const, budgetTokens: ANTHROPIC_BUDGET[effort] } } } - case ClientType.OpenAIResponses: - return { openai: { reasoningEffort: effort, reasoningSummary: 'auto' } } - case ClientType.OpenAICompletions: - return { openai: { reasoningEffort: effort } } - case ClientType.GoogleGenerativeAI: - return { google: { thinkingConfig: { thinkingBudget: GOOGLE_BUDGET[effort] } } } - default: - return undefined - } -} - -const buildStepUsages = ( - steps: { usage: LanguageModelUsage; response: { messages: unknown[] } }[], -): (LanguageModelUsage | null)[] => { - const usages: (LanguageModelUsage | null)[] = [] - for (const step of steps) { - for (let i = 0; i < step.response.messages.length; i++) { - usages.push(i === 0 ? step.usage : null) - } - } - return usages -} - -export const buildNativeImageParts = (attachments: GatewayInputAttachment[]): ImagePart[] => { - return attachments - .map((attachment) => createImagePartFromAttachment(attachment)) - .filter((attachment): attachment is ImagePart => attachment != null) -} - -const rebuildPartialMessages = (steps: StepResult[]): ModelMessage[] => { - const messages: ModelMessage[] = [] - for (const step of steps) { - if (step.response?.messages) { - messages.push(...(step.response.messages as ModelMessage[])) - } - } - return messages -} - -export const createAgent = ( - { - model: modelConfig, - activeContextTime = 24 * 60, - language = 'Same as the user input', - channels = [], - skills = [], - mcpConnections = [], - currentChannel = 'Unknown Channel', - identity = { - botId: '', - channelIdentityId: '', - displayName: '', - }, - auth, - inbox = [], - loopDetection = { enabled: false }, - isSubagent = false, - }: AgentParams, - fetch: AuthFetcher, -) => { - const model = createModel(modelConfig) - const supportsImageInput = hasInputModality(modelConfig, ModelInput.Image) - // eslint-disable-next-line @typescript-eslint/no-explicit-any - const providerOptions = buildProviderOptions(modelConfig) as any - const loopDetectionEnabled = loopDetection?.enabled === true - const enabledSkills: AgentSkill[] = [] - const fs = createFS({ fetch, botId: identity.botId }) - - const enableSkill = (skill: string) => { - const agentSkill = skills.find((s) => s.name === skill) - if (agentSkill) { - enabledSkills.push(agentSkill) - } - } - - const getEnabledSkills = () => { - return enabledSkills.map((skill) => skill.name) - } - - const loadSystemFiles = async (): Promise => { - const home = '/data' - const pad = (n: number) => n.toString().padStart(2, '0') - const getDateString = (date: Date) => - `${date.getFullYear()}-${pad(date.getMonth() + 1)}-${pad(date.getDate())}` - const _today = getDateString(new Date()) - const _yesterday = getDateString(new Date(Date.now() - 24 * 60 * 60 * 1000)) - const files = [ - 'IDENTITY.md', - 'SOUL.md', - 'TOOLS.md', - 'MEMORY.md', - 'PROFILES.md', - `memory/${_today}.md`, - `memory/${_yesterday}.md`, - ] - const promises = files.map((file) => (async () => ({ - filename: file, - content: await fs.readText(`${home}/${file}`).catch(() => ''), - }))()) - return await Promise.all(promises) as SystemFile[] - } - - const generateSystemPrompt = async () => { - const files = await loadSystemFiles() - return system({ - date: new Date(), - language, - maxContextLoadTime: activeContextTime, - channels, - currentChannel, - skills, - enabledSkills, - inbox, - supportsImageInput, - files, - }) - } - - const getAgentTools = async () => { - const baseUrl = auth.baseUrl.replace(/\/$/, '') - const botId = identity.botId.trim() - if (!baseUrl || !botId) { - return { - tools: {}, - close: async () => {}, - } - } - const headers = buildIdentityHeaders(identity, auth, { isSubagent }) - const builtins: MCPConnection[] = [ - { - type: 'http', - name: 'builtin', - url: `${baseUrl}/bots/${botId}/tools`, - headers, - }, - ] - const { tools: mcpTools, close: closeMCP } = await getMCPTools( - [...builtins, ...mcpConnections], - { - auth, - fetch, - botId, - }, - ) - return { - tools: mcpTools as ToolSet, - close: closeMCP, - } - } - - const generateUserPrompt = (input: AgentInput) => { - const imageParts = supportsImageInput ? buildNativeImageParts(input.attachments) : [] - - const userMessage: UserModelMessage = { - role: 'user', - content: [{ type: 'text', text: input.query }, ...imageParts], - } - return userMessage - } - - const createNonStreamTextLoopInspector = () => { - if (!loopDetectionEnabled) { - return null - } - const textLoopGuard = createTextLoopGuard({ - consecutiveHitsToAbort: LOOP_DETECTED_STREAK_THRESHOLD, - minNewGramsPerChunk: LOOP_DETECTED_MIN_NEW_GRAMS_PER_CHUNK, - }) - return (text: string) => { - const result = textLoopGuard.inspect(text) - if (result.abort) { - throw new Error(LOOP_DETECTED_ABORT_MESSAGE) - } - } - } - - const buildGuardedTools = ( - tools: ToolSet, - onAbortToolCall: (toolCallId: string) => void = () => {}, - ): ToolSet => { - if (!loopDetectionEnabled) { - return tools - } - return createToolLoopGuardedTools(tools, { - repeatThreshold: TOOL_LOOP_REPEAT_THRESHOLD, - warningsBeforeAbort: TOOL_LOOP_WARNINGS_BEFORE_ABORT, - onAbortToolCall, - warningKey: TOOL_LOOP_WARNING_KEY, - warningText: TOOL_LOOP_WARNING_TEXT, - }) - } - - const runTextGeneration = async ({ - messages, - systemPrompt, - basePrepareStep, - }: { - messages: ModelMessage[] - systemPrompt: string - basePrepareStep?: PrepareStepFunction - }) => { - const { tools: baseTools, close } = await getAgentTools() - const { prepareStep, tools: readMediaTools } = createPrepareStepWithReadMedia({ - modelConfig, - fs, - systemPrompt, - basePrepareStep, - }) - const tools = { ...baseTools, ...readMediaTools } - let shouldAbortForToolLoop = false - const guardedTools = buildGuardedTools(tools, () => { - shouldAbortForToolLoop = true - }) - const inspectTextLoop = createNonStreamTextLoopInspector() - let runError: unknown = null - try { - return await generateText({ - model, - messages, - system: systemPrompt, - ...(providerOptions && { providerOptions }), - stopWhen: stepCountIs(Infinity), - prepareStep, - onStepFinish: ({ text, toolResults }: { text: string; toolResults: Array<{ toolName: string; result: unknown }> }) => { - if (loopDetectionEnabled) { - if (shouldAbortForToolLoop) { - throw new Error(TOOL_LOOP_DETECTED_ABORT_MESSAGE) - } - if (inspectTextLoop) { - inspectTextLoop(text) - } - } - if (toolResults) { - for (const tr of toolResults) { - if (tr.toolName === 'use_skill') { - const result = tr.result as Record | undefined - const skillName = typeof result?.skillName === 'string' ? result.skillName : '' - if (skillName) { - enableSkill(skillName) - } - } - } - } - }, - tools: guardedTools, - }) - } catch (error) { - runError = error - throw error - } finally { - try { - await close() - } catch (closeError) { - if (runError == null) { - throw closeError - } - console.error(closeError) - } - } - } - - const ask = async (input: AgentInput) => { - const userPrompt = generateUserPrompt(input) - const messages = [...input.messages, userPrompt] - input.skills.forEach((skill) => enableSkill(skill)) - const systemPrompt = await generateSystemPrompt() - const { response, reasoning, text, usage, steps } = await runTextGeneration({ - messages, - systemPrompt, - basePrepareStep: () => ({ system: systemPrompt }), - }) - const stepUsages = buildStepUsages(steps) - const tagResolvers = [attachmentsResolver, reactionsResolver, speechResolver] - const { cleanedText, events } = extractTagsFromText(text, tagResolvers) - const textAttachments = events - .filter((e) => e.tag === 'attachments') - .flatMap((e) => e.data as ContainerFileAttachment[]) - const reactions = events - .filter((e) => e.tag === 'reactions') - .flatMap((e) => e.data as ReactionItem[]) - const speeches = events - .filter((e) => e.tag === 'speech') - .flatMap((e) => e.data as SpeechItem[]) - const { messages: strippedMessages, attachments: messageAttachments } = - stripAttachmentsFromMessages(response.messages, [reactionsResolver, speechResolver]) - const allAttachments = dedupeAttachments([ - ...textAttachments, - ...messageAttachments, - ]) - return { - messages: [ - userPrompt, - ...strippedMessages, - ], - usages: [null, ...stepUsages] as (LanguageModelUsage | null)[], - reasoning: reasoning.map((part) => part.text), - usage, - text: cleanedText, - attachments: allAttachments, - reactions, - speeches, - skills: getEnabledSkills(), - } - } - - const askAsSubagent = async (params: { - input: string; - name: string; - description: string; - messages: ModelMessage[]; - }) => { - const userPrompt: UserModelMessage = { - role: 'user', - content: [{ type: 'text', text: params.input }], - } - const generateSubagentSystemPrompt = () => { - return subagentSystem({ - date: new Date(), - name: params.name, - description: params.description, - }) - } - const systemPrompt = generateSubagentSystemPrompt() - const messages = [...params.messages, userPrompt] - const { response, reasoning, text, usage, steps } = await runTextGeneration({ - messages, - systemPrompt, - basePrepareStep: () => ({ system: generateSubagentSystemPrompt() }), - }) - const stepUsages = buildStepUsages(steps) - return { - messages: [userPrompt, ...response.messages], - usages: [null, ...stepUsages] as (LanguageModelUsage | null)[], - reasoning: reasoning.map((part) => part.text), - usage, - text, - skills: getEnabledSkills(), - } - } - - const triggerSchedule = async (params: { - schedule: Schedule; - messages: ModelMessage[]; - skills: string[]; - }) => { - const scheduleMessage: UserModelMessage = { - role: 'user', - content: [ - { - type: 'text', - text: schedule({ schedule: params.schedule, date: new Date() }), - }, - ], - } - const messages = [...params.messages, scheduleMessage] - params.skills.forEach((skill) => enableSkill(skill)) - const { response, reasoning, text, usage, steps } = await runTextGeneration({ - messages, - systemPrompt: await generateSystemPrompt(), - }) - const stepUsages = buildStepUsages(steps) - return { - messages: [scheduleMessage, ...response.messages], - usages: [null, ...stepUsages] as (LanguageModelUsage | null)[], - reasoning: reasoning.map((part) => part.text), - usage, - text, - skills: getEnabledSkills(), - } - } - - const triggerHeartbeat = async (params: { - heartbeat: Heartbeat; - messages: ModelMessage[]; - skills: string[]; - }) => { - const heartbeatText = await heartbeat({ interval: params.heartbeat.interval, date: new Date(), fs }) - const heartbeatMessage: UserModelMessage = { - role: 'user', - content: [ - { - type: 'text', - text: heartbeatText, - }, - ], - } - const messages = [...params.messages, heartbeatMessage] - params.skills.forEach((skill) => enableSkill(skill)) - const { response, reasoning, text, usage, steps } = await runTextGeneration({ - messages, - systemPrompt: await generateSystemPrompt(), - }) - const stepUsages = buildStepUsages(steps) - return { - messages: [heartbeatMessage, ...response.messages], - usages: [null, ...stepUsages] as (LanguageModelUsage | null)[], - reasoning: reasoning.map((part) => part.text), - usage, - text, - skills: getEnabledSkills(), - } - } - - const resolveStreamErrorMessage = (raw: unknown): string => { - if (raw instanceof Error && raw.message.trim()) { - return raw.message - } - if (typeof raw === 'string' && raw.trim()) { - return raw - } - if (raw && typeof raw === 'object') { - const candidate = raw as { message?: unknown; error?: unknown } - if (typeof candidate.message === 'string' && candidate.message.trim()) { - return candidate.message - } - if (typeof candidate.error === 'string' && candidate.error.trim()) { - return candidate.error - } - if (candidate.error instanceof Error && candidate.error.message.trim()) { - return candidate.error.message - } - } - return 'Model stream failed' - } - - function* emitTagEvents(events: TagEvent[]): Generator { - for (const event of events) { - switch (event.tag) { - case 'attachments': { - const attachments = dedupeAttachments(event.data as ContainerFileAttachment[]) as ContainerFileAttachment[] - if (attachments.length) { - yield { type: 'attachment_delta', attachments } - } - break - } - case 'reactions': { - const reactions = event.data as ReactionItem[] - if (reactions.length) { - yield { type: 'reaction_delta', reactions } - } - break - } - case 'speech': { - const speeches = event.data as SpeechItem[] - if (speeches.length) { - yield { type: 'speech_delta', speeches } - } - break - } - } - } - } - - async function* stream(input: AgentInput): AsyncGenerator { - const userPrompt = generateUserPrompt(input) - const messages = [...input.messages, userPrompt] - input.skills.forEach((skill) => enableSkill(skill)) - const systemPrompt = await generateSystemPrompt() - const tagResolvers = [attachmentsResolver, reactionsResolver, speechResolver] - const tagExtractor = new StreamTagExtractor(tagResolvers) - const textLoopGuard = loopDetectionEnabled - ? createTextLoopGuard({ - consecutiveHitsToAbort: LOOP_DETECTED_STREAK_THRESHOLD, - minNewGramsPerChunk: LOOP_DETECTED_MIN_NEW_GRAMS_PER_CHUNK, - }) - : null - const guardLoopOutput = (text: string) => { - if (!textLoopGuard) { - return - } - const result = textLoopGuard.inspect(text) - if (result.abort) { - throw new Error(LOOP_DETECTED_ABORT_MESSAGE) - } - } - const textLoopProbeBuffer = textLoopGuard - ? createTextLoopProbeBuffer( - LOOP_DETECTED_PROBE_CHARS, - guardLoopOutput, - ) - : null - const result: { - messages: ModelMessage[]; - reasoning: string[]; - usage: LanguageModelUsage | null; - usages: (LanguageModelUsage | null)[]; - } = { - messages: [], - reasoning: [], - usage: null, - usages: [], - } - const toolLoopAbortCallIds = new Set() - const { tools: baseTools, close } = await getAgentTools() - const { prepareStep, tools: readMediaTools } = createPrepareStepWithReadMedia({ - modelConfig, - fs, - systemPrompt, - basePrepareStep: () => ({ system: systemPrompt }), - }) - const tools = { ...baseTools, ...readMediaTools } - const guardedTools = buildGuardedTools(tools, (toolCallId) => { - toolLoopAbortCallIds.add(toolCallId) - }) - let closePromise: Promise | null = null - const closeTools = async () => { - if (!closePromise) { - closePromise = Promise.resolve().then(() => close()) - } - await closePromise - } - - const abortController = new AbortController() - if (input.signal) { - if (input.signal.aborted) { - abortController.abort(input.signal.reason) - } else { - input.signal.addEventListener('abort', () => abortController.abort(input.signal!.reason), { once: true }) - } - } - const abortedSteps: StepResult[] = [] - let wasAborted = false - let streamError: unknown = null - try { - const { fullStream } = streamText({ - model, - messages, - system: systemPrompt, - ...(providerOptions && { providerOptions }), - stopWhen: stepCountIs(Infinity), - prepareStep, - tools: guardedTools, - abortSignal: abortController.signal, - onFinish: async ({ usage, reasoning, response, steps }) => { - await closeTools() - result.usage = usage as never - result.reasoning = reasoning.map((part) => part.text) - result.messages = response.messages - result.usages = buildStepUsages(steps) - }, - onAbort: ({ steps }) => { - wasAborted = true - abortedSteps.push(...steps) - }, - }) - yield { - type: 'agent_start', - input, - } - for await (const chunk of fullStream) { - if (chunk.type === 'error') { - throw new Error( - resolveStreamErrorMessage((chunk as { error?: unknown }).error), - ) - } - switch (chunk.type) { - case 'reasoning-start': - yield { - type: 'reasoning_start', - metadata: chunk, - } - break - case 'reasoning-delta': - yield { - type: 'reasoning_delta', - delta: chunk.text, - } - break - case 'reasoning-end': - yield { - type: 'reasoning_end', - metadata: chunk, - } - break - case 'text-start': - yield { - type: 'text_start', - } - break - case 'text-delta': { - const { visibleText, events } = tagExtractor.push(chunk.text) - if (visibleText) { - if (textLoopProbeBuffer) { - textLoopProbeBuffer.push(visibleText) - } - yield { - type: 'text_delta', - delta: visibleText, - } - } - yield* emitTagEvents(events) - break - } - case 'text-end': { - const remainder = tagExtractor.flushRemainder() - if (remainder.visibleText) { - if (textLoopProbeBuffer) { - textLoopProbeBuffer.push(remainder.visibleText) - } - yield { - type: 'text_delta', - delta: remainder.visibleText, - } - } - if (textLoopProbeBuffer) { - textLoopProbeBuffer.flush() - } - yield* emitTagEvents(remainder.events) - yield { - type: 'text_end', - metadata: chunk, - } - break - } - case 'tool-call': { - const remainder = tagExtractor.flushRemainder() - if (remainder.visibleText) { - if (textLoopProbeBuffer) { - textLoopProbeBuffer.push(remainder.visibleText) - } - yield { - type: 'text_delta', - delta: remainder.visibleText, - } - } - if (textLoopProbeBuffer) { - textLoopProbeBuffer.flush() - } - yield* emitTagEvents(remainder.events) - yield { - type: 'tool_call_start', - toolName: chunk.toolName, - toolCallId: chunk.toolCallId, - input: chunk.input, - metadata: chunk, - } - break - } - case 'tool-result': { - const shouldAbortForToolLoop = toolLoopAbortCallIds.delete(chunk.toolCallId) - yield { - type: 'tool_call_end', - toolName: chunk.toolName, - toolCallId: chunk.toolCallId, - input: chunk.input, - result: chunk.output, - metadata: chunk, - } - if (chunk.toolName === 'use_skill') { - const res = chunk.output as Record | undefined - const sn = typeof res?.skillName === 'string' ? res.skillName : '' - if (sn) enableSkill(sn) - } - if (shouldAbortForToolLoop) { - throw new Error(TOOL_LOOP_DETECTED_ABORT_MESSAGE) - } - break - } - case 'file': - yield { - type: 'attachment_delta', - attachments: [ - { - type: 'image', - url: `data:${chunk.file.mediaType ?? 'image/png'};base64,${chunk.file.base64}`, - mime: chunk.file.mediaType ?? 'image/png', - }, - ], - } - } - } - if (textLoopProbeBuffer) { - textLoopProbeBuffer.flush() - } - - const { messages: strippedMessages } = stripAttachmentsFromMessages( - result.messages, - [reactionsResolver, speechResolver], - ) - yield { - type: 'agent_end', - messages: [ - userPrompt, - ...strippedMessages, - ], - usages: [null, ...result.usages], - reasoning: result.reasoning, - usage: result.usage!, - skills: getEnabledSkills(), - } - } catch (error) { - streamError = error - - if (wasAborted || abortController.signal.aborted) { - const partialMessages = rebuildPartialMessages(abortedSteps) - const partialUsages = abortedSteps.length > 0 - ? buildStepUsages(abortedSteps as Parameters[0]) - : [] - const partialUsage = abortedSteps.length > 0 - ? abortedSteps[abortedSteps.length - 1].usage - : null - const partialReasoning = abortedSteps.flatMap( - (s) => (s.reasoning ?? []).map((r: { text: string }) => r.text), - ) - yield { - type: 'agent_abort', - messages: [userPrompt, ...partialMessages], - usages: [null, ...partialUsages], - reasoning: partialReasoning, - usage: partialUsage as LanguageModelUsage | null, - skills: getEnabledSkills(), - } - } else { - console.error(error) - throw error - } - } finally { - try { - await closeTools() - } catch (closeError) { - if (streamError == null) { - throw closeError - } - console.error(closeError) - } - } - } - - return { - stream, - ask, - askAsSubagent, - triggerSchedule, - triggerHeartbeat, - } -} diff --git a/packages/agent/src/index.ts b/packages/agent/src/index.ts deleted file mode 100644 index dc057cf1..00000000 --- a/packages/agent/src/index.ts +++ /dev/null @@ -1,6 +0,0 @@ -export * from './agent' -export * from './types' -export * from './model' -export * from './utils' -export * from './tools' -export * from './prompts' \ No newline at end of file diff --git a/packages/agent/src/model.ts b/packages/agent/src/model.ts deleted file mode 100644 index a36e5a38..00000000 --- a/packages/agent/src/model.ts +++ /dev/null @@ -1,24 +0,0 @@ -import { createOpenAI } from '@ai-sdk/openai' -import { createOpenAICompatible } from '@ai-sdk/openai-compatible' -import { createAnthropic } from '@ai-sdk/anthropic' -import { createGoogleGenerativeAI } from '@ai-sdk/google' -import { ClientType, ModelConfig } from './types' - -export const createModel = (model: ModelConfig) => { - const apiKey = model.apiKey.trim() - const baseURL = model.baseUrl.trim() - const modelId = model.modelId.trim() - - switch (model.clientType) { - case ClientType.OpenAIResponses: - return createOpenAI({ apiKey, baseURL })(modelId) - case ClientType.OpenAICompletions: - return createOpenAICompatible({ name: 'openai', apiKey, baseURL }).chatModel(modelId) - case ClientType.AnthropicMessages: - return createAnthropic({ apiKey, baseURL })(modelId) - case ClientType.GoogleGenerativeAI: - return createGoogleGenerativeAI({ apiKey, baseURL })(modelId) - default: - return createOpenAICompatible({ name: 'openai', apiKey, baseURL }).chatModel(modelId) - } -} diff --git a/packages/agent/src/prompts/heartbeat.ts b/packages/agent/src/prompts/heartbeat.ts deleted file mode 100644 index ff9dd269..00000000 --- a/packages/agent/src/prompts/heartbeat.ts +++ /dev/null @@ -1,36 +0,0 @@ -import type { FSClient } from '../utils/fs' - -export interface HeartbeatParams { - interval: number - date: Date - fs: FSClient -} - -const defaultInstructions = `Do not infer or repeat old tasks from prior chats. -If nothing needs attention, reply HEARTBEAT_OK. -If something needs attention, use the send tool to deliver alerts to the appropriate channel.` - -export const heartbeat = async (params: HeartbeatParams) => { - let checklist = '' - try { - checklist = await params.fs.readText('/data/HEARTBEAT.md') - } catch { - // HEARTBEAT.md does not exist — not an error - } - - const sections: string[] = [ - '** This is a heartbeat check automatically triggered by the system **', - '---', - `interval: every ${params.interval} minutes`, - `time: ${params.date.toISOString()}`, - '---', - ] - - if (checklist.trim()) { - sections.push(`\n## HEARTBEAT.md (checklist)\n\n${checklist.trim()}`) - } - - sections.push(`\n${defaultInstructions}`) - - return sections.join('\n').trim() -} diff --git a/packages/agent/src/prompts/index.ts b/packages/agent/src/prompts/index.ts deleted file mode 100644 index ab6e1800..00000000 --- a/packages/agent/src/prompts/index.ts +++ /dev/null @@ -1,5 +0,0 @@ -export * from './system' -export * from './schedule' -export * from './heartbeat' -export * from './subagent' -export * from './utils' \ No newline at end of file diff --git a/packages/agent/src/prompts/schedule.ts b/packages/agent/src/prompts/schedule.ts deleted file mode 100644 index 5eaa5a48..00000000 --- a/packages/agent/src/prompts/schedule.ts +++ /dev/null @@ -1,24 +0,0 @@ -import { Schedule } from '../types' -import { stringify } from 'yaml' - -export interface ScheduleParams { - schedule: Schedule - date: Date -} - -export const schedule = (params: ScheduleParams) => { - const headers = { - 'schedule-name': params.schedule.name, - 'schedule-description': params.schedule.description, - 'max-calls': params.schedule.maxCalls ?? 'Unlimited', - 'cron-pattern': params.schedule.pattern, - } - return ` -** This is a scheduled task automatically send to you by the system ** ---- -${stringify(headers)} ---- - -${params.schedule.command} - `.trim() -} \ No newline at end of file diff --git a/packages/agent/src/prompts/subagent.ts b/packages/agent/src/prompts/subagent.ts deleted file mode 100644 index 0c2f464d..00000000 --- a/packages/agent/src/prompts/subagent.ts +++ /dev/null @@ -1,21 +0,0 @@ -import { stringify } from 'yaml' - -export interface SubagentParams { - date: Date - name: string - description?: string -} - -export const subagentSystem = ({ date, name, description }: SubagentParams) => { - const headers = { - 'name': name, - 'description': description, - 'time-now': date.toISOString(), - } - return [ - description, - '---' - + stringify(headers) - + '---' - ].join('\n\n') -} \ No newline at end of file diff --git a/packages/agent/src/prompts/system.ts b/packages/agent/src/prompts/system.ts deleted file mode 100644 index 04801e00..00000000 --- a/packages/agent/src/prompts/system.ts +++ /dev/null @@ -1,322 +0,0 @@ -import { block, quote } from './utils' -import { AgentSkill, InboxItem, SystemFile } from '../types' -import { stringify } from 'yaml' - -export interface SystemParams { - date: Date - language: string - maxContextLoadTime: number - channels: string[] - /** Channel where the current session/message is from (e.g. telegram, feishu, web). */ - currentChannel: string - skills: AgentSkill[] - enabledSkills: AgentSkill[] - files: SystemFile[] - attachments?: string[] - inbox?: InboxItem[] - supportsImageInput?: boolean -} - -export const skillPrompt = (skill: AgentSkill) => { - return ` -**${quote(skill.name)}** -> ${skill.description} - -${skill.content} - `.trim() -} - -const formatInbox = (items: InboxItem[]): string => { - if (!items || items.length === 0) return '' - const formatted = items.map((item) => ({ - id: item.id, - source: item.source, - header: item.header, - content: item.content, - createdAt: item.createdAt, - })) - return ` -## Inbox (${items.length} unread) - -These are messages from other channels — NOT from the current conversation. Use ${quote('send')} or ${quote('react')} if you want to respond to any of them. - - -${JSON.stringify(formatted)} - - -Use ${quote('search_inbox')} to find older messages by keyword. -`.trim() -} - -const formatSystemFile = (file: SystemFile) => { - return ` -## ${file.filename} - -${file.content} - `.trim() -} - -export const system = ({ - date, - language, - maxContextLoadTime, - channels, - currentChannel, - skills, - enabledSkills, - files, - inbox = [], - supportsImageInput = true, -}: SystemParams) => { - const home = '/data' - // ── Static section (stable prefix for LLM prompt caching) ────────── - const staticHeaders = { - 'language': language, - } - - // ── Dynamic section (appended at the end to preserve cache prefix) ─ - const dynamicHeaders = { - 'available-channels': channels.join(','), - 'current-session-channel': currentChannel, - 'max-context-load-time': maxContextLoadTime.toString(), - 'time-now': date.toISOString(), - } - - const basicTools = [ - `- ${quote('read')}: read file content`, - supportsImageInput ? `- ${quote('read_media')}: view the media` : null, - `- ${quote('write')}: write file content`, - `- ${quote('list')}: list directory entries`, - `- ${quote('edit')}: replace exact text in a file`, - `- ${quote('exec')}: execute command`, - ] - .filter((line): line is string => Boolean(line)) - .join('\n') - console.log('inbox', inbox) - - return ` ---- -${stringify(staticHeaders)} ---- -You are just woke up. - -**Your text output IS your reply.** Whatever you write goes directly back to the person who messaged you. You do not need any tool to reply — just write. - -${quote(home)} is your HOME — you can read and write files there freely. - -## Basic Tools -${basicTools} - -## Safety -- Keep private data private -- Don't run destructive commands without asking -- When in doubt, ask - -## Core files -- ${quote('IDENTITY.md')}: Your identity and personality. -- ${quote('SOUL.md')}: Your soul and beliefs. -- ${quote('TOOLS.md')}: Your tools and methods. -- ${quote('PROFILES.md')}: Profiles of users and groups. -- ${quote('MEMORY.md')}: Your core memory. -- ${quote('memory/YYYY-MM-DD.md')}: Today's memory. - -## Memory - -You wake up fresh each session. These files are your continuity: - -- **Daily notes:** ${quote('memory/YYYY-MM-DD.md')} (create ${quote('memory/')} if needed) — raw logs of what happened -- **Long-term:** ${quote('MEMORY.md')} — your curated memories, like a human's long-term memory - -Use ${quote('search_memory')} to recall earlier conversations beyond the current context window. - -### Memory Write Rules (IMPORTANT) - -For ${quote('memory/YYYY-MM-DD.md')}, use canonical markdown entries: - -${block([ - '## Entry mem_20260313_001', - '', - '```yaml', - 'id: mem_20260313_001', - 'created_at: 2026-03-13T13:34:49Z', - 'updated_at: 2026-03-13T13:34:49Z', - 'metadata:', - ' topic: Notes', - '```', - '', - 'What happened / what to remember', -].join('\n'))} - -Rules: -- Only send NEW memory items (do not re-write old content). -- Preserve the canonical entry structure for daily memory files. -- When a memory is about a known user or group from ${quote('PROFILES.md')}, include a stable profile link in ${quote('metadata')} (for example ${quote('profile_ref')}, plus identity fields when available). -- Do not provide ${quote('hash')} (backend generates it). -- If plain text is unavoidable, write concise factual notes only. -- ${quote('MEMORY.md')} stays human-readable markdown (not JSON). - -## How to Respond - -**Direct reply (default):** When someone sends you a message in the current session, just write your response as plain text. This is the normal way to answer — your text output goes directly back to the person talking to you. Do NOT use ${quote('send')} for this. - -**${quote('send')} tool:** ONLY for reaching out to a DIFFERENT channel or conversation — e.g. posting to another group, messaging a different person, or replying to an inbox item from another platform. Requires a ${quote('target')} — use ${quote('get_contacts')} to find available targets. - -**${quote('react')} tool:** Add or remove an emoji reaction on a specific message (any channel). - -**${quote('speak')} tool:** Send a voice message to a DIFFERENT channel. Synthesizes text and delivers as audio. Requires ${quote('target')} — use ${quote('get_contacts')} to find available targets. For speaking in the current conversation, use the ${quote('')} block instead. - -### When to use ${quote('send')} -- A scheduled task tells you to notify or post somewhere. -- You want to forward information to a different group or person. -- You want to reply to an inbox message that came from another channel. -- The user explicitly asks you to send a message to someone else or another channel. - -### When NOT to use ${quote('send')} -- The user is chatting with you and expects a reply — just respond directly. -- The user asks a question, gives a command, or has a conversation — just respond directly. -- The user asks you to search, summarize, compute, or do any task — do the work with tools, then write the result directly. Do NOT use ${quote('send')} to deliver results back to the person who asked. -- If you are unsure, respond directly. Only use ${quote('send')} when the destination is clearly a different target. - -**Common mistake:** User says "search for X" → you search → then you use ${quote('send')} to post the result back to the same conversation. This is WRONG. Just write the result as your reply. - -## Contacts -You may receive messages from different people, bots, and channels. Use ${quote('get_contacts')} to list all known contacts and conversations for your bot. -It returns each route's platform, conversation type, and ${quote('target')} (the value you pass to ${quote('send')}). - -## Your Inbox -Your inbox contains notifications from: -- Group conversations where you were not directly mentioned. -- Other connected platforms (email, etc.). - -Guidelines: -- Not all messages need a response — be selective like a human would. -- If you decide to reply to an inbox message, use ${quote('send')} or ${quote('react')} (since inbox messages come from other channels). -- Sometimes an emoji reaction is better than a long reply. - -## Attachments - -**Receiving**: Uploaded files are saved to your workspace; the file path appears in the message header. - -**Sending via ${quote('send')} tool**: Pass file paths or URLs in the ${quote('attachments')} parameter. Example: ${quote('attachments: ["' + home + '/media/ab/file.jpg", "https://example.com/img.png"]')} - -**Sending in direct responses**: Use this format: - -${block([ - '', - `- ${home}/path/to/file.pdf`, - `- ${home}/path/to/video.mp4`, - '- https://example.com/image.png', - '', -].join('\n'))} - -Rules: -- One path or URL per line, prefixed by ${quote('- ')} -- No extra text inside ${quote('...')} -- The block can appear anywhere in your response; it will be parsed and stripped from visible text - -## Reactions - -To react with an emoji to the message you are replying to, use this format in your direct response: - -${block([ - '', - '- 👍', - '', -].join('\n'))} - -Rules: -- One emoji per line, prefixed by ${quote('- ')} -- The block can appear anywhere in your response; it will be parsed and stripped from visible text -- This reacts to the **source message** of the current conversation (the message you are responding to) -- For reacting to messages in other channels or removing reactions, use the ${quote('react')} tool instead - -## Speech - -To speak aloud in the current conversation (text-to-speech), use this format in your direct response: - -${block([ - '', - 'The text you want to say aloud.', - '', -].join('\n'))} - -Rules: -- Content is the text to synthesize (max 500 characters) -- The block can appear anywhere in your response; it will be parsed and stripped from visible text -- For sending voice to a DIFFERENT channel, use the ${quote('speak')} tool instead - -## Schedule Tasks - -You can create and manage schedule tasks via cron. -Use ${quote('schedule')} to create a new schedule task, and fill ${quote('command')} with natural language. -When cron pattern is valid, you will receive a schedule message with your ${quote('command')}. - -When a scheduled task triggers, use ${quote('send')} to deliver the result to the intended channel — do not respond directly, as there is no active conversation to reply to. - -## Heartbeat — Be Proactive - -You may receive periodic **heartbeat** messages — automatic system-triggered turns that let you proactively check on things without the user asking. - -### The HEARTBEAT_OK Contract -- If nothing needs attention, reply with exactly ${quote('HEARTBEAT_OK')}. The system will suppress this message — the user will not see it. -- If something needs attention, use ${quote('send')} to deliver alerts to the appropriate channel. Your text output in heartbeat turns is NOT sent to the user directly. - -### HEARTBEAT.md -${quote('/data/HEARTBEAT.md')} is your checklist file. The system will read it automatically and include its content in the heartbeat message. You are free to edit this file — add short checklists, reminders, or periodic tasks. Keep it small to limit token usage. - -### When to Reach Out (use ${quote('send')}) -- Important messages or notifications arrived -- Upcoming events or deadlines (< 2 hours) -- Something interesting or actionable you discovered -- A monitored task changed status - -### When to Stay Quiet (${quote('HEARTBEAT_OK')}) -- Late night hours unless truly urgent -- Nothing new since last check -- The user is clearly busy or in a conversation -- You just checked recently and nothing changed - -### Proactive Work (no need to ask) -During heartbeats you can freely: -- Read, organize, and update your memory files -- Check on ongoing projects (git status, file changes, etc.) -- Update ${quote('HEARTBEAT.md')} to refine your own checklist -- Clean up or archive old notes - -### Heartbeat vs Schedule: When to Use Each -- **Heartbeat**: batch multiple periodic checks together (inbox + calendar + notifications), timing can drift slightly, needs conversational context. -- **Schedule (cron)**: exact timing matters, task needs isolation, one-shot reminders, output should go directly to a channel. - -**Tip:** Batch similar periodic checks into ${quote('HEARTBEAT.md')} instead of creating multiple schedule tasks. Use schedule for precise timing and standalone tasks. - -## Subagent - -For complex tasks like: -- Create a website -- Research a topic -- Generate a report -- etc. - -You can create a subagent to help you with these tasks, -${quote('description')} will be the system prompt for the subagent. - -${files.map(formatSystemFile).join('\n\n')} - -## Skills -${skills.length} skills available via ${quote('use_skill')}: -${skills.map(skill => `- ${skill.name}: ${skill.description}`).join('\n')} - -${enabledSkills.map(skill => skillPrompt(skill)).join('\n\n---\n\n')} - -${formatInbox(inbox)} - - -${stringify(dynamicHeaders)} - - -Context window covers the last ${maxContextLoadTime} minutes (${(maxContextLoadTime / 60).toFixed(2)} hours). - -Current session channel: ${quote(currentChannel)}. Messages from other channels will include a ${quote('channel')} header. - - `.trim() -} diff --git a/packages/agent/src/prompts/utils.ts b/packages/agent/src/prompts/utils.ts deleted file mode 100644 index a3707a88..00000000 --- a/packages/agent/src/prompts/utils.ts +++ /dev/null @@ -1,7 +0,0 @@ -export const quote = (content: string) => { - return `\`${content}\`` -} - -export const block = (content: string, tag: string = '') => { - return `\`\`\`${tag}\n${content}\n\`\`\`` -} \ No newline at end of file diff --git a/packages/agent/src/sential.test.ts b/packages/agent/src/sential.test.ts deleted file mode 100644 index 603a183e..00000000 --- a/packages/agent/src/sential.test.ts +++ /dev/null @@ -1,265 +0,0 @@ -import { describe, expect, it } from 'vitest' -import { - createSential, - createTextLoopGuard, - createToolLoopGuard, -} from './sential' - -describe('sential', () => { - it('does not hit when overlap stays low', () => { - const sential = createSential() - sential.inspect('ABCDEFGHIJKLMNO') - - const result = sential.inspect('qrstuvwxyz12345') - - expect(result.hit).toBe(false) - expect(result.overlap).toBe(0) - }) - - it('hits when overlap is above threshold', () => { - const sential = createSential() - sential.inspect('0123456789abcdefghij0123456789abcdefghij') - - const result = sential.inspect('0123456789abcdefghij') - - expect(result.hit).toBe(true) - expect(result.overlap).toBeGreaterThan(0.75) - }) - - it('does not hit when overlap is exactly threshold', () => { - const sential = createSential({ - ngramSize: 1, - overlapThreshold: 0.75, - }) - sential.inspect('aaaaaaaaaa') - - const result = sential.inspect(`${'a'.repeat(15)}bcdef`) - - expect(result.newGrams).toBe(20) - expect(result.matchedGrams).toBe(15) - expect(result.overlap).toBeCloseTo(0.75, 10) - expect(result.hit).toBe(false) - }) - - it('evicts old grams with sliding window', () => { - const sential = createSential({ - windowSize: 20, - }) - sential.inspect('abcdefghijabcdefghij') - sential.inspect('KLMNOPQRST') - sential.inspect('UVWXYZ1234') - - const result = sential.inspect('abcdefghij') - - expect(result.hit).toBe(false) - expect(result.matchedGrams).toBe(0) - }) - - it('aborts only after 10 consecutive hits', () => { - const guard = createTextLoopGuard({ - ngramSize: 1, - overlapThreshold: 0.5, - consecutiveHitsToAbort: 10, - }) - - const seeded = guard.inspect('aaaaaaaaaa') - expect(seeded.hit).toBe(false) - expect(seeded.streak).toBe(0) - expect(seeded.abort).toBe(false) - - for (let i = 1; i <= 9; i += 1) { - const result = guard.inspect('aaaaaaaaaa') - expect(result.hit).toBe(true) - expect(result.streak).toBe(i) - expect(result.abort).toBe(false) - } - - const tenth = guard.inspect('aaaaaaaaaa') - expect(tenth.hit).toBe(true) - expect(tenth.streak).toBe(10) - expect(tenth.abort).toBe(true) - }) - - it('resets streak when a non-hit chunk appears', () => { - const guard = createTextLoopGuard({ - ngramSize: 1, - overlapThreshold: 0.5, - consecutiveHitsToAbort: 10, - }) - - guard.inspect('aaaaaaaaaa') - for (let i = 0; i < 5; i += 1) { - guard.inspect('aaaaaaaaaa') - } - - const miss = guard.inspect('bcdefghijk') - expect(miss.hit).toBe(false) - expect(miss.streak).toBe(0) - expect(miss.abort).toBe(false) - - const hitAgain = guard.inspect('aaaaaaaaaa') - expect(hitAgain.hit).toBe(true) - expect(hitAgain.streak).toBe(1) - expect(hitAgain.abort).toBe(false) - }) - - it('only updates streak when chunk has enough new grams', () => { - const guard = createTextLoopGuard({ - ngramSize: 1, - overlapThreshold: 0.5, - minNewGramsPerChunk: 5, - }) - - guard.inspect('aaaaaaaaaa') - - const smallHit = guard.inspect('aaaa') - expect(smallHit.hit).toBe(true) - expect(smallHit.newGrams).toBe(4) - expect(smallHit.streak).toBe(0) - expect(smallHit.abort).toBe(false) - - const countedHit = guard.inspect('aaaaa') - expect(countedHit.hit).toBe(true) - expect(countedHit.newGrams).toBe(5) - expect(countedHit.streak).toBe(1) - expect(countedHit.abort).toBe(false) - - const smallMiss = guard.inspect('cccc') - expect(smallMiss.hit).toBe(false) - expect(smallMiss.newGrams).toBe(4) - expect(smallMiss.streak).toBe(1) - - const countedMiss = guard.inspect('ddddd') - expect(countedMiss.hit).toBe(false) - expect(countedMiss.newGrams).toBe(5) - expect(countedMiss.streak).toBe(0) - }) - - it('warns on first tool-loop breach and aborts on second breach', () => { - const guard = createToolLoopGuard({ - repeatThreshold: 5, - warningsBeforeAbort: 1, - }) - const payload = { - toolName: 'web_fetch', - input: { url: 'https://example.com', requestId: 'r-1' }, - } - - for (let i = 1; i <= 5; i += 1) { - const result = guard.inspect(payload) - expect(result.warn).toBe(false) - expect(result.abort).toBe(false) - expect(result.repeatCount).toBe(i) - } - - const firstBreach = guard.inspect(payload) - expect(firstBreach.warn).toBe(true) - expect(firstBreach.abort).toBe(false) - expect(firstBreach.breachCount).toBe(1) - expect(firstBreach.repeatCount).toBe(0) - - for (let i = 1; i <= 5; i += 1) { - const result = guard.inspect(payload) - expect(result.warn).toBe(false) - expect(result.abort).toBe(false) - expect(result.repeatCount).toBe(i) - } - - const secondBreach = guard.inspect(payload) - expect(secondBreach.warn).toBe(false) - expect(secondBreach.abort).toBe(true) - expect(secondBreach.breachCount).toBe(2) - }) - - it('resets tool-loop repeat count when hash changes', () => { - const guard = createToolLoopGuard({ - repeatThreshold: 5, - warningsBeforeAbort: 1, - }) - - const first = guard.inspect({ - toolName: 'web_fetch', - input: { url: 'https://a.example.com' }, - }) - expect(first.repeatCount).toBe(1) - - const second = guard.inspect({ - toolName: 'web_fetch', - input: { url: 'https://a.example.com' }, - }) - expect(second.repeatCount).toBe(2) - - const changed = guard.inspect({ - toolName: 'web_fetch', - input: { url: 'https://b.example.com' }, - }) - expect(changed.repeatCount).toBe(1) - expect(changed.warn).toBe(false) - expect(changed.abort).toBe(false) - }) - - it('resets tool-loop breach count when hash changes', () => { - const guard = createToolLoopGuard({ - repeatThreshold: 1, - warningsBeforeAbort: 1, - }) - - // First fingerprint reaches warning phase. - guard.inspect({ - toolName: 'web_fetch', - input: { url: 'https://a.example.com' }, - }) - const warned = guard.inspect({ - toolName: 'web_fetch', - input: { url: 'https://a.example.com' }, - }) - expect(warned.warn).toBe(true) - expect(warned.breachCount).toBe(1) - - // Switching fingerprint should restart warning/abort phase. - const switched = guard.inspect({ - toolName: 'web_fetch', - input: { url: 'https://b.example.com' }, - }) - expect(switched.warn).toBe(false) - expect(switched.abort).toBe(false) - expect(switched.breachCount).toBe(0) - - const warnedAgain = guard.inspect({ - toolName: 'web_fetch', - input: { url: 'https://b.example.com' }, - }) - expect(warnedAgain.warn).toBe(true) - expect(warnedAgain.abort).toBe(false) - expect(warnedAgain.breachCount).toBe(1) - }) - - it('ignores volatile keys when computing tool-loop hash', () => { - const guard = createToolLoopGuard({ - repeatThreshold: 1, - warningsBeforeAbort: 1, - }) - - const first = guard.inspect({ - toolName: 'web_fetch', - input: { - url: 'https://example.com', - request_id: 'req-1', - updatedAt: '2026-02-28T00:00:00.000Z', - }, - }) - expect(first.warn).toBe(false) - expect(first.abort).toBe(false) - - const second = guard.inspect({ - toolName: 'web_fetch', - input: { - url: 'https://example.com', - request_id: 'req-2', - updatedAt: '2026-02-28T00:01:00.000Z', - }, - }) - expect(second.warn).toBe(true) - expect(second.abort).toBe(false) - }) -}) diff --git a/packages/agent/src/sential.ts b/packages/agent/src/sential.ts deleted file mode 100644 index 1a5e5738..00000000 --- a/packages/agent/src/sential.ts +++ /dev/null @@ -1,506 +0,0 @@ -import { createHash } from 'node:crypto' - -export interface SentialOptions { - ngramSize?: number; - windowSize?: number; - overlapThreshold?: number; -} - -export interface TextLoopGuardOptions extends SentialOptions { - consecutiveHitsToAbort?: number; - minNewGramsPerChunk?: number; -} - -export interface SentialInspectResult { - hit: boolean; - overlap: number; - matchedGrams: number; - newGrams: number; -} - -export interface TextLoopGuardInspectResult extends SentialInspectResult { - streak: number; - abort: boolean; -} - -export interface ToolLoopInspectInput { - toolName: string; - input: unknown; -} - -export interface ToolLoopInspectResult { - hash: string; - repeatCount: number; - breachCount: number; - warn: boolean; - abort: boolean; -} - -export interface ToolLoopGuardOptions { - repeatThreshold?: number; - warningsBeforeAbort?: number; - volatileKeys?: string[]; -} - -export interface Sential { - inspect(text: string): SentialInspectResult; - reset(): void; -} - -export interface TextLoopGuard { - inspect(text: string): TextLoopGuardInspectResult; - reset(): void; -} - -export interface TextLoopProbeBuffer { - push(text: string): void; - flush(): void; -} - -export interface ToolLoopGuard { - inspect(input: ToolLoopInspectInput): ToolLoopInspectResult; - reset(): void; -} - -const DEFAULT_NGRAM_SIZE = 10 -const DEFAULT_WINDOW_SIZE = 1000 -const DEFAULT_OVERLAP_THRESHOLD = 0.75 -const DEFAULT_CONSECUTIVE_HITS_TO_ABORT = 10 -const DEFAULT_MIN_NEW_GRAMS_PER_CHUNK = 1 -const DEFAULT_TOOL_LOOP_REPEAT_THRESHOLD = 5 -const DEFAULT_TOOL_LOOP_WARNINGS_BEFORE_ABORT = 1 -const DEFAULT_VOLATILE_KEYS = [ - 'toolCallId', - 'tool_call_id', - 'requestId', - 'request_id', - 'traceId', - 'trace_id', - 'spanId', - 'span_id', - 'sessionId', - 'session_id', - 'timestamp', - 'createdAt', - 'created_at', - 'updatedAt', - 'updated_at', - 'expiresAt', - 'expires_at', - 'nonce', -] -const VOLATILE_KEY_SUFFIXES = [ - 'requestid', - 'traceid', - 'sessionid', - 'toolcallid', - 'timestamp', - 'createdat', - 'updatedat', - 'expiresat', -] - -type NormalizedValue = - | null - | string - | number - | boolean - | NormalizedValue[] - | { [key: string]: NormalizedValue }; - -function validatePositiveInt(name: string, value: number): number { - if (!Number.isFinite(value) || value <= 0 || !Number.isInteger(value)) { - throw new Error(`${name} must be a positive integer`) - } - return value -} - -function validateThreshold(value: number): number { - if (!Number.isFinite(value) || value < 0 || value > 1) { - throw new Error('overlapThreshold must be between 0 and 1') - } - return value -} - -function normalizeChars(text: string): string[] { - if (!text) return [] - return Array.from(text.normalize('NFC')) -} - -function buildNgram(chars: string[], start: number, size: number): string { - return chars.slice(start, start + size).join('') -} - -function normalizeKeyName(key: string): string { - return key - .trim() - .toLowerCase() - .replace(/[^a-z0-9]/g, '') -} - -function isVolatileKey(key: string, volatileKeySet: Set): boolean { - const normalized = normalizeKeyName(key) - if (!normalized) return false - if (volatileKeySet.has(normalized)) return true - return VOLATILE_KEY_SUFFIXES.some((suffix) => normalized.endsWith(suffix)) -} - -function isPlainObject(value: unknown): value is Record { - if (value === null || typeof value !== 'object') return false - const prototype = Object.getPrototypeOf(value as object) - return prototype === Object.prototype || prototype === null -} - -function normalizeToolLoopValue( - value: unknown, - volatileKeySet: Set, - seen: WeakSet, -): NormalizedValue | undefined { - if (value === null) return null - - if (typeof value === 'string') return value.normalize('NFC') - if (typeof value === 'boolean') return value - if (typeof value === 'number') { - return Number.isFinite(value) ? value : (String(value) as NormalizedValue) - } - if (typeof value === 'bigint') return value.toString() - if ( - typeof value === 'undefined' || - typeof value === 'function' || - typeof value === 'symbol' - ) { - return undefined - } - - if (value instanceof Date) { - return value.toISOString() - } - - if (Array.isArray(value)) { - return value.map( - (item) => normalizeToolLoopValue(item, volatileKeySet, seen) ?? null, - ) - } - - if (!isPlainObject(value)) { - const maybeRecord = value as { toJSON?: () => unknown } - if (typeof maybeRecord.toJSON === 'function') { - return normalizeToolLoopValue(maybeRecord.toJSON(), volatileKeySet, seen) - } - return String(value) - } - - if (seen.has(value)) { - return '[Circular]' - } - seen.add(value) - - const normalizedObject: { [key: string]: NormalizedValue } = {} - const keys = Object.keys(value).sort() - for (const key of keys) { - if (isVolatileKey(key, volatileKeySet)) { - continue - } - const normalized = normalizeToolLoopValue(value[key], volatileKeySet, seen) - if (normalized !== undefined) { - normalizedObject[key] = normalized - } - } - - seen.delete(value) - return normalizedObject -} - -function computeToolLoopHash( - input: ToolLoopInspectInput, - volatileKeySet: Set, -): string { - const payload = { - toolName: input.toolName.trim(), - input: - normalizeToolLoopValue(input.input, volatileKeySet, new WeakSet()) ?? - null, - } - const serialized = JSON.stringify(payload) - return createHash('sha256').update(serialized).digest('hex') -} - -export function createSential(options: SentialOptions = {}): Sential { - const ngramSize = validatePositiveInt( - 'ngramSize', - options.ngramSize ?? DEFAULT_NGRAM_SIZE, - ) - const windowSize = validatePositiveInt( - 'windowSize', - options.windowSize ?? DEFAULT_WINDOW_SIZE, - ) - const overlapThreshold = validateThreshold( - options.overlapThreshold ?? DEFAULT_OVERLAP_THRESHOLD, - ) - if (windowSize < ngramSize) { - throw new Error('windowSize must be greater than or equal to ngramSize') - } - - const windowChars: string[] = [] - const windowNgramQueue: string[] = [] - const historySet = new Set() - const historyCounts = new Map() - - const addHistoryGram = (gram: string) => { - const nextCount = (historyCounts.get(gram) ?? 0) + 1 - historyCounts.set(gram, nextCount) - if (nextCount === 1) { - historySet.add(gram) - } - } - - const removeHistoryGram = (gram: string) => { - const prevCount = historyCounts.get(gram) - if (!prevCount) return - if (prevCount <= 1) { - historyCounts.delete(gram) - historySet.delete(gram) - return - } - historyCounts.set(gram, prevCount - 1) - } - - const pushWindowChar = (char: string) => { - windowChars.push(char) - - if (windowChars.length >= ngramSize) { - const gram = buildNgram( - windowChars, - windowChars.length - ngramSize, - ngramSize, - ) - windowNgramQueue.push(gram) - addHistoryGram(gram) - } - - if (windowChars.length <= windowSize) { - return - } - - windowChars.shift() - const removedGram = windowNgramQueue.shift() - if (removedGram) { - removeHistoryGram(removedGram) - } - } - - return { - inspect(text: string): SentialInspectResult { - const incomingChars = normalizeChars(text) - if (incomingChars.length === 0) { - return { - hit: false, - overlap: 0, - matchedGrams: 0, - newGrams: 0, - } - } - - const contextSize = Math.max(ngramSize - 1, 0) - const contextChars = - contextSize > 0 ? windowChars.slice(-contextSize) : [] - const candidateChars = [...contextChars, ...incomingChars] - - let matchedGrams = 0 - let newGrams = 0 - const contextLength = contextChars.length - - if (candidateChars.length >= ngramSize) { - for (let i = 0; i <= candidateChars.length - ngramSize; i += 1) { - const gramEndIndex = i + ngramSize - 1 - if (gramEndIndex < contextLength) { - continue - } - const gram = buildNgram(candidateChars, i, ngramSize) - newGrams += 1 - if (historySet.has(gram)) { - matchedGrams += 1 - } - } - } - - const overlap = newGrams === 0 ? 0 : matchedGrams / newGrams - const hit = overlap > overlapThreshold - - for (const char of incomingChars) { - pushWindowChar(char) - } - - return { - hit, - overlap, - matchedGrams, - newGrams, - } - }, - reset(): void { - windowChars.length = 0 - windowNgramQueue.length = 0 - historySet.clear() - historyCounts.clear() - }, - } -} - -export function createTextLoopGuard( - options: TextLoopGuardOptions = {}, -): TextLoopGuard { - const consecutiveHitsToAbort = validatePositiveInt( - 'consecutiveHitsToAbort', - options.consecutiveHitsToAbort ?? DEFAULT_CONSECUTIVE_HITS_TO_ABORT, - ) - const minNewGramsPerChunk = validatePositiveInt( - 'minNewGramsPerChunk', - options.minNewGramsPerChunk ?? DEFAULT_MIN_NEW_GRAMS_PER_CHUNK, - ) - const sential = createSential(options) - let streak = 0 - - return { - inspect(text: string): TextLoopGuardInspectResult { - const result = sential.inspect(text) - if (result.newGrams >= minNewGramsPerChunk) { - if (result.hit) { - streak += 1 - } else { - streak = 0 - } - } - return { - ...result, - streak, - abort: streak >= consecutiveHitsToAbort, - } - }, - reset(): void { - sential.reset() - streak = 0 - }, - } -} - -export function createTextLoopProbeBuffer( - chunkSize: number, - inspect: (text: string) => void, -): TextLoopProbeBuffer { - validatePositiveInt('chunkSize', chunkSize) - let chars: string[] = [] - let offset = 0 - - const compact = () => { - if (offset > 0) { - chars = chars.slice(offset) - offset = 0 - } - } - - const inspectChunk = (text: string) => { - if (text.length > 0) { - inspect(text) - } - } - - return { - push(text: string): void { - if (!text) return - chars.push(...normalizeChars(text)) - - while (chars.length - offset >= chunkSize) { - const chunk = chars.slice(offset, offset + chunkSize).join('') - offset += chunkSize - inspectChunk(chunk) - } - - // Prevent unbounded front-gaps after many chunks. - if (offset >= chunkSize) { - compact() - } - }, - flush(): void { - if (chars.length - offset > 0) { - const remainder = chars.slice(offset).join('') - inspectChunk(remainder) - } - chars = [] - offset = 0 - }, - } -} - -export function createToolLoopGuard( - options: ToolLoopGuardOptions = {}, -): ToolLoopGuard { - const repeatThreshold = validatePositiveInt( - 'repeatThreshold', - options.repeatThreshold ?? DEFAULT_TOOL_LOOP_REPEAT_THRESHOLD, - ) - const warningsBeforeAbort = validatePositiveInt( - 'warningsBeforeAbort', - options.warningsBeforeAbort ?? DEFAULT_TOOL_LOOP_WARNINGS_BEFORE_ABORT, - ) - const volatileKeySet = new Set(DEFAULT_VOLATILE_KEYS.map(normalizeKeyName)) - for (const key of options.volatileKeys ?? []) { - const normalizedKey = normalizeKeyName(key) - if (normalizedKey) { - volatileKeySet.add(normalizedKey) - } - } - - let lastHash = '' - let repeatCount = 0 - let breachCount = 0 - let breachHash = '' - - return { - inspect(input: ToolLoopInspectInput): ToolLoopInspectResult { - const hash = computeToolLoopHash(input, volatileKeySet) - - if (hash === lastHash) { - repeatCount += 1 - } else { - lastHash = hash - repeatCount = 1 - } - - // Breach phase is fingerprint-specific: switching tool signature restarts it. - if (breachHash !== hash) { - breachHash = hash - breachCount = 0 - } - - let warn = false - let abort = false - if (repeatCount > repeatThreshold) { - if (breachCount < warningsBeforeAbort) { - breachCount += 1 - warn = true - // Reset consecutive accumulation after first warning. - lastHash = '' - repeatCount = 0 - } else { - breachCount += 1 - abort = true - } - } - - return { - hash, - repeatCount, - breachCount, - warn, - abort, - } - }, - reset(): void { - lastHash = '' - repeatCount = 0 - breachCount = 0 - breachHash = '' - }, - } -} diff --git a/packages/agent/src/tool-loop.test.ts b/packages/agent/src/tool-loop.test.ts deleted file mode 100644 index 8d481db6..00000000 --- a/packages/agent/src/tool-loop.test.ts +++ /dev/null @@ -1,93 +0,0 @@ -import { describe, expect, it, vi } from 'vitest' -import type { ToolSet } from 'ai' -import { createToolLoopGuardedTools } from './tool-loop' - -describe('tool loop guarded tools', () => { - it('preserves promised async-iterable tool outputs', async () => { - const onAbortToolCall = vi.fn() - const streamedChunks = ['chunk-1', 'chunk-2'] - const stream = { - async *[Symbol.asyncIterator]() { - for (const chunk of streamedChunks) { - yield chunk - } - }, - } - - const baseTools = { - streamy: { - execute: async () => stream, - }, - } as unknown as ToolSet - - const tools = createToolLoopGuardedTools(baseTools, { - repeatThreshold: 1, - warningsBeforeAbort: 1, - onAbortToolCall, - warningKey: '__warn', - warningText: 'loop warning', - }) - - const output = await tools.streamy.execute?.({ value: 'same' } as never, { toolCallId: 't-stream' } as never) - - expect(output).toBe(stream) - const received: string[] = [] - for await (const chunk of output as AsyncIterable) { - received.push(chunk) - } - expect(received).toEqual(streamedChunks) - expect(onAbortToolCall).not.toHaveBeenCalled() - }) - - it('defers abort to stream layer when onAbortToolCall is provided', async () => { - const onAbortToolCall = vi.fn() - const baseTools = { - echo: { - execute: async (input: unknown) => ({ result: input }), - }, - } as unknown as ToolSet - const tools = createToolLoopGuardedTools(baseTools, { - repeatThreshold: 1, - warningsBeforeAbort: 1, - onAbortToolCall, - warningKey: '__warn', - warningText: 'loop warning', - }) - - await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-1' } as never) - const warned = await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-1' } as never) - expect(warned).toMatchObject({ - __warn: { - marker: 'MEMOH_TOOL_LOOP_WARNING', - }, - }) - await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-1' } as never) - const abortedOutput = await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-1' } as never) - - expect(onAbortToolCall).toHaveBeenCalledWith('t-1') - expect(abortedOutput).toEqual({ result: { value: 'same' } }) - }) - - it('reports abort via callback without throwing inside tool execution', async () => { - const onAbortToolCall = vi.fn() - const baseTools = { - echo: { - execute: async (input: unknown) => ({ result: input }), - }, - } as unknown as ToolSet - const tools = createToolLoopGuardedTools(baseTools, { - repeatThreshold: 1, - warningsBeforeAbort: 1, - onAbortToolCall, - warningKey: '__warn', - warningText: 'loop warning', - }) - - await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-2' } as never) - await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-2' } as never) - await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-2' } as never) - const abortedOutput = await tools.echo.execute?.({ value: 'same' } as never, { toolCallId: 't-2' } as never) - expect(onAbortToolCall).toHaveBeenCalledWith('t-2') - expect(abortedOutput).toEqual({ result: { value: 'same' } }) - }) -}) diff --git a/packages/agent/src/tool-loop.ts b/packages/agent/src/tool-loop.ts deleted file mode 100644 index ab8dfc08..00000000 --- a/packages/agent/src/tool-loop.ts +++ /dev/null @@ -1,122 +0,0 @@ -import { ToolExecutionOptions, ToolSet } from 'ai' -import { createToolLoopGuard, type ToolLoopInspectResult } from './sential' - -export interface CreateToolLoopGuardedToolsOptions { - repeatThreshold: number - warningsBeforeAbort: number - onAbortToolCall: (toolCallId: string) => void - warningKey: string - warningText: string -} - -const isRecord = (value: unknown): value is Record => { - return value !== null && typeof value === 'object' && !Array.isArray(value) -} - -const isAsyncIterable = (value: unknown): value is AsyncIterable => { - return ( - value !== null && - typeof value === 'object' && - Symbol.asyncIterator in value - ) -} - -const injectToolLoopWarning = ( - output: unknown, - inspectResult: ToolLoopInspectResult, - warningKey: string, - warningText: string, -): unknown => { - // Keep warning payload structured so UI/consumers can detect and render it. - const warningPayload = { - marker: 'MEMOH_TOOL_LOOP_WARNING', - message: warningText, - fingerprint: inspectResult.hash, - breachCount: inspectResult.breachCount, - } - if (isRecord(output)) { - return { - ...output, - [warningKey]: warningPayload, - } - } - return { - [warningKey]: warningPayload, - result: output, - } -} - -export function createToolLoopGuardedTools( - tools: ToolSet, - { - repeatThreshold, - warningsBeforeAbort, - onAbortToolCall, - warningKey, - warningText, - }: CreateToolLoopGuardedToolsOptions, -): ToolSet { - const guard = createToolLoopGuard({ - repeatThreshold, - warningsBeforeAbort, - }) - - // Wrap each executable tool to inspect (toolName + input) after execution. - // First breach injects a warning into this tool result; second breach signals abort. - return Object.fromEntries( - Object.entries(tools).map(([toolName, toolDefinition]) => { - const execute = toolDefinition.execute - if (typeof execute !== 'function') { - return [toolName, toolDefinition] - } - - const wrappedTool = { - ...toolDefinition, - execute: ( - toolInput: unknown, - options: ToolExecutionOptions, - ) => { - const directOutput = execute( - toolInput as never, - options as never, - ) as unknown - - // Streamed tool outputs are passed through unchanged to preserve streaming semantics. - if (isAsyncIterable(directOutput)) { - return directOutput as never - } - - return (async () => { - const resolvedOutput = await directOutput - - // Tools may return Promise; keep that stream untouched too. - if (isAsyncIterable(resolvedOutput)) { - return resolvedOutput as never - } - - const inspectResult = guard.inspect({ - toolName, - input: toolInput, - }) - if (inspectResult.abort) { - // Report loop abort to generation layer; it decides when/how to stop. - onAbortToolCall(options.toolCallId) - return resolvedOutput as never - } - if (inspectResult.warn) { - return injectToolLoopWarning( - resolvedOutput, - inspectResult, - warningKey, - warningText, - ) as never - } - return resolvedOutput as never - })() - }, - } - - return [toolName, wrappedTool] - }), - ) as ToolSet -} diff --git a/packages/agent/src/tools/index.ts b/packages/agent/src/tools/index.ts deleted file mode 100644 index 75e700c2..00000000 --- a/packages/agent/src/tools/index.ts +++ /dev/null @@ -1 +0,0 @@ -export * from './mcp' diff --git a/packages/agent/src/tools/mcp.ts b/packages/agent/src/tools/mcp.ts deleted file mode 100644 index 062e98a2..00000000 --- a/packages/agent/src/tools/mcp.ts +++ /dev/null @@ -1,101 +0,0 @@ -import { HTTPMCPConnection, MCPConnection, SSEMCPConnection, StdioMCPConnection } from '../types' -import { createMCPClient } from '@ai-sdk/mcp' -import { AuthFetcher } from '../types' -import type { AgentAuthContext } from '../types/agent' - -type MCPToolOptions = { - botId?: string - auth?: AgentAuthContext - fetch?: AuthFetcher -} - -export const getMCPTools = async (connections: MCPConnection[], options: MCPToolOptions = {}) => { - const closeCallbacks: Array<() => Promise> = [] - - const getHTTPTools = async (connection: HTTPMCPConnection) => { - const client = await createMCPClient({ - transport: { - type: 'http', - url: connection.url, - headers: connection.headers, - } - }) - closeCallbacks.push(() => client.close()) - const tools = await client.tools() - return tools - } - - const getSSETools = async (connection: SSEMCPConnection) => { - const client = await createMCPClient({ - transport: { - type: 'sse', - url: connection.url, - headers: connection.headers, - } - }) - closeCallbacks.push(() => client.close()) - const tools = await client.tools() - return tools - } - - const getStdioTools = async (connection: StdioMCPConnection) => { - if (!options.fetch || !options.botId || !options.auth) { - throw new Error('stdio mcp requires auth fetcher and bot id') - } - const response = await options.fetch(`/bots/${options.botId}/mcp-stdio`, { - method: 'POST', - headers: { - 'Content-Type': 'application/json' - }, - body: JSON.stringify({ - name: connection.name, - command: connection.command, - args: connection.args ?? [], - env: connection.env ?? {}, - cwd: connection.cwd ?? '' - }) - }) - if (!response.ok) { - const text = await response.text().catch(() => '') - throw new Error(`mcp-stdio failed: ${response.status} ${text}`) - } - const data = await response.json().catch(() => ({})) as { url?: string } - const rawUrl = typeof data?.url === 'string' ? data.url : '' - if (!rawUrl) { - throw new Error('mcp-stdio response missing url') - } - const baseUrl = options.auth.baseUrl ?? '' - const url = rawUrl.startsWith('http') - ? rawUrl - : `${baseUrl.replace(/\/$/, '')}/${rawUrl.replace(/^\//, '')}` - return await getHTTPTools({ - type: 'http', - name: connection.name, - url, - headers: { - 'Authorization': `Bearer ${options.auth.bearer}` - } - }) - } - - const toolSets = await Promise.all(connections.map(async (connection) => { - switch (connection.type) { - case 'http': - return getHTTPTools(connection) - case 'sse': - return getSSETools(connection) - case 'stdio': - return getStdioTools(connection) - default: - console.warn('unknown mcp connection type', connection) - return {} - } - })) - - return { - tools: Object.assign({}, ...toolSets), - close: async () => { - await Promise.all(closeCallbacks.map(callback => callback())) - } - } -} \ No newline at end of file diff --git a/packages/agent/src/types/action.ts b/packages/agent/src/types/action.ts deleted file mode 100644 index 8e809291..00000000 --- a/packages/agent/src/types/action.ts +++ /dev/null @@ -1,105 +0,0 @@ -import { LanguageModelUsage, ModelMessage } from 'ai' -import { AgentInput } from './agent' -import { AgentAttachment } from './attachment' -import { ReactionItem } from '../utils/reactions' -import { SpeechItem } from '../utils/speech' - -export interface BaseAction { - type: string - metadata?: Record -} - -export interface AgentStartAction extends BaseAction { - type: 'agent_start' - input: AgentInput -} - -export interface ReasoningStartAction extends BaseAction { - type: 'reasoning_start' -} - -export interface ReasoningDeltaAction extends BaseAction { - type: 'reasoning_delta' - delta: string -} - -export interface ReasoningEndAction extends BaseAction { - type: 'reasoning_end' -} - -export interface TextStartAction extends BaseAction { - type: 'text_start' -} - -export interface TextDeltaAction extends BaseAction { - type: 'text_delta' - delta: string -} - -export interface AttachmentDeltaAction extends BaseAction { - type: 'attachment_delta' - attachments: AgentAttachment[] -} - -export interface ReactionDeltaAction extends BaseAction { - type: 'reaction_delta' - reactions: ReactionItem[] -} - -export interface SpeechDeltaAction extends BaseAction { - type: 'speech_delta' - speeches: SpeechItem[] -} - -export interface TextEndAction extends BaseAction { - type: 'text_end' -} - -export interface ToolCallStartAction extends BaseAction { - type: 'tool_call_start' - toolName: string - toolCallId: string - input: unknown -} - -export interface ToolCallEndAction extends BaseAction { - type: 'tool_call_end' - toolName: string - toolCallId: string - input: unknown - result: unknown -} - -export interface AgentEndAction extends BaseAction { - type: 'agent_end' - messages: ModelMessage[] - skills: string[] - reasoning: string[] - usage: LanguageModelUsage - usages: (LanguageModelUsage | null)[] -} - -export interface AgentAbortAction extends BaseAction { - type: 'agent_abort' - messages: ModelMessage[] - skills: string[] - reasoning: string[] - usage: LanguageModelUsage | null - usages: (LanguageModelUsage | null)[] -} - -export type AgentStreamAction = - | AgentStartAction - | ReasoningStartAction - | ReasoningDeltaAction - | ReasoningEndAction - | TextStartAction - | TextDeltaAction - | AttachmentDeltaAction - | ReactionDeltaAction - | SpeechDeltaAction - | TextEndAction - | ToolCallStartAction - | ToolCallEndAction - | AgentEndAction - | AgentAbortAction diff --git a/packages/agent/src/types/agent.ts b/packages/agent/src/types/agent.ts deleted file mode 100644 index 6aa6dbab..00000000 --- a/packages/agent/src/types/agent.ts +++ /dev/null @@ -1,66 +0,0 @@ -import { ModelMessage } from 'ai' -import { ModelConfig } from './model' -import { GatewayInputAttachment } from './attachment' -import { MCPConnection } from './mcp' - -export interface IdentityContext { - botId: string - channelIdentityId: string - displayName: string - currentPlatform?: string - replyTarget?: string - conversationType?: string - sessionToken?: string -} - -export interface AgentAuthContext { - bearer: string - baseUrl: string -} - -export interface InboxItem { - id: string - source: string - header: Record - content: string - createdAt: string -} - -export interface LoopDetectionConfig { - enabled?: boolean -} - -export interface AgentParams { - model: ModelConfig - language?: string - activeContextTime?: number - mcpConnections?: MCPConnection[] - channels?: string[] - currentChannel?: string - identity?: IdentityContext - auth: AgentAuthContext - skills?: AgentSkill[] - inbox?: InboxItem[] - loopDetection?: LoopDetectionConfig - isSubagent?: boolean -} - -export interface AgentInput { - messages: ModelMessage[] - attachments: GatewayInputAttachment[] - skills: string[] - query: string - signal?: AbortSignal -} - -export interface AgentSkill { - name: string - description: string - content: string - metadata?: Record -} - -export interface SystemFile { - filename: string - content: string -} diff --git a/packages/agent/src/types/attachment.ts b/packages/agent/src/types/attachment.ts deleted file mode 100644 index 9bce0297..00000000 --- a/packages/agent/src/types/attachment.ts +++ /dev/null @@ -1,38 +0,0 @@ -export type GatewayAttachmentTransport = - | 'inline_data_url' - | 'public_url' - | 'tool_file_ref' - -export interface GatewayInputAttachment { - contentHash?: string - type: string - mime?: string - size?: number - name?: string - transport: GatewayAttachmentTransport - payload: string - metadata?: Record -} - -export interface BaseAgentAttachment { - type: string - url?: string - name?: string - mime?: string - content_hash?: string - metadata?: Record -} - -export interface ImageAttachment extends BaseAgentAttachment { - type: 'image' - base64?: string - url?: string - path?: string -} - -export interface ContainerFileAttachment extends BaseAgentAttachment { - type: 'file' - path: string -} - -export type AgentAttachment = ImageAttachment | ContainerFileAttachment \ No newline at end of file diff --git a/packages/agent/src/types/auth.ts b/packages/agent/src/types/auth.ts deleted file mode 100644 index f137970d..00000000 --- a/packages/agent/src/types/auth.ts +++ /dev/null @@ -1,4 +0,0 @@ -export type AuthFetcher = ( - url: string, - options?: RequestInit, -) => Promise \ No newline at end of file diff --git a/packages/agent/src/types/heartbeat.ts b/packages/agent/src/types/heartbeat.ts deleted file mode 100644 index 61cb3bcd..00000000 --- a/packages/agent/src/types/heartbeat.ts +++ /dev/null @@ -1,3 +0,0 @@ -export interface Heartbeat { - interval: number -} diff --git a/packages/agent/src/types/index.ts b/packages/agent/src/types/index.ts deleted file mode 100644 index 32db5f8b..00000000 --- a/packages/agent/src/types/index.ts +++ /dev/null @@ -1,8 +0,0 @@ -export * from './agent' -export * from './model' -export * from './schedule' -export * from './heartbeat' -export * from './attachment' -export * from './mcp' -export * from './auth' -export * from './action' \ No newline at end of file diff --git a/packages/agent/src/types/mcp.ts b/packages/agent/src/types/mcp.ts deleted file mode 100644 index ab5dc2f5..00000000 --- a/packages/agent/src/types/mcp.ts +++ /dev/null @@ -1,29 +0,0 @@ -export interface BaseMCPConnection { - type: string - name: string -} - -export interface StdioMCPConnection extends BaseMCPConnection { - type: 'stdio' - command: string - args: string[] - env?: Record - cwd?: string -} - -export interface HTTPMCPConnection extends BaseMCPConnection { - type: 'http' - url: string - headers?: Record -} - -export interface SSEMCPConnection extends BaseMCPConnection { - type: 'sse' - url: string - headers?: Record -} - -export type MCPConnection = - | StdioMCPConnection - | HTTPMCPConnection - | SSEMCPConnection \ No newline at end of file diff --git a/packages/agent/src/types/model.ts b/packages/agent/src/types/model.ts deleted file mode 100644 index 3eb18cfe..00000000 --- a/packages/agent/src/types/model.ts +++ /dev/null @@ -1,33 +0,0 @@ -export enum ClientType { - OpenAIResponses = 'openai-responses', - OpenAICompletions = 'openai-completions', - AnthropicMessages = 'anthropic-messages', - GoogleGenerativeAI = 'google-generative-ai', -} - -export enum ModelInput { - Text = 'text', - Image = 'image', - Audio = 'audio', - Video = 'video', - File = 'file', -} - -export type ReasoningEffort = 'low' | 'medium' | 'high' - -export interface ReasoningConfig { - enabled: boolean - effort: ReasoningEffort -} - -export interface ModelConfig { - apiKey: string - baseUrl: string - modelId: string - clientType: ClientType - input: ModelInput[] - reasoning?: ReasoningConfig -} - -export const hasInputModality = (config: ModelConfig, modality: ModelInput): boolean => - config.input.includes(modality) diff --git a/packages/agent/src/types/schedule.ts b/packages/agent/src/types/schedule.ts deleted file mode 100644 index a8eaeb2f..00000000 --- a/packages/agent/src/types/schedule.ts +++ /dev/null @@ -1,8 +0,0 @@ -export interface Schedule { - id: string - name: string - description: string - pattern: string - maxCalls?: number | null - command: string -} diff --git a/packages/agent/src/utils/attachments.ts b/packages/agent/src/utils/attachments.ts deleted file mode 100644 index bf809bf9..00000000 --- a/packages/agent/src/utils/attachments.ts +++ /dev/null @@ -1,181 +0,0 @@ -import type { AssistantModelMessage, ModelMessage, TextPart } from 'ai' -import type { - AgentAttachment, - ContainerFileAttachment, -} from '../types/attachment' -import type { TagResolver } from './tag-extractor' -import { StreamTagExtractor, extractTagsFromText } from './tag-extractor' - -// --------------------------------------------------------------------------- -// Helpers -// --------------------------------------------------------------------------- - -/** - * Get a unique key for deduplication of attachments. - */ -const getAttachmentKey = (a: AgentAttachment): string => { - switch (a.type) { - case 'file': return `file:${a.path}` - case 'image': return `image:${(a.base64 ?? a.url ?? '').slice(0, 64)}` - } -} - -/** - * Deduplicate attachments by their key. - */ -export const dedupeAttachments = (attachments: AgentAttachment[]): AgentAttachment[] => { - return Array.from(new Map(attachments.map(a => [getAttachmentKey(a), a])).values()) -} - -/** - * Parse attachment file paths from the inner content of an `` block. - * Each line should be formatted as `- /path/to/file`. - */ -export const parseAttachmentPaths = (content: string): string[] => { - return content - .split('\n') - .map(line => line.trim()) - .map(line => { - if (!line.startsWith('-')) return '' - return line.slice(1).trim() - }) - .filter(Boolean) -} - -// --------------------------------------------------------------------------- -// TagResolver for -// --------------------------------------------------------------------------- - -export const attachmentsResolver: TagResolver = { - tag: 'attachments', - parse(content: string): ContainerFileAttachment[] { - const paths = Array.from(new Set(parseAttachmentPaths(content))) - return paths.map((path): ContainerFileAttachment => ({ type: 'file', path })) - }, -} - -// --------------------------------------------------------------------------- -// Batch extraction (backward-compatible wrapper) -// --------------------------------------------------------------------------- - -/** - * Extract all `...` blocks from a text string. - * Returns the cleaned text (blocks removed) and the parsed file attachments. - */ -export const extractAttachmentsFromText = (text: string): { cleanedText: string; attachments: ContainerFileAttachment[] } => { - const { cleanedText, events } = extractTagsFromText(text, [attachmentsResolver]) - const attachments = events - .filter((e) => e.tag === 'attachments') - .flatMap((e) => e.data as ContainerFileAttachment[]) - return { - cleanedText, - attachments: dedupeAttachments(attachments) as ContainerFileAttachment[], - } -} - -// --------------------------------------------------------------------------- -// Message-level stripping -// --------------------------------------------------------------------------- - -/** - * Type guard: checks whether a content part is a TextPart. - */ -const isTextPart = (part: unknown): part is TextPart => { - return ( - typeof part === 'object' && - part !== null && - (part as Record).type === 'text' && - typeof (part as Record).text === 'string' - ) -} - -/** - * Strip all registered tag blocks from assistant messages in a message list. - * Accepts additional resolvers to strip beyond `` (e.g. ``). - * Returns the cleaned messages and a deduplicated list of attachments found. - */ -export const stripAttachmentsFromMessages = ( - messages: ModelMessage[], - extraResolvers: TagResolver[] = [], -): { messages: ModelMessage[]; attachments: ContainerFileAttachment[] } => { - const allAttachments: ContainerFileAttachment[] = [] - const resolvers: TagResolver[] = [attachmentsResolver, ...extraResolvers] - - const cleanText = (text: string): string => { - const { cleanedText, events } = extractTagsFromText(text, resolvers) - const attachments = events - .filter((e) => e.tag === 'attachments') - .flatMap((e) => e.data as ContainerFileAttachment[]) - allAttachments.push(...attachments) - return cleanedText - } - - const stripped = messages.map((msg): ModelMessage => { - if (msg.role !== 'assistant') return msg - - const assistantMsg = msg as AssistantModelMessage - const { content } = assistantMsg - - if (typeof content === 'string') { - return { ...assistantMsg, content: cleanText(content) } - } - - if (Array.isArray(content)) { - const newParts = content.map(part => { - if (!isTextPart(part)) return part - return { ...part, text: cleanText(part.text) } - }) - return { ...assistantMsg, content: newParts } - } - - return msg - }) - - return { - messages: stripped, - attachments: dedupeAttachments(allAttachments) as ContainerFileAttachment[], - } -} - -// --------------------------------------------------------------------------- -// Streaming extractor (backward-compatible wrapper) -// --------------------------------------------------------------------------- - -export interface AttachmentsStreamResult { - visibleText: string - attachments: ContainerFileAttachment[] -} - -/** - * Backward-compatible streaming extractor that delegates to {@link StreamTagExtractor}. - * Intercepts `...` blocks from a stream of text deltas. - */ -export class AttachmentsStreamExtractor { - private inner: StreamTagExtractor - - constructor() { - this.inner = new StreamTagExtractor([attachmentsResolver]) - } - - push(delta: string): AttachmentsStreamResult { - const { visibleText, events } = this.inner.push(delta) - const attachments = events - .filter((e) => e.tag === 'attachments') - .flatMap((e) => e.data as ContainerFileAttachment[]) - return { - visibleText, - attachments: dedupeAttachments(attachments) as ContainerFileAttachment[], - } - } - - flushRemainder(): AttachmentsStreamResult { - const { visibleText, events } = this.inner.flushRemainder() - const attachments = events - .filter((e) => e.tag === 'attachments') - .flatMap((e) => e.data as ContainerFileAttachment[]) - return { - visibleText, - attachments: dedupeAttachments(attachments) as ContainerFileAttachment[], - } - } -} diff --git a/packages/agent/src/utils/fs.ts b/packages/agent/src/utils/fs.ts deleted file mode 100644 index 0ab57080..00000000 --- a/packages/agent/src/utils/fs.ts +++ /dev/null @@ -1,207 +0,0 @@ -import type { AuthFetcher } from '../types' - -// ---------- types ---------- - -export interface FSFileInfo { - name: string - path: string - size: number - mode: string - modTime: string - isDir: boolean -} - -export interface FSListResponse { - path: string - entries: FSFileInfo[] -} - -export interface FSReadResponse { - path: string - content: string - size: number -} - -export interface FSWriteParams { - path: string - content: string -} - -export interface FSUploadResponse { - path: string - size: number -} - -export interface FSMkdirParams { - path: string -} - -export interface FSDeleteParams { - path: string - recursive?: boolean -} - -export interface FSRenameParams { - oldPath: string - newPath: string -} - -export interface FSOkResponse { - ok: boolean -} - -export interface FSClientOptions { - fetch: AuthFetcher - botId: string -} - -// ---------- helpers ---------- - -const encodeQuery = (path: string) => encodeURIComponent(path) - -const ensureOk = async (response: Response, action: string): Promise => { - if (!response.ok) { - const text = await response.text().catch(() => '') - throw new Error(`fs ${action} failed (${response.status}): ${text}`) - } -} - -// ---------- public API ---------- - -/** - * Creates a set of filesystem utility functions that operate on a bot's - * container via the REST file-manager API. - * - * All functions use `AuthFetcher` so auth headers are injected automatically. - */ -export const createFS = ({ fetch, botId }: FSClientOptions) => { - const base = `/bots/${botId}/container/fs` - - /** Get file or directory metadata. */ - const stat = async (path: string): Promise => { - const response = await fetch(`${base}?path=${encodeQuery(path)}`) - await ensureOk(response, 'stat') - return response.json() as Promise - } - - /** List directory contents. */ - const list = async (path: string): Promise => { - const response = await fetch(`${base}/list?path=${encodeQuery(path)}`) - await ensureOk(response, 'list') - return response.json() as Promise - } - - /** Read a file as text. */ - const read = async (path: string): Promise => { - const response = await fetch(`${base}/read?path=${encodeQuery(path)}`) - await ensureOk(response, 'read') - return response.json() as Promise - } - - /** Download a file as a binary `Response` (stream-ready). */ - const download = async (path: string): Promise => { - const response = await fetch(`${base}/download?path=${encodeQuery(path)}`) - await ensureOk(response, 'download') - return response - } - - /** Write text content to a file (creates parent dirs automatically). */ - const write = async (params: FSWriteParams): Promise => { - const response = await fetch(`${base}/write`, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify(params), - }) - await ensureOk(response, 'write') - return response.json() as Promise - } - - /** Upload a binary file via multipart/form-data. */ - const upload = async ( - path: string, - file: Blob | File, - fileName?: string, - ): Promise => { - const form = new FormData() - form.append('path', path) - form.append('file', file, fileName ?? (file instanceof File ? file.name : 'upload')) - const response = await fetch(`${base}/upload`, { - method: 'POST', - body: form, - }) - await ensureOk(response, 'upload') - return response.json() as Promise - } - - /** Create a directory (and parents). */ - const mkdir = async (params: FSMkdirParams): Promise => { - const response = await fetch(`${base}/mkdir`, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify(params), - }) - await ensureOk(response, 'mkdir') - return response.json() as Promise - } - - /** Delete a file or directory. */ - const remove = async (params: FSDeleteParams): Promise => { - const response = await fetch(`${base}/delete`, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify(params), - }) - await ensureOk(response, 'delete') - return response.json() as Promise - } - - /** Rename or move a file / directory. */ - const rename = async (params: FSRenameParams): Promise => { - const response = await fetch(`${base}/rename`, { - method: 'POST', - headers: { 'Content-Type': 'application/json' }, - body: JSON.stringify(params), - }) - await ensureOk(response, 'rename') - return response.json() as Promise - } - - /** Check whether a path exists. */ - const exists = async (path: string): Promise => { - try { - await stat(path) - return true - } catch { - return false - } - } - - /** Read a file and return only the text content string. */ - const readText = async (path: string): Promise => { - const result = await read(path) - return result.content - } - - /** Shorthand: write text content to a path. */ - const writeText = async (path: string, content: string): Promise => { - return write({ path, content }) - } - - return { - stat, - list, - read, - readText, - download, - write, - writeText, - upload, - mkdir, - remove, - rename, - exists, - } -} - -export type FSClient = ReturnType - diff --git a/packages/agent/src/utils/headers.ts b/packages/agent/src/utils/headers.ts deleted file mode 100644 index 326b4ffa..00000000 --- a/packages/agent/src/utils/headers.ts +++ /dev/null @@ -1,27 +0,0 @@ -import { AgentAuthContext, IdentityContext } from '../types' - -export interface BuildHeadersOptions { - isSubagent?: boolean -} - -export const buildIdentityHeaders = (identity: IdentityContext, auth: AgentAuthContext, options?: BuildHeadersOptions) => { - const headers: Record = { - Authorization: `Bearer ${auth.bearer}`, - } - if (identity.channelIdentityId) { - headers['X-Memoh-Channel-Identity-Id'] = identity.channelIdentityId - } - if (identity.sessionToken) { - headers['X-Memoh-Session-Token'] = identity.sessionToken - } - if (identity.currentPlatform) { - headers['X-Memoh-Current-Platform'] = identity.currentPlatform - } - if (identity.replyTarget) { - headers['X-Memoh-Reply-Target'] = identity.replyTarget - } - if (options?.isSubagent) { - headers['X-Memoh-Is-Subagent'] = 'true' - } - return headers -} \ No newline at end of file diff --git a/packages/agent/src/utils/image-parts.ts b/packages/agent/src/utils/image-parts.ts deleted file mode 100644 index db9c4ae9..00000000 --- a/packages/agent/src/utils/image-parts.ts +++ /dev/null @@ -1,144 +0,0 @@ -import type { ImagePart } from 'ai' -import type { GatewayInputAttachment } from '../types/attachment' - -type NativeImageAttachment = GatewayInputAttachment & { - type: 'image' - transport: 'inline_data_url' | 'public_url' -} - -type ImagePartPayload = string | Uint8Array | URL -const strictBase64Pattern = /^[A-Za-z0-9+/]*={0,2}$/ - -const normalizeMediaType = (value?: string): string | undefined => { - const mediaType = typeof value === 'string' ? value.trim() : '' - return mediaType || undefined -} - -const createImagePart = (image: ImagePartPayload, mediaType?: string): ImagePart => { - const normalizedMediaType = normalizeMediaType(mediaType) - if (normalizedMediaType == null) { - return { type: 'image', image } - } - return { type: 'image', image, mediaType: normalizedMediaType } -} - -const decodeBase64Strict = (value: string): Buffer | null => { - const normalized = value.replace(/\s+/g, '') - if (normalized === '' || !strictBase64Pattern.test(normalized)) { - return null - } - - const firstPadding = normalized.indexOf('=') - if (firstPadding >= 0) { - if (/[A-Za-z0-9+/]/.test(normalized.slice(firstPadding))) { - return null - } - if (normalized.length-firstPadding > 2 || normalized.length % 4 !== 0) { - return null - } - } - else if (normalized.length % 4 === 1) { - return null - } - - const padded = firstPadding >= 0 - ? normalized - : normalized + '='.repeat((4 - (normalized.length % 4)) % 4) - - const decoded = Buffer.from(padded, 'base64') - const canonical = decoded.toString('base64').replace(/=+$/g, '') - const input = normalized.replace(/=+$/g, '') - if (canonical !== input) { - return null - } - - return decoded -} - -const parseDataUrl = (payload: string): { bytes: Uint8Array; mediaType?: string } | null => { - const trimmed = payload.trim() - if (!trimmed.toLowerCase().startsWith('data:')) { - return null - } - - const commaIndex = trimmed.indexOf(',') - if (commaIndex < 0) { - return null - } - - const header = trimmed.slice(5, commaIndex) - const body = trimmed.slice(commaIndex + 1) - const segments = header.split(';').map((segment) => segment.trim()).filter(Boolean) - const mediaType = normalizeMediaType(segments.find((segment) => segment.includes('/'))) - const isBase64 = segments.some((segment) => segment.toLowerCase() === 'base64') - let buffer: Buffer - if (isBase64) { - const decoded = decodeBase64Strict(body) - if (decoded == null) { - return null - } - buffer = decoded - } - else { - try { - buffer = Buffer.from(decodeURIComponent(body), 'utf8') - } - catch { - return null - } - } - - return { - bytes: new Uint8Array(buffer), - mediaType, - } -} - -const isNativeImageAttachment = ( - attachment: GatewayInputAttachment, -): attachment is NativeImageAttachment => { - if (attachment.type !== 'image') { - return false - } - if (attachment.transport !== 'inline_data_url' && attachment.transport !== 'public_url') { - return false - } - return typeof attachment.payload === 'string' && attachment.payload.trim() !== '' -} - -const createInlineDataImagePart = (payload: string, mediaType?: string): ImagePart => { - const parsed = parseDataUrl(payload) - if (parsed != null) { - return createImagePart(parsed.bytes, mediaType ?? parsed.mediaType) - } - return createImagePart(payload, mediaType) -} - -const createPublicURLImagePart = (payload: string, mediaType?: string): ImagePart => { - try { - return createImagePart(new URL(payload), mediaType) - } - catch { - return createImagePart(payload, mediaType) - } -} - -export const createBinaryImagePart = (bytes: Uint8Array, mediaType?: string): ImagePart => { - return createImagePart(bytes, mediaType) -} - -export const createImagePartFromAttachment = ( - attachment: GatewayInputAttachment, -): ImagePart | null => { - if (!isNativeImageAttachment(attachment)) { - return null - } - - const payload = attachment.payload.trim() - switch (attachment.transport) { - case 'public_url': - return createPublicURLImagePart(payload, attachment.mime) - case 'inline_data_url': - return createInlineDataImagePart(payload, attachment.mime) - } -} diff --git a/packages/agent/src/utils/index.ts b/packages/agent/src/utils/index.ts deleted file mode 100644 index 6cd8f45a..00000000 --- a/packages/agent/src/utils/index.ts +++ /dev/null @@ -1,4 +0,0 @@ -export * from './attachments' -export * from './fs' -export * from './headers' -export * from './speech' diff --git a/packages/agent/src/utils/reactions.ts b/packages/agent/src/utils/reactions.ts deleted file mode 100644 index aed42b4c..00000000 --- a/packages/agent/src/utils/reactions.ts +++ /dev/null @@ -1,28 +0,0 @@ -import type { TagResolver } from './tag-extractor' - -export interface ReactionItem { - emoji: string -} - -/** - * Parse emoji entries from the inner content of a `` block. - * Each line should be formatted as `- 👍`. - */ -export const parseReactionEmojis = (content: string): string[] => { - return content - .split('\n') - .map(line => line.trim()) - .map(line => { - if (!line.startsWith('-')) return '' - return line.slice(1).trim() - }) - .filter(Boolean) -} - -export const reactionsResolver: TagResolver = { - tag: 'reactions', - parse(content: string): ReactionItem[] { - const emojis = Array.from(new Set(parseReactionEmojis(content))) - return emojis.map((emoji): ReactionItem => ({ emoji })) - }, -} diff --git a/packages/agent/src/utils/read-media-injector.test.ts b/packages/agent/src/utils/read-media-injector.test.ts deleted file mode 100644 index 740eda80..00000000 --- a/packages/agent/src/utils/read-media-injector.test.ts +++ /dev/null @@ -1,118 +0,0 @@ -import { describe, expect, it } from 'vitest' -import { createPrepareStepWithReadMedia } from './read-media-injector' -import { ClientType, ModelInput, ModelConfig } from '../types/model' - -const baseModelConfig: ModelConfig = { - apiKey: 'test', - baseUrl: 'http://example.com', - modelId: 'model', - clientType: ClientType.OpenAIResponses, - input: [ModelInput.Image], -} - -const createToolOptions = (toolCallId: string) => ({ - toolCallId, - messages: [], -}) - -describe('read_media runtime', () => { - it('caches image and injects it into messages', async () => { - const fs = { - download: async () => - new Response(new Uint8Array([1, 2, 3]), { - headers: { 'content-type': 'image/png' }, - }), - } - const { prepareStep, tools } = createPrepareStepWithReadMedia({ - modelConfig: baseModelConfig, - fs, - systemPrompt: 'sys', - }) - const executeReadMedia = tools.read_media.execute! - const output = await executeReadMedia( - { path: '/data/media/a.png' }, - createToolOptions('call-1'), - ) - expect((output as { ok?: boolean }).ok).toBe(true) - const prepared = await prepareStep({ - messages: [{ role: 'user', content: 'hi' }], - steps: [], - stepNumber: 0, - model: {} as never, - experimental_context: undefined, - }) - const injected = prepared.messages?.[1] - expect(injected?.role).toBe('user') - const content = injected?.content as Array<{ type?: string; image?: Uint8Array; mediaType?: string }> - expect(content?.[0]?.type).toBe('image') - expect(content?.[0]?.image).toBeInstanceOf(Uint8Array) - expect(Array.from(content?.[0]?.image ?? [])).toEqual([1, 2, 3]) - expect(content?.[0]?.mediaType).toBe('image/png') - }) - - it('returns error result on download failure', async () => { - const fs = { - download: async () => { - throw new Error('boom') - }, - } - const { prepareStep, tools } = createPrepareStepWithReadMedia({ - modelConfig: baseModelConfig, - fs, - systemPrompt: 'sys', - }) - const executeReadMedia = tools.read_media.execute! - const output = await executeReadMedia( - { path: '/data/media/a.png' }, - createToolOptions('call-2'), - ) - expect((output as { isError?: boolean }).isError).toBe(true) - const prepared = await prepareStep({ - messages: [{ role: 'user', content: 'hi' }], - steps: [], - stepNumber: 0, - model: {} as never, - experimental_context: undefined, - }) - expect(prepared.messages).toBeUndefined() - }) - - it('preserves tool call order when downloads finish out of order', async () => { - const fs = { - download: async (path: string) => { - const delay = path.includes('a.png') ? 20 : 0 - await new Promise((resolve) => setTimeout(resolve, delay)) - const payload = path.includes('a.png') ? new Uint8Array([1]) : new Uint8Array([2]) - return new Response(payload, { headers: { 'content-type': 'image/png' } }) - }, - } - const { prepareStep, tools } = createPrepareStepWithReadMedia({ - modelConfig: baseModelConfig, - fs, - systemPrompt: 'sys', - }) - const executeReadMedia = tools.read_media.execute! - const first = executeReadMedia( - { path: '/data/media/a.png' }, - createToolOptions('call-1'), - ) - const second = executeReadMedia( - { path: '/data/media/b.png' }, - createToolOptions('call-2'), - ) - await Promise.all([first, second]) - const prepared = await prepareStep({ - messages: [{ role: 'user', content: 'hi' }], - steps: [], - stepNumber: 0, - model: {} as never, - experimental_context: undefined, - }) - const injected = prepared.messages?.[1] - const content = injected?.content as Array<{ type?: string; image?: Uint8Array; mediaType?: string }> - expect(Array.from(content?.[0]?.image ?? [])).toEqual([1]) - expect(Array.from(content?.[1]?.image ?? [])).toEqual([2]) - expect(content?.[0]?.mediaType).toBe('image/png') - expect(content?.[1]?.mediaType).toBe('image/png') - }) -}) diff --git a/packages/agent/src/utils/read-media-injector.ts b/packages/agent/src/utils/read-media-injector.ts deleted file mode 100644 index cf9a9446..00000000 --- a/packages/agent/src/utils/read-media-injector.ts +++ /dev/null @@ -1,122 +0,0 @@ -import { ImagePart, PrepareStepFunction, ToolSet, UserModelMessage, tool } from 'ai' -import { z } from 'zod' -import { ModelConfig, ModelInput, hasInputModality } from '../types/model' -import { createBinaryImagePart } from './image-parts' - -const READ_MEDIA_TOOL_NAME = 'read_media' - -const isImageMime = (mime: string): boolean => { - return mime.trim().toLowerCase().startsWith('image/') -} - -type ReadMediaFS = { - download: (path: string) => Promise -} - -const buildReadMediaToolError = (message: string) => ({ - isError: true, - content: [{ type: 'text', text: message }], - structuredContent: { ok: false, error: message }, -}) - -const loadImageBytes = async ( - fs: ReadMediaFS, - path: string, -): Promise<{ ok: true; bytes: Uint8Array; mime: string } | { ok: false; error: string }> => { - try { - const response = await fs.download(path) - const bytes = new Uint8Array(await response.arrayBuffer()) - const header = response.headers.get('content-type') ?? '' - const mime = header.split(';')[0]?.trim() ?? '' - if (!mime || !isImageMime(mime)) { - return { ok: false, error: 'read_media only supports image files' } - } - return { ok: true, bytes, mime } - } catch (error) { - console.error(error) - const message = error instanceof Error ? error.message : String(error) - return { ok: false, error: `read_media failed to load image: ${message}` } - } -} - -export const createPrepareStepWithReadMedia = (params: { - modelConfig: ModelConfig - fs: ReadMediaFS - systemPrompt: string - basePrepareStep?: PrepareStepFunction -}) => { - const supportsImage = hasInputModality(params.modelConfig, ModelInput.Image) - if (!supportsImage) { - const prepareStep = async (options: Parameters[0]) => { - return (params.basePrepareStep ? await params.basePrepareStep(options) : {}) ?? {} - } - return { prepareStep, tools: {} as ToolSet } - } - const cachedImages = new Map() - const callOrder: string[] = [] - - const readMediaTool = tool({ - description: 'Load an image file into context so the model can view it.', - inputSchema: z.object({ - path: z.string().describe('Image file path inside the container.'), - }), - execute: async ({ path }, options) => { - const trimmedPath = typeof path === 'string' ? path.trim() : '' - if (!trimmedPath) { - return buildReadMediaToolError('path is required') - } - const toolCallId = typeof options?.toolCallId === 'string' ? options.toolCallId : '' - if (!toolCallId) { - return buildReadMediaToolError('read_media missing toolCallId') - } - if (!cachedImages.has(toolCallId)) { - cachedImages.set(toolCallId, null) - callOrder.push(toolCallId) - } - const loaded = await loadImageBytes(params.fs, trimmedPath) - if (!loaded.ok) { - return buildReadMediaToolError(loaded.error) - } - cachedImages.set(toolCallId, createBinaryImagePart(loaded.bytes, loaded.mime)) - return { ok: true, path: trimmedPath, mime: loaded.mime } - }, - }) - - const prepareStep = async (options: Parameters[0]) => { - const base = (params.basePrepareStep ? await params.basePrepareStep(options) : {}) ?? {} - const baseMessages = base.messages ?? options.messages - if (cachedImages.size === 0) { - if (!base.system) { - base.system = params.systemPrompt - } - return base - } - const imageParts = callOrder - .map((toolCallId) => cachedImages.get(toolCallId)) - .filter((part): part is ImagePart => Boolean(part)) - if (imageParts.length === 0) { - if (!base.system) { - base.system = params.systemPrompt - } - return base - } - const injectedMessage: UserModelMessage = { - role: 'user', - content: imageParts, - } - const merged = { - ...base, - messages: [...baseMessages, injectedMessage], - } - if (!merged.system) { - merged.system = params.systemPrompt - } - return merged - } - - const readMediaTools: ToolSet = { - [READ_MEDIA_TOOL_NAME]: readMediaTool, - } - - return { prepareStep, tools: readMediaTools } -} diff --git a/packages/agent/src/utils/speech.ts b/packages/agent/src/utils/speech.ts deleted file mode 100644 index 81518da5..00000000 --- a/packages/agent/src/utils/speech.ts +++ /dev/null @@ -1,16 +0,0 @@ -import type { TagResolver } from './tag-extractor' - -export interface SpeechItem { - text: string -} - -/** - * Parse a `` block. The entire trimmed content is one synthesis request. - */ -export const speechResolver: TagResolver = { - tag: 'speech', - parse(content: string): SpeechItem[] { - const text = content.trim() - return text ? [{ text }] : [] - }, -} diff --git a/packages/agent/src/utils/tag-extractor.ts b/packages/agent/src/utils/tag-extractor.ts deleted file mode 100644 index 6195645d..00000000 --- a/packages/agent/src/utils/tag-extractor.ts +++ /dev/null @@ -1,183 +0,0 @@ -/** - * Generic extensible tag-interception system. - * - * Register TagResolver instances (e.g. attachments, reactions) and both the - * batch extractor and the streaming state-machine will intercept the - * corresponding `...` blocks, stripping them from visible text and - * forwarding the parsed payload through {@link TagEvent} objects. - */ - -// --------------------------------------------------------------------------- -// Public interfaces -// --------------------------------------------------------------------------- - -export interface TagResolver { - tag: string - parse(content: string): T[] -} - -export interface TagEvent { - tag: string - data: unknown[] -} - -export interface TagStreamResult { - visibleText: string - events: TagEvent[] -} - -// --------------------------------------------------------------------------- -// Batch extractor -// --------------------------------------------------------------------------- - -/** - * Extract all registered tag blocks from a complete text string. - * Returns the cleaned text (blocks removed) and a list of tag events. - */ -export function extractTagsFromText( - text: string, - resolvers: TagResolver[], -): { cleanedText: string; events: TagEvent[] } { - const events: TagEvent[] = [] - let cleaned = text - for (const resolver of resolvers) { - const open = `<${resolver.tag}>` - const close = `` - const pattern = new RegExp( - `${escapeRegExp(open)}([\\s\\S]*?)${escapeRegExp(close)}`, - 'g', - ) - cleaned = cleaned.replace(pattern, (_match, inner: string) => { - const parsed = resolver.parse(inner) - if (parsed.length > 0) { - events.push({ tag: resolver.tag, data: parsed }) - } - return '' - }) - } - return { - cleanedText: cleaned.replace(/\n{3,}/g, '\n\n').trim(), - events, - } -} - -// --------------------------------------------------------------------------- -// Streaming extractor -// --------------------------------------------------------------------------- - -interface ResolverMeta { - resolver: TagResolver - openTag: string - closeTag: string -} - -/** - * Incremental state-machine that intercepts multiple `...` blocks - * from a stream of text deltas. - * - * Text outside registered blocks is passed through as `visibleText`; completed - * blocks are emitted as {@link TagEvent} entries. - */ -export class StreamTagExtractor { - private metas: ResolverMeta[] - private maxOpenLen: number - private state: 'text' | 'inside' = 'text' - private activeMeta: ResolverMeta | null = null - private buffer = '' - private tagBuffer = '' - - constructor(resolvers: TagResolver[]) { - this.metas = resolvers.map((resolver) => ({ - resolver, - openTag: `<${resolver.tag}>`, - closeTag: ``, - })) - this.maxOpenLen = Math.max(...this.metas.map((m) => m.openTag.length), 0) - } - - push(delta: string): TagStreamResult { - this.buffer += delta - let visible = '' - const events: TagEvent[] = [] - - while (this.buffer.length > 0) { - if (this.state === 'text') { - let earliestIdx = -1 - let matchedMeta: ResolverMeta | null = null - - for (const meta of this.metas) { - const idx = this.buffer.indexOf(meta.openTag) - if (idx !== -1 && (earliestIdx === -1 || idx < earliestIdx)) { - earliestIdx = idx - matchedMeta = meta - } - } - - if (earliestIdx === -1) { - const keep = Math.min(this.buffer.length, this.maxOpenLen - 1) - const emit = this.buffer.slice(0, this.buffer.length - keep) - visible += emit - this.buffer = this.buffer.slice(this.buffer.length - keep) - break - } - - visible += this.buffer.slice(0, earliestIdx) - this.buffer = this.buffer.slice(earliestIdx + matchedMeta!.openTag.length) - this.tagBuffer = '' - this.activeMeta = matchedMeta - this.state = 'inside' - continue - } - - // state === 'inside' - const closeTag = this.activeMeta!.closeTag - const endIdx = this.buffer.indexOf(closeTag) - if (endIdx === -1) { - const keep = Math.min(this.buffer.length, closeTag.length - 1) - const take = this.buffer.slice(0, this.buffer.length - keep) - this.tagBuffer += take - this.buffer = this.buffer.slice(this.buffer.length - keep) - break - } - - this.tagBuffer += this.buffer.slice(0, endIdx) - const parsed = this.activeMeta!.resolver.parse(this.tagBuffer) - if (parsed.length > 0) { - events.push({ tag: this.activeMeta!.resolver.tag, data: parsed }) - } - this.buffer = this.buffer.slice(endIdx + closeTag.length) - this.tagBuffer = '' - this.activeMeta = null - this.state = 'text' - } - - return { visibleText: visible, events } - } - - /** - * Flush remaining buffered content. Call when the stream ends. - * Unclosed tag blocks are returned as literal `visibleText` to avoid data loss. - */ - flushRemainder(): TagStreamResult { - if (this.state === 'text') { - const out = this.buffer - this.buffer = '' - return { visibleText: out, events: [] } - } - const meta = this.activeMeta! - const out = `${meta.openTag}${this.tagBuffer}${this.buffer}` - this.state = 'text' - this.buffer = '' - this.tagBuffer = '' - this.activeMeta = null - return { visibleText: out, events: [] } - } -} - -// --------------------------------------------------------------------------- -// Helpers -// --------------------------------------------------------------------------- - -function escapeRegExp(s: string): string { - return s.replace(/[.*+?^${}()|[\]\\]/g, '\\$&') -} diff --git a/packages/agent/tsconfig.json b/packages/agent/tsconfig.json deleted file mode 100644 index 94111105..00000000 --- a/packages/agent/tsconfig.json +++ /dev/null @@ -1,22 +0,0 @@ -{ - "compilerOptions": { - "target": "ES2022", - "module": "ESNext", - "lib": ["ES2022"], - "moduleResolution": "bundler", - "allowImportingTsExtensions": true, - "noEmit": true, - "strict": true, - "esModuleInterop": true, - "skipLibCheck": true, - "forceConsistentCasingInFileNames": true, - "resolveJsonModule": true, - "allowSyntheticDefaultImports": true, - "jsx": "react-jsx", - "outDir": "./dist", - "rootDir": "./src", - }, - "include": ["src/**/*"], - "exclude": ["node_modules", "dist"] -} - diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index cb6d795a..c3902fba 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -46,49 +46,6 @@ importers: specifier: ^10.2.0 version: 10.2.0(eslint@9.39.2(jiti@2.6.1)) - apps/agent: - dependencies: - '@elysiajs/bearer': - specifier: ^1.4.2 - version: 1.4.2(elysia@1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3)) - '@elysiajs/cors': - specifier: ^1.4.1 - version: 1.4.1(elysia@1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3)) - '@memoh/agent': - specifier: workspace:* - version: link:../../packages/agent - '@memoh/config': - specifier: workspace:* - version: link:../../packages/config - '@modelcontextprotocol/sdk': - specifier: ^1.25.2 - version: 1.25.2(@cfworker/json-schema@4.1.1)(hono@4.11.4)(zod@4.3.6) - '@mozilla/readability': - specifier: ^0.6.0 - version: 0.6.0 - '@types/turndown': - specifier: ^5.0.6 - version: 5.0.6 - ai: - specifier: ^6.0.25 - version: 6.0.25(zod@4.3.6) - elysia: - specifier: latest - version: 1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3) - toml: - specifier: ^3.0.0 - version: 3.0.0 - turndown: - specifier: ^7.2.2 - version: 7.2.2 - zod: - specifier: ^4.3.5 - version: 4.3.6 - devDependencies: - bun-types: - specifier: latest - version: 1.3.10 - apps/browser: dependencies: '@elysiajs/cors': @@ -265,51 +222,6 @@ importers: specifier: ^3.5.0 version: 3.5.26(typescript@5.9.3) - packages/agent: - dependencies: - '@ai-sdk/anthropic': - specifier: ^3.0.9 - version: 3.0.9(zod@4.3.6) - '@ai-sdk/google': - specifier: ^3.0.6 - version: 3.0.6(zod@4.3.6) - '@ai-sdk/mcp': - specifier: ^1.0.6 - version: 1.0.6(zod@4.3.6) - '@ai-sdk/openai': - specifier: ^3.0.39 - version: 3.0.39(zod@4.3.6) - '@ai-sdk/openai-compatible': - specifier: ^2.0.33 - version: 2.0.33(zod@4.3.6) - '@mozilla/readability': - specifier: ^0.6.0 - version: 0.6.0 - '@types/turndown': - specifier: ^5.0.6 - version: 5.0.6 - ai: - specifier: ^6.0.25 - version: 6.0.25(zod@4.3.6) - linkedom: - specifier: ^0.18.12 - version: 0.18.12 - toml: - specifier: ^3.0.0 - version: 3.0.0 - turndown: - specifier: ^7.2.2 - version: 7.2.2 - typescript: - specifier: ^5 - version: 5.9.3 - yaml: - specifier: ^2.8.2 - version: 2.8.2 - zod: - specifier: ^4.3.6 - version: 4.3.6 - packages/cli: dependencies: '@memoh/sdk': @@ -455,68 +367,6 @@ packages: '@acemir/cssom@0.9.31': resolution: {integrity: sha512-ZnR3GSaH+/vJ0YlHau21FjfLYjMpYVIzTD8M8vIEQvIGxeOXyXdzCI140rrCY862p/C/BbzWsjc1dgnM9mkoTA==} - '@ai-sdk/anthropic@3.0.9': - resolution: {integrity: sha512-QBD4qDnwIHd+N5PpjxXOaWJig1aRB43J0PM5ZUe6Yyl9Qq2bUmraQjvNznkuFKy+hMFDgj0AvgGogTiO5TC+qA==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/gateway@3.0.10': - resolution: {integrity: sha512-sRlPMKd38+fdp2y11USW44c0o8tsIsT6T/pgyY04VXC3URjIRnkxugxd9AkU2ogfpPDMz50cBAGPnMxj+6663Q==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/google@3.0.6': - resolution: {integrity: sha512-Nr7E+ouWd/bKO9SFlgLnJJ1+fiGHC07KAeFr08faT+lvkECWlxVox3aL0dec8uCgBDUghYbq7f4S5teUrCc+QQ==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/mcp@1.0.6': - resolution: {integrity: sha512-ybDhfRArYXrA6Lg6JCPgr3FI3h2COPDRLN9DW6mSFITy4eFK3EccC7UyRTJBY+qp7u2qcqUChrOuYFua/HeFPQ==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/openai-compatible@2.0.33': - resolution: {integrity: sha512-HwptqeUS4vtDyjSSjmKCQExjoQMwPVq0C4pHH18i7c+3CQ0QN81HLvz3BdpULo0n/UtdQwTNISRqx3G5miPZhw==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/openai@3.0.39': - resolution: {integrity: sha512-EZrs4L6kMkPQhpodagpEvqLSryOIK99WgblN0IsVHr1xhajWizQOZ0XMa7c5JpSYgIjV6u8GCpGV6hS3Mk2Bug==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/provider-utils@4.0.17': - resolution: {integrity: sha512-oyCeFINTYK0B8ZGUBiQc05G5vytPlKSmTTtm19xfJuUgoi8zkvvRcoPQci4mSnyfpPn2XSFFDfsALG8uGcapfg==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/provider-utils@4.0.4': - resolution: {integrity: sha512-VxhX0B/dWGbpNHxrKCWUAJKXIXV015J4e7qYjdIU9lLWeptk0KMLGcqkB4wFxff5Njqur8dt8wRi1MN9lZtDqg==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/provider-utils@4.0.5': - resolution: {integrity: sha512-Ow/X/SEkeExTTc1x+nYLB9ZHK2WUId8+9TlkamAx7Tl9vxU+cKzWx2dwjgMHeCN6twrgwkLrrtqckQeO4mxgVA==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - - '@ai-sdk/provider@3.0.2': - resolution: {integrity: sha512-HrEmNt/BH/hkQ7zpi2o6N3k1ZR1QTb7z85WYhYygiTxOQuaml4CMtHCWRbric5WPU+RNsYI7r1EpyVQMKO1pYw==} - engines: {node: '>=18'} - - '@ai-sdk/provider@3.0.8': - resolution: {integrity: sha512-oGMAgGoQdBXbZqNG0Ze56CHjDZ1IDYOwGYxYjO5KLSlz5HiNQ9udIXsPZ61VWaHGZ5XW/jyjmr6t2xz2jGVwbQ==} - engines: {node: '>=18'} - '@algolia/abtesting@1.12.2': resolution: {integrity: sha512-oWknd6wpfNrmRcH0vzed3UPX0i17o4kYLM5OMITyMVM2xLgaRbIafoxL0e8mcrNNb0iORCJA0evnNDKRYth5WQ==} engines: {node: '>= 14.0.0'} @@ -751,9 +601,6 @@ packages: '@braintree/sanitize-url@7.1.2': resolution: {integrity: sha512-jigsZK+sMF/cuiB7sERuo9V7N9jx+dhmHHnQyDSVdpZwVutaBu7WvNYqMDLSgFgfB30n452TP3vjDAvFC973mA==} - '@cfworker/json-schema@4.1.1': - resolution: {integrity: sha512-gAmrUZSGtKc3AiBL71iNWxDsyUC5uMaKKGdvzYsBoTW/xi42JQHl7eKV2OYzCUqvc+D2RCcf7EXY2iCyFIk6og==} - '@chevrotain/cst-dts-gen@11.1.2': resolution: {integrity: sha512-XTsjvDVB5nDZBQB8o0o/0ozNelQtn2KrUVteIHSlPd2VAV2utEb6JzyCJaJ8tGxACR4RiBNWy5uYUHX2eji88Q==} @@ -826,11 +673,6 @@ packages: '@drizzle-team/brocli@0.10.2': resolution: {integrity: sha512-z33Il7l5dKjUgGULTqBsQBQwckHh5AbIuxhdsIxDDiZAzBOrZO6q9ogcWC65kU382AfynTfgNumVcNIjuIua6w==} - '@elysiajs/bearer@1.4.2': - resolution: {integrity: sha512-MK2aCFqnFMqMNSa1e/A6+Ow5uNl5LpKd8K4lCB2LIsyDrI6juxOUHAgqq+esgdSoh3urD1UIMqFC//TsqCQViA==} - peerDependencies: - elysia: '>= 1.4.3' - '@elysiajs/cors@1.4.1': resolution: {integrity: sha512-lQfad+F3r4mNwsxRKbXyJB8Jg43oAOXjRwn7sKUL6bcOW3KjUqUimTS+woNpO97efpzjtDE0tEjGk9DTw8lqTQ==} peerDependencies: @@ -1539,12 +1381,6 @@ packages: peerDependencies: typescript: '>=5.5.3' - '@hono/node-server@1.19.9': - resolution: {integrity: sha512-vHL6w3ecZsky+8P5MD+eFfaGTyCeOHUIFYMGpQGbrBTSmNNoxv0if69rEZ5giu36weC5saFuznL411gRX7bJDw==} - engines: {node: '>=18.14.1'} - peerDependencies: - hono: ^4 - '@humanfs/core@0.19.1': resolution: {integrity: sha512-5DyQ4+1JEUzejeK1JGICcideyfUbGixgS9jNgex5nqkW+cY7WZhxBigmieN5Qnw9ZosSNVC9KQKyb+GUaGyKUA==} engines: {node: '>=18.18.0'} @@ -1765,23 +1601,6 @@ packages: '@microsoft/tsdoc@0.16.0': resolution: {integrity: sha512-xgAyonlVVS+q7Vc7qLW0UrJU7rSFcETRWsqdXZtjzRU8dF+6CkozTK4V4y1LwOX7j8r/vHphjDeMeGI4tNGeGA==} - '@mixmark-io/domino@2.2.0': - resolution: {integrity: sha512-Y28PR25bHXUg88kCV7nivXrP2Nj2RueZ3/l/jdx6J9f8J4nsEGcgX0Qe6lt7Pa+J79+kPiJU3LguR6O/6zrLOw==} - - '@modelcontextprotocol/sdk@1.25.2': - resolution: {integrity: sha512-LZFeo4F9M5qOhC/Uc1aQSrBHxMrvxett+9KLHt7OhcExtoiRN9DKgbZffMP/nxjutWDQpfMDfP3nkHI4X9ijww==} - engines: {node: '>=18'} - peerDependencies: - '@cfworker/json-schema': ^4.1.1 - zod: ^3.25 || ^4.0 - peerDependenciesMeta: - '@cfworker/json-schema': - optional: true - - '@mozilla/readability@0.6.0': - resolution: {integrity: sha512-juG5VWh4qAivzTAeMzvY9xs9HY5rAcr2E4I7tiSSCokRFi7XIZCAu92ZkSTsIj1OPceCifL3cpfteP3pDT9/QQ==} - engines: {node: '>=14.0.0'} - '@opentelemetry/api@1.9.0': resolution: {integrity: sha512-3giAOQvZiH5F9bMlMiv8+GSPMeqg0dbaeo58/0SlA9sxSqZhnUtxzX9/2FzyhS9sWQf5S0GJE0AKBrFqjpeYcg==} engines: {node: '>=8.0.0'} @@ -2294,9 +2113,6 @@ packages: '@types/trusted-types@2.0.7': resolution: {integrity: sha512-ScaPdn1dQczgbl0QFTeTOmVHFULt394XJgOQNoyVhZ6r2vLnMLJfBPd53SB52T/3G36VI1/g2MZaX0cwDuXsfw==} - '@types/turndown@5.0.6': - resolution: {integrity: sha512-ru00MoyeeouE5BX4gRL+6m/BsDfbRayOskWqUvh7CLGW+UXxHQItqALa38kKnOiZPqJrtzJUgAC2+F0rL1S4Pg==} - '@types/unist@3.0.3': resolution: {integrity: sha512-ko/gIFJRv177XgZsZcBwnqJN5x/Gien8qNOn0D5bQU/zAzVf9Zt3BlcUiLqhV9y4ARk0GbT3tnUiPNgnTXzc/Q==} @@ -2370,10 +2186,6 @@ packages: peerDependencies: zod: ^3.24.0 - '@vercel/oidc@3.1.0': - resolution: {integrity: sha512-Fw28YZpRnA3cAHHDlkt7xQHiJ0fcL+NRcIqsocZQUSmbzeIKRpwttJjik5ZGanXP+vlA4SbTg+AbA3bP363l+w==} - engines: {node: '>= 20'} - '@vitejs/plugin-vue@5.2.4': resolution: {integrity: sha512-7Yx/SXSOcQq5HiiV3orevHUFn+pmMB4cgbEkDYgnkUWb0WfeQ/wa2yFv6D5ICiCQOVpjA7vYDXrC7AGO8yjDHA==} engines: {node: ^18.0.0 || >=20.0.0} @@ -2580,10 +2392,6 @@ packages: '@xterm/xterm@6.0.0': resolution: {integrity: sha512-TQwDdQGtwwDt+2cgKDLn0IRaSxYu1tSUjgKarSDkUM0ZNiSRXFpjxEsvc/Zgc5kq5omJ+V0a8/kIM2WD3sMOYg==} - accepts@2.0.0: - resolution: {integrity: sha512-5cvg6CtKwfgdmVqY1WIiXKc3Q1bkRqGLi+2W/6ao+6Y7gu/RCwRuAhGEzh5B4KlszSuTLgZYuqFqo5bImjNKng==} - engines: {node: '>= 0.6'} - acorn-jsx@5.3.2: resolution: {integrity: sha512-rq9s+JNhf0IChjtDXxllJ7g41oZk5SlXtp0LHwyA5cejwn7vKmKp4pPri6YEePv2PU65sAsegbXtIinmDFDXgQ==} peerDependencies: @@ -2598,12 +2406,6 @@ packages: resolution: {integrity: sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ==} engines: {node: '>= 14'} - ai@6.0.25: - resolution: {integrity: sha512-KErk9JWkRaN4j9Xzxuo+twa0TxcYKdYbrRV8iGktduvUeGb0Yd5seWe3yOfuLGERbDBiKI1ajQz28O2FG3WO5A==} - engines: {node: '>=18'} - peerDependencies: - zod: ^3.25.76 || ^4.1.8 - ajv-draft-04@1.0.0: resolution: {integrity: sha512-mv00Te6nmYbRp5DCwclxtt7yV/joXJPGS7nM+97GdxvuttCOfgI3K4U25zboyeX0O+myI8ERluxQe5wljMmVIw==} peerDependencies: @@ -2629,9 +2431,6 @@ packages: ajv@8.13.0: resolution: {integrity: sha512-PRA911Blj99jR5RMeTunVbNXMF6Lp4vZXnk5GQjcnUWUTsrXtekg/pnmFFI2u/I36Y/2bITGS30GZCXei6uNkA==} - ajv@8.17.1: - resolution: {integrity: sha512-B/gBuNg5SiMTrPkC+A2+cW0RszwxYmn6VYxB/inlBStS5nx6xHIt/ehKRhIMhqusl7a8LjQoZnjCs5vhwxOQ1g==} - algoliasearch@5.46.2: resolution: {integrity: sha512-qqAXW9QvKf2tTyhpDA4qXv1IfBwD2eduSW6tUEBFIfCeE9gn9HQ9I5+MaKoenRuHrzk5sQoNh1/iof8mY7uD6Q==} engines: {node: '>= 14.0.0'} @@ -2709,10 +2508,6 @@ packages: birpc@2.9.0: resolution: {integrity: sha512-KrayHS5pBi69Xi9JmvoqrIgYGDkD6mcSe/i6YKi3w5kekCLzrX4+nawcXqrj2tIp50Kw/mT/s3p+GVK0A0sKxw==} - body-parser@2.2.2: - resolution: {integrity: sha512-oP5VkATKlNwcgvxi0vM0p/D3n2C3EReYVX+DNYs5TjZFn/oQt2j+4sVJtSMr18pdRr8wjTcBl6LoV+FUwzPmNA==} - engines: {node: '>=18'} - boolbase@1.0.0: resolution: {integrity: sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww==} @@ -2751,10 +2546,6 @@ packages: peerDependencies: esbuild: '>=0.18' - bytes@3.1.2: - resolution: {integrity: sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==} - engines: {node: '>= 0.8'} - c12@3.3.3: resolution: {integrity: sha512-750hTRvgBy5kcMNPdh95Qo+XUBeGo8C7nsKSmedDmaQI+E0r82DwHeM6vBewDe4rGFbnxoa4V9pw+sPh5+Iz8Q==} peerDependencies: @@ -2771,10 +2562,6 @@ packages: resolution: {integrity: sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==} engines: {node: '>= 0.4'} - call-bound@1.0.4: - resolution: {integrity: sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==} - engines: {node: '>= 0.4'} - callsites@3.1.0: resolution: {integrity: sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==} engines: {node: '>=6'} @@ -2901,25 +2688,9 @@ packages: resolution: {integrity: sha512-5IKcdX0nnYavi6G7TtOhwkYzyjfJlatbjMjuLSfE2kYT5pMDOilZ4OvMhi637CcDICTmz3wARPoyhqyX1Y+XvA==} engines: {node: ^14.18.0 || >=16.10.0} - content-disposition@1.0.1: - resolution: {integrity: sha512-oIXISMynqSqm241k6kcQ5UwttDILMK4BiurCfGEREw6+X9jkkpEe5T9FZaApyLGGOnFuyMWZpdolTXMtvEJ08Q==} - engines: {node: '>=18'} - - content-type@1.0.5: - resolution: {integrity: sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==} - engines: {node: '>= 0.6'} - convert-source-map@2.0.0: resolution: {integrity: sha512-Kvp459HrV2FEJ1CAsi1Ku+MY3kasH19TFykTz2xWmMeq6bk2NU3XXvfJ+Q61m0xktWwt+1HSYf3JZsTms3aRJg==} - cookie-signature@1.2.2: - resolution: {integrity: sha512-D76uU73ulSXrD1UXF4KE2TMxVVwhsnCgfAyTg9k8P6KGZjlXKrOLe4dJQKI3Bxi5wjesZoFXJWElNWBjPZMbhg==} - engines: {node: '>=6.6.0'} - - cookie@0.7.2: - resolution: {integrity: sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==} - engines: {node: '>= 0.6'} - cookie@1.1.1: resolution: {integrity: sha512-ei8Aos7ja0weRpFzJnEA9UHJ/7XQmqglbRwnf2ATjcB9Wq874VKH9kfjjirM6UhU2/E5fFYadylyhFldcqSidQ==} engines: {node: '>=18'} @@ -2928,10 +2699,6 @@ packages: resolution: {integrity: sha512-7Vv6asjS4gMOuILabD3l739tsaxFQmC+a7pLZm02zyvs8p977bL3zEgq3yDk5rn9B0PbYgIv++jmHcuUab4RhA==} engines: {node: '>=18'} - cors@2.8.5: - resolution: {integrity: sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g==} - engines: {node: '>= 0.10'} - cose-base@1.0.3: resolution: {integrity: sha512-s9whTXInMSgAp/NVXVNuVxVKzGH2qck3aQlVHxDCdAEPgtMKwc4Wq6/QKhgdEdgbLSi9rBTAcPoRa6JpiG4ksg==} @@ -2942,25 +2709,15 @@ packages: resolution: {integrity: sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==} engines: {node: '>= 8'} - css-select@5.2.2: - resolution: {integrity: sha512-TizTzUddG/xYLA3NXodFM0fSbNizXjOKhqiQQwvhlspadZokn1KDy0NZFS0wuEubIYAV5/c1/lAr0TaaFXEXzw==} - css-tree@3.1.0: resolution: {integrity: sha512-0eW44TGN5SQXU1mWSkKwFstI/22X2bG1nYzZTYMAWjylYURhse752YgbE4Cx46AC+bAvI+/dYTPRk1LqSUnu6w==} engines: {node: ^10 || ^12.20.0 || ^14.13.0 || >=15.0.0} - css-what@6.2.2: - resolution: {integrity: sha512-u/O3vwbptzhMs3L1fQE82ZSLHQQfto5gyZzwteVIEyeaY5Fc7R4dapF/BvRoSYFeqfBk4m0V1Vafq5Pjv25wvA==} - engines: {node: '>= 6'} - cssesc@3.0.0: resolution: {integrity: sha512-/Tb/JcjK111nNScGob5MNtsntNM1aCNUDipB/TkwZFhyDrrE47SOx/18wF2bbjgc3ZzCSKW1T5nt5EbFoAz/Vg==} engines: {node: '>=4'} hasBin: true - cssom@0.5.0: - resolution: {integrity: sha512-iKuQcq+NdHqlAcwUY0o/HL69XQrUaQdMjmStJ8JFmUaiiQErlhrmuigkg/CU4E2J0IyUKUrMAgl36TvN67MqTw==} - cssstyle@5.3.7: resolution: {integrity: sha512-7D2EPVltRrsTkhpQmksIu+LxeWAIEk6wRDMJ1qljlv+CKHJM+cJLlfhWIzNA44eAsHXSNe3+vO6DW1yCYx8SuQ==} engines: {node: '>=20'} @@ -3168,10 +2925,6 @@ packages: resolution: {integrity: sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==} engines: {node: '>=0.4.0'} - depd@2.0.0: - resolution: {integrity: sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==} - engines: {node: '>= 0.8'} - dequal@2.0.3: resolution: {integrity: sha512-0je+qPKHEMohvfRTCEo3CrPG6cAzAYgmzKyxRiYSSDkS6eGJdyVJm7WaYA5ECaAD9wLB2T4EEeymA5aFVcYXCA==} engines: {node: '>=6'} @@ -3190,23 +2943,10 @@ packages: resolution: {integrity: sha512-sSuxWU5j5SR9QQji/o2qMvqRNYRDOcBTgsJ/DeCf4iSN4gW+gNMXM7wFIP+fdXZxoNiAnHUTGjCr+TSWXdRDKg==} engines: {node: '>=0.3.1'} - dom-serializer@2.0.0: - resolution: {integrity: sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==} - - domelementtype@2.3.0: - resolution: {integrity: sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==} - - domhandler@5.0.3: - resolution: {integrity: sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==} - engines: {node: '>= 4'} - dompurify@3.3.2: resolution: {integrity: sha512-6obghkliLdmKa56xdbLOpUZ43pAR6xFy1uOrxBaIDjT+yaRuuybLjGS9eVBoSR/UPU5fq3OXClEHLJNGvbxKpQ==} engines: {node: '>=20'} - domutils@3.2.2: - resolution: {integrity: sha512-6kZKyUajlDuqlHKVX1w7gyslj9MPIXzIFiz/rGu35uC1wMi+kMhQwGhl4lt9unC9Vb9INnY9Z3/ZA3+FhASLaw==} - dotenv@17.2.3: resolution: {integrity: sha512-JVUnt+DUIzu87TABbhPmNfVdBDt18BLOWjMUFJMSi/Qqg7NTYtabbvSNJGOJ7afbRuv9D/lngizHtP7QyLQ+9w==} engines: {node: '>=12'} @@ -3222,9 +2962,6 @@ packages: echarts@6.0.0: resolution: {integrity: sha512-Tte/grDQRiETQP4xz3iZWSvoHrkCQtwqd6hs+mifXcjrCuo2iKWbajFObuLJVBlDIJlOzgQPd1hsaKt/3+OMkQ==} - ee-first@1.1.1: - resolution: {integrity: sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow==} - electron-to-chromium@1.5.267: resolution: {integrity: sha512-0Drusm6MVRXSOJpGbaSVgcQsuB4hEkMpHXaVstcPmhu5LIedxs1xNK/nIxmQIU/RPC0+1/o0AVZfBTkTNJOdUw==} @@ -3252,10 +2989,6 @@ packages: emoji-regex@8.0.0: resolution: {integrity: sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==} - encodeurl@2.0.0: - resolution: {integrity: sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==} - engines: {node: '>= 0.8'} - enhanced-resolve@5.18.4: resolution: {integrity: sha512-LgQMM4WXU3QI+SYgEc2liRgznaD5ojbmY3sb8LxyguVkIg5FxdpTkvk72te2R38/TGKxH634oLxXRGY6d7AP+Q==} engines: {node: '>=10.13.0'} @@ -3272,10 +3005,6 @@ packages: resolution: {integrity: sha512-FDWG5cmEYf2Z00IkYRhbFrwIwvdFKH07uV8dvNy0omp/Qb1xcyCWp2UDtcwJF4QZZvk0sLudP6/hAu42TaqVhQ==} engines: {node: '>=0.12'} - entities@7.0.1: - resolution: {integrity: sha512-TWrgLOFUQTH994YUyl1yT4uyavY5nNB5muff+RtWaqNVCAK408b5ZnnbNAUEWLTCpum9w6arT70i1XdQ4UeOPA==} - engines: {node: '>=0.12'} - error-stack-parser-es@1.0.5: resolution: {integrity: sha512-5qucVt2XcuGMcEGgWI7i+yZpmpByQ8J1lHhcL7PwqCwu9FPP3VUXzT4ltHe5i2z9dePwEHcDVOAfSnHsOlCXRA==} @@ -3327,9 +3056,6 @@ packages: resolution: {integrity: sha512-WUj2qlxaQtO4g6Pq5c29GTcWGDyd8itL8zTlipgECz3JesAiiOKotd8JU6otB3PACgG6xkJUyVhboMS+bje/jA==} engines: {node: '>=6'} - escape-html@1.0.3: - resolution: {integrity: sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==} - escape-string-regexp@4.0.0: resolution: {integrity: sha512-TtpcNJ3XAzx3Gq8sWRzJaVajRs0uVxA2YAkdb1jm2YkPz4G6egUFAyA3n5vtEIZefPk5Wa4UXbKuS5fKkJWdgA==} engines: {node: '>=10'} @@ -3396,18 +3122,6 @@ packages: resolution: {integrity: sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==} engines: {node: '>=0.10.0'} - etag@1.8.1: - resolution: {integrity: sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==} - engines: {node: '>= 0.6'} - - eventsource-parser@3.0.6: - resolution: {integrity: sha512-Vo1ab+QXPzZ4tCa8SwIHJFaSzy4R6SHf7BY79rFBDf0idraZWAkYrDjDj8uWaSm3S2TK+hJ7/t1CEmZ7jXw+pg==} - engines: {node: '>=18.0.0'} - - eventsource@3.0.7: - resolution: {integrity: sha512-CRT1WTyuQoD771GW56XEZFQ/ZoSfWid1alKGDYMmkt2yl8UXrVR4pspqWNEcqKvVIzg6PAltWjxcSSPrboA4iA==} - engines: {node: '>=18.0.0'} - exact-mirror@0.2.6: resolution: {integrity: sha512-7s059UIx9/tnOKSySzUk5cPGkoILhTE4p6ncf6uIPaQ+9aRBQzQjc9+q85l51+oZ+P6aBxh084pD0CzBQPcFUA==} peerDependencies: @@ -3420,16 +3134,6 @@ packages: resolution: {integrity: sha512-knvyeauYhqjOYvQ66MznSMs83wmHrCycNEN6Ao+2AeYEfxUIkuiVxdEa1qlGEPK+We3n0THiDciYSsCcgW/DoA==} engines: {node: '>=12.0.0'} - express-rate-limit@7.5.1: - resolution: {integrity: sha512-7iN8iPMDzOMHPUYllBEsQdWVB6fPDMPqwjBaFrgr4Jgr/+okjvzAy+UHlYYL/Vs0OsOrMkwS6PJDkFlJwoxUnw==} - engines: {node: '>= 16'} - peerDependencies: - express: '>= 4.11' - - express@5.2.1: - resolution: {integrity: sha512-hIS4idWWai69NezIdRt2xFVofaF4j+6INOpJlVOLDO8zXGpUVEVzIYk12UUi2JzjEzWL3IOAxcTubgz9Po0yXw==} - engines: {node: '>= 18'} - exsolve@1.0.8: resolution: {integrity: sha512-LmDxfWXwcTArk8fUEnOfSZpHOJ6zOMUJKOtFLFqJLoKJetuQG874Uc7/Kki7zFLzYybmZhp1M7+98pfMqeX8yA==} @@ -3445,9 +3149,6 @@ packages: fast-levenshtein@2.0.6: resolution: {integrity: sha512-DCXu6Ifhqcks7TZKY3Hxp3y6qphY5SJZmrWMDrKcERSOXWQdMhU9Ig/PYrzyw/ul9jOIyh0N4M0tbC5hodg8dw==} - fast-uri@3.1.0: - resolution: {integrity: sha512-iPeeDKJSWf4IEOasVVrknXpaBV0IApz/gp7S2bb7Z4Lljbl2MGJRqInZiUrQwV16cpzw/D3S5j5Julj/gT52AA==} - fdir@6.5.0: resolution: {integrity: sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg==} engines: {node: '>=12.0.0'} @@ -3465,10 +3166,6 @@ packages: resolution: {integrity: sha512-8kPJMIGz1Yt/aPEwOsrR97ZyZaD1Iqm8PClb1nYFclUCkBi0Ma5IsYNQzvSFS9ib51lWyIw5mIT9rWzI/xjpzA==} engines: {node: '>=20'} - finalhandler@2.1.1: - resolution: {integrity: sha512-S8KoZgRZN+a5rNwqTxlZZePjT/4cnm0ROV70LedRHZ0p8u9fRID0hJUZQpkKLzro8LfmC8sx23bY6tVNxv8pQA==} - engines: {node: '>= 18.0.0'} - find-up@5.0.0: resolution: {integrity: sha512-78/PXT1wlLLDgTzDs7sjq9hzz0vXD+zn+7wypEe4fXQxCmdmqfGsEPQxmiCSQI3ajFV91bVSsvNtrJRiW6nGng==} engines: {node: '>=10'} @@ -3499,14 +3196,6 @@ packages: resolution: {integrity: sha512-8RipRLol37bNs2bhoV67fiTEvdTrbMUYcFTiy3+wuuOnUog2QBHCZWXDRijWQfAkhBj2Uf5UnVaiWwA5vdd82w==} engines: {node: '>= 6'} - forwarded@0.2.0: - resolution: {integrity: sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==} - engines: {node: '>= 0.6'} - - fresh@2.0.0: - resolution: {integrity: sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A==} - engines: {node: '>= 0.8'} - fs-extra@11.3.3: resolution: {integrity: sha512-VWSRii4t0AFm6ixFFmLLx1t7wS1gh+ckoa84aOeapGum0h+EZd1EhEumSB+ZdDLnEPuucsVB9oB7cxJHap6Afg==} engines: {node: '>=14.14'} @@ -3587,10 +3276,6 @@ packages: hast-util-whitespace@3.0.0: resolution: {integrity: sha512-88JUN06ipLwsnv+dVn+OIYOvAuvBMy/Qoi6O7mQHxdPXpjy+Cd6xRkWwux7DKO+4sYILtLBRIKgsdpS2gQc7qw==} - hono@4.11.4: - resolution: {integrity: sha512-U7tt8JsyrxSRKspfhtLET79pU8K+tInj5QZXs1jSugO1Vq5dFj3kmZsRldo29mTBfcjDRVRXrEZ6LS63Cog9ZA==} - engines: {node: '>=16.9.0'} - hookable@5.5.3: resolution: {integrity: sha512-Yc+BQe8SvoXH1643Qez1zqLRmbA5rCL+sSmk6TVos0LWVfNIB7PGncdlId77WzLGSIB5KaWgTaNTs2lNVEI6VQ==} @@ -3598,19 +3283,9 @@ packages: resolution: {integrity: sha512-CV9TW3Y3f8/wT0BRFc1/KAVQ3TUHiXmaAb6VW9vtiMFf7SLoMd1PdAc4W3KFOFETBJUb90KatHqlsZMWV+R9Gg==} engines: {node: ^20.19.0 || ^22.12.0 || >=24.0.0} - html-escaper@3.0.3: - resolution: {integrity: sha512-RuMffC89BOWQoY0WKGpIhn5gX3iI54O6nRA0yC124NYVtzjmFWBIiFd8M0x+ZdX0P9R4lADg1mgP8C7PxGOWuQ==} - html-void-elements@3.0.0: resolution: {integrity: sha512-bEqo66MRXsUGxWHV5IP0PUiAWwoEjba4VCzg0LjFJBpchPaTfyfCKTG6bc5F8ucKec3q5y6qOdGyYTSBEvhCrg==} - htmlparser2@10.1.0: - resolution: {integrity: sha512-VTZkM9GWRAtEpveh7MSF6SjjrpNVNNVJfFup7xTY3UpFtm67foy9HDVXneLtFVt4pMz5kZtgNcvCniNFb1hlEQ==} - - http-errors@2.0.1: - resolution: {integrity: sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ==} - engines: {node: '>= 0.8'} - http-proxy-agent@7.0.2: resolution: {integrity: sha512-T1gkAiYYDWYx3V5Bmyu7HcfcvL7mUrTWiM6yOfa3PIphViJ/gFPbvidQ+veqSOHci/PxBcDabeUNCzpOODJZig==} engines: {node: '>= 14'} @@ -3655,9 +3330,6 @@ packages: resolution: {integrity: sha512-JmXMZ6wuvDmLiHEml9ykzqO6lwFbof0GG4IkcGaENdCRDDmMVnny7s5HsIgHCbaq0w2MyPhDqkhTUgS2LU2PHA==} engines: {node: '>=0.8.19'} - inherits@2.0.4: - resolution: {integrity: sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==} - inquirer@12.11.1: resolution: {integrity: sha512-9VF7mrY+3OmsAfjH3yKz/pLbJ5z22E23hENKw3/LNSaA/sAt3v49bDRY+Ygct1xwuKT+U+cBfTzjCPySna69Qw==} engines: {node: '>=18'} @@ -3674,10 +3346,6 @@ packages: resolution: {integrity: sha512-5Hh7Y1wQbvY5ooGgPbDaL5iYLAPzMTUrjMulskHLH6wnv/A+1q5rgEaiuqEjB+oxGXIVZs1FF+R/KPN3ZSQYYg==} engines: {node: '>=12'} - ipaddr.js@1.9.1: - resolution: {integrity: sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==} - engines: {node: '>= 0.10'} - is-core-module@2.16.1: resolution: {integrity: sha512-UfoeMA6fIJ8wTYFEUjelnaGI67v6+N7qXJEvQuIGa99l4xsCruSYOVSQ0uPANn4dAzm8lkYPaKLrrijLq7x23w==} engines: {node: '>= 0.4'} @@ -3715,9 +3383,6 @@ packages: is-potential-custom-element-name@1.0.1: resolution: {integrity: sha512-bCYeRA2rVibKZd+s2625gGnGF/t7DSqDs4dP7CrLA1m7jKWz6pps0LpYLJN8Q64HtmPKJ1hrN3nzPNKFEKOUiQ==} - is-promise@4.0.0: - resolution: {integrity: sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ==} - is-unicode-supported@1.3.0: resolution: {integrity: sha512-43r2mRvz+8JRIKnWJ+3j8JtjRKZ6GmjzfaE/qiBJnikNnYv/6bagRJ1kUhNk8R5EX/GkobD+r+sfxCPJsiKBLQ==} engines: {node: '>=12'} @@ -3748,9 +3413,6 @@ packages: jju@1.4.0: resolution: {integrity: sha512-8wb9Yw966OSxApiCt0K3yNJL8pnNeIv+OEq2YMidz4FKP6nonSRoOXc80iXY4JaN2FC11B9qsNmDsm+ZOfMROA==} - jose@6.1.3: - resolution: {integrity: sha512-0TpaTfihd4QMNwrz/ob2Bp7X04yuxJkjRGi4aKmOqwhov54i6u79oCv7T+C7lo70MKH6BesI3vscD1yb/yzKXQ==} - joycon@3.1.1: resolution: {integrity: sha512-34wB/Y7MW7bzjKRjUKTa46I2Z7eV62Rkhva+KkopW7Qvv/OSWBqvkSY7vusOPrNuZcUG3tApvdVgNB8POj3SPw==} engines: {node: '>=10'} @@ -3789,12 +3451,6 @@ packages: json-schema-traverse@1.0.0: resolution: {integrity: sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug==} - json-schema-typed@8.0.2: - resolution: {integrity: sha512-fQhoXdcvc3V28x7C7BMs4P5+kNlgUURe2jmUT1T//oBRMDrqy1QPelJimwZGo7Hg9VPV3EQV5Bnq4hbFy2vetA==} - - json-schema@0.4.0: - resolution: {integrity: sha512-es94M3nTIfsEPisRafak+HDLfHXnKBhV3vU5eqPcS3flIWqcxJWgXHXiey3YrpaNsanY5ei1VoYEbOzijuq9BA==} - json-stable-stringify-without-jsonify@1.0.1: resolution: {integrity: sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw==} @@ -3917,15 +3573,6 @@ packages: lines-and-columns@1.2.4: resolution: {integrity: sha512-7ylylesZQ/PV29jhEDl3Ufjo6ZX7gCqJr5F7PKrqc93v7fzSymt1BpwEU8nAUXs8qzzvqhbjhK5QZg6Mt/HkBg==} - linkedom@0.18.12: - resolution: {integrity: sha512-jalJsOwIKuQJSeTvsgzPe9iJzyfVaEJiEXl+25EkKevsULHvMJzpNqwvj1jOESWdmgKDiXObyjOYwlUqG7wo1Q==} - engines: {node: '>=16'} - peerDependencies: - canvas: '>= 2' - peerDependenciesMeta: - canvas: - optional: true - linkify-it@5.0.0: resolution: {integrity: sha512-5aHCbzQRADcdP+ATqnDuhhJ/MRIqDkZX5pyjFHRRysS8vZ5AbqGEoFIb6pYHPZ+L/OC2Lc+xT8uHVVR5CAK/wQ==} @@ -4052,10 +3699,6 @@ packages: mdurl@2.0.0: resolution: {integrity: sha512-Lf+9+2r+Tdp5wXDXC4PcIBjTDtq4UKjCPMQhKIuzpJNW0b96kVqSwW0bT7FhRSfmAiFYgP+SCRvdrDozfh0U5w==} - media-typer@1.1.0: - resolution: {integrity: sha512-aisnrDP4GNe06UcKFnV5bfMNPBUw4jsLGaWwWfnH3v02GnBuXX2MCVn5RbrWo0j3pczUilYblq7fQ7Nw2t5XKw==} - engines: {node: '>= 0.8'} - memoirist@0.4.0: resolution: {integrity: sha512-zxTgA0mSYELa66DimuNQDvyLq36AwDlTuVRbnQtB+VuTcKWm5Qc4z3WkSpgsFWHNhexqkIooqpv4hdcqrX5Nmg==} @@ -4063,10 +3706,6 @@ packages: resolution: {integrity: sha512-S3UwM3yj5mtUSEfP41UZmt/0SCoVYUcU1rkXv+BQ5Ig8ndL4sPoJNBUJERafdPb5jjHJGuMgytgKvKIf58XNBw==} engines: {node: '>= 0.10.0'} - merge-descriptors@2.0.0: - resolution: {integrity: sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g==} - engines: {node: '>=18'} - mermaid@11.12.3: resolution: {integrity: sha512-wN5ZSgJQIC+CHJut9xaKWsknLxaFBwCPwPkGTSUYrTiHORWvpT8RxGk849HPnpUAQ+/9BPRqYb80jTpearrHzQ==} @@ -4089,18 +3728,10 @@ packages: resolution: {integrity: sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==} engines: {node: '>= 0.6'} - mime-db@1.54.0: - resolution: {integrity: sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ==} - engines: {node: '>= 0.6'} - mime-types@2.1.35: resolution: {integrity: sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==} engines: {node: '>= 0.6'} - mime-types@3.0.2: - resolution: {integrity: sha512-Lbgzdk0h4juoQ9fCKXW4by0UJqj+nOOrI9MJ1sSj4nI8aI2eo1qmvQEie4VD1glsS250n15LsWsYtCugiStS5A==} - engines: {node: '>=18'} - mimic-function@5.0.1: resolution: {integrity: sha512-VP79XUPxV2CigYP3jWwAUFSku2aKqBH7uTAapFWCBqutsbmDo96KY5o8uh6U+/YSIn5OxJnXp73beVkpqMIGhA==} engines: {node: '>=18'} @@ -4161,10 +3792,6 @@ packages: natural-compare@1.4.0: resolution: {integrity: sha512-OWND8ei3VtNC9h7V60qff3SVobHr996CTwgxubgyQYEpg290h9J0buyECNNJexkFm5sOajh5G116RYA1c8ZMSw==} - negotiator@1.0.0: - resolution: {integrity: sha512-8Ofs/AUQh8MaEcrlq5xOX0CQ9ypTF5dl78mjlMNfOK08fzpgTHQRQPBxcPlEtIw0yRpws+Zo/3r+5WRby7u3Gg==} - engines: {node: '>= 0.6'} - node-fetch-native@1.6.7: resolution: {integrity: sha512-g9yhqoedzIUm0nTnTqAQvueMPVOuIY16bqgAJJC8XOOubYFNwz6IER9qs0Gq2Xd0+CecCKFjtdDTMA4u4xG06Q==} @@ -4192,23 +3819,12 @@ packages: resolution: {integrity: sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==} engines: {node: '>=0.10.0'} - object-inspect@1.13.4: - resolution: {integrity: sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew==} - engines: {node: '>= 0.4'} - obug@2.1.1: resolution: {integrity: sha512-uTqF9MuPraAQ+IsnPf366RG4cP9RtUi7MLO1N3KEc+wb0a6yKpeL0lmk2IB1jY5KHPAlTc6T/JRdC/YqxHNwkQ==} ohash@2.0.11: resolution: {integrity: sha512-RdR9FQrFwNBNXAr4GixM8YaRZRJ5PUWbKYbE5eOsrwAjJW0q2REGcf79oYPsLyskQCZG1PLN+S/K1V00joZAoQ==} - on-finished@2.4.1: - resolution: {integrity: sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg==} - engines: {node: '>= 0.8'} - - once@1.4.0: - resolution: {integrity: sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==} - onetime@7.0.0: resolution: {integrity: sha512-VXJjc87FScF88uafS3JllDgvAm+c/Slfz06lorj2uAY34rlUu0Nt+v8wreiImcrgAjjIHp1rXpTDlLOGw29WwQ==} engines: {node: '>=18'} @@ -4259,10 +3875,6 @@ packages: parse5@8.0.0: resolution: {integrity: sha512-9m4m5GSgXjL4AjumKzq1Fgfp3Z8rsvjRNbnkVwfu2ImRqE5D0LnY2QfDen18FSY9C573YU5XxSapdHZTZ2WolA==} - parseurl@1.3.3: - resolution: {integrity: sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==} - engines: {node: '>= 0.8'} - path-browserify@1.0.1: resolution: {integrity: sha512-b7uo2UCUOYZcnF/3ID0lulOJi/bafxa1xPe7ZPsammBSpjSWQkjNxlt635YGS2MiR9GjvuXCtz2emr3jbsz98g==} @@ -4280,9 +3892,6 @@ packages: path-parse@1.0.7: resolution: {integrity: sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==} - path-to-regexp@8.3.0: - resolution: {integrity: sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==} - pathe@2.0.3: resolution: {integrity: sha512-WUjGcAqP1gQacoQe+OBJsFA7Ld4DyXuUIjZ5cc75cLHvJ7dtNsTugphxIADwspS+AraAUePCKrSVtPLFj/F88w==} @@ -4331,10 +3940,6 @@ packages: resolution: {integrity: sha512-TfySrs/5nm8fQJDcBDuUng3VOUKsd7S+zqvbOTiGXHfxX4wK31ard+hoNuvkicM/2YFzlpDgABOevKSsB4G/FA==} engines: {node: '>= 6'} - pkce-challenge@5.0.1: - resolution: {integrity: sha512-wQ0b/W4Fr01qtpHlqSqspcj3EhBvimsdh0KlHhH8HRZnMsEa0ea2fTULOXOS9ccQr3om+GcGRk4e+isrZWV8qQ==} - engines: {node: '>=16.20.0'} - pkg-types@1.3.1: resolution: {integrity: sha512-/Jm5M4RvtBFVkKWRu2BLUTNP8/M2a+UwuAX+ae4770q1qVGtfjG+WTCupoZixokjmHiry8uI+dlY8KXYV5HVVQ==} @@ -4397,10 +4002,6 @@ packages: property-information@7.1.0: resolution: {integrity: sha512-TwEZ+X+yCJmYfL7TPUOcvBZ4QfoT5YenQiJuX//0th53DE6w0xxLEtfK3iyryQFddXuvkIk51EEgrJQ0WJkOmQ==} - proxy-addr@2.0.7: - resolution: {integrity: sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==} - engines: {node: '>= 0.10'} - proxy-from-env@1.1.0: resolution: {integrity: sha512-D+zkORCbA9f1tdWRK0RaCR3GPv50cMxcrz4X8k5LTSUD1Dkw47mKJEZQNunItRTkWwgtaUSo1RVFRIG9ZXiFYg==} @@ -4412,21 +4013,9 @@ packages: resolution: {integrity: sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==} engines: {node: '>=6'} - qs@6.14.1: - resolution: {integrity: sha512-4EK3+xJl8Ts67nLYNwqw/dsFVnCf+qR7RgXSK9jEEm9unao3njwMDdmsdvoKBKHzxd7tCYz5e5M+SnMjdtXGQQ==} - engines: {node: '>=0.6'} - quansync@0.2.11: resolution: {integrity: sha512-AifT7QEbW9Nri4tAwR5M/uzpBuqfZf+zwaEM/QkzEjj7NBuFD2rBuy0K3dE+8wltbezDV7JMA0WfnCPYRSYbXA==} - range-parser@1.2.1: - resolution: {integrity: sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==} - engines: {node: '>= 0.6'} - - raw-body@3.0.2: - resolution: {integrity: sha512-K5zQjDllxWkf7Z5xJdV0/B0WTNqx6vxG70zJE4N0kBs4LovmEYWJzQGxC9bS9RAKu3bgM40lrd5zoLJ12MQ5BA==} - engines: {node: '>= 0.10'} - rc9@2.1.2: resolution: {integrity: sha512-btXCnMmRIBINM2LDZoEmOogIZU7Qe7zn4BpomSKZ/ykbLObuBdvG+mFq11DL6fjH1DRwHhrlgtYWG96bJiC7Cg==} @@ -4494,10 +4083,6 @@ packages: roughjs@4.6.6: resolution: {integrity: sha512-ZUz/69+SYpFN/g/lUlo2FXcIjRkSu3nDarreVdGGndHEBJ6cXPdKguS8JGxwj5HA5xIbVKSmLgr5b3AWxtRfvQ==} - router@2.2.0: - resolution: {integrity: sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ==} - engines: {node: '>= 18'} - run-applescript@7.1.0: resolution: {integrity: sha512-DPe5pVFaAsinSaV6QjQ6gdiedWDcRCbUuiQfQa2wmWV7+xC9bGulGI8+TdRmoFkAPaBXk8CrAbnlY2ISniJ47Q==} engines: {node: '>=18'} @@ -4536,17 +4121,6 @@ packages: engines: {node: '>=10'} hasBin: true - send@1.2.1: - resolution: {integrity: sha512-1gnZf7DFcoIcajTjTwjwuDjzuz4PPcY2StKPlsGAQ1+YH20IRVrBaXSWmdjowTJ6u8Rc01PoYOGHXfP1mYcZNQ==} - engines: {node: '>= 18'} - - serve-static@2.2.1: - resolution: {integrity: sha512-xRXBn0pPqQTVQiC8wyQrKs2MOlX24zQ0POGaj0kultvoOCstBQM5yvOhAVSUwOMjQtTvsPWoNCHfPGwaaQJhTw==} - engines: {node: '>= 18'} - - setprototypeof@1.2.0: - resolution: {integrity: sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==} - shebang-command@2.0.0: resolution: {integrity: sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==} engines: {node: '>=8'} @@ -4565,22 +4139,6 @@ packages: shiki@3.23.0: resolution: {integrity: sha512-55Dj73uq9ZXL5zyeRPzHQsK7Nbyt6Y10k5s7OjuFZGMhpp4r/rsLBH0o/0fstIzX1Lep9VxefWljK/SKCzygIA==} - side-channel-list@1.0.0: - resolution: {integrity: sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==} - engines: {node: '>= 0.4'} - - side-channel-map@1.0.1: - resolution: {integrity: sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA==} - engines: {node: '>= 0.4'} - - side-channel-weakmap@1.0.2: - resolution: {integrity: sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A==} - engines: {node: '>= 0.4'} - - side-channel@1.1.0: - resolution: {integrity: sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw==} - engines: {node: '>= 0.4'} - siginfo@2.0.0: resolution: {integrity: sha512-ybx0WO1/8bSBLEWXZvEd7gMW3Sn3JFlW3TvX1nREbDLRNQNaeNN8WK0meBwPdAaOI7TtRRRJn/Es1zhrrCHu7g==} @@ -4624,10 +4182,6 @@ packages: stackback@0.0.2: resolution: {integrity: sha512-1XMJE5fQo1jGH6Y/7ebnwPOBEkIEnT4QF32d5R1+VXdXveM0IBMJt8zfaxX1P3QhVwrYe+576+jkANtSS2mBbw==} - statuses@2.0.2: - resolution: {integrity: sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw==} - engines: {node: '>= 0.8'} - std-env@3.10.0: resolution: {integrity: sha512-5GS12FdOZNliM5mAOxFRg7Ir0pWz8MdpYm6AY6VPkGpbA7ZzmbzNcBJQ0GPvvyWgcY7QAhCgf9Uy89I03faLkg==} @@ -4756,10 +4310,6 @@ packages: resolution: {integrity: sha512-1r6vQTTt1rUiJkI5vX7KG8PR342Ru/5Oh13kEQP2SMbRSZpOey9SrBe27IDxkoWulx8ShWu4K6C0BkctP8Z1bQ==} hasBin: true - toidentifier@1.0.1: - resolution: {integrity: sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA==} - engines: {node: '>=0.6'} - token-types@6.1.2: resolution: {integrity: sha512-dRXchy+C0IgK8WPC6xvCHFRIWYUbqqdEIKPaKo/AcTUNzwLTK6AH7RjdLWsEZcAN/TBdtfUw3PYEgPr5VPr6ww==} engines: {node: '>=14.16'} @@ -4829,9 +4379,6 @@ packages: engines: {node: '>=18.0.0'} hasBin: true - turndown@7.2.2: - resolution: {integrity: sha512-1F7db8BiExOKxjSMU2b7if62D/XOyQyZbPKq/nUwopfgnHlqXHqQ0lvfUTeUIr1lZJzOPFn43dODyMSIfvWRKQ==} - tw-animate-css@1.4.0: resolution: {integrity: sha512-7bziOlRqH0hJx80h/3mbicLW7o8qLsH5+RaLR2t+OHM3D0JlWGODQKQ4cxbK7WlvmUxpcj6Kgu6EKqjrGFe3QQ==} @@ -4843,10 +4390,6 @@ packages: resolution: {integrity: sha512-TeTSQ6H5YHvpqVwBRcnLDCBnDOHWYu7IvGbHT6N8AOymcr9PJGjc1GTtiWZTYg0NCgYwvnYWEkVChQAr9bjfwA==} engines: {node: '>=16'} - type-is@2.0.1: - resolution: {integrity: sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==} - engines: {node: '>= 0.6'} - typescript-eslint@8.52.0: resolution: {integrity: sha512-atlQQJ2YkO4pfTVQmQ+wvYQwexPDOIgo+RaVcD7gHgzy/IQA+XTyuxNM9M9TVXvttkF7koBHmcwisKdOAf2EcA==} engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0} @@ -4870,9 +4413,6 @@ packages: ufo@1.6.2: resolution: {integrity: sha512-heMioaxBcG9+Znsda5Q8sQbWnLJSl98AFDXTO80wELWEzX3hordXsTdxrIfMQoO9IY1MEnoGoPjpoKpMj+Yx0Q==} - uhyphen@0.2.0: - resolution: {integrity: sha512-qz3o9CHXmJJPGBdqzab7qAYuW8kQGKNEuoHFYrBwV6hWIMcpAmxDLXojcHfFr9US1Pe6zUswEIJIbLI610fuqA==} - uint8array-extras@1.5.0: resolution: {integrity: sha512-rvKSBiC5zqCCiDZ9kAOszZcDvdAHwwIKJG33Ykj43OKcWsnmcBRL09YTU4nOeHZ8Y2a7l1MgTd08SBe9A8Qj6A==} engines: {node: '>=18'} @@ -4902,10 +4442,6 @@ packages: resolution: {integrity: sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==} engines: {node: '>= 10.0.0'} - unpipe@1.0.0: - resolution: {integrity: sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==} - engines: {node: '>= 0.8'} - unplugin-dts@1.0.0-beta.6: resolution: {integrity: sha512-+xbFv5aVFtLZFNBAKI4+kXmd2h+T42/AaP8Bsp0YP/je/uOTN94Ame2Xt3e9isZS+Z7/hrLCLbsVJh+saqFMfQ==} peerDependencies: @@ -4960,10 +4496,6 @@ packages: resolution: {integrity: sha512-0/A9rDy9P7cJ+8w1c9WD9V//9Wj15Ce2MPz8Ri6032usz+NfePxx5AcN3bN+r6ZL6jEo066/yNYB3tn4pQEx+A==} hasBin: true - vary@1.1.2: - resolution: {integrity: sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==} - engines: {node: '>= 0.8'} - vee-validate@4.15.1: resolution: {integrity: sha512-DkFsiTwEKau8VIxyZBGdO6tOudD+QoUBPuHj3e6QFqmbfCRj1ArmYWue9lEp6jLSWBIw4XPlDLjFIZNLdRAMSg==} peerDependencies: @@ -5251,9 +4783,6 @@ packages: resolution: {integrity: sha512-r6lPcBGxZXlIcymEu7InxDMhdW0KDxpLgoFLcguasxCaJ/SOIZwINatK9KY/tf+ZrlywOKU0UDj3ATXUBfxJXA==} engines: {node: '>=8'} - wrappy@1.0.2: - resolution: {integrity: sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==} - ws@8.19.0: resolution: {integrity: sha512-blAT2mjOEIi0ZzruJfIhb3nps74PRWTCz1IjglWEEpQl5XS/UNama6u2/rjFkDDouqr4L67ry+1aGIALViWjDg==} engines: {node: '>=10.0.0'} @@ -5304,11 +4833,6 @@ packages: resolution: {integrity: sha512-U/PBtDf35ff0D8X8D0jfdzHYEPFxAI7jJlxZXwCSez5M3190m+QobIfh+sWDWSHMCWWJN2AWamkegn6vr6YBTw==} engines: {node: '>=18'} - zod-to-json-schema@3.25.1: - resolution: {integrity: sha512-pM/SU9d3YAggzi6MtR4h7ruuQlqKtad8e9S0fmxcMi+ueAK5Korys/aWcV9LIIHTVbj01NdzxcnXSN+O74ZIVA==} - peerDependencies: - zod: ^3.25 || ^4 - zod@3.25.76: resolution: {integrity: sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ==} @@ -5326,73 +4850,6 @@ snapshots: '@acemir/cssom@0.9.31': optional: true - '@ai-sdk/anthropic@3.0.9(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.2 - '@ai-sdk/provider-utils': 4.0.4(zod@4.3.6) - zod: 4.3.6 - - '@ai-sdk/gateway@3.0.10(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.2 - '@ai-sdk/provider-utils': 4.0.4(zod@4.3.6) - '@vercel/oidc': 3.1.0 - zod: 4.3.6 - - '@ai-sdk/google@3.0.6(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.2 - '@ai-sdk/provider-utils': 4.0.4(zod@4.3.6) - zod: 4.3.6 - - '@ai-sdk/mcp@1.0.6(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.2 - '@ai-sdk/provider-utils': 4.0.5(zod@4.3.6) - pkce-challenge: 5.0.1 - zod: 4.3.6 - - '@ai-sdk/openai-compatible@2.0.33(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.8 - '@ai-sdk/provider-utils': 4.0.17(zod@4.3.6) - zod: 4.3.6 - - '@ai-sdk/openai@3.0.39(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.8 - '@ai-sdk/provider-utils': 4.0.17(zod@4.3.6) - zod: 4.3.6 - - '@ai-sdk/provider-utils@4.0.17(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.8 - '@standard-schema/spec': 1.1.0 - eventsource-parser: 3.0.6 - zod: 4.3.6 - - '@ai-sdk/provider-utils@4.0.4(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.2 - '@standard-schema/spec': 1.1.0 - eventsource-parser: 3.0.6 - zod: 4.3.6 - - '@ai-sdk/provider-utils@4.0.5(zod@4.3.6)': - dependencies: - '@ai-sdk/provider': 3.0.2 - '@standard-schema/spec': 1.1.0 - eventsource-parser: 3.0.6 - zod: 4.3.6 - - '@ai-sdk/provider@3.0.2': - dependencies: - json-schema: 0.4.0 - - '@ai-sdk/provider@3.0.8': - dependencies: - json-schema: 0.4.0 - '@algolia/abtesting@1.12.2': dependencies: '@algolia/client-common': 5.46.2 @@ -5726,9 +5183,6 @@ snapshots: '@braintree/sanitize-url@7.1.2': {} - '@cfworker/json-schema@4.1.1': - optional: true - '@chevrotain/cst-dts-gen@11.1.2': dependencies: '@chevrotain/gast': 11.1.2 @@ -5800,10 +5254,6 @@ snapshots: '@drizzle-team/brocli@0.10.2': {} - '@elysiajs/bearer@1.4.2(elysia@1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3))': - dependencies: - elysia: 1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3) - '@elysiajs/cors@1.4.1(elysia@1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3))': dependencies: elysia: 1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3) @@ -6248,10 +5698,6 @@ snapshots: dependencies: typescript: 5.9.3 - '@hono/node-server@1.19.9(hono@4.11.4)': - dependencies: - hono: 4.11.4 - '@humanfs/core@0.19.1': {} '@humanfs/node@0.16.7': @@ -6516,35 +5962,8 @@ snapshots: '@microsoft/tsdoc@0.16.0': {} - '@mixmark-io/domino@2.2.0': {} - - '@modelcontextprotocol/sdk@1.25.2(@cfworker/json-schema@4.1.1)(hono@4.11.4)(zod@4.3.6)': - dependencies: - '@hono/node-server': 1.19.9(hono@4.11.4) - ajv: 8.17.1 - ajv-formats: 3.0.1(ajv@8.17.1) - content-type: 1.0.5 - cors: 2.8.5 - cross-spawn: 7.0.6 - eventsource: 3.0.7 - eventsource-parser: 3.0.6 - express: 5.2.1 - express-rate-limit: 7.5.1(express@5.2.1) - jose: 6.1.3 - json-schema-typed: 8.0.2 - pkce-challenge: 5.0.1 - raw-body: 3.0.2 - zod: 4.3.6 - zod-to-json-schema: 3.25.1(zod@4.3.6) - optionalDependencies: - '@cfworker/json-schema': 4.1.1 - transitivePeerDependencies: - - hono - - supports-color - - '@mozilla/readability@0.6.0': {} - - '@opentelemetry/api@1.9.0': {} + '@opentelemetry/api@1.9.0': + optional: true '@pinia/colada@0.21.2(pinia@3.0.4(typescript@5.9.3)(vue@3.5.26(typescript@5.9.3)))(vue@3.5.26(typescript@5.9.3))': dependencies: @@ -7080,8 +6499,6 @@ snapshots: '@types/trusted-types@2.0.7': optional: true - '@types/turndown@5.0.6': {} - '@types/unist@3.0.3': {} '@types/web-bluetooth@0.0.21': {} @@ -7195,8 +6612,6 @@ snapshots: transitivePeerDependencies: - vue - '@vercel/oidc@3.1.0': {} - '@vitejs/plugin-vue@5.2.4(vite@5.4.21(@types/node@25.0.3)(lightningcss@1.30.2))(vue@3.5.26(typescript@5.9.3))': dependencies: vite: 5.4.21(@types/node@25.0.3)(lightningcss@1.30.2) @@ -7450,11 +6865,6 @@ snapshots: '@xterm/xterm@6.0.0': {} - accepts@2.0.0: - dependencies: - mime-types: 3.0.2 - negotiator: 1.0.0 - acorn-jsx@5.3.2(acorn@8.15.0): dependencies: acorn: 8.15.0 @@ -7464,14 +6874,6 @@ snapshots: agent-base@7.1.4: optional: true - ai@6.0.25(zod@4.3.6): - dependencies: - '@ai-sdk/gateway': 3.0.10(zod@4.3.6) - '@ai-sdk/provider': 3.0.2 - '@ai-sdk/provider-utils': 4.0.4(zod@4.3.6) - '@opentelemetry/api': 1.9.0 - zod: 4.3.6 - ajv-draft-04@1.0.0(ajv@8.13.0): optionalDependencies: ajv: 8.13.0 @@ -7480,10 +6882,6 @@ snapshots: optionalDependencies: ajv: 8.13.0 - ajv-formats@3.0.1(ajv@8.17.1): - optionalDependencies: - ajv: 8.17.1 - ajv@6.12.6: dependencies: fast-deep-equal: 3.1.3 @@ -7505,13 +6903,6 @@ snapshots: require-from-string: 2.0.2 uri-js: 4.4.1 - ajv@8.17.1: - dependencies: - fast-deep-equal: 3.1.3 - fast-uri: 3.1.0 - json-schema-traverse: 1.0.0 - require-from-string: 2.0.2 - algoliasearch@5.46.2: dependencies: '@algolia/abtesting': 1.12.2 @@ -7588,20 +6979,6 @@ snapshots: birpc@2.9.0: {} - body-parser@2.2.2: - dependencies: - bytes: 3.1.2 - content-type: 1.0.5 - debug: 4.4.3 - http-errors: 2.0.1 - iconv-lite: 0.7.2 - on-finished: 2.4.1 - qs: 6.14.1 - raw-body: 3.0.2 - type-is: 2.0.1 - transitivePeerDependencies: - - supports-color - boolbase@1.0.0: {} brace-expansion@1.1.12: @@ -7656,8 +7033,6 @@ snapshots: esbuild: 0.27.2 load-tsconfig: 0.2.5 - bytes@3.1.2: {} - c12@3.3.3: dependencies: chokidar: 5.0.0 @@ -7679,11 +7054,7 @@ snapshots: dependencies: es-errors: 1.3.0 function-bind: 1.1.2 - - call-bound@1.0.4: - dependencies: - call-bind-apply-helpers: 1.0.2 - get-intrinsic: 1.3.0 + optional: true callsites@3.1.0: {} @@ -7783,27 +7154,14 @@ snapshots: consola@3.4.2: {} - content-disposition@1.0.1: {} - - content-type@1.0.5: {} - convert-source-map@2.0.0: {} - cookie-signature@1.2.2: {} - - cookie@0.7.2: {} - cookie@1.1.1: {} copy-anything@4.0.5: dependencies: is-what: 5.5.0 - cors@2.8.5: - dependencies: - object-assign: 4.1.1 - vary: 1.1.2 - cose-base@1.0.3: dependencies: layout-base: 1.0.2 @@ -7818,26 +7176,14 @@ snapshots: shebang-command: 2.0.0 which: 2.0.2 - css-select@5.2.2: - dependencies: - boolbase: 1.0.0 - css-what: 6.2.2 - domhandler: 5.0.3 - domutils: 3.2.2 - nth-check: 2.1.1 - css-tree@3.1.0: dependencies: mdn-data: 2.12.2 source-map-js: 1.2.1 optional: true - css-what@6.2.2: {} - cssesc@3.0.0: {} - cssom@0.5.0: {} - cssstyle@5.3.7: dependencies: '@asamuzakjp/css-color': 4.1.2 @@ -8067,8 +7413,6 @@ snapshots: delayed-stream@1.0.0: optional: true - depd@2.0.0: {} - dequal@2.0.3: {} destr@2.0.5: {} @@ -8081,28 +7425,10 @@ snapshots: diff@8.0.2: {} - dom-serializer@2.0.0: - dependencies: - domelementtype: 2.3.0 - domhandler: 5.0.3 - entities: 4.5.0 - - domelementtype@2.3.0: {} - - domhandler@5.0.3: - dependencies: - domelementtype: 2.3.0 - dompurify@3.3.2: optionalDependencies: '@types/trusted-types': 2.0.7 - domutils@3.2.2: - dependencies: - dom-serializer: 2.0.0 - domelementtype: 2.3.0 - domhandler: 5.0.3 - dotenv@17.2.3: {} drizzle-kit@0.31.8: @@ -8119,14 +7445,13 @@ snapshots: call-bind-apply-helpers: 1.0.2 es-errors: 1.3.0 gopd: 1.2.0 + optional: true echarts@6.0.0: dependencies: tslib: 2.3.0 zrender: 6.0.0 - ee-first@1.1.1: {} - electron-to-chromium@1.5.267: {} elysia@1.4.27(@sinclair/typebox@0.34.47)(@types/bun@1.3.10)(exact-mirror@0.2.6(@sinclair/typebox@0.34.47))(file-type@21.3.0)(openapi-types@12.1.3)(typescript@5.9.3): @@ -8148,8 +7473,6 @@ snapshots: emoji-regex@8.0.0: {} - encodeurl@2.0.0: {} - enhanced-resolve@5.18.4: dependencies: graceful-fs: 4.2.11 @@ -8162,19 +7485,20 @@ snapshots: entities@7.0.0: {} - entities@7.0.1: {} - error-stack-parser-es@1.0.5: {} - es-define-property@1.0.1: {} + es-define-property@1.0.1: + optional: true - es-errors@1.3.0: {} + es-errors@1.3.0: + optional: true es-module-lexer@1.7.0: {} es-object-atoms@1.1.1: dependencies: es-errors: 1.3.0 + optional: true es-set-tostringtag@2.1.0: dependencies: @@ -8302,8 +7626,6 @@ snapshots: escalade@3.2.0: {} - escape-html@1.0.3: {} - escape-string-regexp@4.0.0: {} eslint-plugin-vue@10.6.2(@typescript-eslint/parser@8.52.0(eslint@9.39.2(jiti@2.6.1))(typescript@5.9.3))(eslint@9.39.2(jiti@2.6.1))(vue-eslint-parser@10.2.0(eslint@9.39.2(jiti@2.6.1))): @@ -8393,57 +7715,12 @@ snapshots: esutils@2.0.3: {} - etag@1.8.1: {} - - eventsource-parser@3.0.6: {} - - eventsource@3.0.7: - dependencies: - eventsource-parser: 3.0.6 - exact-mirror@0.2.6(@sinclair/typebox@0.34.47): optionalDependencies: '@sinclair/typebox': 0.34.47 expect-type@1.3.0: {} - express-rate-limit@7.5.1(express@5.2.1): - dependencies: - express: 5.2.1 - - express@5.2.1: - dependencies: - accepts: 2.0.0 - body-parser: 2.2.2 - content-disposition: 1.0.1 - content-type: 1.0.5 - cookie: 0.7.2 - cookie-signature: 1.2.2 - debug: 4.4.3 - depd: 2.0.0 - encodeurl: 2.0.0 - escape-html: 1.0.3 - etag: 1.8.1 - finalhandler: 2.1.1 - fresh: 2.0.0 - http-errors: 2.0.1 - merge-descriptors: 2.0.0 - mime-types: 3.0.2 - on-finished: 2.4.1 - once: 1.4.0 - parseurl: 1.3.3 - proxy-addr: 2.0.7 - qs: 6.14.1 - range-parser: 1.2.1 - router: 2.2.0 - send: 1.2.1 - serve-static: 2.2.1 - statuses: 2.0.2 - type-is: 2.0.1 - vary: 1.1.2 - transitivePeerDependencies: - - supports-color - exsolve@1.0.8: {} fast-decode-uri-component@1.0.1: {} @@ -8454,8 +7731,6 @@ snapshots: fast-levenshtein@2.0.6: {} - fast-uri@3.1.0: {} - fdir@6.5.0(picomatch@4.0.3): optionalDependencies: picomatch: 4.0.3 @@ -8473,17 +7748,6 @@ snapshots: transitivePeerDependencies: - supports-color - finalhandler@2.1.1: - dependencies: - debug: 4.4.3 - encodeurl: 2.0.0 - escape-html: 1.0.3 - on-finished: 2.4.1 - parseurl: 1.3.3 - statuses: 2.0.2 - transitivePeerDependencies: - - supports-color - find-up@5.0.0: dependencies: locate-path: 6.0.0 @@ -8518,10 +7782,6 @@ snapshots: mime-types: 2.1.35 optional: true - forwarded@0.2.0: {} - - fresh@2.0.0: {} - fs-extra@11.3.3: dependencies: graceful-fs: 4.2.11 @@ -8552,11 +7812,13 @@ snapshots: has-symbols: 1.1.0 hasown: 2.0.2 math-intrinsics: 1.1.0 + optional: true get-proto@1.0.1: dependencies: dunder-proto: 1.0.1 es-object-atoms: 1.1.1 + optional: true get-tsconfig@4.13.0: dependencies: @@ -8577,7 +7839,8 @@ snapshots: globals@14.0.0: {} - gopd@1.2.0: {} + gopd@1.2.0: + optional: true graceful-fs@4.2.11: {} @@ -8585,7 +7848,8 @@ snapshots: has-flag@4.0.0: {} - has-symbols@1.1.0: {} + has-symbols@1.1.0: + optional: true has-tostringtag@1.0.2: dependencies: @@ -8614,8 +7878,6 @@ snapshots: dependencies: '@types/hast': 3.0.4 - hono@4.11.4: {} - hookable@5.5.3: {} html-encoding-sniffer@6.0.0: @@ -8625,25 +7887,8 @@ snapshots: - '@noble/hashes' optional: true - html-escaper@3.0.3: {} - html-void-elements@3.0.0: {} - htmlparser2@10.1.0: - dependencies: - domelementtype: 2.3.0 - domhandler: 5.0.3 - domutils: 3.2.2 - entities: 7.0.1 - - http-errors@2.0.1: - dependencies: - depd: 2.0.0 - inherits: 2.0.4 - setprototypeof: 1.2.0 - statuses: 2.0.2 - toidentifier: 1.0.1 - http-proxy-agent@7.0.2: dependencies: agent-base: 7.1.4 @@ -8685,8 +7930,6 @@ snapshots: imurmurhash@0.1.4: {} - inherits@2.0.4: {} - inquirer@12.11.1(@types/node@22.19.5): dependencies: '@inquirer/ansi': 1.0.2 @@ -8703,8 +7946,6 @@ snapshots: internmap@2.0.3: {} - ipaddr.js@1.9.1: {} - is-core-module@2.16.1: dependencies: hasown: 2.0.2 @@ -8730,8 +7971,6 @@ snapshots: is-potential-custom-element-name@1.0.1: optional: true - is-promise@4.0.0: {} - is-unicode-supported@1.3.0: {} is-unicode-supported@2.1.0: {} @@ -8750,8 +7989,6 @@ snapshots: jju@1.4.0: {} - jose@6.1.3: {} - joycon@3.1.1: {} js-tokens@4.0.0: {} @@ -8799,10 +8036,6 @@ snapshots: json-schema-traverse@1.0.0: {} - json-schema-typed@8.0.2: {} - - json-schema@0.4.0: {} - json-stable-stringify-without-jsonify@1.0.1: {} json5@2.2.3: {} @@ -8897,14 +8130,6 @@ snapshots: lines-and-columns@1.2.4: {} - linkedom@0.18.12: - dependencies: - css-select: 5.2.2 - cssom: 0.5.0 - html-escaper: 3.0.3 - htmlparser2: 10.1.0 - uhyphen: 0.2.0 - linkify-it@5.0.0: dependencies: uc.micro: 2.1.0 @@ -8994,7 +8219,8 @@ snapshots: stream-monaco: 0.0.18(monaco-editor@0.52.2) vue-i18n: 11.2.8(vue@3.5.26(typescript@5.9.3)) - math-intrinsics@1.1.0: {} + math-intrinsics@1.1.0: + optional: true mdast-util-to-hast@13.2.1: dependencies: @@ -9013,14 +8239,10 @@ snapshots: mdurl@2.0.0: {} - media-typer@1.1.0: {} - memoirist@0.4.0: {} memorystream@0.3.1: {} - merge-descriptors@2.0.0: {} - mermaid@11.12.3: dependencies: '@braintree/sanitize-url': 7.1.2 @@ -9064,17 +8286,11 @@ snapshots: mime-db@1.52.0: optional: true - mime-db@1.54.0: {} - mime-types@2.1.35: dependencies: mime-db: 1.52.0 optional: true - mime-types@3.0.2: - dependencies: - mime-db: 1.54.0 - mimic-function@5.0.1: {} minimatch@10.0.3: @@ -9124,8 +8340,6 @@ snapshots: natural-compare@1.4.0: {} - negotiator@1.0.0: {} - node-fetch-native@1.6.7: {} node-releases@2.0.27: {} @@ -9155,20 +8369,10 @@ snapshots: object-assign@4.1.1: {} - object-inspect@1.13.4: {} - obug@2.1.1: {} ohash@2.0.11: {} - on-finished@2.4.1: - dependencies: - ee-first: 1.1.1 - - once@1.4.0: - dependencies: - wrappy: 1.0.2 - onetime@7.0.0: dependencies: mimic-function: 5.0.1 @@ -9245,8 +8449,6 @@ snapshots: entities: 6.0.1 optional: true - parseurl@1.3.3: {} - path-browserify@1.0.1: {} path-data-parser@0.1.0: {} @@ -9257,8 +8459,6 @@ snapshots: path-parse@1.0.7: {} - path-to-regexp@8.3.0: {} - pathe@2.0.3: {} perfect-debounce@1.0.0: {} @@ -9286,8 +8486,6 @@ snapshots: pirates@4.0.7: {} - pkce-challenge@5.0.1: {} - pkg-types@1.3.1: dependencies: confbox: 0.1.8 @@ -9343,11 +8541,6 @@ snapshots: property-information@7.1.0: {} - proxy-addr@2.0.7: - dependencies: - forwarded: 0.2.0 - ipaddr.js: 1.9.1 - proxy-from-env@1.1.0: optional: true @@ -9355,21 +8548,8 @@ snapshots: punycode@2.3.1: {} - qs@6.14.1: - dependencies: - side-channel: 1.1.0 - quansync@0.2.11: {} - range-parser@1.2.1: {} - - raw-body@3.0.2: - dependencies: - bytes: 3.1.2 - http-errors: 2.0.1 - iconv-lite: 0.7.2 - unpipe: 1.0.0 - rc9@2.1.2: dependencies: defu: 6.1.4 @@ -9469,16 +8649,6 @@ snapshots: points-on-curve: 0.2.0 points-on-path: 0.2.1 - router@2.2.0: - dependencies: - debug: 4.4.3 - depd: 2.0.0 - is-promise: 4.0.0 - parseurl: 1.3.3 - path-to-regexp: 8.3.0 - transitivePeerDependencies: - - supports-color - run-applescript@7.1.0: {} run-async@4.0.6: {} @@ -9506,33 +8676,6 @@ snapshots: semver@7.7.3: {} - send@1.2.1: - dependencies: - debug: 4.4.3 - encodeurl: 2.0.0 - escape-html: 1.0.3 - etag: 1.8.1 - fresh: 2.0.0 - http-errors: 2.0.1 - mime-types: 3.0.2 - ms: 2.1.3 - on-finished: 2.4.1 - range-parser: 1.2.1 - statuses: 2.0.2 - transitivePeerDependencies: - - supports-color - - serve-static@2.2.1: - dependencies: - encodeurl: 2.0.0 - escape-html: 1.0.3 - parseurl: 1.3.3 - send: 1.2.1 - transitivePeerDependencies: - - supports-color - - setprototypeof@1.2.0: {} - shebang-command@2.0.0: dependencies: shebang-regex: 3.0.0 @@ -9563,34 +8706,6 @@ snapshots: '@shikijs/vscode-textmate': 10.0.2 '@types/hast': 3.0.4 - side-channel-list@1.0.0: - dependencies: - es-errors: 1.3.0 - object-inspect: 1.13.4 - - side-channel-map@1.0.1: - dependencies: - call-bound: 1.0.4 - es-errors: 1.3.0 - get-intrinsic: 1.3.0 - object-inspect: 1.13.4 - - side-channel-weakmap@1.0.2: - dependencies: - call-bound: 1.0.4 - es-errors: 1.3.0 - get-intrinsic: 1.3.0 - object-inspect: 1.13.4 - side-channel-map: 1.0.1 - - side-channel@1.1.0: - dependencies: - es-errors: 1.3.0 - object-inspect: 1.13.4 - side-channel-list: 1.0.0 - side-channel-map: 1.0.1 - side-channel-weakmap: 1.0.2 - siginfo@2.0.0: {} signal-exit@4.1.0: {} @@ -9626,8 +8741,6 @@ snapshots: stackback@0.0.2: {} - statuses@2.0.2: {} - std-env@3.10.0: {} stdin-discarder@0.2.2: {} @@ -9761,8 +8874,6 @@ snapshots: tldts-core: 7.0.24 optional: true - toidentifier@1.0.1: {} - token-types@6.1.2: dependencies: '@borewit/text-codec': 0.2.1 @@ -9835,10 +8946,6 @@ snapshots: optionalDependencies: fsevents: 2.3.3 - turndown@7.2.2: - dependencies: - '@mixmark-io/domino': 2.2.0 - tw-animate-css@1.4.0: {} type-check@0.4.0: @@ -9847,12 +8954,6 @@ snapshots: type-fest@4.41.0: {} - type-is@2.0.1: - dependencies: - content-type: 1.0.5 - media-typer: 1.1.0 - mime-types: 3.0.2 - typescript-eslint@8.52.0(eslint@9.39.2(jiti@2.6.1))(typescript@5.9.3): dependencies: '@typescript-eslint/eslint-plugin': 8.52.0(@typescript-eslint/parser@8.52.0(eslint@9.39.2(jiti@2.6.1))(typescript@5.9.3))(eslint@9.39.2(jiti@2.6.1))(typescript@5.9.3) @@ -9872,8 +8973,6 @@ snapshots: ufo@1.6.2: {} - uhyphen@0.2.0: {} - uint8array-extras@1.5.0: {} undici-types@6.21.0: {} @@ -9905,8 +9004,6 @@ snapshots: universalify@2.0.1: {} - unpipe@1.0.0: {} - unplugin-dts@1.0.0-beta.6(@microsoft/api-extractor@7.55.2(@types/node@24.10.4))(@vue/language-core@3.2.2)(esbuild@0.27.2)(rollup@4.54.0)(typescript@5.9.3)(vite@7.3.0(@types/node@24.10.4)(jiti@2.6.1)(lightningcss@1.30.2)(tsx@4.21.0)(yaml@2.8.2)): dependencies: '@rollup/pluginutils': 5.3.0(rollup@4.54.0) @@ -9953,8 +9050,6 @@ snapshots: uuid@11.1.0: {} - vary@1.1.2: {} - vee-validate@4.15.1(vue@3.5.26(typescript@5.9.3)): dependencies: '@vue/devtools-api': 7.7.9 @@ -10266,8 +9361,6 @@ snapshots: string-width: 4.2.3 strip-ansi: 6.0.1 - wrappy@1.0.2: {} - ws@8.19.0: optional: true @@ -10298,10 +9391,6 @@ snapshots: yoctocolors-cjs@2.1.3: {} - zod-to-json-schema@3.25.1(zod@4.3.6): - dependencies: - zod: 4.3.6 - zod@3.25.76: {} zod@4.3.6: {} diff --git a/scripts/release.sh b/scripts/release.sh index 2546f650..50e93d4c 100755 --- a/scripts/release.sh +++ b/scripts/release.sh @@ -9,17 +9,8 @@ COMMIT_HASH="${COMMIT_HASH:-unknown}" BUILD_TIME="${BUILD_TIME:-$(date -u +"%Y-%m-%dT%H:%M:%SZ")}" OUTPUT_DIR="${OUTPUT_DIR:-$ROOT_DIR/dist}" PREPARE_ASSETS_ONLY="false" -UPX_COMPRESS_AGENT_BIN="${UPX_COMPRESS_AGENT_BIN:-false}" -UPX_ARGS="${UPX_ARGS:--3}" -UPX_ALLOW_DARWIN="${UPX_ALLOW_DARWIN:-false}" -AUTO_INSTALL_UPX="${AUTO_INSTALL_UPX:-}" -if [[ -z "$AUTO_INSTALL_UPX" ]]; then - AUTO_INSTALL_UPX=$([[ "${GITHUB_ACTIONS:-}" == "true" ]] && echo "true" || echo "false") -fi WEB_DIR="$ROOT_DIR/internal/embedded/web" -AGENT_DIR="$ROOT_DIR/internal/embedded/agent" -BUN_DIR="$ROOT_DIR/internal/embedded/bun" log() { echo "[release] $*" @@ -36,9 +27,6 @@ Options: --commit-hash Commit hash injected into memoh binary --output-dir Output directory for release artifacts --prepare-assets Only prepare embedded assets, do not build archive - -Compatibility options: - --bun-version Deprecated; ignored (kept for backward compatibility) EOF } @@ -53,10 +41,6 @@ parse_args() { TARGET_ARCH="$2" shift 2 ;; - --bun-version) - # Bun runtime archives are no longer embedded; keep arg for compatibility. - shift 2 - ;; --version) VERSION="$2" shift 2 @@ -91,23 +75,10 @@ write_keep_gitignore() { printf "*\n!.gitignore\n" > "$dir/.gitignore" } -resolve_agent_compile_target() { - case "${TARGET_OS}-${TARGET_ARCH}" in - linux-amd64) echo "bun-linux-x64|agent-bin" ;; - linux-arm64) echo "bun-linux-arm64|agent-bin" ;; - darwin-amd64) echo "bun-darwin-x64|agent-bin" ;; - darwin-arm64) echo "bun-darwin-arm64|agent-bin" ;; - windows-amd64) echo "bun-windows-x64|agent-bin.exe" ;; - *) echo "|" ;; - esac -} - prepare_embed_dirs() { - rm -rf "$WEB_DIR" "$AGENT_DIR" "$BUN_DIR" - mkdir -p "$WEB_DIR" "$AGENT_DIR" "$BUN_DIR" + rm -rf "$WEB_DIR" + mkdir -p "$WEB_DIR" write_keep_gitignore "$WEB_DIR" - write_keep_gitignore "$AGENT_DIR" - write_keep_gitignore "$BUN_DIR" } prepare_assets() { @@ -118,140 +89,7 @@ prepare_assets() { cp -R "$ROOT_DIR/apps/web/dist/." "$WEB_DIR/" gzip_embedded_web_assets "$WEB_DIR" - local target_key="${TARGET_OS}-${TARGET_ARCH}" - local resolved bun_compile_target agent_bin_name - resolved="$(resolve_agent_compile_target)" - bun_compile_target="${resolved%%|*}" - agent_bin_name="${resolved##*|}" - if [[ -z "$bun_compile_target" || -z "$agent_bin_name" ]]; then - echo "agent-bin not available for ${target_key}" > "$AGENT_DIR/UNAVAILABLE" - log "skipped agent-bin compile for unsupported target ${target_key}" - return 0 - fi - - log "building agent executable (${bun_compile_target})" - patch_jsdom_style_loader_for_compile - trap 'restore_jsdom_style_loader_patch' RETURN - ( - cd "$ROOT_DIR/apps/agent" - bun build src/index.ts --compile --target "$bun_compile_target" --outfile "$AGENT_DIR/$agent_bin_name" - ) - restore_jsdom_style_loader_patch - trap - RETURN - chmod +x "$AGENT_DIR/$agent_bin_name" || true - compress_agent_bin_if_enabled "$AGENT_DIR/$agent_bin_name" "$TARGET_OS" - - log "embedded assets prepared (${target_key})" -} - -JSDOM_STYLE_RULES_FILE="" -JSDOM_STYLE_RULES_BACKUP="" -JSDOM_XHR_IMPL_FILE="" -JSDOM_XHR_IMPL_BACKUP="" - -patch_jsdom_style_loader_for_compile() { - local css_path css_json - JSDOM_STYLE_RULES_FILE="$(node -e "try{process.stdout.write(require.resolve('jsdom/lib/jsdom/living/helpers/style-rules.js',{paths:['$ROOT_DIR/apps/agent']}))}catch{process.exit(1)}" 2>/dev/null || true)" - css_path="$(node -e "try{process.stdout.write(require.resolve('jsdom/lib/jsdom/browser/default-stylesheet.css',{paths:['$ROOT_DIR/apps/agent']}))}catch{process.exit(1)}" 2>/dev/null || true)" - JSDOM_XHR_IMPL_FILE="$(node -e "try{process.stdout.write(require.resolve('jsdom/lib/jsdom/living/xhr/XMLHttpRequest-impl.js',{paths:['$ROOT_DIR/apps/agent']}))}catch{process.exit(1)}" 2>/dev/null || true)" - - if [[ -z "$JSDOM_STYLE_RULES_FILE" || -z "$css_path" || -z "$JSDOM_XHR_IMPL_FILE" ]]; then - log "skip jsdom patch (jsdom sources not resolved)" - return 0 - fi - - JSDOM_STYLE_RULES_BACKUP="${JSDOM_STYLE_RULES_FILE}.memoh.bak" - JSDOM_XHR_IMPL_BACKUP="${JSDOM_XHR_IMPL_FILE}.memoh.bak" - cp "$JSDOM_STYLE_RULES_FILE" "$JSDOM_STYLE_RULES_BACKUP" - cp "$JSDOM_XHR_IMPL_FILE" "$JSDOM_XHR_IMPL_BACKUP" - css_json="$(node -e "const fs=require('fs');process.stdout.write(JSON.stringify(fs.readFileSync(process.argv[1],'utf8')))" "$css_path")" - - node - "$JSDOM_STYLE_RULES_FILE" "$css_json" <<'NODE' -const fs = require("fs"); -const file = process.argv[2]; -const css = process.argv[3]; -let src = fs.readFileSync(file, "utf8"); -const pattern = /const defaultStyleSheet = fs\.readFileSync\([\s\S]*?\);\n/; -if (!pattern.test(src)) { - console.error("[release] jsdom patch target not found"); - process.exit(1); -} -src = src.replace(pattern, `const defaultStyleSheet = ${css};\n`); -fs.writeFileSync(file, src, "utf8"); -NODE - - node - "$JSDOM_XHR_IMPL_FILE" <<'NODE' -const fs = require("fs"); -const file = process.argv[2]; -let src = fs.readFileSync(file, "utf8"); -const pattern = /const syncWorkerFile = require\.resolve \? require\.resolve\("\.\/xhr-sync-worker\.js"\) : null;/; -if (!pattern.test(src)) { - console.error("[release] jsdom xhr patch target not found"); - process.exit(1); -} -src = src.replace(pattern, 'const syncWorkerFile = `${__dirname}/xhr-sync-worker.js`;'); -fs.writeFileSync(file, src, "utf8"); -NODE - - log "patched jsdom style loader for compile-time embedding" -} - -restore_jsdom_style_loader_patch() { - if [[ -n "$JSDOM_STYLE_RULES_BACKUP" && -f "$JSDOM_STYLE_RULES_BACKUP" && -n "$JSDOM_STYLE_RULES_FILE" ]]; then - mv "$JSDOM_STYLE_RULES_BACKUP" "$JSDOM_STYLE_RULES_FILE" - fi - if [[ -n "$JSDOM_XHR_IMPL_BACKUP" && -f "$JSDOM_XHR_IMPL_BACKUP" && -n "$JSDOM_XHR_IMPL_FILE" ]]; then - mv "$JSDOM_XHR_IMPL_BACKUP" "$JSDOM_XHR_IMPL_FILE" - fi - log "restored jsdom compile-time patches" -} - -compress_agent_bin_if_enabled() { - local bin_path="$1" - local target_os="$2" - - if [[ "$UPX_COMPRESS_AGENT_BIN" != "true" ]]; then - return 0 - fi - ensure_upx_available - if [[ "$target_os" == "darwin" && "$UPX_ALLOW_DARWIN" != "true" ]]; then - log "skip upx on darwin (set UPX_ALLOW_DARWIN=true to force)" - return 0 - fi - - local before_bytes after_bytes - before_bytes="$(wc -c < "$bin_path" | tr -d ' ')" - read -r -a upx_flags <<< "$UPX_ARGS" - upx "${upx_flags[@]}" "$bin_path" - after_bytes="$(wc -c < "$bin_path" | tr -d ' ')" - log "upx compressed agent-bin: ${before_bytes} -> ${after_bytes} bytes" -} - -ensure_upx_available() { - if command -v upx >/dev/null 2>&1; then - return 0 - fi - if [[ "$AUTO_INSTALL_UPX" != "true" ]]; then - echo "[release] UPX_COMPRESS_AGENT_BIN=true but upx not found in PATH" >&2 - echo "[release] install upx or set AUTO_INSTALL_UPX=true" >&2 - exit 1 - fi - - log "upx not found; attempting auto-install" - if [[ "$OSTYPE" == linux* ]] && command -v apt-get >/dev/null 2>&1; then - if command -v sudo >/dev/null 2>&1; then - sudo apt-get update -y && sudo apt-get install -y upx-ucl - else - apt-get update -y && apt-get install -y upx-ucl - fi - elif [[ "$OSTYPE" == darwin* ]] && command -v brew >/dev/null 2>&1; then - brew install upx - fi - - if ! command -v upx >/dev/null 2>&1; then - echo "[release] failed to auto-install upx" >&2 - exit 1 - fi + log "embedded assets prepared" } gzip_embedded_web_assets() { diff --git a/skills-lock.json b/skills-lock.json index 3a8649a1..c400a560 100644 --- a/skills-lock.json +++ b/skills-lock.json @@ -4,7 +4,7 @@ "twilight-ai": { "source": "memohai/twilight-ai", "sourceType": "github", - "computedHash": "f52a544c699944def25f46ac924b9e49cbf6b951f768325a0df9dd3f3fb512ab" + "computedHash": "84e60a3bf2743a7e47bb85a3bdb989c63caacf10d72a6e63e77c777cfffc30d5" } } }