mirror of
https://github.com/memohai/Memoh.git
synced 2026-04-25 07:00:48 +09:00
43c4153938
* refactor: introduce DCP pipeline layer for unified context assembly
Introduce a Deterministic Context Pipeline (DCP) inspired by Cahciua,
providing event-driven context assembly for LLM conversations.
- Add `internal/pipeline/` package with Canonical Event types, Projection
(reduce), Rendering (XML RC), Pipeline manager, and EventStore persistence
- Change user message format from YAML front-matter to XML `<message>` tags
with self-contained attributes (sender, channel, conversation, type)
- Merge CLI/Web dual API into single `/local/` endpoint, remove CLI handler
- Add `bot_session_events` table for event persistence and cold-start replay
- Add `discuss` session type (reserved for future Cahciua-style mode)
- Wire pipeline into HandleInbound: adapt → persist → push on every message
- Lazy cold-start replay: load events from DB on first session access
* feat: implement discuss mode with reactive driver and probe gate
Add discuss session mode where the bot autonomously decides when to speak
in group chats via tool-gated output (send tool only, no direct text reply).
- Add discuss driver (per-session goroutine, RC watch, step loop via
agent.Generate, TR persistence, late-binding prompt with mention hints)
- Add system_discuss.md prompt template ("text = inner monologue, send = speak")
- Add context composition (MergeContext, ComposeContext, TrimContext) for
RC + assistant/tool message interleaving by timestamp
- Add probe gate: when discuss_probe_model_id is set, cheap model pre-filters
group messages; no tool calls = silence, tool calls = activate primary
- Add /new [chat|discuss] command: explicit mode selection, defaults to
discuss in groups, chat in DMs, chat-only for WebUI
- Add ResolveRunConfig on flow.Resolver for discuss driver to reuse
model/tools/system-prompt resolution without reimplementing
- Fix send tool for discuss mode: same-conversation sends now go through
SendDirect (channel adapter) instead of the local emitter shortcut
- Add target attribute to XML message format (reply_target for routing)
- Add discuss_probe_model_id to bots table settings
- Remove pipeline compaction (SetCompactCursor) — reuse existing compaction.Service
- Persist full SDK messages (including tool calls) in discuss mode
* refactor: unify DCP event layer, fix persistence and local channel
- Fix bot_session_events dedup index to include event_kind so that
message + edit events for the same external_message_id coexist.
- Change CreateSessionEvent from :one to :exec so ON CONFLICT DO NOTHING
does not produce spurious errors on duplicate delivery.
- Move ACL evaluation before event ingest; denied messages no longer
enter bot_session_events or the in-memory pipeline.
- Let chat mode consume RenderedContext from the DCP pipeline when
available, sharing the same event-driven context assembly as discuss.
- Collapse local WebSocket handler to route through HandleInbound
instead of directly calling StreamChatWS, eliminating the dual
business entry point.
- Extract buildBaseRunConfig shared builder so resolve() and
ResolveRunConfig() no longer duplicate model/credentials/skills setup.
- Add StoreRound to RunConfigResolver interface so discuss driver
persists assistant output with full metadata, usage, and memory
extraction (same quality as chat mode).
- Fix discuss driver context: use context.Background() instead of the
short-lived HTTP request context that was getting cancelled.
- Fix model ID passed to StoreRound: return database UUID from
ResolveRunConfig instead of SDK model name.
- Remove dead CLIAdapter/CLIType and update legacy web/cli references
in tests and comments.
* fix: stop idle discuss goroutines after 10min timeout
Discuss session goroutines were never cleaned up when a session became
inactive (e.g. after /new). Add a 10-minute idle timer that auto-exits
the goroutine and removes it from the sessions map when no new RC
arrives.
* refactor: pipeline details — event types, structured reply, display content
- Remove [User sent N attachments] placeholder text from buildInboundQuery;
attachment info is now expressed via pipeline <attachment> tags.
- Unify in-reply-to as structured ReplyRef (Sender/Preview fields) across
Telegram, Discord, Feishu, and Matrix adapters instead of prepending
[Reply to ...] text into the message body. Remove now-unused
buildTelegramQuotedText, buildDiscordQuotedText, buildMatrixQuotedText.
- Make AdaptInbound return CanonicalEvent interface and dispatch to
adaptMessage/adaptEdit/adaptService based on metadata["event_type"].
- Add event_id column to bot_history_messages (migration 0059) so user
messages can reference their canonical pipeline event.
- PersistEvent now returns the event UUID; HandleInbound passes it through
to both persistPassiveMessage and ChatRequest.EventID for storeRound.
- Add FillDisplayContent to message service: extracts plain text from
event_data for clean frontend display.
- Frontend extractMessageText prefers display_content when available,
falling back to legacy strip logic for old messages.
- Fix: always generate headerifiedQuery for storage even when usePipeline
is true, so user messages are persisted via storeRound in chat mode.
* fix: use json.Marshal for pipeline context content serialization
The manual string escaping in buildMessagesFromPipeline only handled
double quotes but not newlines, backslashes, and other JSON special
characters, producing invalid json.RawMessage values. The LLM then
received empty/malformed context and complained about having no history.
* fix: restore WebSocket handler to use StreamChatWS directly
The previous refactoring replaced the WS handler with HandleInbound +
RouteHub subscription, which broke streaming because RouteHub events
use a different format (channel.StreamEvent) than what the frontend
expects (flow.WSStreamEvent with text_delta, tool_call_start, etc.).
Restore the original direct StreamChatWS call path so WebUI streaming
works again. The WS handler now matches the pre-refactoring behavior
while all other changes (pipeline, ACL, event types, etc.) are kept.
* feat: store display_text directly in bot_history_messages
Instead of computing display content at API response time by querying
bot_session_events via event_id, store the raw user text in a dedicated
display_text column at write time. This works for all paths including
the WebSocket handler which does not go through the pipeline/event layer.
- Migration 0060: add display_text TEXT column
- PersistInput gains DisplayText; filled from trimmedText (passive) and
req.Query (storeRound)
- toMessageFields reads display_text into DisplayContent
- Remove FillDisplayContent runtime query and ListSessionEventsByEventID
- Frontend already prefers display_content when available (no change)
* fix: display_text should contain raw user text, not XML-wrapped query
req.Query gets overwritten to headerifiedQuery (with XML <message> tags)
before storeRound runs. Add RawQuery field to ChatRequest to preserve
the original user text, and use it for display_text in storeMessages.
* fix(web): show discuss sessions
* refactor: introduce DCP pipeline layer for unified context assembly
Introduce a Deterministic Context Pipeline (DCP) inspired by Cahciua,
providing event-driven context assembly for LLM conversations.
- Add `internal/pipeline/` package with Canonical Event types, Projection
(reduce), Rendering (XML RC), Pipeline manager, and EventStore persistence
- Change user message format from YAML front-matter to XML `<message>` tags
with self-contained attributes (sender, channel, conversation, type)
- Merge CLI/Web dual API into single `/local/` endpoint, remove CLI handler
- Add `bot_session_events` table for event persistence and cold-start replay
- Add `discuss` session type (reserved for future Cahciua-style mode)
- Wire pipeline into HandleInbound: adapt → persist → push on every message
- Lazy cold-start replay: load events from DB on first session access
* feat: implement discuss mode with reactive driver and probe gate
Add discuss session mode where the bot autonomously decides when to speak
in group chats via tool-gated output (send tool only, no direct text reply).
- Add discuss driver (per-session goroutine, RC watch, step loop via
agent.Generate, TR persistence, late-binding prompt with mention hints)
- Add system_discuss.md prompt template ("text = inner monologue, send = speak")
- Add context composition (MergeContext, ComposeContext, TrimContext) for
RC + assistant/tool message interleaving by timestamp
- Add probe gate: when discuss_probe_model_id is set, cheap model pre-filters
group messages; no tool calls = silence, tool calls = activate primary
- Add /new [chat|discuss] command: explicit mode selection, defaults to
discuss in groups, chat in DMs, chat-only for WebUI
- Add ResolveRunConfig on flow.Resolver for discuss driver to reuse
model/tools/system-prompt resolution without reimplementing
- Fix send tool for discuss mode: same-conversation sends now go through
SendDirect (channel adapter) instead of the local emitter shortcut
- Add target attribute to XML message format (reply_target for routing)
- Add discuss_probe_model_id to bots table settings
- Remove pipeline compaction (SetCompactCursor) — reuse existing compaction.Service
- Persist full SDK messages (including tool calls) in discuss mode
* refactor: unify DCP event layer, fix persistence and local channel
- Fix bot_session_events dedup index to include event_kind so that
message + edit events for the same external_message_id coexist.
- Change CreateSessionEvent from :one to :exec so ON CONFLICT DO NOTHING
does not produce spurious errors on duplicate delivery.
- Move ACL evaluation before event ingest; denied messages no longer
enter bot_session_events or the in-memory pipeline.
- Let chat mode consume RenderedContext from the DCP pipeline when
available, sharing the same event-driven context assembly as discuss.
- Collapse local WebSocket handler to route through HandleInbound
instead of directly calling StreamChatWS, eliminating the dual
business entry point.
- Extract buildBaseRunConfig shared builder so resolve() and
ResolveRunConfig() no longer duplicate model/credentials/skills setup.
- Add StoreRound to RunConfigResolver interface so discuss driver
persists assistant output with full metadata, usage, and memory
extraction (same quality as chat mode).
- Fix discuss driver context: use context.Background() instead of the
short-lived HTTP request context that was getting cancelled.
- Fix model ID passed to StoreRound: return database UUID from
ResolveRunConfig instead of SDK model name.
- Remove dead CLIAdapter/CLIType and update legacy web/cli references
in tests and comments.
* fix: stop idle discuss goroutines after 10min timeout
Discuss session goroutines were never cleaned up when a session became
inactive (e.g. after /new). Add a 10-minute idle timer that auto-exits
the goroutine and removes it from the sessions map when no new RC
arrives.
* refactor: pipeline details — event types, structured reply, display content
- Remove [User sent N attachments] placeholder text from buildInboundQuery;
attachment info is now expressed via pipeline <attachment> tags.
- Unify in-reply-to as structured ReplyRef (Sender/Preview fields) across
Telegram, Discord, Feishu, and Matrix adapters instead of prepending
[Reply to ...] text into the message body. Remove now-unused
buildTelegramQuotedText, buildDiscordQuotedText, buildMatrixQuotedText.
- Make AdaptInbound return CanonicalEvent interface and dispatch to
adaptMessage/adaptEdit/adaptService based on metadata["event_type"].
- Add event_id column to bot_history_messages (migration 0059) so user
messages can reference their canonical pipeline event.
- PersistEvent now returns the event UUID; HandleInbound passes it through
to both persistPassiveMessage and ChatRequest.EventID for storeRound.
- Add FillDisplayContent to message service: extracts plain text from
event_data for clean frontend display.
- Frontend extractMessageText prefers display_content when available,
falling back to legacy strip logic for old messages.
- Fix: always generate headerifiedQuery for storage even when usePipeline
is true, so user messages are persisted via storeRound in chat mode.
* fix: use json.Marshal for pipeline context content serialization
The manual string escaping in buildMessagesFromPipeline only handled
double quotes but not newlines, backslashes, and other JSON special
characters, producing invalid json.RawMessage values. The LLM then
received empty/malformed context and complained about having no history.
* fix: restore WebSocket handler to use StreamChatWS directly
The previous refactoring replaced the WS handler with HandleInbound +
RouteHub subscription, which broke streaming because RouteHub events
use a different format (channel.StreamEvent) than what the frontend
expects (flow.WSStreamEvent with text_delta, tool_call_start, etc.).
Restore the original direct StreamChatWS call path so WebUI streaming
works again. The WS handler now matches the pre-refactoring behavior
while all other changes (pipeline, ACL, event types, etc.) are kept.
* feat: store display_text directly in bot_history_messages
Instead of computing display content at API response time by querying
bot_session_events via event_id, store the raw user text in a dedicated
display_text column at write time. This works for all paths including
the WebSocket handler which does not go through the pipeline/event layer.
- Migration 0060: add display_text TEXT column
- PersistInput gains DisplayText; filled from trimmedText (passive) and
req.Query (storeRound)
- toMessageFields reads display_text into DisplayContent
- Remove FillDisplayContent runtime query and ListSessionEventsByEventID
- Frontend already prefers display_content when available (no change)
* fix: display_text should contain raw user text, not XML-wrapped query
req.Query gets overwritten to headerifiedQuery (with XML <message> tags)
before storeRound runs. Add RawQuery field to ChatRequest to preserve
the original user text, and use it for display_text in storeMessages.
* fix(web): show discuss sessions
* chore(feishu): change discuss output to stream card
* fix(channel): unify discuss/chat send path and card markdown delivery
* feat(discuss): switch to stream execution with RouteHub broadcasting
* refactor(pipeline): remove context trimming from ComposeContext
The pipeline path should not trim context by token budget — the
upstream IC/RC already bounds the event window. Remove TrimContext,
FindWorkingWindowCursor, EstimateTokens, FormatLastProcessedMs (all
unused or only used for trimming), the maxTokens parameter from
ComposeContext, and MaxContextTokens from DiscussSessionConfig.
---------
Co-authored-by: 晨苒 <16112591+chen-ran@users.noreply.github.com>
435 lines
15 KiB
Go
435 lines
15 KiB
Go
// Package channel provides a unified abstraction for multi-platform messaging channels.
|
|
// It defines types, interfaces, and a registry for channel adapters such as Telegram and Feishu.
|
|
package channel
|
|
|
|
import (
|
|
"strings"
|
|
"time"
|
|
)
|
|
|
|
// ChannelType identifies a messaging platform (e.g., "telegram", "feishu").
|
|
type ChannelType string
|
|
|
|
// String returns the channel type as a plain string.
|
|
func (c ChannelType) String() string {
|
|
return string(c)
|
|
}
|
|
|
|
// Identity represents a sender's identity on a channel.
|
|
type Identity struct {
|
|
SubjectID string
|
|
DisplayName string
|
|
Attributes map[string]string
|
|
}
|
|
|
|
// Attribute returns the trimmed value for the given key, or empty string if absent.
|
|
func (i Identity) Attribute(key string) string {
|
|
if i.Attributes == nil {
|
|
return ""
|
|
}
|
|
return strings.TrimSpace(i.Attributes[key])
|
|
}
|
|
|
|
// Conversation holds metadata about the chat or group context.
|
|
type Conversation struct {
|
|
ID string
|
|
Type string
|
|
Name string
|
|
ThreadID string
|
|
Metadata map[string]any
|
|
}
|
|
|
|
const (
|
|
ConversationTypePrivate = "private"
|
|
ConversationTypeGroup = "group"
|
|
ConversationTypeThread = "thread"
|
|
)
|
|
|
|
// NormalizeConversationType normalizes conversation type values within the
|
|
// channel abstraction domain: private/group/thread.
|
|
func NormalizeConversationType(raw string) string {
|
|
switch strings.ToLower(strings.TrimSpace(raw)) {
|
|
case "", "p2p", "direct", ConversationTypePrivate:
|
|
return ConversationTypePrivate
|
|
case ConversationTypeThread:
|
|
return ConversationTypeThread
|
|
case ConversationTypeGroup:
|
|
return ConversationTypeGroup
|
|
default:
|
|
return ConversationTypeGroup
|
|
}
|
|
}
|
|
|
|
// IsPrivateConversationType reports whether the conversation is private.
|
|
func IsPrivateConversationType(raw string) bool {
|
|
return NormalizeConversationType(raw) == ConversationTypePrivate
|
|
}
|
|
|
|
// InboundMessage is a message received from an external channel.
|
|
type InboundMessage struct {
|
|
Channel ChannelType
|
|
Message Message
|
|
BotID string
|
|
ReplyTarget string
|
|
RouteKey string
|
|
Sender Identity
|
|
Conversation Conversation
|
|
ReceivedAt time.Time
|
|
Source string
|
|
Metadata map[string]any
|
|
}
|
|
|
|
// RoutingKey returns a stable identifier used for reply routing.
|
|
// Format: platform:bot_id:conversation_id[:sender_id].
|
|
func (m InboundMessage) RoutingKey() string {
|
|
if strings.TrimSpace(m.RouteKey) != "" {
|
|
return strings.TrimSpace(m.RouteKey)
|
|
}
|
|
senderID := strings.TrimSpace(m.Sender.SubjectID)
|
|
if senderID == "" {
|
|
senderID = strings.TrimSpace(m.Sender.DisplayName)
|
|
}
|
|
return GenerateRoutingKey(string(m.Channel), m.BotID, m.Conversation.ID, m.Conversation.Type, senderID)
|
|
}
|
|
|
|
// GenerateRoutingKey builds a route key from platform, bot, conversation, and sender info.
|
|
// For group chats, the sender ID is appended to provide per-user context.
|
|
func GenerateRoutingKey(platform, botID, conversationID, conversationType, senderID string) string {
|
|
parts := []string{platform, botID, conversationID}
|
|
if !IsPrivateConversationType(conversationType) {
|
|
senderID = strings.TrimSpace(senderID)
|
|
if senderID != "" {
|
|
parts = append(parts, senderID)
|
|
}
|
|
}
|
|
return strings.Join(parts, ":")
|
|
}
|
|
|
|
// OutboundMessage pairs a delivery target with the message content.
|
|
type OutboundMessage struct {
|
|
Target string `json:"target"`
|
|
Message Message `json:"message"`
|
|
}
|
|
|
|
// StreamEventType defines the kind of outbound stream event.
|
|
type StreamEventType string
|
|
|
|
const (
|
|
StreamEventStatus StreamEventType = "status"
|
|
StreamEventDelta StreamEventType = "delta"
|
|
StreamEventFinal StreamEventType = "final"
|
|
StreamEventError StreamEventType = "error"
|
|
StreamEventToolCallStart StreamEventType = "tool_call_start"
|
|
StreamEventToolCallEnd StreamEventType = "tool_call_end"
|
|
StreamEventPhaseStart StreamEventType = "phase_start"
|
|
StreamEventPhaseEnd StreamEventType = "phase_end"
|
|
StreamEventAttachment StreamEventType = "attachment"
|
|
StreamEventAgentStart StreamEventType = "agent_start"
|
|
StreamEventAgentEnd StreamEventType = "agent_end"
|
|
StreamEventReaction StreamEventType = "reaction"
|
|
StreamEventSpeech StreamEventType = "speech"
|
|
StreamEventProcessingStarted StreamEventType = "processing_started"
|
|
StreamEventProcessingCompleted StreamEventType = "processing_completed"
|
|
StreamEventProcessingFailed StreamEventType = "processing_failed"
|
|
)
|
|
|
|
// StreamStatus indicates the lifecycle state of a streaming reply.
|
|
type StreamStatus string
|
|
|
|
const (
|
|
StreamStatusStarted StreamStatus = "started"
|
|
StreamStatusCompleted StreamStatus = "completed"
|
|
StreamStatusFailed StreamStatus = "failed"
|
|
)
|
|
|
|
// StreamFinalizePayload carries the final reply message emitted by a stream.
|
|
type StreamFinalizePayload struct {
|
|
Message Message `json:"message"`
|
|
}
|
|
|
|
// StreamToolCall carries tool invocation data for tool_call_start / tool_call_end events.
|
|
type StreamToolCall struct {
|
|
Name string `json:"name"`
|
|
CallID string `json:"call_id,omitempty"`
|
|
Input any `json:"input,omitempty"`
|
|
Result any `json:"result,omitempty"`
|
|
}
|
|
|
|
// StreamPhase labels a processing stage within a stream (e.g., reasoning, text).
|
|
type StreamPhase string
|
|
|
|
const (
|
|
StreamPhaseReasoning StreamPhase = "reasoning"
|
|
StreamPhaseText StreamPhase = "text"
|
|
)
|
|
|
|
// StreamEvent represents a unified stream event routed through the channel layer.
|
|
type StreamEvent struct {
|
|
Type StreamEventType `json:"type"`
|
|
Status StreamStatus `json:"status,omitempty"`
|
|
Delta string `json:"delta,omitempty"`
|
|
Final *StreamFinalizePayload `json:"final,omitempty"`
|
|
Error string `json:"error,omitempty"`
|
|
ToolCall *StreamToolCall `json:"tool_call,omitempty"`
|
|
Phase StreamPhase `json:"phase,omitempty"`
|
|
Attachments []Attachment `json:"attachments,omitempty"`
|
|
Reactions []ReactRequest `json:"reactions,omitempty"`
|
|
Speeches []SpeechRequest `json:"speeches,omitempty"`
|
|
Metadata map[string]any `json:"metadata,omitempty"`
|
|
}
|
|
|
|
// SpeechRequest carries text-to-speech synthesis text from a speech_delta stream event.
|
|
type SpeechRequest struct {
|
|
Text string `json:"text"`
|
|
}
|
|
|
|
// StreamOptions configures how an outbound stream is initialized.
|
|
type StreamOptions struct {
|
|
Reply *ReplyRef `json:"reply,omitempty"`
|
|
SourceMessageID string `json:"source_message_id,omitempty"`
|
|
Metadata map[string]any `json:"metadata,omitempty"`
|
|
}
|
|
|
|
// MessageFormat indicates how the message text should be rendered.
|
|
type MessageFormat string
|
|
|
|
const (
|
|
MessageFormatPlain MessageFormat = "plain"
|
|
MessageFormatMarkdown MessageFormat = "markdown"
|
|
MessageFormatRich MessageFormat = "rich"
|
|
)
|
|
|
|
// MessagePartType identifies the kind of a rich-text message part.
|
|
type MessagePartType string
|
|
|
|
const (
|
|
MessagePartText MessagePartType = "text"
|
|
MessagePartLink MessagePartType = "link"
|
|
MessagePartCodeBlock MessagePartType = "code_block"
|
|
MessagePartMention MessagePartType = "mention"
|
|
MessagePartEmoji MessagePartType = "emoji"
|
|
)
|
|
|
|
// MessageTextStyle describes inline formatting for a text part.
|
|
type MessageTextStyle string
|
|
|
|
const (
|
|
MessageStyleBold MessageTextStyle = "bold"
|
|
MessageStyleItalic MessageTextStyle = "italic"
|
|
MessageStyleStrikethrough MessageTextStyle = "strikethrough"
|
|
MessageStyleCode MessageTextStyle = "code"
|
|
)
|
|
|
|
// MessagePart is a single element within a rich-text message.
|
|
type MessagePart struct {
|
|
Type MessagePartType `json:"type"`
|
|
Text string `json:"text,omitempty"`
|
|
URL string `json:"url,omitempty"`
|
|
Styles []MessageTextStyle `json:"styles,omitempty"`
|
|
Language string `json:"language,omitempty"`
|
|
ChannelIdentityID string `json:"channel_identity_id,omitempty"`
|
|
Emoji string `json:"emoji,omitempty"`
|
|
Metadata map[string]any `json:"metadata,omitempty"`
|
|
}
|
|
|
|
// AttachmentType classifies the kind of binary attachment.
|
|
type AttachmentType string
|
|
|
|
const (
|
|
AttachmentImage AttachmentType = "image"
|
|
AttachmentAudio AttachmentType = "audio"
|
|
AttachmentVideo AttachmentType = "video"
|
|
AttachmentVoice AttachmentType = "voice"
|
|
AttachmentFile AttachmentType = "file"
|
|
AttachmentGIF AttachmentType = "gif"
|
|
)
|
|
|
|
// Attachment represents a binary file attached to a message.
|
|
type Attachment struct {
|
|
Type AttachmentType `json:"type"`
|
|
URL string `json:"url,omitempty"`
|
|
PlatformKey string `json:"platform_key,omitempty"`
|
|
SourcePlatform string `json:"source_platform,omitempty"`
|
|
ContentHash string `json:"content_hash,omitempty"`
|
|
Base64 string `json:"base64,omitempty"` // data URL for agent delivery
|
|
Name string `json:"name,omitempty"`
|
|
Size int64 `json:"size,omitempty"`
|
|
Mime string `json:"mime,omitempty"`
|
|
DurationMs int64 `json:"duration_ms,omitempty"`
|
|
Width int `json:"width,omitempty"`
|
|
Height int `json:"height,omitempty"`
|
|
ThumbnailURL string `json:"thumbnail_url,omitempty"`
|
|
Caption string `json:"caption,omitempty"`
|
|
Metadata map[string]any `json:"metadata,omitempty"`
|
|
}
|
|
|
|
// Reference returns the strongest available attachment reference.
|
|
// URL is preferred for cross-platform portability, then platform key.
|
|
func (a Attachment) Reference() string {
|
|
if strings.TrimSpace(a.URL) != "" {
|
|
return strings.TrimSpace(a.URL)
|
|
}
|
|
return strings.TrimSpace(a.PlatformKey)
|
|
}
|
|
|
|
// HasReference reports whether URL or platform key is available.
|
|
func (a Attachment) HasReference() bool {
|
|
return a.Reference() != ""
|
|
}
|
|
|
|
// Action describes an interactive button or link in a message.
|
|
type Action struct {
|
|
Type string `json:"type"`
|
|
Label string `json:"label,omitempty"`
|
|
Value string `json:"value,omitempty"`
|
|
URL string `json:"url,omitempty"`
|
|
}
|
|
|
|
// ThreadRef references a conversation thread by ID.
|
|
type ThreadRef struct {
|
|
ID string `json:"id"`
|
|
}
|
|
|
|
// ReplyRef points to a message being replied to.
|
|
type ReplyRef struct {
|
|
Target string `json:"target,omitempty"`
|
|
MessageID string `json:"message_id,omitempty"`
|
|
Sender string `json:"sender,omitempty"`
|
|
Preview string `json:"preview,omitempty"`
|
|
}
|
|
|
|
// Message is the unified message structure used across all channels.
|
|
type Message struct {
|
|
ID string `json:"id,omitempty"`
|
|
Format MessageFormat `json:"format,omitempty"`
|
|
Text string `json:"text,omitempty"`
|
|
Parts []MessagePart `json:"parts,omitempty"`
|
|
Attachments []Attachment `json:"attachments,omitempty"`
|
|
Actions []Action `json:"actions,omitempty"`
|
|
Thread *ThreadRef `json:"thread,omitempty"`
|
|
Reply *ReplyRef `json:"reply,omitempty"`
|
|
Metadata map[string]any `json:"metadata,omitempty"`
|
|
}
|
|
|
|
// IsEmpty reports whether the message carries no content.
|
|
func (m Message) IsEmpty() bool {
|
|
return strings.TrimSpace(m.Text) == "" &&
|
|
len(m.Parts) == 0 &&
|
|
len(m.Attachments) == 0 &&
|
|
len(m.Actions) == 0
|
|
}
|
|
|
|
// PlainText extracts the plain text representation of the message.
|
|
func (m Message) PlainText() string {
|
|
if strings.TrimSpace(m.Text) != "" {
|
|
return strings.TrimSpace(m.Text)
|
|
}
|
|
if len(m.Parts) == 0 {
|
|
return ""
|
|
}
|
|
lines := make([]string, 0, len(m.Parts))
|
|
for _, part := range m.Parts {
|
|
switch part.Type {
|
|
case MessagePartText, MessagePartLink, MessagePartCodeBlock, MessagePartMention, MessagePartEmoji:
|
|
value := strings.TrimSpace(part.Text)
|
|
if value == "" && part.Type == MessagePartLink {
|
|
value = strings.TrimSpace(part.URL)
|
|
}
|
|
if value == "" && part.Type == MessagePartEmoji {
|
|
value = strings.TrimSpace(part.Emoji)
|
|
}
|
|
if value == "" {
|
|
continue
|
|
}
|
|
lines = append(lines, value)
|
|
default:
|
|
continue
|
|
}
|
|
}
|
|
return strings.Join(lines, "\n")
|
|
}
|
|
|
|
// BindingCriteria specifies conditions for matching a user-channel binding.
|
|
type BindingCriteria struct {
|
|
SubjectID string
|
|
Attributes map[string]string
|
|
}
|
|
|
|
// Attribute returns the trimmed value for the given key, or empty string if absent.
|
|
func (c BindingCriteria) Attribute(key string) string {
|
|
if c.Attributes == nil {
|
|
return ""
|
|
}
|
|
return strings.TrimSpace(c.Attributes[key])
|
|
}
|
|
|
|
// BindingCriteriaFromIdentity creates BindingCriteria from a channel Identity.
|
|
func BindingCriteriaFromIdentity(identity Identity) BindingCriteria {
|
|
return BindingCriteria{
|
|
SubjectID: strings.TrimSpace(identity.SubjectID),
|
|
Attributes: identity.Attributes,
|
|
}
|
|
}
|
|
|
|
// ChannelConfig holds the configuration for a bot's channel integration.
|
|
// Disabled: true means the channel is stopped (not connected); false means enabled.
|
|
type ChannelConfig struct {
|
|
ID string `json:"id"`
|
|
BotID string `json:"bot_id"`
|
|
ChannelType ChannelType `json:"channel_type"`
|
|
Credentials map[string]any `json:"credentials"`
|
|
ExternalIdentity string `json:"external_identity"`
|
|
SelfIdentity map[string]any `json:"self_identity"`
|
|
Routing map[string]any `json:"routing"`
|
|
Disabled bool `json:"disabled"`
|
|
VerifiedAt time.Time `json:"verified_at"`
|
|
CreatedAt time.Time `json:"created_at"`
|
|
UpdatedAt time.Time `json:"updated_at"`
|
|
}
|
|
|
|
// ChannelIdentityBinding represents a channel identity's binding to a specific channel type.
|
|
type ChannelIdentityBinding struct {
|
|
ID string `json:"id"`
|
|
ChannelType ChannelType `json:"channel_type"`
|
|
ChannelIdentityID string `json:"channel_identity_id"`
|
|
Config map[string]any `json:"config"`
|
|
CreatedAt time.Time `json:"created_at"`
|
|
UpdatedAt time.Time `json:"updated_at"`
|
|
}
|
|
|
|
// UpsertConfigRequest is the input for creating or updating a channel configuration.
|
|
// Disabled: true to stop the channel, false to enable it. Omitted is treated as false (enabled).
|
|
type UpsertConfigRequest struct {
|
|
Credentials map[string]any `json:"credentials"`
|
|
ExternalIdentity string `json:"external_identity,omitempty"`
|
|
SelfIdentity map[string]any `json:"self_identity,omitempty"`
|
|
Routing map[string]any `json:"routing,omitempty"`
|
|
Disabled *bool `json:"disabled,omitempty"`
|
|
VerifiedAt *time.Time `json:"verified_at,omitempty"`
|
|
}
|
|
|
|
// UpsertChannelIdentityConfigRequest is the input for creating or updating a channel-identity binding.
|
|
type UpsertChannelIdentityConfigRequest struct {
|
|
Config map[string]any `json:"config"`
|
|
}
|
|
|
|
// UpdateChannelStatusRequest is the input for enabling/disabling a bot channel config.
|
|
type UpdateChannelStatusRequest struct {
|
|
Disabled bool `json:"disabled"`
|
|
}
|
|
|
|
// SendRequest is the input for sending an outbound message through a channel.
|
|
type SendRequest struct {
|
|
Target string `json:"target,omitempty"`
|
|
ChannelIdentityID string `json:"channel_identity_id,omitempty"`
|
|
Message Message `json:"message"`
|
|
}
|
|
|
|
// ReactRequest is the input for adding or removing an emoji reaction on a message.
|
|
type ReactRequest struct {
|
|
Target string `json:"target"`
|
|
MessageID string `json:"message_id"`
|
|
Emoji string `json:"emoji"`
|
|
Remove bool `json:"remove,omitempty"`
|
|
}
|