Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions lib/active_agent/providers/open_ai/chat_provider.rb
Original file line number Diff line number Diff line change
Expand Up @@ -95,6 +95,8 @@ def api_response_normalize(api_response)
def process_stream_chunk(api_response_event)
instrument("stream_chunk.active_agent")

broadcast_stream_open

Comment on lines 95 to +99
Copy link

Copilot AI Feb 12, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a regression test that asserts the OpenAI (and ideally Ollama) provider streaming lifecycle emits :open and :close events when stream: true. Right now the change fixes a lifecycle bug, but there isn't provider-level test coverage to prevent broadcast_stream_open from being accidentally removed again (similar to the existing mock provider streaming test).

Copilot uses AI. Check for mistakes.
# Called Multiple Times: [Chunk<T>, T]<Content, ToolsCall>
case api_response_event.type
when :chunk
Expand Down
Loading