fix: adaptive LLM progress estimation and emit 85% on stream end #6

Merged
admin merged 1 commits from fix/adaptive-llm-progress into main 2026-03-14 13:43:27 +00:00
Owner

Hardcoded EXPECTED_CHARS (15k) caused progress to stall at ~20-25% for
short statements. Now expected size is derived from input text length.
Also emit an explicit 85% event when the LLM stream finishes, and
throttle SSE events to 300ms to reduce browser overhead.

Hardcoded EXPECTED_CHARS (15k) caused progress to stall at ~20-25% for short statements. Now expected size is derived from input text length. Also emit an explicit 85% event when the LLM stream finishes, and throttle SSE events to 300ms to reduce browser overhead.
admin added 1 commit 2026-03-14 13:43:25 +00:00
Hardcoded EXPECTED_CHARS (15k) caused progress to stall at ~20-25% for
short statements. Now expected size is derived from input text length.
Also emit an explicit 85% event when the LLM stream finishes, and
throttle SSE events to 300ms to reduce browser overhead.
admin merged commit 22be09c101 into main 2026-03-14 13:43:27 +00:00
Sign in to join this conversation.
No Reviewers
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: admin/family_budget#6