fix: adaptive LLM progress estimation and emit 85% on stream end #6

Merged
admin merged 1 commits from fix/adaptive-llm-progress into main 2026-03-14 13:43:27 +00:00

1 Commits

Author SHA1 Message Date
vakabunga
78c4730196 fix: adaptive LLM progress estimation and emit 85% on stream end
Hardcoded EXPECTED_CHARS (15k) caused progress to stall at ~20-25% for
short statements. Now expected size is derived from input text length.
Also emit an explicit 85% event when the LLM stream finishes, and
throttle SSE events to 300ms to reduce browser overhead.
2026-03-14 16:41:12 +03:00