Merged
Conversation
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR introduces an upgraded Realtime streams backend and SDK that makes streams more reliable (and resumable) with increased or the removal of limits. We've also improved the visibily of streams via the run dashboard.
New limits
View the below limits table for more details:
Additionally, previously only a single client stream could be sent to a Realtime stream. Now, you can send multiple client streams to a single Realtime stream.
Reliability improvements
When appending to a stream, the backend will now reliably resume appending from the last chunk index if there is a lost connection. Additionally, we've improved the reliability of reading from a stream by automatically resuming failed reads from the last chunk index if there is a lost connection.
This means that both sides of the stream will be much more reliable and will not lose data even when faced with network issues or other disruptions.
SDK improvements
We've moved the stream logic into their own dedicated namespace in the SDK instead of being mixed in with the other metadata methods:
You can now pipe to a stream using the
streams.pipemethod, which returns a result that can be used to wait until the stream is complete:When calling
streams.pipefrom inside a task, the stream is automatically associated with the current run. You can also optionally specify a target run ID to pipe to a stream on a different run:This means that, if you specify a target run ID, you can pipe to a stream outside of a task:
We've also added a new
streams.readmethod to read from a stream:You can also specify a timeout and start index to read from:
Default stream
Runs also now have a "default" stream which means you can optionally skip specifying a stream key:
streams.append/writer
You can append a single chunk to a stream using the
streams.appendmethod:Or you can use the
streams.writermethod to write multiple chunks to a stream or merge a stream into another stream:Both of these methods accept a target run ID to append/write to a stream on a different run:
streams.define
You can now define a stream in one place with the chunk type and the stream ID and use it in multiple places, DRYing up your code:
New useRealtimeStream hook
We've added a new
useRealtimeStreamhook to subscribe to a stream by its run ID and optional stream key:Just like the previous new functions, you can skip specifying the stream key when using
useRealtimeStream:You can also pass in the stream instance directly:
Dashboard improvements
We're now surfacing streams in the runs dashboard that will allow you to view the stream data in real-time:
CleanShot.2025-10-24.at.17.19.22.mp4