Skip to content

Commit

Permalink
Fix typo
Browse files Browse the repository at this point in the history
  • Loading branch information
ahuang11 authored Feb 2, 2024
1 parent efa67b6 commit a03bd77
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/concepts/streaming.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ This can enhance the user experience by already showing part of the response but
If you want to stream all the tokens generated quickly to your console output,
you can use the `settings.console_stream = True` setting.

## `strem_to()` wrapper
## `stream_to()` wrapper

For streaming with non runnable funcchains you can wrap the LLM generation call into the `stream_to()` context manager. This would look like this:

Expand Down

0 comments on commit a03bd77

Please sign in to comment.