Skip to content

Commit

Permalink
[WIP] Livestream tutorials (#697)
Browse files Browse the repository at this point in the history
* wip

* wip

* wip on livestreaming tutorial

* minor edits

* wip on docs

* Refactor flow condition of the durationInMs

* counts and startedAt

* Implement liveDurationInMs

* Add camera and microphone toggle buttons and improve designs

* Update docs for cookbook - watching livestream

---------

Co-authored-by: Thierry Schellenbach <[email protected]>
Co-authored-by: Tommaso Barbugli <[email protected]>
  • Loading branch information
3 people authored Jul 31, 2023
1 parent 9c34fd9 commit 0455866
Show file tree
Hide file tree
Showing 21 changed files with 731 additions and 204 deletions.
28 changes: 19 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,30 +104,40 @@ Video roadmap and changelog is available [here](https://github.com/GetStream/pro

- [ ] Complete integration with the video demo flow
- [ ] Finish usability testing with design team on chat integration (Jaewoong)
- [ ] Ringing: Finish it, make testing easy and write docs for common changes (Daniel)
- [ ] Enable ice restarts for publisher and subscriber
- [ ] Livestream tutorial (depends on RTMP support) (Thierry)
- [X] Ringing: Finish it, make testing easy and write docs for common changes (Daniel)
- [ ] Bug: Screensharing on Firefox has some issues when rendering on android (Daniel)
- [ ] Pagination on query members & query call endpoints (Daniel)
- [ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel)
- [ ] Android SDK development.md cleanup (Daniel)
- [ ] Livestream tutorial (depends on RTMP support) (Thierry)
- [ ] Call Analytics stateflow (Thierry)
- [ ] Pagination on query members & query channel endpoints (Daniel)
- [ ] Enable ice restarts for publisher and subscriber
- [ ] Test coverage
- [ ] Testing on more devices
- [ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel)
- [ ] Android SDK development.md cleanup (Daniel)
- [ ] Logging is too verbose (rtc is very noisy), clean it up to focus on the essential for info and higher

### 0.4.0 milestone

- [ ] Upgrade to more recent versions of webrtc
- [ ] Screensharing from mobile
- [ ] Tap to focus
- [ ] Camera controls
- [ ] Picture of the video stream at highest resolution
- [ ] Review foreground service vs backend for some things like screensharing etc
- [ ] Audio & Video filters. Support (Daniel)
- [ ] H264 workaround on Samsung 23 (see https://github.com/livekit/client-sdk-android/blob/main/livekit-android-sdk/src/main/java/io/livekit/android/webrtc/SimulcastVideoEncoderFactoryWrapper.kt#L34 and
- https://github.com/react-native-webrtc/react-native-webrtc/issues/983#issuecomment-975624906)
- [ ] Dynascale 2.0 (codecs, f resolution switches, resolution webrtc handling)
- [ ] Dynascale 2.0

### 0.5.0 milestone

- [ ] Screensharing from mobile
- [ ] Camera controls

### Dynascale 2.0

- currently we support selecting which of the 3 layers you want to send: f, h and q. in addition we should support:
- changing the resolution of the f track
- changing the codec that's used from VP8 to h264 or vice versa
- detecting when webrtc changes the resolution of the f track, and notifying the server about it (if needed)

## 💼 We are hiring!

Expand Down
10 changes: 6 additions & 4 deletions docusaurus/docs/Android/02-tutorials/01-video-calling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,15 +18,17 @@ This tutorial teaches you how to build Zoom/Whatsapp style video calling for you
2. Select Phone & Tablet -> **Empty Activity**
3. Name your project **VideoCall**.

Note that setup steps can vary slightly across Android Studio versions.
If you run into trouble, make sure to use the latest version of Android Studio ([Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023) or higher).
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
We recommend using Android Studio Giraffe or newer.

### Step 2 - Install the SDK & Setup the client

**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`.
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.

```groovy


```kotlin
dependencies {
// Stream Video Compose SDK
implementation("io.getstream:stream-video-android-compose:0.2.0")
Expand Down
7 changes: 3 additions & 4 deletions docusaurus/docs/Android/02-tutorials/02-audio-room.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,17 +20,16 @@ Time to get started building an audio-room for your app.

### Step 1 - Create a new project in Android Studio

This tutorial was written using [Android Studio Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023).
Setup steps can vary slightly across Android Studio versions.
If you run into trouble, make sure to use the latest version of Android Studio.
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
We recommend using Android Studio Giraffe or newer.

1. Create a new project
2. Select Phone & Tablet -> **Empty Activity**
3. Name your project **AudioRoom**.

### Step 2 - Install the SDK & Setup the client

**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`.
**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder.

```groovy
Expand Down
146 changes: 74 additions & 72 deletions docusaurus/docs/Android/02-tutorials/03-livestream.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,13 @@ description: How to build a livestream experience using Stream's video SDKs

import { TokenSnippet } from '../../../shared/_tokenSnippet.jsx';

:::danger

This tutorial is almost ready, but not quite finished yet
:::

## Livestream Tutorial

In this tutorial we'll quickly build a low-latency in-app livestreaming experience.
The livestream is broadcasted using Stream's edge network of servers around the world.
We'll cover the following topics:

* Ultra low latency streaming
* Multiple streams & co-hosts
* RTMP in and Webrtc input
* RTMP in and WebRTC input
* Exporting to HLS
* Reactions, custom events and chat
* Recording & Transcriptions
Expand All @@ -27,25 +20,27 @@ Let's get started, if you have any questions or feedback be sure to let us know

### Step 1 - Create a new project in Android Studio

This tutorial was written using Android Studio Flamingo.
Setup steps can vary slightly across Android Studio versions so if you run into trouble be sure to use the latest version of Android Studio.
Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions.
We recommend using Android Studio Giraffe or newer.

1. Create a new project
2. Select Phone & Template -> **empty activity**
3. Name your project **Livestream**.

### Step 2 - Install the SDK & Setup the client

**Add the video SDK** to your app's `build.gradle` file found in app/build.gradle.
**Add the video SDK** to your app's `build.gradle.kts` file found in `app/build.gradle.kts`.
If you're new to android note that there are 2 build.gradle files, you want to open the one in the app folder.

```groovy
```kotlin
dependencies {
implementation "io.getstream:stream-video-android-compose:$stream_version"
implementation("io.getstream:stream-video-android-compose:0.2.0")

...
}
```

This tutorial uses the compose version of the SDK. Stream also provides a core library without compose.
This tutorial uses the compose version of the video SDK. Stream also provides a core library without compose.

### Step 3 - Broadcast a livestream from your phone

Expand Down Expand Up @@ -110,7 +105,10 @@ Replace them now with the values shown below:

<TokenSnippet sampleApp='livestream' displayStyle='credentials' />

In the next step we setup the user:
When you run the app now you'll see a text message saying: "TODO: render video".
Before we get around to rendering the video let's review the code above.

In the first step we setup the user:

```kotlin
val user = User(
Expand All @@ -119,15 +117,15 @@ val user = User(
)
```

If you don't have an authenticated user you can also use a guest or anonymous user. TODO DOCS
If you don't have an authenticated user you can also use a guest or anonymous user.
For most apps it's convenient to match your own system of users to grant and remove permissions.

Next we create the client:

```kotlin
val client = StreamVideoBuilder(
context = applicationContext,
apiKey = "hd8szvscpxvd", // demo API key
apiKey = "mmhfdzb5evj2", // demo API key
geo = GEO.GlobalEdgeNetwork,
user = user,
token = userToken,
Expand All @@ -136,7 +134,9 @@ val client = StreamVideoBuilder(

You'll see the `userToken` variable. Your backend typically generates the user token on signup or login.

In the next step we create and join the call. The call object is used for video calls, audio rooms and livestreaming.
The most important step to review is how we create the call.
Stream uses the same call object for livestreaming, audio rooms and video calling.
Have a look at the code snippet below:

```kotlin
val call = client.call("livestream", callId)
Expand All @@ -149,13 +149,20 @@ lifecycleScope.launch {
}
```

`call.join(create = true)` is the simplest example.
It's also possible to configure settings on the call or add co-hosts. TODO: Docs
First call object is created by specifying the call type: "livestream" and the callId.
The "livestream" call type is just a set a defaults that typically works well for a livestream.
You can edit the features, permissions and settings in the dashboard.
The dashboard also allows you to create new call types as needed.

Lastly, call.join(create = true) creates the call object on our servers and joins it.
The moment you use call.join() the realtime transport for audio and video is started.

### Step 4 - Render the video
Note that you can also add members to a call and assign them different roles. (See the [call creation docs](../03-guides/02-joining-creating-calls.mdx))

### Step 4 - Rendering the video

In this step we're going to build a UI for showing your local video with a button to start the livestream.
This example uses compose, but you could also use our XML VideoRenderer.
This example uses Compose, but you could also use our XML VideoRenderer.

In `MainActivity.kt` replace the `VideoTheme` with the following code.

Expand Down Expand Up @@ -244,12 +251,15 @@ If you now run your app you should see an interface like this:

![Livestream](../assets/tutorial-livestream.png)

When you press **go live** your video will be transmitted.
Press go live in the android app and click the link below to watch it in your browser.
Stream uses a technology called SFU cascading to replicate your livestream over different SFUs around the world.
This makes it possible to reach a large audience in realtime.

Now let's press "go live" in the android app and click the link below to watch the video in your browser.

<TokenSnippet sampleApp='livestream' displayStyle='join' />

Let's take a moment to review the compose code above `Call.state` exposes all the stateflow objects you need.
Let's take a moment to review the Compose code above. Call.state exposes all the stateflow objects you need.

The most important ones are:

```
Expand All @@ -259,82 +269,74 @@ call.state.participants

The [participant state docs](../03-guides/03-call-and-participant-state.mdx) show all the available data.

The compose layout is vanilla compose other than **[VideoRenderer](../04-ui-components/02-video-renderer.mdx)**.
`VideoRenderer` renders the video and a fallback. You can use it for both rendering the local video and remote video.
The livestream layout is built using standard Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream.
VideoRenderer renders the video and a fallback. You can use it for rendering the local and remote video.

### Step 4 - (Optional) Publishing RTMP using OBS

The example above showed how to publish your phone's camera to the livestream.
Almost all livestream software and hardware supports RTMPs.
So let's see how to publish using RTMPs. Feel free to skip this step if you don't need to use RTMPs.

A. Console log the URL & Stream Key println(call.state.ingress.rtmp)

B. Open OBS and go to settings -> stream
- Select "custom" service
- Server: equal to the server URL from the console log
- Stream key: equal to the stream key from the console log

Press start streaming. The RTMP stream will now show up in your call.
Now that we've learned to publish using webrtc or RTMP let's talk about viewing the livestream.
A. Log the URL & Stream Key

### Step 5 - Viewing a livestream (Webrtc)
```kotlin
val rtmp = call.state.ingress.rtmp
Log.i("Tutorial", "RTMP url and streamingKey: $rtmp")
```

Watching a livestream is basically a simplified version of the code we wrote in `MainActivity.kt`
B. Open OBS and go to settings -> stream

* You don't need to request permissions or enable the camera
*
- Select "custom" service
- Server: equal to the server URL from the log
- Stream key: equal to the stream key from the log

To change the example above and just watch a livestream:
Press start streaming in OBS. The RTMP stream will now show up in your call just like a regular video participant.
Now that we've learned to publish using WebRTC or RTMP let's talk about watching the livestream.

```kotlin
// remove this
call.camera.enable()
call.microphone.enable()
### Step 5 - Viewing a livestream (WebRTC)

// and this
LaunchCallPermissions(call = call)
// on the UI side remove the button to go live
Watching a livestream is even easier than broadcasting.

// and update this to use the remote video
val me by call.state.me.collectAsState()
val video = me?.video?.collectAsState()
```
Compared to the current code in in `MainActivity.kt` you:

Here's the update MainActivity for viewing a call
* Don't need to request permissions or enable the camera
* Don't render the local video, but instead render the remote video
* Typically include some small UI elements like viewer count, a button to mute etc

```kotlin
```
The [docs on building a UI for watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx) explain this in more detail.

### Step 6 - (Optional) Viewing a livestream with HLS

Another way to view a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above webrtc approach only has a 100-200ms delay typically.
The benefit that HLS has is that it buffers better under poor network conditions.
So for apps where you expect your users to have poor network, and where a 10 second delay is ok, HLS can be a better option.
Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above WebRTC approach is realtime.
The benefit that HLS offers is better buffering under poor network conditions.
So HLS can be a good option when:

Let's show how to broadcast your call to HLS.
* A 10-20 second delay is acceptable
* Your users want to watch the Stream in poor network conditions

```kotlin
call.startBroadcast()
```

After starting the broadcast the HLS url can be found in the call state
Let's show how to broadcast your call to HLS:

```kotlin
call.state.egress.value?.hls
call.startBroadcast()
val hlsUrl = call.state.egress.value?.hls
Log.i("Tutorial", "HLS url = $hlsUrl")
```

You can view the HLS video feed using any open source HLS capable video player.
You can view the HLS video feed using any HLS capable video player.

### 7 - Advanced Features

This tutorial covered broadcasting and watching a livestream.
It also went into more details about HLS & RTMP-in.

There are several advanced features that can improve the livestreaming experience:

* ** Co-hosts ** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
* ** Custom events ** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
* ** Reactions & Chat ** Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
* ** Notifications ** You can notify users via push notifications when the livestream starts
* ** Recording ** The call recording functionality allows you to record the call with various options and layouts
* ** [Co-hosts](../03-guides/02-joining-creating-calls.mdx) ** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc.
* ** [Custom events](../03-guides/08-reactions-and-custom-events.mdx) ** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case.
* ** [Reactions & Chat](../03-guides/08-reactions-and-custom-events.mdx) ** Users can react to the livestream, and you can add chat. This makes for a more engaging experience.
* ** [Notifications](../06-advanced/01-ringing.mdx) ** You can notify users via push notifications when the livestream starts
* ** [Recording](../06-advanced/06-recording.mdx) ** The call recording functionality allows you to record the call with various options and layouts

### Recap

Expand All @@ -344,7 +346,7 @@ Our team is also happy to review your UI designs and offer recommendations on ho

To recap what we've learned:

* Webrtc is optimal for latency, HLS is slower but buffers better for users with poor connections
* WebRTC is optimal for latency, HLS is slower but buffers better for users with poor connections
* You setup a call: (val call = client.call("livestream", callId))
* The call type "livestream" controls which features are enabled and how permissions are setup
* The livestream by default enables "backstage" mode. This allows you and your co-hosts to setup your mic and camera before allowing people in
Expand Down
3 changes: 3 additions & 0 deletions docusaurus/docs/Android/05-ui-cookbook/01-overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -77,6 +77,9 @@ This cookbook aims to show you how to build your own UI elements for video calli
<CookbookCard title="Reactions">
<img src={require("../assets/cookbook/reactions.png").default} />
</CookbookCard>
<CookbookCard title="Watching a Livestream">
<img src={require("../assets/cookbook/reactions.png").default} />
</CookbookCard>
</CookbookCardGrid>
</div>

Expand Down
Loading

0 comments on commit 0455866

Please sign in to comment.