diff --git a/README.md b/README.md index 9cfc87db90..ade4f99048 100644 --- a/README.md +++ b/README.md @@ -104,30 +104,40 @@ Video roadmap and changelog is available [here](https://github.com/GetStream/pro - [ ] Complete integration with the video demo flow - [ ] Finish usability testing with design team on chat integration (Jaewoong) -- [ ] Ringing: Finish it, make testing easy and write docs for common changes (Daniel) -- [ ] Enable ice restarts for publisher and subscriber -- [ ] Livestream tutorial (depends on RTMP support) (Thierry) +- [X] Ringing: Finish it, make testing easy and write docs for common changes (Daniel) - [ ] Bug: Screensharing on Firefox has some issues when rendering on android (Daniel) +- [ ] Pagination on query members & query call endpoints (Daniel) +- [ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel) +- [ ] Android SDK development.md cleanup (Daniel) +- [ ] Livestream tutorial (depends on RTMP support) (Thierry) - [ ] Call Analytics stateflow (Thierry) -- [ ] Pagination on query members & query channel endpoints (Daniel) +- [ ] Enable ice restarts for publisher and subscriber - [ ] Test coverage - [ ] Testing on more devices -- [ ] local version of audioLevel(s) for lower latency audio visualizations(Daniel) -- [ ] Android SDK development.md cleanup (Daniel) - [ ] Logging is too verbose (rtc is very noisy), clean it up to focus on the essential for info and higher ### 0.4.0 milestone - [ ] Upgrade to more recent versions of webrtc -- [ ] Screensharing from mobile - [ ] Tap to focus -- [ ] Camera controls - [ ] Picture of the video stream at highest resolution - [ ] Review foreground service vs backend for some things like screensharing etc - [ ] Audio & Video filters. Support (Daniel) - [ ] H264 workaround on Samsung 23 (see https://github.com/livekit/client-sdk-android/blob/main/livekit-android-sdk/src/main/java/io/livekit/android/webrtc/SimulcastVideoEncoderFactoryWrapper.kt#L34 and - https://github.com/react-native-webrtc/react-native-webrtc/issues/983#issuecomment-975624906) -- [ ] Dynascale 2.0 (codecs, f resolution switches, resolution webrtc handling) +- [ ] Dynascale 2.0 + +### 0.5.0 milestone + +- [ ] Screensharing from mobile +- [ ] Camera controls + +### Dynascale 2.0 + +- currently we support selecting which of the 3 layers you want to send: f, h and q. in addition we should support: +- changing the resolution of the f track +- changing the codec that's used from VP8 to h264 or vice versa +- detecting when webrtc changes the resolution of the f track, and notifying the server about it (if needed) ## 💼 We are hiring! diff --git a/docusaurus/docs/Android/02-tutorials/01-video-calling.mdx b/docusaurus/docs/Android/02-tutorials/01-video-calling.mdx index 8e6e547f75..9039037a3d 100644 --- a/docusaurus/docs/Android/02-tutorials/01-video-calling.mdx +++ b/docusaurus/docs/Android/02-tutorials/01-video-calling.mdx @@ -18,15 +18,17 @@ This tutorial teaches you how to build Zoom/Whatsapp style video calling for you 2. Select Phone & Tablet -> **Empty Activity** 3. Name your project **VideoCall**. -Note that setup steps can vary slightly across Android Studio versions. -If you run into trouble, make sure to use the latest version of Android Studio ([Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023) or higher). +Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions. +We recommend using Android Studio Giraffe or newer. ### Step 2 - Install the SDK & Setup the client -**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`. +**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`. If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder. -```groovy + + +```kotlin dependencies { // Stream Video Compose SDK implementation("io.getstream:stream-video-android-compose:0.2.0") diff --git a/docusaurus/docs/Android/02-tutorials/02-audio-room.mdx b/docusaurus/docs/Android/02-tutorials/02-audio-room.mdx index 7c19dc1e07..16c611de21 100644 --- a/docusaurus/docs/Android/02-tutorials/02-audio-room.mdx +++ b/docusaurus/docs/Android/02-tutorials/02-audio-room.mdx @@ -20,9 +20,8 @@ Time to get started building an audio-room for your app. ### Step 1 - Create a new project in Android Studio -This tutorial was written using [Android Studio Flamingo](https://developer.android.com/studio/releases#android-studio-flamingo-|-2022.2.1-patch-2-may-2023). -Setup steps can vary slightly across Android Studio versions. -If you run into trouble, make sure to use the latest version of Android Studio. +Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions. +We recommend using Android Studio Giraffe or newer. 1. Create a new project 2. Select Phone & Tablet -> **Empty Activity** @@ -30,7 +29,7 @@ If you run into trouble, make sure to use the latest version of Android Studio. ### Step 2 - Install the SDK & Setup the client -**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle`. +**Add the Video Compose SDK** and [Jetpack Compose](https://developer.android.com/jetpack/compose) dependencies to your app's `build.gradle.kts` file found in `app/build.gradle.kts`. If you're new to android, note that there are 2 `build.gradle` files, you want to open the `build.gradle` in the app folder. ```groovy diff --git a/docusaurus/docs/Android/02-tutorials/03-livestream.mdx b/docusaurus/docs/Android/02-tutorials/03-livestream.mdx index 03132282ad..59447f9b0f 100644 --- a/docusaurus/docs/Android/02-tutorials/03-livestream.mdx +++ b/docusaurus/docs/Android/02-tutorials/03-livestream.mdx @@ -5,20 +5,13 @@ description: How to build a livestream experience using Stream's video SDKs import { TokenSnippet } from '../../../shared/_tokenSnippet.jsx'; -:::danger - -This tutorial is almost ready, but not quite finished yet -::: - -## Livestream Tutorial - In this tutorial we'll quickly build a low-latency in-app livestreaming experience. The livestream is broadcasted using Stream's edge network of servers around the world. We'll cover the following topics: * Ultra low latency streaming * Multiple streams & co-hosts -* RTMP in and Webrtc input +* RTMP in and WebRTC input * Exporting to HLS * Reactions, custom events and chat * Recording & Transcriptions @@ -27,8 +20,8 @@ Let's get started, if you have any questions or feedback be sure to let us know ### Step 1 - Create a new project in Android Studio -This tutorial was written using Android Studio Flamingo. -Setup steps can vary slightly across Android Studio versions so if you run into trouble be sure to use the latest version of Android Studio. +Note that this tutorial was written using Android Studio Giraffe. Setup steps can vary slightly across Android Studio versions. +We recommend using Android Studio Giraffe or newer. 1. Create a new project 2. Select Phone & Template -> **empty activity** @@ -36,16 +29,18 @@ Setup steps can vary slightly across Android Studio versions so if you run into ### Step 2 - Install the SDK & Setup the client -**Add the video SDK** to your app's `build.gradle` file found in app/build.gradle. +**Add the video SDK** to your app's `build.gradle.kts` file found in `app/build.gradle.kts`. If you're new to android note that there are 2 build.gradle files, you want to open the one in the app folder. -```groovy +```kotlin dependencies { - implementation "io.getstream:stream-video-android-compose:$stream_version" + implementation("io.getstream:stream-video-android-compose:0.2.0") + + ... } ``` -This tutorial uses the compose version of the SDK. Stream also provides a core library without compose. +This tutorial uses the compose version of the video SDK. Stream also provides a core library without compose. ### Step 3 - Broadcast a livestream from your phone @@ -110,7 +105,10 @@ Replace them now with the values shown below: -In the next step we setup the user: +When you run the app now you'll see a text message saying: "TODO: render video". +Before we get around to rendering the video let's review the code above. + +In the first step we setup the user: ```kotlin val user = User( @@ -119,7 +117,7 @@ val user = User( ) ``` -If you don't have an authenticated user you can also use a guest or anonymous user. TODO DOCS +If you don't have an authenticated user you can also use a guest or anonymous user. For most apps it's convenient to match your own system of users to grant and remove permissions. Next we create the client: @@ -127,7 +125,7 @@ Next we create the client: ```kotlin val client = StreamVideoBuilder( context = applicationContext, - apiKey = "hd8szvscpxvd", // demo API key + apiKey = "mmhfdzb5evj2", // demo API key geo = GEO.GlobalEdgeNetwork, user = user, token = userToken, @@ -136,7 +134,9 @@ val client = StreamVideoBuilder( You'll see the `userToken` variable. Your backend typically generates the user token on signup or login. -In the next step we create and join the call. The call object is used for video calls, audio rooms and livestreaming. +The most important step to review is how we create the call. +Stream uses the same call object for livestreaming, audio rooms and video calling. +Have a look at the code snippet below: ```kotlin val call = client.call("livestream", callId) @@ -149,13 +149,20 @@ lifecycleScope.launch { } ``` -`call.join(create = true)` is the simplest example. -It's also possible to configure settings on the call or add co-hosts. TODO: Docs +First call object is created by specifying the call type: "livestream" and the callId. +The "livestream" call type is just a set a defaults that typically works well for a livestream. +You can edit the features, permissions and settings in the dashboard. +The dashboard also allows you to create new call types as needed. + +Lastly, call.join(create = true) creates the call object on our servers and joins it. +The moment you use call.join() the realtime transport for audio and video is started. -### Step 4 - Render the video +Note that you can also add members to a call and assign them different roles. (See the [call creation docs](../03-guides/02-joining-creating-calls.mdx)) + +### Step 4 - Rendering the video In this step we're going to build a UI for showing your local video with a button to start the livestream. -This example uses compose, but you could also use our XML VideoRenderer. +This example uses Compose, but you could also use our XML VideoRenderer. In `MainActivity.kt` replace the `VideoTheme` with the following code. @@ -244,12 +251,15 @@ If you now run your app you should see an interface like this: ![Livestream](../assets/tutorial-livestream.png) -When you press **go live** your video will be transmitted. -Press go live in the android app and click the link below to watch it in your browser. +Stream uses a technology called SFU cascading to replicate your livestream over different SFUs around the world. +This makes it possible to reach a large audience in realtime. + +Now let's press "go live" in the android app and click the link below to watch the video in your browser. -Let's take a moment to review the compose code above `Call.state` exposes all the stateflow objects you need. +Let's take a moment to review the Compose code above. Call.state exposes all the stateflow objects you need. + The most important ones are: ``` @@ -259,8 +269,8 @@ call.state.participants The [participant state docs](../03-guides/03-call-and-participant-state.mdx) show all the available data. -The compose layout is vanilla compose other than **[VideoRenderer](../04-ui-components/02-video-renderer.mdx)**. -`VideoRenderer` renders the video and a fallback. You can use it for both rendering the local video and remote video. +The livestream layout is built using standard Compose. The [VideoRenderer](../04-ui-components/02-video-renderer.mdx) component is provided by Stream. +VideoRenderer renders the video and a fallback. You can use it for rendering the local and remote video. ### Step 4 - (Optional) Publishing RTMP using OBS @@ -268,73 +278,65 @@ The example above showed how to publish your phone's camera to the livestream. Almost all livestream software and hardware supports RTMPs. So let's see how to publish using RTMPs. Feel free to skip this step if you don't need to use RTMPs. -A. Console log the URL & Stream Key println(call.state.ingress.rtmp) - -B. Open OBS and go to settings -> stream -- Select "custom" service -- Server: equal to the server URL from the console log -- Stream key: equal to the stream key from the console log - -Press start streaming. The RTMP stream will now show up in your call. -Now that we've learned to publish using webrtc or RTMP let's talk about viewing the livestream. +A. Log the URL & Stream Key -### Step 5 - Viewing a livestream (Webrtc) +```kotlin +val rtmp = call.state.ingress.rtmp +Log.i("Tutorial", "RTMP url and streamingKey: $rtmp") +``` -Watching a livestream is basically a simplified version of the code we wrote in `MainActivity.kt` +B. Open OBS and go to settings -> stream -* You don't need to request permissions or enable the camera -* +- Select "custom" service +- Server: equal to the server URL from the log +- Stream key: equal to the stream key from the log -To change the example above and just watch a livestream: +Press start streaming in OBS. The RTMP stream will now show up in your call just like a regular video participant. +Now that we've learned to publish using WebRTC or RTMP let's talk about watching the livestream. -```kotlin -// remove this -call.camera.enable() -call.microphone.enable() +### Step 5 - Viewing a livestream (WebRTC) -// and this -LaunchCallPermissions(call = call) -// on the UI side remove the button to go live +Watching a livestream is even easier than broadcasting. -// and update this to use the remote video -val me by call.state.me.collectAsState() -val video = me?.video?.collectAsState() -``` +Compared to the current code in in `MainActivity.kt` you: -Here's the update MainActivity for viewing a call +* Don't need to request permissions or enable the camera +* Don't render the local video, but instead render the remote video +* Typically include some small UI elements like viewer count, a button to mute etc -```kotlin -``` +The [docs on building a UI for watching a livestream](../05-ui-cookbook/15-watching-livestream.mdx) explain this in more detail. ### Step 6 - (Optional) Viewing a livestream with HLS -Another way to view a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above webrtc approach only has a 100-200ms delay typically. -The benefit that HLS has is that it buffers better under poor network conditions. -So for apps where you expect your users to have poor network, and where a 10 second delay is ok, HLS can be a better option. +Another way to watch a livestream is using HLS. HLS tends to have a 10 to 20 seconds delay, while the above WebRTC approach is realtime. +The benefit that HLS offers is better buffering under poor network conditions. +So HLS can be a good option when: -Let's show how to broadcast your call to HLS. +* A 10-20 second delay is acceptable +* Your users want to watch the Stream in poor network conditions -```kotlin -call.startBroadcast() -``` - -After starting the broadcast the HLS url can be found in the call state +Let's show how to broadcast your call to HLS: ```kotlin -call.state.egress.value?.hls +call.startBroadcast() +val hlsUrl = call.state.egress.value?.hls +Log.i("Tutorial", "HLS url = $hlsUrl") ``` -You can view the HLS video feed using any open source HLS capable video player. +You can view the HLS video feed using any HLS capable video player. ### 7 - Advanced Features +This tutorial covered broadcasting and watching a livestream. +It also went into more details about HLS & RTMP-in. + There are several advanced features that can improve the livestreaming experience: -* ** Co-hosts ** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc. -* ** Custom events ** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case. -* ** Reactions & Chat ** Users can react to the livestream, and you can add chat. This makes for a more engaging experience. -* ** Notifications ** You can notify users via push notifications when the livestream starts -* ** Recording ** The call recording functionality allows you to record the call with various options and layouts +* ** [Co-hosts](../03-guides/02-joining-creating-calls.mdx) ** You can add members to your livestream with elevated permissions. So you can have co-hosts, moderators etc. +* ** [Custom events](../03-guides/08-reactions-and-custom-events.mdx) ** You can use custom events on the call to share any additional data. Think about showing the score for a game, or any other realtime use case. +* ** [Reactions & Chat](../03-guides/08-reactions-and-custom-events.mdx) ** Users can react to the livestream, and you can add chat. This makes for a more engaging experience. +* ** [Notifications](../06-advanced/01-ringing.mdx) ** You can notify users via push notifications when the livestream starts +* ** [Recording](../06-advanced/06-recording.mdx) ** The call recording functionality allows you to record the call with various options and layouts ### Recap @@ -344,7 +346,7 @@ Our team is also happy to review your UI designs and offer recommendations on ho To recap what we've learned: -* Webrtc is optimal for latency, HLS is slower but buffers better for users with poor connections +* WebRTC is optimal for latency, HLS is slower but buffers better for users with poor connections * You setup a call: (val call = client.call("livestream", callId)) * The call type "livestream" controls which features are enabled and how permissions are setup * The livestream by default enables "backstage" mode. This allows you and your co-hosts to setup your mic and camera before allowing people in diff --git a/docusaurus/docs/Android/05-ui-cookbook/01-overview.mdx b/docusaurus/docs/Android/05-ui-cookbook/01-overview.mdx index 6cbb16143d..8cb2ed601e 100644 --- a/docusaurus/docs/Android/05-ui-cookbook/01-overview.mdx +++ b/docusaurus/docs/Android/05-ui-cookbook/01-overview.mdx @@ -77,6 +77,9 @@ This cookbook aims to show you how to build your own UI elements for video calli + + + diff --git a/docusaurus/docs/Android/05-ui-cookbook/15-watching-livestream.mdx b/docusaurus/docs/Android/05-ui-cookbook/15-watching-livestream.mdx new file mode 100644 index 0000000000..67e8af84a1 --- /dev/null +++ b/docusaurus/docs/Android/05-ui-cookbook/15-watching-livestream.mdx @@ -0,0 +1,360 @@ +--- +title: Watching a livestream +description: How to watch a livestream on android with Kotlin +--- + +This cookbook tutorial walks you through how to render the UI for watching a livestream on Android. + +:::note +This cookbook tutorial will skip how to join a livestream call. If you didn't learn about the [Livestream Tutorial](../02-tutorials/03-livestream.mdx), we highly recommend you to read about the tutorial first. +::: + +When you build a livestreaming UI, there are a few things to keep in mind: + +* UI for when the video isn't loaded yet +* A message to show when the livestream didn't start yet +* What to show when the livestream stopped +* How to indicate when there are connection problems +* Muting the volume +* Number of participants +* Duration of the call + +In this cookbook tutorial, you'll learn how to build the result below at the end: + +| On Backstage | On Live | +| --- | --- | +| ![LiveStream Backstage](../assets/cookbook/livestream-backstage.png) | ![LiveStream Live](../assets/cookbook/livestream-live.png) | + +### Rendering Livestreaming + +First thing first, you need to render a livestreaming video, which is the most important part in the screen. + +You can simply render your livestreaming video like the sample below: + +```kotlin +val userToken = "REPLACE_WITH_TOKEN" +val userId = "REPLACE_WITH_USER_ID" +val callId = "REPLACE_WITH_CALL_ID" + +// step1 - create a user. +val user = User( + id = userId, // any string + name = "Tutorial", // name and image are used in the UI + role = "admin" +) + +// step2 - initialize StreamVideo. For a production app we recommend adding the client to your Application class or di module. +val client = StreamVideoBuilder( + context = applicationContext, + apiKey = "mmhfdzb5evj2", // demo API key + geo = GEO.GlobalEdgeNetwork, + user = user, + token = userToken, +).build() + +// step3 - join a call, which type is `default` and id is `123`. +val call = client.call("livestream", callId) +lifecycleScope.launch { + // join the call + val result = call.join(create = true) + result.onError { + Toast.makeText(applicationContext, "uh oh $it", Toast.LENGTH_SHORT).show() + } +} + +setContent { + // request the Android runtime permissions for the camera and microphone + LaunchCallPermissions(call = call) + + // step4 - apply VideoTheme + VideoTheme { + val me by call.state.me.collectAsState() + val video = me?.video?.collectAsState()?.value + + VideoRenderer( + modifier = Modifier + .fillMaxSize() + .clip(RoundedCornerShape(6.dp)), + call = call, + video = video, + videoFallbackContent = { + Text(text = "Video rendering failed") + } + ) + } +} +``` + +If you run the above example, you'll see the very basic video streaming screen below: + +![Video Streaming](../assets/compose_single_video.png) + +### Implement Live Participants Label + +Now you need to build labels that indicates the count of participants of your livestreaming, and streaming time. + +You can simply implement the live label like so: + +```kotlin +@Composable +fun LiveLabel( + modifier: Modifier, + liveCount: Int +) { + Row(modifier = modifier.clip(RoundedCornerShape(6.dp))) { + Text( + modifier = Modifier + .background(VideoTheme.colors.primaryAccent) + .padding(vertical = 3.dp, horizontal = 12.dp), + text = "Live", + color = Color.White + ) + + Row( + modifier = Modifier.background(Color(0xFF1C1E22)), + verticalAlignment = Alignment.CenterVertically + ) { + Icon( + modifier = Modifier + .padding(horizontal = 6.dp) + .size(22.dp), + imageVector = Icons.Default.Person, + tint = Color.White, + contentDescription = null + ) + + Text( + modifier = Modifier.padding(end = 12.dp, top = 3.dp, bottom = 3.dp), + text = liveCount.toString(), + color = Color.White + ) + } + } +} +``` + +If you build a preview for `LiveLabel` Composable, you'll see the result below: + +![LiveLabel](../assets/cookbook/livestream-live-label.png) + +### Implement Live Time Label + +Next, you need to implement the live time label, which indicates the time duration once your start live streaming. + +You can simply implement the live time label like so: + +```kotlin +@Composable +fun TimeLabel( + modifier: Modifier = Modifier, + sessionTime: Long +) { + val time by remember(sessionTime) { + val date = Date(sessionTime) + val format = SimpleDateFormat("mm:ss", Locale.US) + mutableStateOf(format.format(date)) + } + + Row( + modifier = modifier + .background(Color(0xFF1C1E22), RoundedCornerShape(6.dp)), + verticalAlignment = Alignment.CenterVertically + ) { + Icon( + modifier = Modifier + .size(28.dp) + .padding(start = 12.dp), + imageVector = Icons.Default.CheckCircle, + tint = VideoTheme.colors.infoAccent, + contentDescription = null + ) + + Text( + modifier = Modifier.padding(horizontal = 12.dp), + text = time, + color = Color.White + ) + } +} +``` + +If you build a preview for `LiveLabel` Composable, you'll see the result below: + +![TimeLabel](../assets/cookbook/livestream-time-label.png) + +### Connect implementations With Call State + +Now, let's connect those implementations with the call state and put them all with `Scaffold`, which consists of `TopBar`, `BottomBar`, and `content`. + +```kotlin +VideoTheme { + val participantCount by call.state.participantCounts.collectAsState() + val connection by call.state.connection.collectAsState() + val backstage by call.state.backstage.collectAsState() + val me by call.state.me.collectAsState() + val video = me?.video?.collectAsState()?.value + val sessionTime by call.state.liveDurationInMs.collectAsState() + + Scaffold( + modifier = Modifier + .fillMaxSize() + .background(Color(0xFF272A30)) + .padding(6.dp), + contentColor = Color(0xFF272A30), + backgroundColor = Color(0xFF272A30), + topBar = { + if (connection == RealtimeConnection.Connected) { + Box( + modifier = Modifier + .fillMaxWidth() + .padding(6.dp) + ) { + if (!backstage) { + LiveLabel( + modifier = Modifier.align(Alignment.CenterStart), + liveCount = participantCount?.total ?: 0 + ) + } + + TimeLabel( + modifier = Modifier.align(Alignment.Center), + sessionTime = sessionTime ?: 0 + ) + } + } + } + ) { + VideoRenderer( + modifier = Modifier + .fillMaxSize() + .padding(it) + .clip(RoundedCornerShape(6.dp)), + call = call, + video = video, + videoFallbackContent = { + Text(text = "Video rendering failed") + } + ) + } +} +``` + +As you can see above example, you can see some of state declaration from the call state: + +- `participantCount`: A model that contains information about participant counts. +- `connection`: Indicates the connection state of a call. +- `backstage`: Whether the call is on the backstage or not. +- `me`: A video track, which renders a local video stream. +- `video`: A local video track. +- `sessionTime`: Indicates the time duration since your call goes to live. + +### Implement Live Button + +Now let's build a live button that allows you to start/stop broadcast your call, and controls your physical device, such as camera and microphone. + +You can implement the live button like so: + +```kotlin +@Composable +fun LiveButton( + modifier: Modifier, + call: Call, + isBackstage: Boolean, + onClick: () -> Unit +) { + Box(modifier = Modifier.fillMaxWidth()) { + Button( + modifier = modifier, + colors = if (isBackstage) { + ButtonDefaults.buttonColors( + backgroundColor = VideoTheme.colors.primaryAccent, + contentColor = VideoTheme.colors.primaryAccent + ) + } else { + ButtonDefaults.buttonColors( + backgroundColor = VideoTheme.colors.errorAccent, + contentColor = VideoTheme.colors.errorAccent + ) + }, + onClick = onClick + ) { + Icon( + modifier = Modifier.padding(vertical = 3.dp, horizontal = 6.dp), + imageVector = if (isBackstage) { + Icons.Default.PlayArrow + } else { + Icons.Default.Close + }, + tint = Color.White, + contentDescription = null + ) + + Text( + modifier = Modifier.padding(end = 6.dp), + text = if (isBackstage) "Go Live" else "Stop Broadcast", + fontWeight = FontWeight.Bold, + fontSize = 16.sp, + color = Color.White + ) + } + + val isCameraEnabled by call.camera.isEnabled.collectAsState() + val isMicrophoneEnabled by call.microphone.isEnabled.collectAsState() + + Row(modifier = Modifier.align(Alignment.CenterEnd)) { + ToggleCameraAction( + modifier = Modifier.size(45.dp), + isCameraEnabled = isCameraEnabled, + enabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledIconTint = VideoTheme.colors.errorAccent, + shape = RoundedCornerShape(8.dp), + onCallAction = { callAction -> call.camera.setEnabled(callAction.isEnabled) } + ) + + ToggleMicrophoneAction( + modifier = Modifier + .padding(horizontal = 12.dp) + .size(45.dp), + isMicrophoneEnabled = isMicrophoneEnabled, + enabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledIconTint = VideoTheme.colors.errorAccent, + shape = RoundedCornerShape(8.dp), + onCallAction = { callAction -> call.microphone.setEnabled(callAction.isEnabled) } + ) + } + } +} +``` + +Now, let's complete the `Scaffold` with the new `LiveButton` Composable. + +### Complete The Live Screen + +Now, everything is ready to put together. You can complete the `Scaffold` with the new `LiveButton` Composable like so: + +```kotlin +Scaffold( + .., + bottomBar = { + LiveButton( + modifier = Modifier.padding(9.dp), + call = call, + isBackstage = backstage + ) { + lifecycleScope.launch { + if (backstage) call.goLive() else call.stopLive() + } + } + } + ) { + .. + } +``` + +After building your project, you'll see the final result below: + +![LiveStream Backstage](../assets/cookbook/livestream-backstage.png) + +You can broadcast your stream by clicking the **Go Live** button. \ No newline at end of file diff --git a/docusaurus/docs/Android/assets/cookbook/livestream-backstage.png b/docusaurus/docs/Android/assets/cookbook/livestream-backstage.png new file mode 100644 index 0000000000..71fa82b9bd Binary files /dev/null and b/docusaurus/docs/Android/assets/cookbook/livestream-backstage.png differ diff --git a/docusaurus/docs/Android/assets/cookbook/livestream-live-label.png b/docusaurus/docs/Android/assets/cookbook/livestream-live-label.png new file mode 100644 index 0000000000..650eb5a3a8 Binary files /dev/null and b/docusaurus/docs/Android/assets/cookbook/livestream-live-label.png differ diff --git a/docusaurus/docs/Android/assets/cookbook/livestream-live.png b/docusaurus/docs/Android/assets/cookbook/livestream-live.png new file mode 100644 index 0000000000..c30482749d Binary files /dev/null and b/docusaurus/docs/Android/assets/cookbook/livestream-live.png differ diff --git a/docusaurus/docs/Android/assets/cookbook/livestream-time-label.png b/docusaurus/docs/Android/assets/cookbook/livestream-time-label.png new file mode 100644 index 0000000000..2a26fcee6c Binary files /dev/null and b/docusaurus/docs/Android/assets/cookbook/livestream-time-label.png differ diff --git a/generate-openapi.sh b/generate-openapi.sh index a83ac6c167..f3bc556d84 100755 --- a/generate-openapi.sh +++ b/generate-openapi.sh @@ -51,7 +51,8 @@ done for FILE in "$APIS_ROOT"/*.kt; do echo "Processing ${FILE}" - grep -iE "$API_REQUEST_REGEX" "$FILE" | while read -r line; do + grep -iE "${API_REQUEST_REGEX}" "$FILE" | while read -r line; do + # adds the /video prefix to the URI UPDATED_REQUEST=$(sed 's/("/("\/video\//g' <<<$line) ESCAPED_LINE=$(printf '%s\n' "$line" | sed -e 's/[\/&]/\\&/g') ESCAPED_REQUEST=$(printf '%s\n' "$UPDATED_REQUEST" | sed -e 's/[\/&]/\\&/g') diff --git a/stream-video-android-core/api/stream-video-android-core.api b/stream-video-android-core/api/stream-video-android-core.api index 80fc718173..f2ab628ea0 100644 --- a/stream-video-android-core/api/stream-video-android-core.api +++ b/stream-video-android-core/api/stream-video-android-core.api @@ -102,12 +102,15 @@ public final class io/getstream/video/android/core/CallState { public final fun getCreatedBy ()Lkotlinx/coroutines/flow/StateFlow; public final fun getCustom ()Lkotlinx/coroutines/flow/StateFlow; public final fun getDominantSpeaker ()Lkotlinx/coroutines/flow/StateFlow; + public final fun getDuration ()Lkotlinx/coroutines/flow/StateFlow; + public final fun getDurationInMs ()Lkotlinx/coroutines/flow/StateFlow; public final fun getEgress ()Lkotlinx/coroutines/flow/StateFlow; public final fun getEndedAt ()Lkotlinx/coroutines/flow/StateFlow; public final fun getEndedByUser ()Lkotlinx/coroutines/flow/StateFlow; public final fun getErrors ()Lkotlinx/coroutines/flow/StateFlow; public final fun getIngress ()Lkotlinx/coroutines/flow/StateFlow; public final fun getLive ()Lkotlinx/coroutines/flow/StateFlow; + public final fun getLiveDurationInMs ()Lkotlinx/coroutines/flow/StateFlow; public final fun getMe ()Lkotlinx/coroutines/flow/StateFlow; public final fun getMember (Ljava/lang/String;)Lio/getstream/video/android/core/MemberState; public final fun getMembers ()Lkotlinx/coroutines/flow/StateFlow; @@ -129,6 +132,7 @@ public final class io/getstream/video/android/core/CallState { public final fun getSettings ()Lkotlinx/coroutines/flow/StateFlow; public final fun getSortedParticipants ()Lkotlinx/coroutines/flow/StateFlow; public final fun getSpeakingWhileMuted ()Lkotlinx/coroutines/flow/StateFlow; + public final fun getStartedAt ()Lkotlinx/coroutines/flow/StateFlow; public final fun getStartsAt ()Lkotlinx/coroutines/flow/StateFlow; public final fun getStats ()Lio/getstream/video/android/core/CallStats; public final fun getTeam ()Lkotlinx/coroutines/flow/StateFlow; @@ -656,6 +660,8 @@ public final class io/getstream/video/android/core/StreamVideo$DefaultImpls { } public final class io/getstream/video/android/core/StreamVideoBuilder { + public fun (Landroid/content/Context;Ljava/lang/String;)V + public fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;)V public fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;)V public fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;)V public fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;)V @@ -669,7 +675,6 @@ public final class io/getstream/video/android/core/StreamVideoBuilder { public fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;ZJZ)V public fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;ZJZLjava/lang/String;)V public synthetic fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/core/GEO;Lio/getstream/video/android/model/User;Ljava/lang/String;Lkotlin/jvm/functions/Function2;Lio/getstream/video/android/core/logging/LoggingLevel;Lio/getstream/video/android/core/notifications/NotificationConfig;Lkotlin/jvm/functions/Function1;Ljava/util/List;Ljava/util/List;ZJZLjava/lang/String;ILkotlin/jvm/internal/DefaultConstructorMarker;)V - public fun (Landroid/content/Context;Ljava/lang/String;Lio/getstream/video/android/model/User;)V public final fun build ()Lio/getstream/video/android/core/StreamVideo; public final fun getScope ()Lkotlinx/coroutines/CoroutineScope; } @@ -1178,12 +1183,27 @@ public final class io/getstream/video/android/core/events/ICETrickleEvent : io/g } public final class io/getstream/video/android/core/events/JoinCallResponseEvent : io/getstream/video/android/core/events/SfuDataEvent { - public fun (Lstream/video/sfu/models/CallState;)V + public fun (Lstream/video/sfu/models/CallState;Lio/getstream/video/android/core/events/ParticipantCount;)V public final fun component1 ()Lstream/video/sfu/models/CallState; - public final fun copy (Lstream/video/sfu/models/CallState;)Lio/getstream/video/android/core/events/JoinCallResponseEvent; - public static synthetic fun copy$default (Lio/getstream/video/android/core/events/JoinCallResponseEvent;Lstream/video/sfu/models/CallState;ILjava/lang/Object;)Lio/getstream/video/android/core/events/JoinCallResponseEvent; + public final fun component2 ()Lio/getstream/video/android/core/events/ParticipantCount; + public final fun copy (Lstream/video/sfu/models/CallState;Lio/getstream/video/android/core/events/ParticipantCount;)Lio/getstream/video/android/core/events/JoinCallResponseEvent; + public static synthetic fun copy$default (Lio/getstream/video/android/core/events/JoinCallResponseEvent;Lstream/video/sfu/models/CallState;Lio/getstream/video/android/core/events/ParticipantCount;ILjava/lang/Object;)Lio/getstream/video/android/core/events/JoinCallResponseEvent; public fun equals (Ljava/lang/Object;)Z public final fun getCallState ()Lstream/video/sfu/models/CallState; + public final fun getParticipantCount ()Lio/getstream/video/android/core/events/ParticipantCount; + public fun hashCode ()I + public fun toString ()Ljava/lang/String; +} + +public final class io/getstream/video/android/core/events/ParticipantCount { + public fun (II)V + public final fun component1 ()I + public final fun component2 ()I + public final fun copy (II)Lio/getstream/video/android/core/events/ParticipantCount; + public static synthetic fun copy$default (Lio/getstream/video/android/core/events/ParticipantCount;IIILjava/lang/Object;)Lio/getstream/video/android/core/events/ParticipantCount; + public fun equals (Ljava/lang/Object;)Z + public final fun getAnonymous ()I + public final fun getTotal ()I public fun hashCode ()I public fun toString ()Ljava/lang/String; } @@ -1237,7 +1257,14 @@ public final class io/getstream/video/android/core/events/SFUConnectedEvent : io } public final class io/getstream/video/android/core/events/SFUHealthCheckEvent : io/getstream/video/android/core/events/SfuDataEvent { - public static final field INSTANCE Lio/getstream/video/android/core/events/SFUHealthCheckEvent; + public fun (Lio/getstream/video/android/core/events/ParticipantCount;)V + public final fun component1 ()Lio/getstream/video/android/core/events/ParticipantCount; + public final fun copy (Lio/getstream/video/android/core/events/ParticipantCount;)Lio/getstream/video/android/core/events/SFUHealthCheckEvent; + public static synthetic fun copy$default (Lio/getstream/video/android/core/events/SFUHealthCheckEvent;Lio/getstream/video/android/core/events/ParticipantCount;ILjava/lang/Object;)Lio/getstream/video/android/core/events/SFUHealthCheckEvent; + public fun equals (Ljava/lang/Object;)Z + public final fun getParticipantCount ()Lio/getstream/video/android/core/events/ParticipantCount; + public fun hashCode ()I + public fun toString ()Ljava/lang/String; } public abstract class io/getstream/video/android/core/events/SfuDataEvent : org/openapitools/client/models/VideoEvent { diff --git a/stream-video-android-core/src/androidTest/kotlin/io/getstream/video/android/core/LivestreamTest.kt b/stream-video-android-core/src/androidTest/kotlin/io/getstream/video/android/core/LivestreamTest.kt index 2adf2662ae..c67f4bc582 100644 --- a/stream-video-android-core/src/androidTest/kotlin/io/getstream/video/android/core/LivestreamTest.kt +++ b/stream-video-android-core/src/androidTest/kotlin/io/getstream/video/android/core/LivestreamTest.kt @@ -17,13 +17,9 @@ package io.getstream.video.android.core import com.google.common.truth.Truth.assertThat -import kotlinx.coroutines.delay -import kotlinx.coroutines.flow.flow import kotlinx.coroutines.test.runTest import org.junit.Ignore import org.junit.Test -import org.threeten.bp.OffsetDateTime -import org.threeten.bp.temporal.ChronoUnit class LivestreamTest : IntegrationTestBase() { /** @@ -89,60 +85,26 @@ class LivestreamTest : IntegrationTestBase() { println("hi123 event: $it") } call.join(create = true) + print("debugging: call cid is: " + call.cid) Thread.sleep(1000L) - // counts - val session = call.state.session.value - assertThat(session?.participants).isNotEmpty() - assertThat(session?.startedAt).isNotNull() - - assertThat(session?.participantsCountByRole).isNotEmpty() + // counts and startedAt + assertThat(call.state.participants.value).isNotEmpty() + assertThat(call.state.startedAt).isNotNull() + assertThat(call.state.participantCounts.value?.total).isEqualTo(1) + assertThat(call.state.participantCounts.value?.anonymous).isEqualTo(0) } @Test - @Ignore fun timeRunning() = runTest { - val call = client.call("livestream", randomUUID()) + println("starting") + val call = client.call("livestream") assertSuccess(call.create()) val goLiveResponse = call.goLive() assertSuccess(goLiveResponse) - // call running time -// goLiveResponse.onSuccess { -// assertThat(it.call.session).isNotNull() -// } - - val start = call.state.session.value?.startedAt ?: OffsetDateTime.now() - assertThat(start).isNotNull() - - val test = flow { - emit(1) - emit(2) - } - - val timeSince = flow { - while (true) { - delay(1000) - val now = OffsetDateTime.now() - if (start == null) { - emit(null) - } else { - val difference = ChronoUnit.SECONDS.between(start?.toInstant(), now.toInstant()) - emit(difference) - } - if (call.state.session.value?.endedAt != null) { - break - } - } - } - - println("a") - timeSince.collect { - println("its $it") - } - - println("b") - - Thread.sleep(20000) + val duration = call.state.durationInMs.value + println("duration: $duration") + call.leave() } } diff --git a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/CallState.kt b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/CallState.kt index 23ef2bf1c4..5d166eaaa8 100644 --- a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/CallState.kt +++ b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/CallState.kt @@ -25,6 +25,7 @@ import io.getstream.video.android.core.events.DominantSpeakerChangedEvent import io.getstream.video.android.core.events.ErrorEvent import io.getstream.video.android.core.events.ICETrickleEvent import io.getstream.video.android.core.events.JoinCallResponseEvent +import io.getstream.video.android.core.events.ParticipantCount import io.getstream.video.android.core.events.ParticipantJoinedEvent import io.getstream.video.android.core.events.ParticipantLeftEvent import io.getstream.video.android.core.events.SFUHealthCheckEvent @@ -43,6 +44,7 @@ import io.getstream.video.android.model.User import kotlinx.coroutines.CoroutineScope import kotlinx.coroutines.Job import kotlinx.coroutines.channels.awaitClose +import kotlinx.coroutines.currentCoroutineContext import kotlinx.coroutines.delay import kotlinx.coroutines.flow.MutableStateFlow import kotlinx.coroutines.flow.SharingStarted @@ -50,7 +52,11 @@ import kotlinx.coroutines.flow.StateFlow import kotlinx.coroutines.flow.channelFlow import kotlinx.coroutines.flow.debounce import kotlinx.coroutines.flow.firstOrNull +import kotlinx.coroutines.flow.flow +import kotlinx.coroutines.flow.map import kotlinx.coroutines.flow.stateIn +import kotlinx.coroutines.flow.transform +import kotlinx.coroutines.isActive import kotlinx.coroutines.launch import kotlinx.coroutines.runBlocking import okhttp3.internal.toImmutableList @@ -97,12 +103,16 @@ import org.openapitools.client.models.UpdateCallResponse import org.openapitools.client.models.UpdatedCallPermissionsEvent import org.openapitools.client.models.VideoEvent import org.threeten.bp.Clock +import org.threeten.bp.Duration +import org.threeten.bp.Instant import org.threeten.bp.OffsetDateTime +import org.threeten.bp.ZoneOffset import stream.video.sfu.models.Participant -import stream.video.sfu.models.ParticipantCount import stream.video.sfu.models.TrackType import java.util.SortedMap import java.util.UUID +import kotlin.time.DurationUnit +import kotlin.time.toDuration public sealed interface RealtimeConnection { /** @@ -172,6 +182,9 @@ public class CallState( public val participants: StateFlow> = _participants.mapState { it.values.toList() } + private val _startedAt: MutableStateFlow = MutableStateFlow(null) + public val startedAt: StateFlow = _startedAt + private val _participantCounts: MutableStateFlow = MutableStateFlow(null) val participantCounts: StateFlow = _participantCounts @@ -181,7 +194,8 @@ public class CallState( } /** participants who are currently speaking */ - private val _activeSpeakers: MutableStateFlow> = MutableStateFlow(emptyList()) + private val _activeSpeakers: MutableStateFlow> = + MutableStateFlow(emptyList()) public val activeSpeakers: StateFlow> = _activeSpeakers /** participants other than yourself */ @@ -270,7 +284,8 @@ public class CallState( * * Debounced 100ms to avoid rapid changes */ - val sortedParticipants = sortedParticipantsFlow.debounce(100).stateIn(scope, SharingStarted.WhileSubscribed(10000L), emptyList()) + val sortedParticipants = sortedParticipantsFlow.debounce(100) + .stateIn(scope, SharingStarted.WhileSubscribed(10000L), emptyList()) /** Members contains the list of users who are permanently associated with this call. This includes users who are currently not active in the call * As an example if you invite "john", "bob" and "jane" to a call and only Jane joins. @@ -306,6 +321,26 @@ public class CallState( private val _settings: MutableStateFlow = MutableStateFlow(null) public val settings: StateFlow = _settings + private val _durationInMs = flow { + while (currentCoroutineContext().isActive) { + delay(1000) + val started = _session.value?.startedAt + val ended = _session.value?.endedAt ?: OffsetDateTime.now() + val difference = if (started == null) null else { + ended.toInstant().toEpochMilli() - started.toInstant().toEpochMilli() + } + emit(difference) + } + } + + /** how long the call has been running, null if the call didn't start yet */ + public val duration: StateFlow = + _durationInMs.transform { emit((it ?: 0L).toDuration(DurationUnit.MILLISECONDS)) }.stateIn(scope, SharingStarted.WhileSubscribed(10000L), null) + + /** how many milliseconds the call has been running, null if the call didn't start yet */ + public val durationInMs: StateFlow = + _durationInMs.stateIn(scope, SharingStarted.WhileSubscribed(10000L), null) + /** Check if you have permissions to do things like share your audio, video, screen etc */ public fun hasPermission(permission: String): StateFlow { // store this in a map so we don't have to create a new flow every time @@ -333,9 +368,22 @@ public class CallState( /** if we are in backstage mode or not */ val backstage: StateFlow = _backstage + /** the opposite of backstage, if we are live or not */ val live: StateFlow = _backstage.mapState { !it } + /** how many milliseconds the call has been running, null if the call didn't start yet */ + public val liveDurationInMs: StateFlow = + _durationInMs + .map { + if (live.value) { + it + } else { + null + } + } + .stateIn(scope, SharingStarted.WhileSubscribed(10000L), null) + private val _egress: MutableStateFlow = MutableStateFlow(null) val egress: StateFlow = _egress @@ -388,7 +436,8 @@ public class CallState( val apiKey = runBlocking { call.clientImpl.dataStore.apiKey.firstOrNull() } val streamKey = "$apiKey/$token" // TODO: use the address when the server is updated - val overwriteUrl = "rtmps://video-ingress-frankfurt-vi1.stream-io-video.com:443/${call.type}/${call.id}" + val overwriteUrl = + "rtmps://video-ingress-frankfurt-vi1.stream-io-video.com:443/${call.type}/${call.id}" Ingress(rtmp = RTMP(address = overwriteUrl ?: "", streamKey = streamKey)) } @@ -594,8 +643,8 @@ public class CallState( _errors.value = errors.value + event } - SFUHealthCheckEvent -> { - // we don't do anything with this + is SFUHealthCheckEvent -> { + call.state._participantCounts.value = event.participantCount } is ICETrickleEvent -> { @@ -688,7 +737,12 @@ public class CallState( is CallSessionParticipantJoinedEvent -> { _session.value?.let { callSessionResponse -> val newList = callSessionResponse.participants.toMutableList() - val participant = CallParticipantResponse(user = event.participant.user, joinedAt = event.createdAt, role = "user", userSessionId = event.participant.userSessionId) + val participant = CallParticipantResponse( + user = event.participant.user, + joinedAt = event.createdAt, + role = "user", + userSessionId = event.participant.userSessionId + ) val index = newList.indexOfFirst { user.id == event.participant.user.id } if (index == -1) { newList.add(participant) @@ -706,7 +760,8 @@ public class CallState( private fun updateRingingState() { // this is only true when we are in the session (we have accepted/joined the call) - val userIsParticipant = _session.value?.participants?.find { it.user.id == client.userId } != null + val userIsParticipant = + _session.value?.participants?.find { it.user.id == client.userId } != null val outgoingMembersCount = _members.value.filter { it.value.user.id != client.userId }.size val rejectedBy = _rejectedBy.value val acceptedBy = _acceptedBy.value @@ -764,9 +819,10 @@ public class CallState( private fun updateFromJoinResponse(event: JoinCallResponseEvent) { // update the participant count - val count = event.callState.participant_count - _participantCounts.value = count + _participantCounts.value = event.participantCount + val instant = Instant.ofEpochSecond(event.callState.started_at?.epochSecond!!, 0) + _startedAt.value = OffsetDateTime.ofInstant(instant, ZoneOffset.UTC) // creates the participants val participantStates = event.callState.participants.map { getOrCreateParticipant(it) diff --git a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/StreamVideoBuilder.kt b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/StreamVideoBuilder.kt index 8042ee9643..e81f0b8fbf 100644 --- a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/StreamVideoBuilder.kt +++ b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/StreamVideoBuilder.kt @@ -74,7 +74,9 @@ public class StreamVideoBuilder @JvmOverloads constructor( context: Context, private val apiKey: ApiKey, private val geo: GEO = GEO.GlobalEdgeNetwork, - private var user: User, + private var user: User = User( + type = UserType.Anonymous + ), private val token: UserToken = "", private val tokenProvider: (suspend (error: Throwable?) -> String)? = null, private val loggingLevel: LoggingLevel = LoggingLevel(), diff --git a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/call/signal/socket/RTCEventMapper.kt b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/call/signal/socket/RTCEventMapper.kt index a869413672..a069f83d43 100644 --- a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/call/signal/socket/RTCEventMapper.kt +++ b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/call/signal/socket/RTCEventMapper.kt @@ -25,6 +25,7 @@ import io.getstream.video.android.core.events.DominantSpeakerChangedEvent import io.getstream.video.android.core.events.ErrorEvent import io.getstream.video.android.core.events.ICETrickleEvent import io.getstream.video.android.core.events.JoinCallResponseEvent +import io.getstream.video.android.core.events.ParticipantCount import io.getstream.video.android.core.events.ParticipantJoinedEvent import io.getstream.video.android.core.events.ParticipantLeftEvent import io.getstream.video.android.core.events.PublisherAnswerEvent @@ -89,9 +90,19 @@ public object RTCEventMapper { event.dominant_speaker_changed.session_id ) - event.health_check_response != null -> SFUHealthCheckEvent - event.join_response != null -> with(event.join_response) { - JoinCallResponseEvent(call_state!!) + event.health_check_response != null -> SFUHealthCheckEvent( + ParticipantCount( + event.health_check_response.participant_count?.total ?: 0, + event.health_check_response.participant_count?.anonymous ?: 0 + ) + ) + + event.join_response != null -> { + val counts = ParticipantCount( + event.join_response.call_state?.participant_count?.total ?: 0, + event.join_response.call_state?.participant_count?.anonymous ?: 0 + ) + JoinCallResponseEvent(event.join_response.call_state!!, counts) } event.ice_trickle != null -> with(event.ice_trickle) { diff --git a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/events/SfuDataEvent.kt b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/events/SfuDataEvent.kt index 764e62283a..cb6f3c2f90 100644 --- a/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/events/SfuDataEvent.kt +++ b/stream-video-android-core/src/main/kotlin/io/getstream/video/android/core/events/SfuDataEvent.kt @@ -95,10 +95,18 @@ public data class DominantSpeakerChangedEvent( val sessionId: String, ) : SfuDataEvent() -public object SFUHealthCheckEvent : SfuDataEvent() +public data class ParticipantCount( + val total: Int, + val anonymous: Int, +) + +public data class SFUHealthCheckEvent( + val participantCount: ParticipantCount +) : SfuDataEvent() public data class JoinCallResponseEvent( - val callState: CallState + val callState: CallState, + val participantCount: ParticipantCount ) : SfuDataEvent() public data class UnknownEvent(val event: Any?) : SfuDataEvent() diff --git a/tutorials/tutorial-livestream/.idea/workspace.xml b/tutorials/tutorial-livestream/.idea/workspace.xml new file mode 100644 index 0000000000..a54b5d626b --- /dev/null +++ b/tutorials/tutorial-livestream/.idea/workspace.xml @@ -0,0 +1,27 @@ + + + + + + + + + + \ No newline at end of file diff --git a/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/MainActivity.kt b/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/MainActivity.kt index bf69fa4f97..4ec5fc8adf 100644 --- a/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/MainActivity.kt +++ b/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/MainActivity.kt @@ -94,7 +94,7 @@ class MainActivity : ComponentActivity() { val backstage by call.state.backstage.collectAsState() val me by call.state.me.collectAsState() val video = me?.video?.collectAsState()?.value - val session by call.state.session.collectAsState() + val sessionTime by call.state.liveDurationInMs.collectAsState() Scaffold( modifier = Modifier @@ -119,13 +119,17 @@ class MainActivity : ComponentActivity() { TimeLabel( modifier = Modifier.align(Alignment.Center), - sessionTime = 10000 + sessionTime = sessionTime ?: 0 ) } } }, bottomBar = { - LiveButton(modifier = Modifier.padding(9.dp), isBackstage = backstage) { + LiveButton( + modifier = Modifier.padding(9.dp), + call = call, + isBackstage = backstage + ) { lifecycleScope.launch { if (backstage) call.goLive() else call.stopLive() } diff --git a/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveButton.kt b/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveButton.kt index 3726a9c32f..236399f1ec 100644 --- a/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveButton.kt +++ b/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveButton.kt @@ -16,7 +16,12 @@ package io.getstream.video.android.tutorial.livestream.ui +import androidx.compose.foundation.layout.Box +import androidx.compose.foundation.layout.Row +import androidx.compose.foundation.layout.fillMaxWidth import androidx.compose.foundation.layout.padding +import androidx.compose.foundation.layout.size +import androidx.compose.foundation.shape.RoundedCornerShape import androidx.compose.material.Button import androidx.compose.material.ButtonDefaults import androidx.compose.material.Icon @@ -25,56 +30,87 @@ import androidx.compose.material.icons.Icons import androidx.compose.material.icons.filled.Close import androidx.compose.material.icons.filled.PlayArrow import androidx.compose.runtime.Composable +import androidx.compose.runtime.collectAsState +import androidx.compose.runtime.getValue +import androidx.compose.ui.Alignment import androidx.compose.ui.Modifier import androidx.compose.ui.graphics.Color import androidx.compose.ui.text.font.FontWeight -import androidx.compose.ui.tooling.preview.Preview import androidx.compose.ui.unit.dp import androidx.compose.ui.unit.sp import io.getstream.video.android.compose.theme.VideoTheme +import io.getstream.video.android.compose.ui.components.call.controls.actions.ToggleCameraAction +import io.getstream.video.android.compose.ui.components.call.controls.actions.ToggleMicrophoneAction +import io.getstream.video.android.core.Call @Composable fun LiveButton( modifier: Modifier, + call: Call, isBackstage: Boolean, onClick: () -> Unit ) { - Button( - modifier = modifier, - colors = ButtonDefaults.buttonColors( - backgroundColor = Color(0xFF1C1E22), - contentColor = Color(0xFF1C1E22) - ), - onClick = onClick - ) { - Icon( - modifier = Modifier.padding(vertical = 3.dp, horizontal = 6.dp), - imageVector = if (isBackstage) { - Icons.Default.PlayArrow + Box(modifier = Modifier.fillMaxWidth()) { + Button( + modifier = modifier, + colors = if (isBackstage) { + ButtonDefaults.buttonColors( + backgroundColor = VideoTheme.colors.primaryAccent, + contentColor = VideoTheme.colors.primaryAccent + ) } else { - Icons.Default.Close + ButtonDefaults.buttonColors( + backgroundColor = VideoTheme.colors.errorAccent, + contentColor = VideoTheme.colors.errorAccent + ) }, - tint = Color.White, - contentDescription = null - ) + onClick = onClick + ) { + Icon( + modifier = Modifier.padding(vertical = 3.dp, horizontal = 6.dp), + imageVector = if (isBackstage) { + Icons.Default.PlayArrow + } else { + Icons.Default.Close + }, + tint = Color.White, + contentDescription = null + ) - Text( - modifier = Modifier.padding(end = 6.dp), - text = if (isBackstage) "Go Live" else "Stop Broadcast", - fontWeight = FontWeight.Bold, - fontSize = 16.sp, - color = Color.White - ) - } -} + Text( + modifier = Modifier.padding(end = 6.dp), + text = if (isBackstage) "Go Live" else "Stop Broadcast", + fontWeight = FontWeight.Bold, + fontSize = 16.sp, + color = Color.White + ) + } -@Preview -@Composable -private fun LiveButtonPreview() { - VideoTheme { - LiveButton( - modifier = Modifier, - isBackstage = true, - ) {} + val isCameraEnabled by call.camera.isEnabled.collectAsState() + val isMicrophoneEnabled by call.microphone.isEnabled.collectAsState() + + Row(modifier = Modifier.align(Alignment.CenterEnd)) { + ToggleCameraAction( + modifier = Modifier.size(45.dp), + isCameraEnabled = isCameraEnabled, + enabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledIconTint = VideoTheme.colors.errorAccent, + shape = RoundedCornerShape(8.dp), + onCallAction = { callAction -> call.camera.setEnabled(callAction.isEnabled) } + ) + + ToggleMicrophoneAction( + modifier = Modifier + .padding(horizontal = 12.dp) + .size(45.dp), + isMicrophoneEnabled = isMicrophoneEnabled, + enabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledColor = VideoTheme.colors.callActionIconEnabledBackground, + disabledIconTint = VideoTheme.colors.errorAccent, + shape = RoundedCornerShape(8.dp), + onCallAction = { callAction -> call.microphone.setEnabled(callAction.isEnabled) } + ) + } } } diff --git a/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveLabel.kt b/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveLabel.kt index daed89953f..538aefa08a 100644 --- a/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveLabel.kt +++ b/tutorials/tutorial-livestream/src/main/kotlin/io/getstream/video/android/tutorial/livestream/ui/LiveLabel.kt @@ -19,9 +19,14 @@ package io.getstream.video.android.tutorial.livestream.ui import androidx.compose.foundation.background import androidx.compose.foundation.layout.Row import androidx.compose.foundation.layout.padding +import androidx.compose.foundation.layout.size import androidx.compose.foundation.shape.RoundedCornerShape +import androidx.compose.material.Icon import androidx.compose.material.Text +import androidx.compose.material.icons.Icons +import androidx.compose.material.icons.filled.Person import androidx.compose.runtime.Composable +import androidx.compose.ui.Alignment import androidx.compose.ui.Modifier import androidx.compose.ui.draw.clip import androidx.compose.ui.graphics.Color @@ -43,13 +48,25 @@ fun LiveLabel( color = Color.White ) - Text( - modifier = Modifier - .background(Color(0xFF1C1E22)) - .padding(vertical = 3.dp, horizontal = 12.dp), - text = liveCount.toString(), - color = Color.White - ) + Row( + modifier = Modifier.background(Color(0xFF1C1E22)), + verticalAlignment = Alignment.CenterVertically + ) { + Icon( + modifier = Modifier + .padding(horizontal = 6.dp) + .size(22.dp), + imageVector = Icons.Default.Person, + tint = Color.White, + contentDescription = null + ) + + Text( + modifier = Modifier.padding(end = 12.dp, top = 3.dp, bottom = 3.dp), + text = liveCount.toString(), + color = Color.White + ) + } } }