diff --git a/index.html b/index.html index b609a68..25d33ce 100644 --- a/index.html +++ b/index.html @@ -1876,6 +1876,85 @@
The {{VideoFrameMetadata}} interface exposes the blur state as a property, + which allows apps to know the state for every frame. This can be helpful for + scenarios where the app wishes to help protect user privacy by never sending an + un-blurred frame off of the user's system.
++partial dictionary VideoFrameMetadata { + BackgroundBlur backgroundBlur; +};+
backgroundBlur
of type {{BackgroundBlur}}
The state of the background blur effect for the current frame, if + [=map/exist|present=]. Absence might indicate that the frame is not + from a camera, or the user agent might not support reporting blur state. +
++dictionary BackgroundBlur: MediaEffectInfo {};+
+dictionary MediaEffectInfo { + required boolean enabled; +};+
enabled
of type {{boolean}}True if the effect is enabled, false otherwise. This isn't a strong guarantee, as user + agents likely can't detect all possible video manipulation software. +
++const stream = await navigator.mediaDevices.getUserMedia({ video: true }); +const videoTrack = stream.getVideoTracks()[0]; +const blurIndicator = document.getElementById("blurIndicator"); +const trackProcessor = new MediaStreamTrackProcessor({ track: videoTrack }); +const trackGenerator = new VideoTrackGenerator({ kind: "video" }); + +const transformer = new TransformStream({ + async transform(videoFrame, controller) { + if ("backgroundBlur" in videoFrame.metadata()) { + console.log('backgroundBlur.enabled:', videoFrame.metadata().backgroundBlur.enabled); + } else { + console.log('backgroundBlur unsupported'); + } + + controller.enqueue(videoFrame); + }, +}); + +trackProcessor.readable + .pipeThrough(transformer) + .pipeTo(trackGenerator.writable); + +const processedStream = new MediaStream(); ++