Is anybody know is there an ability to upload frame buffers from Broadcast Upload Extension to the host app or I should load them directly to back-end ? My goal to intercept frame buffers from replay kit, send them to my application and broadcast the video through my application using Webrtc. Will appreciate any help. Thanks in advance.
-
1Could you please share solution that you found. It could be text algorithm that we should use or source code (the best way). Thank you in advance. P.S. I am interesting in parts: how did you transfer data from extension to main app, and how do you encode samples to transfer them to webRTC.– Bws SlukMay 31, 2018 at 10:10
-
Have you found a solution yet?– Jaswant SinghJun 6, 2022 at 13:25
3 Answers
Only Broadcast Upload Extension and Broadcast UI Extension are loaded when the broadcast starts. And as far as I know there is no programmatic way of launching your host app and streaming any data to it in the background.
But you can implement the whole logic in the Broadcast Upload Extension. Your RPBroadcastSampleHandler
implementation is fed with video CMSampleBuffer
s. All post-processing and upload logic is up to the implementation. So you can unpack and process frames and then upload to your server in any suitable way. If you need any configuration or authorization details you can simply set them in the Broadcast UI Extension or even in your host app and then just store them in a shared storage.
There is not much information about it on the internet nor in the Apple documentation. But you still can:
- Watch the WWDC 2016 video
Go Live with ReplayKit
- Read the
RPBroadcastSampleHandler
documentation - Read this quite useful blog post (in Chinese): http://blog.lessfun.com/blog/2016/09/21/ios-10-replaykit-live-and-broadcast-extension/
- Play around the stub implementation of the upload extension (simply create the target in Xcode)
I tried exactly same thing with Replay Kit
and webRTC combination. The fundamental issue of webRTC on iOS, webRTC can't handle video stream if it goes to background. So.. you can stream your app's screen video stream to webRTC while your video chat app is in foreground, but to stream other app, the moment your app goes background, you might not be able to handle video stream but only voice over webRTC.
You'd better upload it to server from upload extension, I've already wasted too much time to connect upload extension to host app.. there is absolutely no control over upload extension.
-
Seconded. Just wasted two weeks on the same approach. Worked great, until app entered background. I feel like it could work, if you force webrtc to use vp8 software encoding, but I can't get the setting to stick.– nevynAug 1, 2018 at 14:21
-
2did you try to implement a broadcast extension for that, or did you just stream from the app itself? Oct 24, 2018 at 13:55
-
Meanwhile it is possible to stream via WebRTC even if in background. Use ReplayKit 2 functionalities for capturing the entire screen– decadesMay 1, 2019 at 21:07
-
I would like to know if you have tried to implement a broadcast extension or just stream from your app itself ? broadcast extension seems it should work even in background though Dec 11, 2019 at 19:41
I have some code for you, I've already implement it in my project and discussed it on google-groups: https://groups.google.com/d/msg/discuss-webrtc/jAHCnB12khE/zJEu1vyUAgAJ
I will transfer code here for the next generations:
First of all I've created additional class in broadcast extension to manage WebRTC related code and call it PeerManager.
Setup video track with local stream, be careful, you should do it before generate local offer.
private func setupVideoStreaming() {
localStream = webRTCPeer.peerConnectionFactory.mediaStream(withStreamId: "\(personID)_screen_sharing")
videoSource = webRTCPeer.peerConnectionFactory.videoSource()
videoCapturer = RTCVideoCapturer(delegate: videoSource)
videoSource.adaptOutputFormat(toWidth: 441, height: 736, fps: 15)
let videoTrack = webRTCPeer.peerConnectionFactory.videoTrack(with: videoSource, trackId: "screen_share_track_id")
videoTrack.isEnabled = true
localStream.addVideoTrack(videoTrack)
for localStream in webRTCPeer.localPeerConnection.peerConnection.localStreams {
webRTCPeer.localPeerConnection.peerConnection.remove(localStream)
}
webRTCPeer.localPeerConnection.peerConnection.add(localStream)
}
I’ve got callback from the system that provides me CMSampleBuffer, I convert it to the RTCVideoFrame and send to the videoSource (emulate VideoCapturer)
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
switch sampleBufferType {
case RPSampleBufferType.video:
// Handle video sample buffer
guard peerManager != nil, let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
break
}
let pixelFormat = CVPixelBufferGetPixelFormatType(imageBuffer) // kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000000000)
let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: imageBuffer)
let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._0, timeStampNs: timeStampNs)
peerManager.push(videoFrame: rtcVideoFrame)
case RPSampleBufferType.audioApp:
break
case RPSampleBufferType.audioMic:
break
}
}
Code from peerManager, it is implementation of push functions from code above. Nothing strange here, we emulate Capturer behavior using delegate.
func push(videoFrame: RTCVideoFrame) {
guard isConnected, videoCapturer != nil, isProcessed else {
return
}
videoSource.capturer(videoCapturer, didCapture: videoFrame)
}
Now you are ready to generate local offer, send it, and transfer data whatever you want. Try to check your local offer, if you do everything right you should see a=sendonly in your offer.
P.S. As suggested VladimirTechMan, You may also check the sample code of broadcast extension in the AppRTCMobile demo app. I've found link for you, it is Objective-C example https://webrtc.googlesource.com/src/+/358f2e076051d28b012529d3ae6a080838d27209 You should be interesting in ARDBroadcastSampleHandler.m/.h and ARDExternalSampleCapturer.m/.h files. Never forget that you can build it by yourself according to the instruction https://webrtc.org/native-code/ios/
-
I was trying to get the project running, but I can not build the project from the sample code and I used your snippets, but it is not working as well, or should I add anything else?– dmymaSep 7, 2018 at 0:23
-
1@dmyma you should setup code that you use for the usual webRTC call in your broadcast extension, but remove deviceCapturer part from it and replace that part with the function 'func push(videoFrame: RTCVideoFrame)', because in broadcastExtension you don't have capturer device.– Bws SlukOct 14, 2018 at 15:01
-
@MaheshShahane: Can you please answer my question stackoverflow.com/questions/61190672/… I have been struggling with this for weeks now. Apr 27, 2020 at 4:51
-
Did you import the webrtc framework into the extension or somehow created the extension within the framework? May 5, 2020 at 14:56
-
Hey @BwsSluk Could you share your source on github? I am facing a similar issue. Trying to import the webrtc framework into the broadcast extension just doesn't seem to be working. Not sure if I am doing something wrong May 5, 2020 at 16:27