Business Productivity

How to Screen Share from iOS Devices Using Amazon Chime SDK

The Amazon Chime SDKs for iOS and Android allow application developers to integrate real time audio, video and screen share functionality into their iOS and Android applications. The screen sharing functionality uses the same content sharing concept introduced in the Amazon Chime SDK for JavaScript (except that audio sharing is not supported). With screen sharing, Telehealth application users can access and share any relevant notes, chat conversations or documents with their healthcare professional in real time on their mobile device. Similarly, educational users are no longer limited by their choice of device while presenting in live virtual classrooms. Screen share is supported on the Amazon Chime SDKs for iOS and Android as a 2nd video stream from mobile devices. This is in addition to the camera video stream.

In this blog post, we will discuss adding code to existing iOS applications to enable screen share with others in a meeting. This is powered by the Amazon Chime SDK. If you have questions about adding this capability on Android applications, take a look at our content share guide in the Amazon Chime SDK for Android GitHub repository.


This blog post explains how to add screen sharing to an existing iOS application using the Amazon Chime SDK. There are two options to share screen on iOS:

  • Only while the application is in the foreground: “In application only”
  • Until screen share is stopped: “Device”

Let’s take a high-level look at how each option is used.

In application only screen sharing

This option (iOS 11.0+) captures and shares the view when the customer application is in the foreground. Screen sharing is paused when the customer application is put into the background. Developers can use InAppScreenCaptureSource in the new Content Share API to capture the screen from their application and send the screen video stream to other participants.

This diagram shows how the application sends *In Application Only* screen capture video streams to Amazon Chime Media Services through the existing MeetingSession instance.

Device Level Screen Sharing

Device level screen sharing (iOS 12.0+) continues to send screen capture from other applications even when the customer application is in the background. Developers need to create a Broadcast Upload Extension and add it to the same App Group as their main application. The extension allows developers to access a privately defined UserDefaults to share necessary data with their main application. The extension uses this shared data to construct its own MeetingSession for the existing meeting. It also uses it via the Content Share API to send the screen video stream, as shown in the sample code.This diagram shows how the Broadcast Upload Extension creates another MeetingSession instance to send *Device Level* screen capture.

Note: Deploying this demo and receiving traffic from the demo created in this post will incur AWS charges.


  • You have read Building a Meeting Application on iOS using the Amazon Chime SDK. You understand the basic architecture of the Amazon Chime SDK. You have deployed a serverless/browser demo meeting application.
  • You have read Custom Video Sources, Processors, and Sinks and have a basic understanding of APIs such as VideoSource.
  • You have a basic to intermediate understanding of iOS development and tools.
  • You have installed Xcode version 11.0 or later.
  • You have a physical iOS device. Screen Share is not supported by the iOS Simulator.
    • iOS 11.0 or later is required for In Application Only Screen Sharing
    • iOS 12.0 or later is required for Device Level Screen Sharing

In Application Only Screen Sharing

1. Create an InAppScreenCaptureSource and a CaptureSourceObserver

Add a CaptureSourceObserver which will be notified when an RPScreenRecorder is started, stopped, or failed.

let inAppScreenCaptureSource = InAppScreenCaptureSource(logger: logger)
inAppScreenCaptureSource.addCaptureSourceObserver(observer: observer)

2. Start Screen Share

Add the following sample code. This starts capturing the screen and sends it to Amazon Chime Media Services.

func methodToStartScreenShare() {

// CaptureSourceObserver
func captureDidStart() { "InAppScreenCaptureSource did start")
    let contentShareSource = ContentShareSource()
    contentShareSource.videoSource = inAppScreenCaptureSource
    meetingSession.audioVideo.startContentShare(source: contentShareSource)

The user will see the following system permission dialog when InAppScreenCaptureSource.start() is called for the first time. captureDidStart() is called after a user taps “Record Screen” and RPScreenRecorder is started. captureDidFail(error:) is called with CaptureSourceError.systemFailure if the user taps “Don’t Allow”.

3. Stop Screen Share

Add the following sample code to stop capturing the screen and stop the content share connection.

func methodToStopScreenShare() {

// CaptureSourceObserver
func captureDidStop() { "InAppScreenCaptureSource did stop")

func captureDidFail(error: CaptureSourceError) {
    logger.error(msg: "InAppScreenCaptureSource did fail: \(error.description)")

Device Level Screen Sharing

1. Create Broadcast Upload Extension target

  1. With your application project open in Xcode, select File → New → Target.
  2. In the pop-up, select Broadcast Upload Extension, and fill in the necessary fields on the next screen.
    1. It is not necessary to include UI Extension
    2. Embed in Customer Application
  3. Select the newly created broadcast upload extension target in the project
  4. Open General tab and add AmazonChimeSDK in the Frameworks and Libraries section

Note: These screenshots and section names refer to Xcode 12.0. Other versions of Xcode might present these options differently.

2. Add App Groups Capability

To add the App Groups capability to both your main application target and the newly created broadcast upload extension target:

  1. Select Customer Application target in the project.
  2. Open the Signing & Capabilities tab.
  3. Click + Capabilities and select App Groups in the pop-up.
  4. Create an App Group with identifier: group.<application bundle id>.
  5. Repeat the same steps and select the same App Group for the broadcast upload extension target.
  6. Verify that .entitlements files are generated for your main application target and the broadcast upload extension target by Xcode. Ensure that the App Groups values are the same in both .entitlements files.
  7. Regenerate Provisioning Profile for your application in
  8. Create Provisioning Profile for the broadcast upload extension target.

Important: This step is crucial for the broadcast upload extension to access app group user defaults. Validate this step again if you cannot share screen during testing in debug or after the application is signed for distribution.

3. Add RPSystemBroadcastPickerView

Add RPSystemBroadcastPickerView to the application’s view hierarchy. This is the device level screen sharing entry point.

// In the view controller
let pickerViewDiameter: CGFloat = 35
let pickerView = RPSystemBroadcastPickerView(frame: CGRect(x: 0,
                                                           y: 0,
                                                           width: pickerViewDiameter,
                                                           height: pickerViewDiameter))
pickerView.preferredExtension = <Your Broadcast Extension Bundle Identifier>

// Microphone audio is passed through the main application instead of
// the broadcast extension.
pickerView.showsMicrophoneButton = false

// Set up view constraints as necessary.


RPSystemBroadcastPickerView contains a button that brings up the system broadcast picker when tapped. This button functions the same as the Screen Recording button in the iOS Control Center.

4. Write MeetingSession data from Customer Application

Reference the following sample code to write necessary MeetingSessionConfiguration data into App Group User Defaults from the application after MeetingSession is created.

let meetingSessionConfig = meetingSession.configuration
let userDefaultsKeyMeetingId = "demoMeetingId"
let userDefaultsKeyCredentials = "demoMeetingCredentials"
let userDefaultsKeyUrls = "demoMeetingUrls"

if let appGroupUserDefaults = UserDefaults(suiteName: <Your App Group Identifier>) {
    appGroupUserDefaults.set(meetingSessionConfig.meetingId, forKey: userDefaultsKeyMeetingId)
    let encoder = JSONEncoder()
    if let credentials = try? encoder.encode(meetingSessionConfig.credentials) {
        appGroupUserDefaults.set(credentials, forKey: userDefaultsKeyCredentials)
    if let urls = try? encoder.encode(meetingSessionConfig.urls) {
        appGroupUserDefaults.set(urls, forKey: userDefaultsKeyUrls)

5. Read MeetingSession data from the Broadcast Upload Extension

Now the data is written to the App Group User Defaults from the Customer Application. Reference the following sample code to recreate the MeetingSessionConfiguration in SampleHandler.swift in the Broadcast Upload Extension.

let userDefaultsKeyMeetingId = "demoMeetingId"
let userDefaultsKeyCredentials = "demoMeetingCredentials"
let userDefaultsKeyUrls = "demoMeetingUrls"

class SampleHander {
    func recreatMeetingSessionConfig() -> MeetingSessionConfiguration? {
        guard let appGroupUserDefaults = UserDefaults(suiteName: <Your App Group Identifier>) else {
            logger.error(msg: "App Group User Defaults not found")
            return nil
        let decoder = JSONDecoder()
        if let meetingId = appGroupUserDefaults.demoMeetingId,
           let credentialsData = appGroupUserDefaults.demoMeetingCredentials,
           let urlsData = appGroupUserDefaults.demoMeetingUrls,
           let credentials = try? decoder.decode(MeetingSessionCredentials.self, from: credentialsData),
           let urls = try? decoder.decode(MeetingSessionURLs.self, from: urlsData) {

            // Use the same URLRewriter as Customer Application.
            return MeetingSessionConfiguration(meetingId: meetingId,
                                               credentials: credentials,
                                               urls: urls,
                                               urlRewriter: URLRewriterUtils.defaultUrlRewriter)
        return nil

extension UserDefaults {
    @objc dynamic var demoMeetingId: String? {
        return string(forKey: userDefaultsKeyMeetingId)
    @objc dynamic var demoMeetingCredentials: Data? {
        return object(forKey: userDefaultsKeyCredentials) as? Data
    @objc dynamic var demoMeetingUrls: Data? {
        return object(forKey: userDefaultsKeyUrls) as? Data

6. Start Screen Share

Add the following code to SampleHandler. This sends the device level screen capture to Amazon Chime Media Services through the Content Share APIs.

var currentMeetingSession: MeetingSession?
lazy var replayKitSource: ReplayKitSource = { return ReplayKitSource(logger: logger) }()
lazy var contentShareSource: ContentShareSource = {
    let source = ContentShareSource()
    source.videoSource = replayKitSource
    return source

override func broadcastStarted(withSetupInfo setupInfo: [String: NSObject]?) {
    guard let config = getSavedMeetingSessionConfig() else {
        logger.error(msg: "Unable to recreate MeetingSessionConfiguration from Broadcast Extension")
        finishBroadcastWithError(NSError(domain: "AmazonChimeSDKDemoBroadcast", code: 0))
    currentMeetingSession = DefaultMeetingSession(configuration: config, logger: logger)
    currentMeetingSession?.audioVideo.startContentShare(source: contentShareSource)

override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
    replayKitSource.processSampleBuffer(sampleBuffer: sampleBuffer, type: sampleBufferType)

The ReplayKit framework sends screen capture data as CMSampleBuffer. ReplayKitSource converts it to VideoFrame and sends it into Amazon Chime SDK for iOS. It also handles device rotation during the conversion.

7. Stop Screen Share

Update the following method in SampleHandler to stop the content share peer connection when the user stops the broadcast:

override func broadcastFinished() {

Reference the following sample code to stop broadcast programmatically:

// In the main application
func endMeeting() {
    guard let appGroupUserDefaults = UserDefaults(suiteName: <Your App Group Identifier>) else {
            logger.error(msg: "App Group User Defaults not found")
    appGroupUserDefaults.removeObject(forKey: userDefaultsKeyMeetingId)
    appGroupUserDefaults.removeObject(forKey: userDefaultsKeyCredentials)
    appGroupUserDefaults.removeObject(forKey: userDefaultsKeyUrls)

// In the broadcast upload extension
class SampleHandler: RPBroadcastSampleHandler {
    let userDefaultsObserver: NSKeyValueObservation?

    override func broadcastStarted(withSetupInfo setupInfo: [String: NSObject]?) {

        // If the meetingId is changed from the demo app, we need to observe the meetingId and stop broadcast
        userDefaultsObserver = appGroupUserDefaults?.observe(\.demoMeetingId,
                                                 options: [.new, .old]) { [weak self] (_, _) in
            guard let strongSelf = self else { return }
            strongSelf.finishBroadcastWithError(NSError(domain: "<You App Domain>", code: errorCode))
    override func broadcastFinished() {

The user can stop the broadcast by any of the following:

  • Tap the RPSystemBroadcastPickerView added in Customer Application.
  • Tap the status bar.
  • Tap the Screen Recorder button in the Control Center.

Stopping the broadcast programmatically may be needed when:

  • The MeetingSession is ended in the main application. Otherwise, the screen capture will continue to be sent to the meeting session.
  • The main application is terminated by the iOS system in applicationWillTerminate(_:).


Now users can share their view of the main application or bring up the system broadcast picker to share device level screen capture. Users can view the screen share from another device by joining the same meeting session. This is the same as viewing screen share from Amazon Chime SDK for JavaScript. The attendeeId of a screen share is the same as the original attendee, but with a suffix of #content. videoTileDidAdd(tileState:) on VideoTileObserver is called with tileState.isContent being true for the screen share video stream.

Note: You will need a physical iOS device to test screen sharing.

// Add VideoTileObserver instance to receive video streams.
meetingSession.audioVideo.addVideoTileObserver(observer: observer)

// Your VideoTileObserver implementation
func videoTileDidAdd(tileState: VideoTileState) {
    if(tileState.isContent) {
        meetingSession.audioVideo.bindVideoView(videoView: videoView, 
                                                tileId: tileState.tileId)


If you no longer want to keep the demo active in your AWS account and want to avoid incurring AWS charges, the demo resources can be removed by deleting the two AWS CloudFormation stacks created in the prerequisites. These are found in your AWS CloudFormation console.


In this blog post, you learned how to add screen sharing functionality to your iOS application while using the Amazon Chime SDK to send audio and video streams. Amazon Chime SDK for iOS provides simple APIs to enable In Application Only Screen Sharing where developers are required to create Broadcast Upload Extension to their application to add Device Level Screen Sharing functionality. With both options, developers can provide their end users a richer audio/video conferencing experience. If you have questions regarding screen share or other functionalities of Amazon Chime SDK for iOS, take a look at our iOS Demo application in the official GitHub repository.