Business Productivity
Building a Meeting Application on Android using the Amazon Chime SDK
We introduced Amazon Chime SDKs for iOS and Android to supplement the Amazon Chime SDK for JavaScript. This provides application developers native choices when integrating audio, video, and screen share sharing sessions into their own applications. In addition to providing methods to access the Amazon Chime SDK media services and local audio and video devices, the Amazon Chime SDK for Android supports realtime signaling for audio quality, active speaker events, enhanced echo cancellation, hardware accelerated encoding, decoding and rendering, and adaptive bandwidth. The Amazon Chime SDK for Android is comprised of a native media binary shared with Amazon Chime SDK for iOS and a Kotlin wrapper for easy integration.
Using the Amazon Chime SDK, you can add a full unified communication experience to your mobile application on any supported Android devices, for example, to enable distance learning or tele-health solutions, or simple one-to-one video interactions for customers on-the-go.
This post demonstrates how to integrate the Amazon Chime SDK into your Android application project. First, we walk through configuring an Android project. We then describe how to call the APIs provided by the Amazon Chime SDK for Android to build a basic meeting experience along with sample code.
Integration walk-through
Prerequisites
- You have read this Building a meeting application using the Amazon Chime SDK to understand the basic architecture of Amazon Chime SDK and deployed a serverless/browser demo meeting application.
- You have a basic to intermediate understanding of Kotlin and Android development.
- You have installed Android Studio and have an Android application project.
Note: A physical Android device is recommended for a better testing experience.
Key steps outline
Here is an outline of the key steps involved in integrating the Amazon Chime SDK into your Android application.
- Configure your application
- Create a meeting session
- Access AudioVideoFacade
- Handle real-time events
- Render a video tile
- Report metrics
- Test
- Cleanup
Configure your application
To declare the Amazon Chime SDK as a dependency, you must complete the following steps.
- Download amazon-chime-sdk-media.tar.gz and amazon-chime-sdk.tar.gz
- Unzip the files and copy
amazon-chime-sdk-media.aar
andamazon-chime-sdk.aar
into your application’s library directory. - Open your project’s
build.gradle
and add the following underrepositories
inallprojects
: - Add the following under
dependencies
section: - Use Java 8 features by adding the following under the
android
section. MODIFY_AUDIO_SETTINGS
,RECORD_AUDIO
andCAMERA
permissions are already added to the manifest by the Amazon Chime SDK. Your activity should also request the appropriate permissions.private val PERMISSION_REQUEST_CODE = 1 private val PERMISSIONS = arrayOf( Manifest.permission.MODIFY_AUDIO_SETTINGS, Manifest.permission.RECORD_AUDIO, Manifest.permission.CAMERA) ActivityCompat.requestPermissions(applicationContext, PERMISSIONS, PERMISSION_REQUEST_CODE)
You are now ready to integrate with the Amazon Chime SDK for Android. Next we walk you through the key APIs in order to have a basic audio, video and screen share viewing experience. You can refer to the API document in our GitHub repository for additional details.
Create a meeting session
To start a meeting, you must complete the following steps to create a meeting session.
- Make a POST request to
meetingUrl
to create a meeting and an attendee. ThemeetingUrl
is the URL of the serverless demo meeting application you deployed (see Prerequisites section). Don’t forget to escape the inputs appropriately, as shown in the following code.val attendeeName = java.net.URLEncoder.encode(attendee, "utf-8"); val region = java.net.URLEncoder.encode("us-east-1", "utf-8"); val title = java.net.URLEncoder.encode(meetingId, "utf-8"); val url = "${meetingUrl}join?title=$title&name=$attendeeName®ion=$region";
- Use the
response
from the previous request to construct aMeetingSessionConfiguration
. You can convert the JSON response to the pre-definedCreateMeetingResponse
andCreateAttendeeResponse
types in a number of ways. In the following example we use Gson.// Data stucture that maps to the HTTP response. data class JoinMeetingResponse( @SerializedName("JoinInfo") val joinInfo: MeetingInfo) data class MeetingInfo( @SerializedName("Meeting") val meetingResponse: MeetingResponse, @SerializedName("Attendee") val attendeeResponse: AttendeeResponse) data class MeetingResponse( @SerializedName("Meeting") val meeting: Meeting) data class AttendeeResponse( @SerializedName("Attendee") val attendee: Attendee) // Deserialize the response to object. val joinMeetingResponse = Gson().fromJson( response.toString(), JoinMeetingResponse::class.java ) // Construct configuration using the meeting response. val configuration = MeetingSessionConfiguration( CreateMeetingResponse(joinMeetingResponse.joinInfo.meetingResponse.meeting), CreateAttendeeResponse(joinMeetingResponse.joinInfo.attendeeResponse.attendee) ) // Create a default meeting seesion. val meetingSession = DefaultMeetingSession(configuration, ConsoleLogger(), applicationContext)
Access AudioVideoFacade
Now that we have the meeting session, we can access the AudioVideoFacade
instance and use it to control the audio and video experience.
val audioVideo = meetingSession.audioVideo
// Start audio and video clients.
audioVideo.start()
Your application now starts sending and receiving audio streams. You can turn local audio on and off by calling the mute and unmute methods on the facade.
// Mute local audio input.
audioVideo.realtimeLocalMute()
// Unmute local audio input.
audioVideo.realtimeLocalUnmute()
The video does not start automatically. Call the following methods to start sending local video and receiving remote video.
// Start receiving remote video.
audioVideo.startRemoteVideo()
// Start sending local video.
audioVideo.startLocalVideo()
// Switch camera for local video between front and back.
audioVideo.switchCamera()
Handle real-time events
We want to handle various real-time events during the meeting to update UI accordingly. Events are triggered when attendees join or leave the meeting, audio is muted or unmuted, or video is enabled or disabled. The Amazon Chime SDK for Android provides several observer interfaces including AudioVideoObserver
, RealtimeObserver
and VideoTileObserver
. These can be implemented in your application to handle those events. Let’s look at the following samples based on different interfaces.
1. AudioVideoObserver — status of audio and video client
AudioVideoObserver
is used to monitor the status of audio or video sessions. The following code demonstrates some examples of the callbacks from AudioVideoObserver
.
override fun onAudioSessionStarted(reconnecting: Boolean) =
logger.info(TAG, "Audio successfully started. reconnecting: $reconnecting")
override fun onAudioSessionStopped(sessionStatus: MeetingSessionStatus) =
logger.info(TAG, "Audio stopped for reason: ${sessionStatus.statusCode}")
override fun onVideoSessionStarted() =
logger.info(TAG, "Video successfully started.")
override fun onVideoSessionStopped(sessionStatus: MeetingSessionStatus) =
logger.info(TAG, "Video stopped for reason: ${sessionStatus.statusCode}")
// Register the observer.
audioVideo.addAudioVideoObserver(this)
2. RealtimeObserver — attendee status
This observer is useful to maintain a list of attendees and their audio volume and signal strength status. This observer only notifies the change since the last notification.
// Notifies when attendees joined the meeting.
override fun onAttendeesJoined(attendeeInfo: Array<AttendeeInfo>) {
attendeeInfo.forEach {
logger.debug(TAG, "Attendee join. attendee Id: ${it.attendeeId} external user Id: ${it.externalUserId}")
}
}
// Notifies when volume levels changed.
override fun onVolumeChanged(volumeUpdates: Array<VolumeUpdate>) {
signalUpdates.forEach { (attendeeInfo, volumeLevel) ->
logger.info(TAG, "AttendeeId: ${attendeeInfo.attendeeId} externalUserId: ${attendeeInfo.externalUserId} volumeLevel: $volumeLevel")
}
}
// Register the observer.
audioVideo.addRealtimeObserver(this)
Once you get a list of attendees, you can fetch the attendee name from the externalUserId
by the following code.
val attendeeName = attendeeInfo.externalUserId.split('#')[1]
3. VideoTileObserver — video tile track
Some applications only need audio, so video events are handled by a separate observer. If you’re not going to render video, you can skip this step and the render video tile step below.
By implementing onVideoTileAdded
and onVideoTileRemoved
, you can track the currently active video tiles. The video track can come from either camera or screen share.
override fun onVideoTileAdded(tileState: VideoTileState) {
logger.info(
TAG,
"Video tile added, titleId: ${tileState.tileId}, attendeeId: ${tileState.attendeeId}, isContent ${tileState.isContent}")
showVideoTile(tileState)
}
override fun onVideoTileRemoved(tileState: VideoTileState) {
logger.info(
TAG,
"Video tile removed, titleId: ${tileState.tileId}, attendeeId: ${tileState.attendeeId}")
// Unbind the video tile to release the resource
audioVideo.unbindVideoView(tileId)
}
// It could be remote or local video. See the following step for how to render the tile.
fun showVideoTile(tileState) {
...
}
// Register the observer.
audioVideo.addVideoTileObserver()
Render video tile
To render a video tile (both local and remote), you must define a VideoRenderView
in the layout resource file where you want to display the video tile.
<com.amazon.chime.sdk.media.mediacontroller.video.DefaultVideoRenderView
android:id="@+id/video_surface"
android:layout_width="match_parent"
android:layout_height="match_parent" />
To display the new available video tile, bind the view to the tileId
.
audioVideo.bindVideoView(view.video_surface, videoTileState.tileId)
Report metrics
The Amazon Chime SDK for Android emits metrics for you to monitor the meeting experience. Similar to the real-time observers, you must implement the MetricsObserver
function and get it registered. Check ObservableMetric
for the available metrics dimensions.
override fun onMetricsReceived(metrics: Map<ObservableMetric, Any>) {
logger.debug(TAG, "Media metrics received: $metrics")
}
Test
After building and running your Android application, you can verify the end-to-end behavior. Test it by joining the same meeting from your Android device and a browser (using the demo application you set up in the prerequisites).
Amazon Chime SDK Android Demo Application
Cleanup
If you no longer want to keep the demo active in your AWS account and wish to avoid incurring AWS charges, the demo resources can be removed by deleting the two AWS CloudFormation stacks created in the prerequisites. These stacks can be found in the AWS CloudFormation console.
Conclusion
Thanks for following and coding along. This blog should make it easier for you to integrate the Amazon Chime SDK with your Android application. You can download the complete demo application here, which gives additional sample code, including:
- An activity to create the meeting
- An activity to choose the audio input device before entering the meeting
- An activity to show the current attendees and video
- Examples of implementing all observers