Tag: AWS Device Farm


Announcing: Direct Device Access in AWS Device Farm

The defining characteristics of a useful mobile testing service in the cloud are: instant access to a large number of real devices, ability to run parallel tests, excellent reporting, ability to configure the device state, and little to no modifications to your existing test setup.

Working with our most successful mobile customers, we noticed that they use a small set of dedicated physical devices or emulators during the app development and test code authoring phase. Thereafter, when they are closer to the release, they access a larger fleet of devices to get extensive market coverage. In addition, their continuous integration (CI) tools need access to a set of devices to run their regression tests on every check-in and on a nightly or daily basis.

(more…)

Custom Artifacts on AWS Device Farm

A good test report is essential to proper analysis of mobile app testing. It is not only the summary of test results, but also artifacts like screenshots, videos, crash reports, logs, and app-generated files. A well formed test report provides useful insights, leads to faster resolution of issues, and helps teams focus on the right priorities. With so many great testing frameworks available, with their own strengths and weaknesses, the configuration capabilities for reports are important.

AWS Device Farm provides comprehensive test reporting that includes all of the standard capabilities: screenshots, videos, and crash reports. We also know that developers need access to custom reports and artifacts that are generated by the test framework. Today, we are providing access to any custom artifact (log file, test framework generated file, or application generated file) when accessing the test results within AWS Device Farm.
(more…)

Testing Mobile Apps: A Primer

Why should you test your mobile app?  A recent study showed that almost a quarter of users only use a mobile app once, and a shocking 95% abandon an app within the first month.  Some of the reasons that users abandon an app are due to content and engagement.  The biggest non-content reasons for abandonment are application crashes and security concerns.  No one can prevent all application crashes.  The mobile ecosystem is too broad and unpredictable to provide a 100% guarantee.  However, testing ensures that your mobile app has been stressed on as many devices as possible , which enables you to identify and fix bugs early.

Types of mobile testing

There are seven types of scenarios that you should consider when testing a client-server application such as a mobile app:

  • Unit testing
  • UI testing
  • Fuzz testing
  • Performance testing
  • End-to-end testing
  • Pre-production testing
  • Canary (post-production) testing

Tests are organized into test suites – sets of tests that together test the entire functionality of your app.  Let’s look at each of these types.

Unit testing

Unit testing tests individual parts of your code for correctness, usually using an automated test suite.  A good unit test assures you that the functionality that you are expecting out of the unit (your code; usually a single method) is correct.  A good unit test is:

  • Repeatable – you can run it several times and it produces the same result.
  • Fast – you don’t want it to interrupt the flow of your work.
  • Readable – you can understand what the test is doing.
  • Independent – you can run a single test from a suite.
  • Comprehensive – you have enough tests to test all the main cases and corner cases of inputs and cover all code paths.

A good unit test suite augments the developer documentation for your app. This helps new developers come up to speed by describing the functionality of specific methods.  When coupled with good code coverage, a unit test acts as a safeguard against regressions.  Unit tests are important for anything that does not produce a UI.  For example, consider a Notes app that has a set of notes stored in a SQLite database on a mobile device. For this app, there is a class that accesses the notes and a set of classes that implement the UI.  The class that accesses the notes has unit tests.

Each platform has its own unit testing framework:

Each of these frameworks supports all of the features you need to start testing, including support for async functionality (a common pattern in mobile apps).   Each testing framework has its own appearance, but in essence all testing frameworks operate the same.  You specify some inputs, call the method to test, and then verify that the output is what you expect it to be.

UI testing

UI testing takes a flow that the user might follow and ensures it produces the right output.  It can be done on real devices or on emulators and simulators.  Given the same state (including backing stores and starting point), the UI test always produces the same output.  A UI test can be considered similar to a unit test. The input to the test is the user clicks and the output of the test is the screen.  UI testing is more closely associated with the device than with the platform language.  There is a preferred testing framework and each framework has a test recorder. This enables you to record a UI flow on one device and then replay the test on many other devices.  The test recorder allows you to get productive quickly:

In addition, there are cross-platform UI test frameworks to consider, primarily Appium and Calabash.  With Appium, you can choose a language. This is useful if you are writing Ionic or React Native apps because you can use JavaScript as the testing language.  Calabash requires that the tests are written in Ruby.  Neither of these frameworks provide a recorder to aid in writing tests.  Both frameworks are open source projects.

Finally, AWS Device Farm has a feature called Explorer for Android apps.  This feature investigates your UI and finds things to interact with.  When it finds a control, it interacts with the control.  This feature works with login screens, allowing you to validate authenticated sessions as well.

Ionic and React Native perform UI testing against compiled code, so your UI tests should not include Ionic or React Native code.  Instead, you are testing how the app acts on a device.

Fuzz testing

Unit tests and UI tests are used to ensure that the expected output happens when the expected input is used.  However, apps are used by humans and humans don’t always do the expected thing.  For that case, you can introduce a random stream of events into your app and see what happens.  This is known as monkey testing or fuzz testing. It’s a stress test that simulates what happens when the user randomly presses the screen, for example.  This is similar to a UI test.  However, you don’t need to write any tests since random events are created.

It is also interesting to record a fuzz test run.  It can be replayed later to reproduce any issue found, verify that the issue was fixed, and to scale testing across more devices.  Fuzz testing takes off when you take testing to the cloud with AWS Device Farm, allowing you to test on many more devices than you may have access to.

Performance testing

Gather performance metrics while running the UI and fuzz tests to ensure that your application does not take up significant resources on the device.  Such performance metrics include:

  • Battery drain and energy usage.
  • Appropriate usage of the GPS and other features that drain battery.
  • Network bandwidth usage.
  • Memory usage.

This data can be gathered during development using a profiling tool such as the Android Monitor (built into Android Studio) or Instruments (built into XCode).  During this process, you can also use network shaping, which simulates lower bandwidth network connections (for example, 3G connections) or spotty Wi-Fi connections.  This enables you to experience adverse network conditions as your users would and ensure an appropriate user experience.  If you are developing a mobile game, you can also measure the FPS (frames per second) for your mobile game when the device is stressed (low memory or restricted network capabilities).

For performance testing during the test phase (and beyond), use application performance monitoring (APM) tools such as New Relic Mobile or Splunk MINT.

Integration (end-to-end) testing

After you test your mobile app in isolation, you test it with a mobile backend. This is generally known as integration testing or end-to-end testing.  Take the same UI and fuzz testing that we already described, and then recompile your code with the live cloud backend.  Many organizations produce a separate test environment for this purpose.

If you are upgrading the mobile app (rather than releasing a new app) and the upgrade involves an upgrade to the backend resources – the database schema or API responses – then you also should test the upgrade scenarios during integration testing.  Users do not upgrade their mobile apps immediately, so multiple versions of your app will use the same cloud backend resources (even if only the database is affected).

If you are using a third-party cloud service (for example, a Weather API), then make sure you check their rules for throttling.  Test failures can result if the third-party cloud service detects you are making too many API calls.

Pre-launch or pre-submission testing

You are just about to launch your app to the public.  You’ve done all the appropriate testing on the latest devices and OS versions.  Run the UI and fuzz tests on as wide a pool of devices as you possibly can.  There is one final test before submitting your app to the app store.  You want to test your app on as large a community of devices and OS combinations as you possibly can.

An enterprise usually has the luxury of dictating a support matrix of devices. If you are producing a mobile app for the general public, then the situation is a little more complex.  OpenSignal estimates that there are over 24,000 distinct Android devices in the world. Over 18,000 of these were used in the last year.  The statistics from Google indicate that API level 19 (which is a few years old at this point) has only reached 73% of consumers. The iOS device types are a little more limited – there are only a handful of models, and iOS devices tend to be kept up to date.  That’s still numerous device / OS combinations.

One option is to maintain a device farm of your own, buying one of each device.  However, that is expensive to buy and maintain. A better option is to rent the devices you need for the pre-production run and let someone else worry about maintaining the latest device and OS versions. AWS Device Farm helps with this problem as well, running your UI and fuzz tests across a wide variety of devices and charging only for what you use.

Canary (post-production) testing

After your app is in production, use a canary test to ensure that the mobile app and the backend are running harmoniously. A canary test is a set of UI tests that run using the same mobile app that you distributed to your users and using the same (production) services on the backend.

You can generate a (different, smaller) test suite for an individual (canary) user and run this test suite on AWS Device Farm on a schedule to implement canary testing.

Best practices in testing

You don’t need to run all tests all the time.  The following are some best practices:

  • Architect your application so that the mobile application and backend can be tested independently.  Compile your app with “stub” methods that simulate the cloud services.  Also, try mock cloud services with frameworks like Mockito (Android) or Cuckoo (Swift).
  • All changes to the code base of your mobile app or backend should include appropriate unit tests or UI tests to check the new functionality.
  • Run unit tests with every build and UI tests on an emulator or simulator before checking in your code change.
  • Run a complete set of UI and fuzz tests on a pre-defined set of the most popular devices for your users on a regular basis. It is ideal to do this as part of a continuous integration pipeline (for example, using Jenkins). At minimum, you should run these tests on a nightly basis.  Monitor for application crashes and test failures.
  • Run your UI tests on as many devices as possible at intervals throughout the development process. At minimum, run a full set of tests on as many devices as possible before release.  Any test failures or application crashes should generate bugs to be fixed by the engineers.
  • Include enough information in your bugs to reproduce the failure. Include the test case, device and OS combination, whether the test was against the cloud services or the stubs, and video or screen captures of the test failure.
  • Analyze the test results.  A single test failure is useful.  Knowing that the same error or crash occurred on a set of devices with a common characteristic is more useful to the developers who need to diagnose the problem.

AWS Device Farm is a mobile app testing service that lets you test and interact with your Android, iOS, and web apps on many devices at once. It enables you to capture video, screenshots, logs, and performance data to pinpoint and fix issues before shipping your app.  You can use it to automate a large portion of the suggested mobile app testing capabilities described in this article.

With these best practices, you are on your way to producing a quality product that can be enjoyed by the maximum number of users.

Automated Device Testing with AWS Device Farm and Jenkins

A strong indicator of quality of an app is its ability to run on multiple devices of different make, model, and manufacturer consistently and predictably. Incremental development models are at the core of every mobile app development team. Often a seemingly benign feature can be disruptive to the overall user experience and functioning of the app if not thoroughly tested. How do you ensure that every code check-in triggers tests for your mobile app against numerous devices and provides detailed analysis of the results?

Continuous integration (CI) systems have been around for a long time and are now at the core of all DevOps workflows. They integrate build, test, and distribution into one workflow. Many systems to implement CI exist, including cloud solutions like AWS CodePipeline and on-premises solutions like Jenkins. Jenkins is an open source CI solution that is easy to install and portable across all major platforms. It has more than 1000 plugins that integrate with build, test, and distribution components.  This makes Jenkins a popular choice among teams looking for a CI tool for a new project.

AWS Device Farm is an app testing service. You can use it to test and interact with your Android, iOS, and web apps on physical phones and tablets that are hosted by AWS. You can use Device Farm for automated testing with a variety of available testing frameworks. You can also use it for manual testing through a remote session.

Today, AWS Device Farm announced the availability of a Jenkins plugin for AWS Device Farm.  The Jenkins plugin enables you to integrate device testing into your CI workflow.

Prepare your AWS account so that Jenkins can access Device Farm resources:

  1. Sign in to the IAM console.
  2. Create or modify an IAM policy so that your Jenkins instance has access to AWS Device Farm.
  3. Create a user for your Jenkins master (with programmatic access) and assign the user to the IAM policy you just created.
  4. Note the access key ID and secret access key. You need them later.

Add the Jenkins plugin to your Jenkins master:

  1. Sign in to your Jenkins web UI.
  2. Click Manage Jenkins > Manage Plugins on the left side of the screen.
  3. Click the Available tab.
  4. Type aws-device-farm in the search box.
  5. Click Install without restart.
  6. After the installation is complete, click Manage Jenkins > Configure System.
  7. Scroll to the AWS Device Farm header.
  8. Paste the access key ID and secret access key into the appropriate boxes.
  9. Click Save.

You can also use the Jenkins CLI to manage plugins.  Now that the Jenkins plugin for AWS Device Farm is available on your Jenkins master, you can use it in a workflow:

  1. Select the project for which you want to set up the plugin.
  2. Add a post-build step: Run Tests on AWS Device Farm.
  3. Configure any options you want for the run. At minimum, select the UI tests that you want to run and the devices you want to run those tests on.
  4. Save the changes for the project.

AWS Device Farm supports major testing frameworks such as Espresso, XCTest UI, Appium, and Calabash.

After a build and test flow is complete, you can see the results of the tests:

Integrating AWS Device Farm into your automated build pipeline enables you to test your mobile applications on more devices, producing a better quality mobile app.  As always, we want to hear from you.  If you have suggestions or issues, let us know on the AWS Forums.

Test User Interfaces in iOS Apps with XCTest UI and AWS Device Farm

by Asha Chakrabarty | on | Permalink | Comments |  Share

With AWS Device Farm, you can quickly start testing your Android, iOS, and FireOS apps on real devices in the AWS Cloud. Choose to start an interactive session with a device or run automated tests on many devices at once. AWS Device Farm will provide the results of your tests including pass/fail status, logs, performance metrics, screenshots, and videos.

Introduction to XCTest UI

As of Xcode 7, you can access UI testing capabilities (XCUI) integrated into the Xcode IDE. This functionality allows you to find and interact with UI elements in your app and confirm their properties and states. A few new features make it possible for you to programmatically test and exercise the UI of an iOS app with:

  • New Xcode target type for UI tests:

To set up a UI test in Xcode, you create an Xcode target with the iOS UI Testing Bundle template. The Xcode target type fulfills the special requirements required by the UI tests, including launching a proxy for the application in a separate process and providing accessibility permissions.

  • UI testing APIs include three key classes:

    • XCUIElementQuery: Every UI element is backed by the ability to query its properties. For this reason, each XCUIElement must be unique.
    • XCUIElement: A proxy object for a UI element that is represented as types (for example, a cell or a button).
    • XCUIApplication: An instantiation of your application object that forms the origin for finding UI elements.
  • UI recording:

This allows you to record interactions with your app’s user interface. Xcode will transform these interactions into source code that can be included in your existing tests or to create new tests. 

AWS Device Farm now allows you to run the UI Testing feature incorporated in Xcode 7 on real devices in the AWS Cloud. In this post, we will walk you through how to create an XCTest UI test, package it for testing on AWS Device Farm, schedule a run, and view test results from real devices in the cloud.

Prerequisites

  • You’ll find the sample iOS app used in this post on AWS Labs on GitHub.
  • UI Testing was introduced in Xcode 7 and iOS 9, so be sure to update accordingly.
  • iOS devices must be enabled for development and connected to a host running Xcode.
  • It is assumed that you have created the .ipa file for the sample iOS app before you schedule a run in AWS Device Farm.

Step 1: Create a UI Test for the AWS Sample iOS App

After you have downloaded and opened the sample app in Xcode, build the project. After the build is successful, you will create a new a target type for the UI tests.

Your project navigator should look like the following:

With UI testing, you can record interactions within your app and Xcode will write the code required to re-enact those interactions in your test. You will still need to use XCTAssert to add your test assertions. You can record interactions with your UI by pressing the record button (the small red dot at the bottom left corner of the editor pane).

Copy the following UI test code to your AWSDeviceFarmiOSReferenceAppUITests.m implementation file.

#import 

@interface AWSDeviceFarmiOSReferenceAppUITests : XCTestCase

@end

@implementation AWSDeviceFarmiOSReferenceAppUITests

- (void)setUp {
    
    [super setUp];
    self.continueAfterFailure = NO;
    [[[XCUIApplication alloc] init] launch];
    
}


- (void)tearDown {
    
    [super tearDown];
}


- (void)testNativeInput {
    
    XCUIApplication *app = [[XCUIApplication alloc] init];
    XCUIElementQuery *tabBarsQuery = app.tabBars;
    [tabBarsQuery.buttons[@"Native"] tap];
    
    XCUIElementQuery *collectionViewsQuery = app.collectionViews;
    [collectionViewsQuery.staticTexts[@"Table of elements"] tap];
    [app.navigationBars[@"ElementsTableView"].buttons[@"Menu"] tap];
    [collectionViewsQuery.staticTexts[@"Scrolling View"] tap];
    [app.navigationBars[@"Scrolling View"].buttons[@"Menu"] tap];
    [tabBarsQuery.buttons[@"Home"] tap];
    
}


- (void)testNestedView {
    
    
    XCUIApplication *app = [[XCUIApplication alloc] init];
    XCUIElementQuery *tabBarsQuery = app.tabBars;
    [tabBarsQuery.buttons[@"More"] tap];
    [app.staticTexts[@"Nested"] tap];
    
    XCUIElement *moreNavigationBar = app.navigationBars[@"More"];
    XCUIElement *nextButton = moreNavigationBar.buttons[@"Next"];
    [nextButton tap];
    [nextButton tap];
    [nextButton tap];
    
    XCUIElement *backButton = [[[moreNavigationBar childrenMatchingType:XCUIElementTypeButton] matchingIdentifier:@"Back"] elementBoundByIndex:0];
    [backButton tap];
    [backButton tap];
    [backButton tap];
    [moreNavigationBar.buttons[@"More"] tap];
    [tabBarsQuery.buttons[@"Home"] tap];
    
}


- (void)testAlertControl {
    
    
    XCUIApplication *app = [[XCUIApplication alloc] init];
    XCUIElementQuery *tabBarsQuery = app.tabBars;
    [tabBarsQuery.buttons[@"More"] tap];
    [app.staticTexts[@"Alerts"] tap];
    [app.buttons[@"Modal"] tap];
    [app.buttons[@"OK"] tap];
    [app.buttons[@"Alert"] tap];
    [app.alerts[@"Alert"].collectionViews.buttons[@"OK"] tap];
    [app.navigationBars[@"More"].buttons[@"More"] tap];
    [tabBarsQuery.buttons[@"Native"] tap];
    [app.collectionViews.staticTexts[@"Image Gallery"] tap];
        
}

@end

Before packaging your test for AWS Device Farm, be sure to build the project with Xcode. Use the Product/Build for Running option select an iOS device. Keep in mind that the Build Active Architecture setting for your app and your UI test targets should be the same.

Step 2: Package Your Test for AWS Device Farm

When packaging your test for upload to AWS Device Farm, make sure your iOS XCTest UI Runner test runner bundle is contained in a correctly formatted .ipa file. For more information, see the AWS Device Farm documentation. You can also view the creation of an .ipa file on AWS Labs on GitHub.

Make sure that you package your test in the Payload folder under debug-iphoneos as shown here. In the following screenshot, we renamed the resulting zip file of the Payload folder to UITest.ipa for easier file management.

Step 3: Schedule a Run in AWS Device Farm

Sign in to the AWS Device Farm console and create a run under a new or existing project. Upload the .ipa file of the sample app.

In the next step, you will choose XCTest UI as the test type and upload the .ipa file you created in step 2.

Select the devices on which you’d like to test. If you like, you can create a new device pool for your test run to reuse in subsequent runs.

Finally, review and start the test run.

Step 4: View XCTest UI Test Results

When your test run is complete, you will see the test results summary.

Choose a device to examine its test suites. Here we are reviewing the test suite results for an Apple iPhone 5s and Apple iPhone 6 device.

The results for each device will contain screenshots, video recordings, performance data, and log files that can be downloaded for further review.

Conclusion

We are always happy to hear your feedback. Feel free to leave your feedback, including questions, in the comments or on our developer forum.

Happy testing!

Test iOS apps on AWS Device Farm using Appium – Part 3: Upload your iOS Application and TestNG tests to AWS Device Farm

by Asha Chakrabarty | on | Permalink | Comments |  Share

With AWS Device Farm, you can quickly get started testing your Android, iOS, and FireOS apps on real devices in the AWS cloud. Simply upload your app, choose your test framework and the devices you want to test your app on, and start your test run. AWS Device Farm will provide the results of your tests including pass / fail status, logs, performance characteristics, and screenshots.

Previously in Part 1 and Part 2 of this series, you set up your environment, created and validated your TestNG tests, and packaged your TestNG tests for upload to AWS Device Farm. In Part 3, you will upload the sample iOS app (an unsigned ipa file you downloaded as part of the prerequisities section in Part 1) and the TestNG package you created in Part 2.

Upload the sample iOS app and TestNG suite to AWS Device Farm

We now have the sample iOS app (ToDoList.ipa) you downloaded and the TestNG package (zip-with-dependencies.zip) that contains our test code and test dependencies. To begin testing iOS apps on AWS Device Farm, you will need an .ipa (iOS app archive) file, which you will upload to AWS Device Farm. Make sure when you create your own iOS apps, your .ipa file is built for an iOS device and not for a simulator. You do not need to add any UDIDs to your profile as Device Farm will resign your app automatically (See the FAQ for more info).

Follow the Getting Started steps to begin testing with AWS Device Farm. You will upload the iOS app (.ipa file), configure your test type (choose Appium with TestNG), and upload the zip-with-dependencies file that was created as part of the Maven build in Part 2 of this series.

Log into your AWS Account and navigate to the AWS Device Farm console under the Mobile Services category.

Create a new project if you do not have an existing one. A project is a logical workspace for your test runs.

Once you have created a project, click on it and create a run. A run represents a test of a specific build of your app, with a specific set of tests, to be run on a specific set of devices. Each test run will produce a report containing information about the results of the run. 

Browse to the location of the ipa file you downloaded in Part 1 and upload.

Once the app has successfully uploaded, proceed to configure the test type. For this example, we will select Appium Java TestNG. Note that the Appium Java TestNG framework is currently in preview mode. 

Upload the zip-with-dependencies zip file that was created in Part 2 of this series. This can be found in the “target” folder of the Maven project in the workspace location where you have stored your Eclipse projects.

Select the devices that are compatible with the app for testing. For this example, we will use a curated list of devices called Top Devices that AWS Device Farm has pre-selected for us. You have the option of selecting other devices by creating a new device pool.

The next step prompts you to specify device states to override any device settings. This allows you to provide additional data and / or install additional apps for AWS Device Farm to use during the run, specify whether Wi-Fi, Bluetooth, GPS, or NFC will be enabled during the run, preset the device latitude and longitude for the run, and preset the device locale for the run.

Choose "Review and start run." On the Review and start run page, choose "Confirm and start run" once you have confirmed all details. When your tests have completed, you will be able to drill down into each device to view the results.

If we click on the first device (Apple iPhone 5c), we can see that our TestNG test suites passed successfully and how much time it took for each to complete.

Based on the TestNG test suite we created in Part 1, you can see that each method that is part of the "com.aws.devicefarm.example.appiumiostest.SampleAppiumTestNGTest" package has been executed. Further if we click on the package name, you will see the four test cases that were created using the Appium Inspector. For each test case, you can view associated screenshots and download log files.

When you click on the "Screenshots" tab, any screenshots that were included in "test01" will be seen.

When you go to the "Files" tab, you will see a list of log files that have been generated for the "test01" test. You can click on each log file and download it for review.

Summary

By utilizing AWS Device Farm, you can test your iOS apps across multiple devices in parallel and get results in minutes to quickly understand your app’s behavior on popular devices. Now that you have an understanding of the steps involved in testing your iOS apps with Appium on AWS Device Farm, I’d highly encourage you to begin testing your own iOS apps. No modifications are required to your app or test code – just package your tests, upload, and start testing!

If you have questions or comments, please reach out to us in the comments section below or on our developer forum. We are happy to receive your feedback on AWS Device Farm and how to make testing iOS apps more seamless for you.

Happy Testing!

Test iOS apps on AWS Device Farm using Appium – Part 2: Packaging your TestNG test suite for AWS Device Farm

by Asha Chakrabarty | on | Permalink | Comments |  Share

With AWS Device Farm, you can quickly get started testing your Android, iOS, and FireOS apps on real devices in the AWS cloud. Simply upload your app, choose your test framework and the devices you want to test your app on, and start your test run. AWS Device Farm will provide the results of your tests including pass / fail status, logs, performance characteristics, and screenshots.

Previously in Part 1, we focused on the prerequisites and creation of the TestNG test suite. In this article, Part 2 of the three-part series, we will walk through how to package your test suite for uploading to AWS DeviceFarm. Before uploading your test suite to AWS Device Farm, the test suite should be packaged correctly and contain all the required dependencies. Both the tests and associated dependencies must be packaged as a zip file. In Part 3, we will upload the package from this post and run the tests on AWS Device Farm.

Package your TestNG Test Suite

Create a new Maven project in Eclipse using the quickstart archetype (maven-archetype-quickstart). It is recommended to match the Group Id / Artifact Id with the root package as shown in the sample pom.xml. For this sample, our package name is com.aws.devicefarm.example.appiumiostest.

Copy the TestNG class created in the previous step to your Maven project under src/test/java.

Save the following XML assembly to src/main/assembly/zip.xml. The following XML is an assembly definition that, when configured, instructs Maven to build a .zip file containing everything in the root of your build output directory and the dependency-jars directory.

	<assembly

	    xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"

	    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

	    xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd">

	      <id>zip</id>

	      <formats>

	            <format>zip</format>

	      </formats>

	      <includeBaseDirectory>false</includeBaseDirectory>

	      <fileSets>

	            <fileSet>

	                  <directory>${project.build.directory}</directory>

	                  <outputDirectory>./</outputDirectory>

	                  <includes>

	                    <include>*.jar</include>

	                  </includes>

	            </fileSet>

	            <fileSet>

	                  <directory>${project.build.directory}</directory>

	                  <outputDirectory>./</outputDirectory>

	                  <includes>

	                    <include>/dependency-jars/</include>

	                  </includes>

	            </fileSet>

	      </fileSets>

	</assembly>

Replace your pom.xml with the following (NOTE: Change the groupId, artifactId, and name elements to match that of your Maven project).

	<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">

	  <modelVersion>4.0.0</modelVersion>

	

	  <groupId>com.aws.devicefarm.example</groupId>

	  <artifactId>appiumiostest</artifactId>

	  <version>0.0.1-SNAPSHOT</version>

	  <packaging>jar</packaging>

	  <name>appiumiostest</name>

	  <url>http://maven.apache.org</url>

	

	  <properties>

	    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>

	  </properties>

	  <dependencies>

	        <dependency>

	            <groupId>org.testng</groupId>

	            <artifactId>testng</artifactId>

	            <version>6.8.8</version>

	            <scope>test</scope>

	        </dependency>

	        <dependency>

	            <groupId>org.seleniumhq.selenium</groupId>

	            <artifactId>selenium-java</artifactId>

	            <version>2.42.2</version>

	        </dependency>

	        <dependency>

	            <groupId>io.appium</groupId>

	            <artifactId>java-client</artifactId>

	            <version>1.4.0</version>

	        </dependency>

	  </dependencies>

	  <build>

	        <plugins>

	            <plugin>

	                <groupId>org.apache.maven.plugins</groupId>

	                <artifactId>maven-jar-plugin</artifactId>

	                <version>2.6</version>

	                <executions>

	                    <execution>

	                        <goals>

	                            <goal>test-jar</goal>

	                        </goals>

	                    </execution>

	                </executions>

	            </plugin>

	            <plugin>

	                <groupId>org.apache.maven.plugins</groupId>

	                <artifactId>maven-dependency-plugin</artifactId>

	                <version>2.10</version>

	                <executions>

	                    <execution>

	                        <id>copy-dependencies</id>

	                        <phase>package</phase>

	                        <goals>

	                            <goal>copy-dependencies</goal>

	                        </goals>

	                        <configuration>

	                            <outputDirectory>${project.build.directory}/dependency-jars/</outputDirectory>

	                        </configuration>

	                    </execution>

	                </executions>

	            </plugin>

	            <plugin>

	                <artifactId>maven-assembly-plugin</artifactId>

	                <version>2.5.4</version>

	                <executions>

	                    <execution>

	                        <phase>package</phase>

	                        <goals>

	                            <goal>single</goal>

	                        </goals>

	                        <configuration>

	                            <finalName>zip-with-dependencies</finalName>

	                            <appendAssemblyId>false</appendAssemblyId>

	                            <descriptors>

	                                <descriptor>src/main/assembly/zip.xml</descriptor>

	                            </descriptors>

	                        </configuration>

	                    </execution>

	                </executions>

	            </plugin>

	        </plugins>

	  </build>

	</project>

Build, package, and verify your Maven project using the following command.

mvn clean package –DskipTests=true

The Maven project structure should look similar to the below. Note the zip-with-dependencies file that was created under the “target” folder. This contains our test code with all dependencies:

Summary

Now that we have packaged our tests and have a sample iOS app to test, in Part 3 we will upload both to AWS Device Farm to test across multiple devices and receive test results within minutes. Feel free to post any suggestions, comments or questions in the comments section below or in our developer forum. We look forward to hearing your feedback.

Test iOS apps on AWS Device Farm using Appium – Part 1: Prerequisities, Environment Set Up, and Test Creation

by Asha Chakrabarty | on | Permalink | Comments |  Share

With AWS Device Farm, you can quickly get started testing your Android, iOS, and FireOS apps on real devices in the AWS cloud. Simply upload your app, choose your test framework and the devices you want to test your app on, and start your test run. AWS Device Farm will provide the results of your tests including pass / fail status, logs, performance metrics, and screenshots.

The objective of this three-part series is to walk through step-by-step how to get started testing an iOS app with AWS Device Farm. The test automation framework we will be using is Appium and we will write our test cases using the TestNG framework. We will not be building an iOS app as part of this walkthrough. However, to follow along with this post, a sample iOS app is available for download here.

At the end of this series, you will have the knowledge required to test an iOS app with AWS Device using Appium and TestNG.

Prerequisites

For this example, the tools and versions listed below were used. It is assumed you have these tools installed on your machine beforehand and that you have an AWS Account.

  • Java 8
  • Eclipse IDE Mars was used for this tutorial.
  • m2e Maven Integration for Eclipse
  • Xcode 7.0 We will use a sample, unsigned iOS app called “ToDoList.ipa” that you can download here.
  • Appium 1.4.8 If you have not worked with Appium before, please review the Appium documentation before moving ahead.  We will be using the Appium GUI for this tutorial.  Please install the latest version of the Appium GUI.
  • Apache Maven 3.3.3 Or the latest version of Maven
  • AWS Account and IAM User creation Follow steps under the Setting Up section of the AWS Device Farm documentation.

A preamble on Appium

Appium is a test automation framework that executes your test suite across both iOS and Android apps without you having to further modify your app or create separate test suites for the same app. Appium is able to provide this capability by utilizing Selenium WebDriver, which follows a client-server protocol. Appium uses Selenium WebDriver to be able to receive commands from a client running your tests and respond to those commands with a response over HTTP. This follows a request / response paradigm in which requests are the commands being sent to the Appium server and responses are being received by the client via WebDriver. 

Appium executes your tests as described below:

  1. Tests for an iOS or Android app are written in a WebDriver-supported programming language. Appium provides client-side libraries for many languages. 
  2. An Appium server is launched after installation and configuration on the test machine.
  3. The tests are initiated on a client machine.
  4. Appium will begin receiving commands from the client (the machine running your tests) and execute those commands (your test emulating user actions with your app) on an emulator or real device.

Now that we have walked through the prerequisites and an overview of Appium, let’s get started building out our test cases for the sample iOS app you have downloaded.

Create a Java project in Eclipse

Launch Eclipse and create a new Java project to begin writing your test cases by selecting File -> New -> Java Project.

Create two source folders: src/main/java and src/test/java. Within each source folder, create a package in which you will eventually create your TestNG test class. Your project structure should look similar to the below.

Configure the build path for each source folder by right clicking the source folder and selecting ‘Configure Build Path.’ Ensure you have selected the Java Build Path option from the left hand menu and you are on the Source tab. Check the ‘Allow output folders for source folders’ option.

  1. For src/main/java -> the output folder is target/classes.
  2. For src/test/java -> the output folder is target/test-classes.

The build paths for your source folders should look similar to the folders in the image below.

Add external libraries for Selenium WebDriver, TestNG, and Appium Java Client

Selenium

Download the Selenium Client & WebDriver Language Bindings for the language of your choice. For this tutorial, we will be using Java.  Downloads are available at: http://docs.seleniumhq.org/download/.

  1. Once you have saved and unzipped the Selenium dependencies to your machine, right-click the Java project created in Step 1 and go to Properties.  
  2. Select Java Build Path in the left-hand menu and ensure you are on the Libraries tab. Select ‘Add External JARS’ and navigate to the location of your Selenium jar files. 
  3. Select the 2 Selenium jar files and the jar files in the folder ‘libs’ and click ‘Apply.’

TestNG

To install TestNG in Eclipse, go to Help -> Eclipse Marketplace -> Search for TestNG and Install.

Once you have installed TestNG in Eclipse, proceed below:

  1. Right click on the project created in Step 1 and go to Properties. 
  2. Select Java Build Path in the left-hand menu and ensure you are on the Libraries tab. Select the ‘Add Library’ button on the right.
  3. Select TestNG -> Next -> Finish.
  4. Select ‘Apply’ and then click OK.

Appium client for Java

Download the Appium client library for Java and add the jar file to the project you created in Step 1. Follow steps as performed when adding the Selenium jar files.

Once all referenced libraries have been added to your Java project, your project structure should look similar to the image below.

Create the TestNG test suite

Create a Java class called "SampleAppiumTestNGTest" under src/test/java for creating your TestNG test code. Paste the below sample TestNG code:

package com.aws.devicefarm.example.appiumiostest;

import org.testng.annotations.Test;
import java.io.File;
import java.net.URL;
import org.openqa.selenium.By;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import org.openqa.selenium.remote.DesiredCapabilities;
import org.openqa.selenium.remote.RemoteWebDriver;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;

public class SampleAppiumTestNGTest 
{
	private static RemoteWebDriver driver = null;
	
	public boolean takeScreenshot(final String name) {
		String screenshotDirectory = System.getProperty("appium.screenshots.dir", System.getProperty("java.io.tmpdir", ""));
		File screenshot = ((TakesScreenshot) driver).getScreenshotAs(OutputType.FILE)
		return screenshot.renameTo(new File(screenshotDirectory, String.format("%s.png", name)));
	}
  
	
	@BeforeMethod
	public void setUp() throws Exception {
		DesiredCapabilities capabilities = new DesiredCapabilities();
       		URL url = new URL("http://localhost:4723/wd/hub");
        	driver = new RemoteWebDriver(url, capabilities);
	}
	
	
	@Test
	public void test01() throws InterruptedException {
		driver.findElement(By.name("Add")).click();
		driver.findElement(By.xpath("//UIAApplication[1]/UIAWindow[1]/UIATextField[1]")).sendKeys("Complete Taxes");
		driver.findElement(By.name("Save")).click();
		driver.findElement(By.xpath("//UIAApplication[1]/UIAWindow[1]/UIATableView[1]/UIATableCell[4]")).click();
		String screenshot1 = null;
		takeScreenshot(screenshot1);
	}
	
	
	@Test
	public void test02() throws InterruptedException {
		driver.findElement(By.name("Add")).click();
		driver.findElement(By.xpath("//UIAApplication[1]/UIAWindow[1]/UIATextField[1]")).sendKeys("Book tickets for vacation");
		driver.findElement(By.name("Save")).click();
		String screenshot2 = null;
		takeScreenshot(screenshot2);
	}
	

	@Test
	public void test03() throws InterruptedException {
		driver.findElement(By.name("Add")).click();
		driver.findElement(By.xpath("//UIAApplication[1]/UIAWindow[1]/UIANavigationBar[1]/UIAButton[1]")).click();
	}
	

	@Test
	public void test04() throws InterruptedException {
		driver.findElement(By.xpath("//UIAApplication[1]/UIAWindow[1]/UIATableView[1]/UIATableCell[1]")).click();
		driver.findElement(By.xpath("//UIAApplication[1]/UIAWindow[1]/UIATableView[1]/UIATableCell[1]")).click();
		driver.findElement(By.xpath("//UIAApplication[1]/UIAWindow[1]/UIATableView[1]/UIATableCell[1]")).click();
	}

	
	@AfterMethod
	public static void tearDownClass() {
        		if (driver != null) {
            		driver.quit();
        		}
    	}
}

To capture the user interface elements and interactions in the TestNG test suite, Appium Inspector was used. To do this yourself, you can launch the Appium GUI and select the iOS Settings.

  1. For the App Path, navigate to the location of your .app file. 
  2. Ensure you check the Force Device option and choose a device.
  3. Ensure a Platform Version is selected.
  4. Click the Launch button.
  5. Click the icon with the magnifying glass to launch the Appium Inspector.
  6. Begin recording your tests.

NOTE:  Appium versions 1.4.11 and later support Xcode 7 with iOS 9 simulators. To use the Appium Inspector for capturing elements in this case, you will need to point the Appium client GUI to the latest Appium version. To do this:

  1. Clone the Appium git to your local git clone (https://github.com/appium/appium.git)
  2. Once the download is complete, Run reset.sh
  3. Open the Appium client (GUI)
  4. Select "Developer settings" and enable it
  5. Select Use External NodeJS Binary to "Your local node bin folder path"
  6. Select Use External Appium Package to "Your Appium Local clone folder path"
  7. Launch the Appium Server

Summary

You should now have your environment set up, a sample iOS app that you downloaded, and an Eclipse project with your TestNG tests.  With this you can run your test suite locally with Appium and validate your tests. Join us in Part 2 where we will walk through how to prepare your TestNG test suite for uploading to AWS Device Farm. Finally, in Part 3 we will run the tests we’ve created in the AWS Cloud with AWS Device Farm. We are always interested in your feedback, comments, and questions so please reach out to us in the comments section below or in our developer forum.

 

Set Up Continuous Testing with Appium, AWS CodeCommit, Jenkins, and AWS Device Farm

by Epshteyn, Ilya | on | Permalink | Comments |  Share

AWS Device Farm is an app testing service that enables you to test your native, hybrid, and web apps on real Android and iOS devices hosted in the AWS Cloud. A test report containing high-level results, low-level logs, pixel-to-pixel screenshots, and performance data is updated as tests are completed.

The purpose of this walkthrough is to show how AWS Device Farm can be integrated into a continuous testing process to expose application risks with each build, enabling the development team to prevent problems from progressing to the next stage of the software development lifecycle. In this walkthrough, we will be leveraging the Appium automation framework for Android (Java based), AWS CodeCommit as our source control service, a Jenkins server for continuous builds, and AWS Device Farm to test against real mobile devices. 

Pre-requisites

Below are the pre-requisites for getting started:

Configure your Appium Maven project

In this walkthrough, we will be configuring a Java based Appium Maven project. If you don’t have an existing Appium Maven project in Eclipse, you can use the following sample project available on GitHub. Simply Download the project zip, extract it to a local directory, and import into Eclipse (File –> Import –> Maven –> Existing Maven Project). 

Before going to the next step, make sure you can get a clean Maven build within Eclipse. To create a Maven build, simply right click on pom.xml and select Run As –> Maven build.

In the Edit Configuration dialog, specify “clean install” as Goals and click Run.

A zip-with-dependencies file will be generated in the target sub-folder if the build is successful:

Set up AWS CodeCommit and configure EGit

Create an AWS CodeCommit Repository

First create an AWS CodeCommit repository using the AWS CodeCommit Console or AWS CLI. Note the SSH URL of your repository as this will be needed in a following step:

Configure SSH Authentication for your CodeCommit repository

We will be using SSH credentials for accessing CodeCommit from EGit. Please use the following walkthroughs for setting up SSH credentials on Windows or OS X environments (you may skip Step 2 and Step 4, but make sure to test your SSH credentials in Step 3 before proceeding further).

Create a local clone of your AWS CodeCommit repository

The next step is to create a local clone of your AWS CodeCommit repository to be used with your Appium project. In Eclipse, open the Git Repositories view (Window –>  Show View –>  Other –> Git –> Git Repositories) and click on Clone a Git repository link. In the Clone Git Repository dialog box, select Clone URI option and click Next:

Paste the AWS CodeCommit SSH URL as the URI. The Host and Repository path fields will be auto-populated. Select ssh as the Protocol and click Next

A warning message will appear on the next screen stating that the Source Git Repository is empty. Click Next to continue. On the next dialog, configure the local storage for the repository and click Finish.

Now right click on your Appium Maven project and select Team –> Share Project.

In the Repository dropdown, select the repository you have just cloned and press Finish.

Right click again on your Appium Maven project and select Team –> Commit.  Add a commit message, select all files and click Commit.

Right click one more time on your Maven project and select Team –> Push branch ‘master’ to push the branch to your AWS CodeCommit repository. Click Next and Finish on the following two dialogs. A Push Results dialog will confirm a succesful push to the CodeCommit repository.

You may also login to AWS CodeCommit console to confirm that the repo has been updated:

Configure Jenkins

Install Jenkins server

If you don’t already have a Jenkins server, you can install one very quickly and easily on an Amazon Linux AMI using the following instructions:

  1. If setting up a new Jenkins serer, create an Amazon EC2 service role and attach a managed policy AWSCodeCommitPowerUser to it. Jenkins will use the permissions granted by the IAM role to access the CodeCommit repositories.
  2. Launch an Amazon Linux AMI and login to your instance.
  3. Update your repositories: sudo yum update
  4. Get Jenkins repository: sudo wget -O /etc/yum.repos.d/jenkins.repo http://pkg.jenkins-ci.org/redhat-stable/jenkins.repo
  5. Get Jenkins repository key: sudo rpm –import http://pkg.jenkins-ci.org/redhat-stable/jenkins-ci.org.key
  6. Install Jenkins server: sudo yum install jenkins
  7. Start Jenkins server: sudo service jenkins start
  8. Make sure the Jenkins server starts up automatically: sudo chkconfig jenkins on
  9. Open your browser and navigate to http://<Server-IP>:8080. You should see the Jenkins dashboard.
  10. Secure Jenkins server by selecting “Manage Jenkins” on the Jenkins dashboard and click on “Configure Global Security”.  Check “Enable Security” and select the desired Access Control and Authorization. 

Install Jenkins plugins

  1. Login to Jenkins
  2. Navigate to Manage Jenkins, Manage Plugins, and then choose the following plugins: aws-device-farm, Git plugin, and Copy To Slave Plugin and click Save.
  3. Navigate to Jenkins Dashboard and select Manage Jenkins and then Configure System.  Scroll down to Maven section and click Add Maven. Enter any name for the Maven installation and click Save to automatically configure the Maven installation.
  4. Restart Jenkins

Configure the AWS Device Farm Jenkins plugin

Once your Jenkins server is up and running, install the AWS Device Farm Jenkins Plugin following these steps. You will then create an IAM user and Policy with DeviceFarmAll permissions that will be used by the plugin to submit Runs to the AWS Device Farm service.

Configure access to your CodeCommit respository

  • SSH to the Jenkins server.Install Git: sudo yum install git
  • If you are using Amazon Linux AMI for your Jenkins server the AWS CLI is already pre-installed. If you are not using Amazon Linux AMI, install AWS CLI.
  • Configure the AWS CLI for the jenkins user. 
   cd ~jenkins
   sudo -u jenkins aws configure

Specify the AWS Access Key ID and AWS Secret Access Key for a user that has permissions to the CodeCommit repository you created earlier. If you are using an EC2 service role, leave the AWS Access Key ID and AWS Secret Access Key blank. Jenkins will use the permissions granted by the IAM role to access the CodeCommit repositories.

   AWS Access Key ID [None]: paste the Access Key ID, and then press Enter
   AWS Secret Access Key [None]: paste the Secret Access Key, and then press Enter
   Default region name [None]: type us-east-1, and then press Enter
   Default output format [None]: type json, and then press Enter
  • Next, configure Git on the Jenkins server to use IAM credentials and an HTTP path to access the repositories hosted by AWS CodeCommit.
   sudo -u jenkins git config --global credential.helper '!aws codecommit credential-helper $@'
   sudo -u jenkins git config --global credential.useHttpPath true
   sudo -u jenkins git config --global user.email "me@mycompany.com"
   sudo -u jenkins git config --global user.name "MyJenkinsServer"
  • Restart Jenkins server

Configure your Jenkins project with CodeCommit and Device Farm

From the Jenkins home page, select New Item. Type a project name and select a Maven project and click OK:

In the Source Code Management section, select Git and paste the HTTP endpoint of your CodeCommit repository. 

You can get the HTTP endpoint of your CodeCommit repository in the AWS CodeCommit console:

For the Build Trigger, select Poll SCM and specify a schedule, for instance H/05 * * * *

In the Build Environment section, select Copy files into the job’s workspace before building and specify the name of the Android application file such as “app-debug.apk." This file needs to be uploaded to $JENKINS_HOME/userContent directory on the Jenkins server before the build is triggered. Alternatively, you can include the application file within your Appium Maven project within Eclipse, depending on your development and testing workflow.

In the Post Steps section, select Run Tests on AWS Device Farm under the Add post-build action dropdown:

In the Run Tests on AWS Device Farm section, specify the following parameters:

  • Project: select the name of the Device Farm Project.
  • Device Pool: select a default Top Devices pool or another device pool that you have created previously.
  • Application: specify the name of the Android application file such as “app-debug.apk” (either included in your Appium Maven Project or copied to the workspace as specified in the Build Environment section above)
  • Choose test to run: select Appium Java TestNG and specify target/zip-with-dependencies.zip which is the Maven build output file containing all of the tests.

Click Save to save your Jenkins project configuration.

Trigger an automatic test run on Device Farm

Make an update to your Appium Maven Project in Eclipse, commit the changes and push the branch to AWS CodeCommit repository. After a few minutes you should see a Jenkins build in progress:

You can also navigate to Console Output to confirm a successful Maven build and a scheduled Device Farm run:

Return to the build status page and click on the Full Report link to automatically login to the AWS Device Farm console to view detailed statistics for the run and artifacts including detailed logs and screenshots.

Conclusion

In this tutorial we demonstrated how AWS Device Farm can be integrated into your Continuous Testing strategy. First, we showed how an Appium Maven project can be pushed to AWS CodeCommit repository. Then, by leveraging Jenkins, we configured a project that automatically triggers a Maven build and submits Appium tests to Device Farm for testing against hundreds of real mobile devices. If you have questions, please let us know in the comments section or in our developer forum.

Get started with the AWS Device Farm CLI and Calabash Part 2: Retrieving Reports and Artifacts

by Epshteyn, Ilya | on | Permalink | Comments |  Share

In Part 1 of the walkthrough, we went through the process of creating an AWS Device Farm run for Calabash-Android test scripts.  In the second and final part of the series, we will go through the CLI commands for obtaining the status of the run and to retrieve all of the artifacts associated with a run including log files and screenshots.

Device Farm Run Components

Before we begin, let’s review the hierarchical structure of a Device Farm run:

  • RUN – A run in Device Farm represents a specific build of your app, with a specific set of tests, to be run on a specific set of devices. A run produces a report that contains information about the test execution. A run contains one or more jobs.
  • JOB – A job is a request to test a single app against a single device. A job contains one or more suites.
  • SUITE – A suite is a hierarchical organization of tests in a test package. A suite contains one or more tests.
  • TEST – A test is an individual test within a suite. This is the most granular entity in Device Farm’s data hierarchy.

 

Retrieving Reports and Artifacts

Using Device Farm CLI, we can obtain the status and result counters for each level of the run hierarchy including run, job, suite, and test. As you’ll see, the data available for each level of the hierarchy is very similar, with the scope of the data becoming more granular as you navigate deeper in the hierarchy.

Step 1: Retrieve the Status and Result Counters for a Device Farm Run

Use the get-run CLI command to retrieve the status and result counters of the run. 

[ec2-user]$  aws devicefarm get-run –arn arn:aws:devicefarm:us-west-2:705582597265:run:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917

Output:

{
    "run": {
        "status": "COMPLETED",
        "name": "firstCLIRun",
        "created": 1443331615.166,
        "totalJobs": 101,
        "completedJobs": 101,
        "platform": "ANDROID_APP",
        "result": "ERRORED",
        "billingMethod": "METERED",
        "type": "CALABASH",
        "arn": "arn:aws:devicefarm:us-west-2:705582597265:run:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917",
        "counters": {
            "skipped": 37,
            "warned": 16,
            "failed": 46,
            "stopped": 0,
            "passed": 514,
            "errored": 66,
            "total": 679
        }
    }
}

The above Run has a total of 101 jobs corresponding to 101 unique devices, all of which have completed.  This Run also has 679 individual tests with 46 failures and 66 errors.


Step 2: Retrieve the Status and Counters for all Jobs within a Run:

Use the list-jobs CLI command to list all jobs within a run along with job-level status and result counters.

[ec2-user]$  aws devicefarm list-jobs –arn arn:aws:devicefarm:us-west-2:705582597265:run:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917

Output (partial):

{
    "jobs": [
        {
            "status": "COMPLETED",
            "name": "ASUS Nexus 7 – 2nd Gen (WiFi)",
            "created": 1443331616.586,
            "result": "PASSED",
            "device": {
                "formFactor": "TABLET",
                "name": "ASUS Nexus 7 – 2nd Gen (WiFi)",
                "resolution": {
                    "width": 1200,
                    "height": 1920
                },
                "image": "NA",
                "platform": "ANDROID",
                "heapSize": 0,
                "memory": 34359738368,
                "model": "Nexus 7 – 2nd Gen",
                "os": "4.3.1",
                "cpu": {
                    "frequency": "MHz",
                    "architecture": "ARMv7",
                    "clock": 1512.0
                },
                "arn": "arn:aws:devicefarm:us-west-2::device:208FE7EE973042EA97DEC2EEF31CD10A",
                "manufacturer": "ASUS"
            },
            "message": "Successful test lifecycle of Setup Test",
        "arn": "arn:aws:devicefarm:us-west-2:705582597265:job:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066",
            "counters": {
                "skipped": 0,
                "warned": 0,
                "failed": 0,
                "stopped": 0,
                "passed": 7,
                "errored": 0,
                "total": 7
            }
        },
        {
            "status": "COMPLETED",
            "name": "Amazon Kindle Fire HDX 7 (WiFi)",
            "created": 1443331615.56,
            "result": "FAILED",
            "device": {
                "formFactor": "TABLET",
                "name": "Amazon Kindle Fire HDX 7 (WiFi)",
                "resolution": {
                    "width": 1920,
                    "height": 1200
                },
                "image": "NA",
                "platform": "ANDROID",
                "heapSize": 0,
                "memory": 17179869184,
                "model": "Kindle Fire HDX 7",
                "os": "4.4.3",
                "cpu": {
                    "frequency": "MHz",
                    "architecture": "ARMv7",
                    "clock": 2150.0
                },
                "arn": "arn:aws:devicefarm:us-west-2::device:BC44B6802F134918BDAB6FB4F38C37CC",
                "manufacturer": "Amazon"
            },
        "message": "I see the text "In Theaters" failed: Timeout waiting for elements: * {text CONTAINS[c] ‘In Theaters’} (Calabash::Android::WaitHelpers::WaitError)",
        "arn": "arn:aws:devicefarm:us-west-2:705582597265:job:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00018",
            "counters": {
                "skipped": 0,
                "warned": 0,
                "failed": 5,
                "stopped": 0,
                "passed": 2,
                "errored": 0,
                "total": 7
            }
        }

Note the hierarchical nature of the ARN.  The ARN type is “job” and the job ID is appended to the Run ARN: arn:aws:devicefarm:us-west-2:705582597265:job:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066


Step 3: Retrieve the Status and Result Counters for all Suites within a Job:

Use the list-suites CLI command to list all Suites within a job along with suite-level status and result counters.

[ec2-user]$  aws devicefarm list-suites –arn arn:aws:devicefarm:us-west-2:705582597265:job:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066

Output (partial)

    "suites": [
        {
            "status": "COMPLETED",
            "name": "Setup Suite",
            "created": 1443331656.183,
            "result": "PASSED",
            "message": "Successful test lifecycle of Setup Test",
            "arn": "arn:aws:devicefarm:us-west-2:705582597265:suite:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00000",
            "counters": {
                "skipped": 0,
                "warned": 0,
                "failed": 0,
                "stopped": 0,
                "passed": 1,
                "errored": 0,
                "total": 1
            }
        },
        {
            "status": "COMPLETED",
            "name": "STARmeter",
            "created": 1443331656.221,
            "result": "PASSED",
            "message": "Given I see the text "In Theaters" passed",
            "arn": "arn:aws:devicefarm:us-west-2:705582597265:suite:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00001",
            "counters": {
                "skipped": 0,
                "warned": 0,
                "failed": 0,
                "stopped": 0,
                "passed": 1,
                "errored": 0,
                "total": 1
            }
        }

The returned ARN type is “suite” and the suite ID is appended to the end: arn:aws:devicefarm:us-west-2:705582597265:suite:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00006
 

Step 4: Retrieving the Status and Result Counters for all Tests within a Suite

Use the list-tests CLI command to list all tests within a suite along with test-level status and result counters.

[ec2-user]$  aws devicefarm list-tests –arn arn:aws:devicefarm:us-west-2:705582597265:suite:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00006

Output:

{
    "tests": [
        {
            "status": "COMPLETED",
            "name": "Teardown Test",
            "created": 1443331656.424,
            "result": "PASSED",
            "message": "Successful test lifecycle of Teardown Test",
        "arn": "arn:aws:devicefarm:us-west-2:705582597265:test:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00006/00000",
            "counters": {
                "skipped": 0,
                "warned": 0,
                "failed": 0,
                "stopped": 0,
                "passed": 1,
                "errored": 0,
                "total": 1
            }
        }
    ]
}

The returned ARN type is “test” and the Test ID is appended to the end of the ARN": "arn:aws:devicefarm:us-west-2:705582597265:test:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00006/00000
 

Step 5: Retrieving the Artifacts for a Run, Job, Suite or Test

Using the list-artifacts CLI command, you can retrieve all of the artifacts associated with any of the levels of the run by specifying the corresponding ARN.  For example:

  • List artifacts for the run by specifying the run ARN:
    [ec2-user]$  aws devicefarm list-artifacts –arn arn:aws:devicefarm:us-west-2:705582597265:run:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917 –type FILE
  • List artifacts for the job by specifying the job ARN:
    [ec2-user]$  aws devicefarm list-artifacts –arn arn:aws:devicefarm:us-west-2:705582597265:job:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066 –type FILE
  • List artifacts for the suite by specifying the suite ARN:
    [ec2-user]$  aws devicefarm list-artifacts –arn arn:aws:devicefarm:us-west-2:705582597265:suite:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00000 –type FILE
  • List artifacts for the individual test by specifying the test ARN:
    [ec2-user]$  aws devicefarm list-artifacts –arn arn:aws:devicefarm:us-west-2:705582597265:test:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00006/00000 –type FILE

Note that there are three types of artifacts available: FILE, LOG and SCREENSHOT.

The output of the list-artifacts CLI command will include a pre-signed S3 URL that you can use to securely download the artifact:

{
    "artifacts": [
        {
            "url": "https://prod-us-west-2-results.s3-us-west-2.amazonaws.com/705582597265/2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00000/00000/3d811922-64d2-4d0c-8195-ffb715300836.logcat?AWSAccessKeyId=AKIAI3BDGZ5S7TQL2OZA&Expires=1446165402&Signature=D02sc66m01XXYtSJV0hVGK4l9aA%3D",
            "extension": "logcat",
            "name": "Logcat",
            "arn": "arn:aws:devicefarm:us-west-2:705582597265:artifact:2c21a412-bb7b-4657-a28c-d7d78b3888f7/16b3ce7d-63de-4371-bbc7-fdcb20b3f917/00066/00000/00000/00000"
        }
    ]
}

 

Conclusion

In this two-part series we have demonstrated the use of the AWS CLI to submit a Calabash-Android test script to Device Farm for testing against real mobile devices.  We have also demonstrated the use of the CLI to retrieve the status of the test Run as well as to retrieve the reports and artifacts for your test Run. If you have questions, please let us know in the comments section or in our developer forum.