AWS Spatial Computing Blog

Build a Spatial Object Tracking Pipeline with Create ML using Amazon EC2 Mac Instances

Introduction

Apple’s Create ML enables developers to train powerful machine learning models without deep machine learning expertise, simplifying the path from concept to production. A new feature with Apple’s Create ML tool enables object tracking to be trained directly from the command line, unlocking faster and more scalable workflows. This is especially valuable for teams building immersive experiences on Apple Vision Pro who need to reliably detect and track real-world objects as part of their spatial computing applications. By leveraging Amazon EC2 Mac instances, you can distribute multiple training jobs across Macs in the cloud, significantly reducing pipeline completion time and keeping your local machine free for other tasks.

Learn how to build an end-to-end spatial object tracking pipeline using Create ML and Amazon EC2 Mac instances. This post walks through creating a machine learning workflow that can detect and track objects in 3D space, leveraging the power of Apple’s Create ML framework and AWS cloud infrastructure. In this post you will learn how to:

Overview of Object Tracking in visionOS

Figure 1: Apple’s Create ML workflow for Object Tracking

Here is how you can use Apple’s Create ML to implement object tracking in your visionOS app as illustrated by the diagram above:

  1. 3D Model Creation: The process begins with a USDZ (Universal Scene Description ZIP) file, which is a compressed 3D model format optimized for Apple’s spatial computing applications. This file represents the physical object you want to track, including its geometry, textures, and distinctive features.
  2. Model Training: Using Create ML, a machine learning model is trained on the USDZ file. This training process occurs on a Mac and teaches the model to recognize the object from various angles and in different lighting conditions.
  3. Reference Object Generation: The trained model produces a .referenceobject file, a specialized format for visionOS object tracking. This file contains the unique characteristics needed to identify and track the corresponding real-world object.
  4. Integration: The .referenceobject file is then integrated into your visionOS app, enabling it to recognize and track the physical object in real-time.
  5. Real-World Tracking: When the app runs on an Apple Vision Pro device, it uses the .referenceobject file to detect, track, and understand the position and orientation of the physical object in the user’s environment.

Prerequisites

To create an Amazon EC2 Mac Amazon Machine Image (AMI) for Create ML, you will need to have the following on your development Mac:

Obtaining your 3D Model

Figure 2: Example USDZ (Universal Scene Description ZIP) 3D model for Training

To create a 3D model for object tracking in visionOS, you need a USDZ file that accurately represents your real-world object. You can obtain this file through two primary methods: using computer-aided design (CAD) software or leveraging the Object Capture feature in the Reality Composer app on iOS/iPadOS. The CAD method allows for precise modeling and application of physically based rendering (PBR) materials, making it ideal for complex or reflective objects. Alternatively, Object Capture lets you use an iPhone or iPad to photograph an object from multiple angles, automatically generating a USDZ file. Whichever method you choose, it’s crucial that the resulting 3D model is as photorealistic and accurately scaled as possible, essentially serving as a digital twin of the physical object. The quality and precision of this model directly impact the effectiveness of object tracking in your visionOS app. To automate your 3D design workflow you can convert your models to USDZ using an AWS open-sourced architecture and cloud formation template called Visual Asset Management System (VAMS). VAMS can interject logic into your asset pipeline and track changes and version files. Find out more in the VAMS Github repository.

Figure 3: Example Object Capture in Reality Composer for IOS

Configuring Amazon EC2 Mac Instances to use Create ML

To use the Create ML command line utility on Amazon EC2 Mac, you will need to create an Amazon Machine Image (AMI) with macOS 15.4 or later and the Xcode 26 Beta.

1. Create a IAM role to create the AMI with permissions to create, list, and tag AMIs (do not forget to replace 000000000000 with your AWS Account ID):

# Create the IAM role
aws iam create-role --role-name XcodeAMIBuilder --assume-role-policy-document '{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "Service": "ec2.amazonaws.com"
      },
      "Action": "sts:AssumeRole"
    }
  ]
}'
# Attach inline policy to the role
aws iam put-role-policy --role-name XcodeAMIBuilder --policy-name XcodeAMIBuilderPolicy --policy-document '{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "ec2:CreateImage",
        "ec2:DescribeImages",
        "ec2:CreateTags"
      ],
      "Resource": "*"
    },
    {
      "Sid": "xcodeinstall",
      "Effect": "Allow",
      "Action": [
        "secretsmanager:CreateSecret",
        "secretsmanager:GetSecretValue",
        "secretsmanager:PutSecretValue"
      ],
      "Resource": "arn:aws:secretsmanager:*:000000000000:secret:xcodeinstall-*"
    }
  ]
}'
Shell

2. Create the instance profile and add the role to it:

aws iam create-instance-profile --instance-profile-name XcodeAMIBuilder
aws iam add-role-to-instance-profile --instance-profile-name XcodeAMIBuilder --role-name XcodeAMIBuilder
Shell

3. Configure your desired region, Instance type, and AMI ID (you can use the console the latest macOS for the instance type you are using). This is the base AMI we will modify to include Xcode.

export EC2Region="ap-southeast-2" # Replace with your region
export EC2InstanceType="mac2-m2.metal" # Replace with your own
export EC2AMIID=ami-095ec9f091b23b49b # Replace with your own AMI
Shell

4. Authenticate with Apple Developer credentials and store credentials in AWS Secrets Manager

brew tap sebsto/macos
brew install xcodeinstall
xcode install storesecrets -s $EC2Region
xcode install authenticate -s $EC2Region
Shell

5. Create the User Data script. This script downloads Xcode 26 Beta and creates the AMI

cat <<- EOS > ./EC2UserData.sh
#!/bin/bash

# Path variable to find location of awscli on x86 and Apple silicon Macs.
	

MDToken=\$(curl -X PUT "http://169.254.169.254/latest/api/token" -s -H "X-aws-ec2-metadata-token-ttl-seconds: 21600")
currentInstanceID=\$(curl -H "X-aws-ec2-metadata-token: \$MDToken" -s http://169.254.169.254/latest/meta-data/instance-id)
currentRegion=\$(curl -H "X-aws-ec2-metadata-token: \$MDToken" -s http://169.254.169.254/latest/meta-data/placement/region)

su ec2-user -c 'brew tap sebsto/macos'
su ec2-user -c 'brew install xcodeinstall'
su ec2-user -c 'xcodeinstall download --name "Xcode 26 beta.xip" -s "ap-southeast-2"'
su ec2-user -c 'xcodeinstall install --name "Xcode 26 beta.xip"'
su ec2-user -c 'sudo xcode-select -s /Applications/Xcode-beta.app/Contents/Developer && sudo xcodebuild -license accept'

# Create AMI
echo -e "Creating AMI from the instance..."
AMI_ID=\$(aws ec2 create-image \
  --instance-id \$currentInstanceID \
  --name "Xcode-AMI" \
  --description "Auto-created" \
  --no-reboot \
  --query 'ImageId' \
  --output text \
  --region \$currentRegion)

aws ec2 wait image-available --image-ids \$AMI_ID --region \$currentRegion
aws ec2 wait image-available --image-ids \$AMI_ID --region \$currentRegion
aws ec2 wait image-available --image-ids \$AMI_ID --region \$currentRegion
aws ec2 wait image-available --image-ids \$AMI_ID --region \$currentRegion
aws ec2 wait image-available --image-ids \$AMI_ID --region \$currentRegion
aws ec2 wait image-available --image-ids \$AMI_ID --region \$currentRegion

aws ec2 create-tags --region \$currentRegion --resources \$currentInstanceID --tags "Key=Name,Value=XCodeAMIBuilder-✅"

exit 0;
 
EOS
Shell

6. Allocate a dedicated host with the same instance type you chose in Step 3 (EC2InstanceType) and make sure to note down the Host ID

7. Launch the instance on the dedicated host. Replace $EC2HostID with the ID from the previous step

export EC2PlacementString="{\"Affinity\":\"host\", \"HostId\":\"$EC2HostID\", \"Tenancy\":\"host\"}"
  
export EC2VolumeInfo="{\"DeviceName\":\"/dev/sda1\", \"Ebs\":{\"Encrypted\":false, \"DeleteOnTermination\":true, \"Iops\":6000, \"VolumeSize\":200, \"VolumeType\":\"gp3\", \"Throughput\":250}}"  

export EC2TagSpecs="{\"ResourceType\":\"instance\", \"Tags\":[{\"Key\":\"Name\", \"Value\":\"XCodeAMIBuilder\"}]}"
  
export EC2UserData=$(cat ./EC2UserData.sh)

aws ec2 run-instances --region $EC2Region \
   --image-id "$EC2AMIID" \
   --instance-type "$EC2InstanceType" \
   --user-data "$EC2UserData"\
   --block-device-mappings "$EC2VolumeInfo" \
   --tag-specifications "$EC2TagSpecs" \
   --placement "$EC2PlacementString" \
   --iam-instance-profile 'Name'="XcodeAMIBuilder" \
   --associate-public-ip-address
Shell

Wait for the instance to initialize. Once the process is complete and the User Data Script is complete, a green checkmark (✅) appears next to the Amazon EC2 Mac instance in the Amazon EC2 Console. There should also be an AMI named “Xcode-AMI” in your account. You can use this AMI in your pipelines to run the Create ML command line utility on Amazon EC2 Mac Instances:

xcrun createml objecttracker -s globe.usdz -o globe.referenceobject
Shell

Architecting a Pipeline for Apple’s Create ML Training using Amazon EC2 Mac Instances

The following diagram shows how this spatial object tracking workflow could be implemented using AWS services:

Figure 4: Example Spatial Object Tracking Workflow

  1. An admin deploys the solution using the AWS Cloud Development Kit (CDK)
  2. A user can access the solution through a web application hosted in Amazon Simple Storage Service (S3) and globally distributed with Amazon CloudFront.
  3. Amazon Cognito is used to authenticate the user and log in
  4. An authenticated user can initiate the workflow using by uploading the USDZ file to the web application. The user can also use the AWS Management Console and AWS CLI to upload the USDZ file along with JSON job specification directly to Amazon S3 using AWS IAM credentials.
  5. The upload to Amazon S3 triggers an AWS Lambda function that logs the job details in Amazon DynamoDB and triggers an AWS Step Functions workflow
  6. AWS Step Functions orchestrates the workflow using AWS Lambda to create an Amazon EC2 Mac Dedicated Host and Instance and start the Create ML job.
  7. Once complete the Amazon EC2 Mac instance uploads the referenceobject file to Amazon S3
  8. Once the job is complete, the user receives a notification of completion from Amazon Simple Notification Service (SNS) with a S3 presigned URL to Reference Object file.

Using the Reference Object File in Your App

You can use the reference object file in Reality Composer Pro and add it to your visionOS app.

1. Open your visionOS project on Xcode

Figure 5: visionOS Project on Xcode

2. Open the default scene in Reality Composer Pro and delete the default sphere

Figure 6: Default Scene in Reality Composer

3. Create an empty transform entity and add an anchoring component to it

Figure 7: Creating an Empty Transform Entity and adding an Anchor Component

4. Import the reference object created with pipeline

Figure 8: Importing Reference Object

5. You should see a semi-transparent view of the initial USDZ model you used, and you can place virtual content on top of your object. See more details in Using a reference object with Reality Composer Pro

Figure 9: Semi Transparent View of Model

See more details on how to use the reference object in RealityKit and how to use the reference object in ARKit.

Clean Up

It is a good practice to delete any unused resources to avoid ongoing charges. To clean up resources from this walkthrough:

  1. Delete the IAM role and Instance profile. Please refer to Delete roles or instance profiles documentation for detailed instructions
  2. Delete Amazon EC2 Mac instances and Dedicated Hosts (Dedicated Hosts require a 24 minimum allocation time to align with Apple macOS EULA). Please refer to documentation on how to Stop or terminate your Amazon EC2 Mac instance for more details.
  3. Deregister the AMI. Please refer to documentation on how to Deregister an AMI.
  4. Delete secrets from AWS Secrets Manager. Please reference the documentation on how to Delete an AWS Secrets Manager secret for more detailed instructions.

Conclusion

This walkthrough demonstrates an object tracking pipeline using AWS services and Create ML, providing a foundation for spatial computing applications on Vision Pro. However, this is just the beginning. The future integration possibilities with Amazon Bedrock and Amazon Q could revolutionize field service applications. Imagine technicians using the Apple Vision Pro to not only track objects, but also receive real-time AI-powered assistance through Amazon Q, helping them identify parts, access repair manuals, and follow step-by-step maintenance procedures. By combining object tracking with generative AI through Amazon Bedrock, applications could provide contextual information, predictive maintenance insights, and interactive 3D guidance overlaid on tracked objects. We look forward to seeing how you’ll leverage this pipeline and extend it with AWS’s AI and machine learning services to create innovative spatial computing solutions.

Special thanks to Eric Cornwell for inspiration on the pipeline architecture and Dave Sieder on his contributions to Amazon EC2 Mac Instance configuration.