The Internet of Things on AWS – Official Blog

Announcing AWS IoT FleetWise vision system data (Preview)

Today, we are excited to announce that AWS IoT FleetWise now supports vehicle vision system data collection that enables customers to collect metadata, object list and detection data, and images or videos from camera, lidar, radar and other vision sub-systems. This new feature, now available in Preview, builds upon existing AWS IoT FleetWise capabilities that enable customers to extract more value and context from their data to build vehicles that are more connected and convenient.

Modern vehicles are equipped with multiple vision systems. Examples of vision systems include a surround view array of cameras and radars that enable advanced driver assistance (ADAS) use cases and driver and cabin monitoring systems to assist with driver attention in semi-autonomous driving use cases. Most of these systems perform some level of computation on the vehicle, often using sophisticated algorithms for sensor fusion and AI/ML for inference.

Vision systems generate massive amounts of data in structured (numbers, text) and unstructured (images, video) formats. This challenge makes it difficult to synchronize data from multiple vehicle sensor modalities around a given event of interest in a way that minimizes interference with the operation of the vehicle. For example, to analyze the accuracy of road conditions detected by a vehicle camera, a data scientist may want to view telemetry data (e.g., speed and brake pressure), structured object lists and metadata, and unstructured images/video data. Keeping all of those data points organized and associated with the same event is a heavy lift. This typically requires additional software and compute power to only collect data points of interest to minimize interference with the operation of the vehicle, add metadata, and keep the data synchronized.

Vision system data from AWS IoT FleetWise lets automotive companies easily collect and organize data from vehicle vision systems that include cameras, radars, and lidars. It keeps both structured and unstructured vision system data, metadata, and telemetry data synchronized in the cloud, making it easier for customers to assemble a full picture view of events and gain insights. Here are a few scenarios:

  • To understand what happened during a hard-braking event, a customer wants to collect data before and after the event occurs. The data collected may include inference (e.g., an obstacle was detected), timestamps and camera settings (metadata), and what occurred around the vehicle (e.g., images, videos, and light/radar maps with bounding boxes and detection overlays).
  • A customer is interested in anomalous events on roadways like accidents, wildfires, and obstacles that impede traffic. The customer begins by collecting telemetry and object list data at scale across a large number of vehicles, then, zooms in on a set of vehicles that are signaling anomalous events (e.g., speed is 0 on a large highway) and collects vision system data from those vehicles.

When collecting vision system data using AWS IoT FleetWise, customers can take advantage of the service’s advanced features and interfaces they already use to collect telemetry data, for example, specifying events in their data collection campaign to optimize bandwidth and data size. Customers can get started on AWS by defining and modeling a vehicle’s vision system, alongside its attributes and telemetry sensors. The customer’s Edge Agent deployed in the vehicle collects data from CAN-based vehicle sensors (e.g. battery temperature), as well as from vehicle sub-systems that include vision system sensors. Customers can use the same event- or time-based data collection campaign to collect data signals concurrently from both standard sensors and vision systems. In the cloud, customers see a unified view of their defined vehicle attributes and other metadata, telemetry data, and structured vision system data, with links to view unstructured vision system data in Amazon Simple Storage Service (Amazon S3). The data stays synchronized using vehicle, campaign, and event identifiers. Customers can then use services like AWS Glue to integrate data for downstream analytics.

Continental AG is developing driver convenience features

Continental AG develops pioneering technologies and services for autonomous mobility. “Continental has collaborated closely with AWS on developing technologies that accelerate automotive software development in the cloud. With vision system data from AWS IoT FleetWise, we will be able to easily collect camera and motion-planning data to improve automated parking assistance and enable fleet-wide monitoring and reporting.”

Yann Baudouin, Head of Data Solutions – Engineering Platform and Ecosystem, Continental AG

HL Mando is developing capabilities that enhance driver safety and personalization

HL Mando is a tier 1 supplier of parts and software to the automotive industry. “At Mando, we are committed to innovating technology that makes vehicles easier to drive and operate. Our solutions rely on the ability to collect vehicle telemetry data as well as vehicle camera data in an efficient way. We are looking forward to using the data we collect through AWS IoT FleetWise to improve vehicle software capabilities that can enhance driver safety and driver personalization.” 

Seong-Hyeon Cho, Vice Chairman/CEO, HL Mando

ThunderSoft is developing automotive and fleet solutions

ThunderSoft provides intelligent operating systems and technologies to automotive companies and enterprises. “As ThunderSoft works to help advance the next generation of connected vehicle technology across the globe, we look forward to continuing our collaboration with AWS. With the arrival of vision system data from AWS IoT FleetWise, we’ll be able to help our customers with innovative solutions for advanced driver assistance systems (ADAS) and fleet management.”

Pengcheng Zou, CTO, ThunderSoft

Solution Overview

Let’s take an ADAS use case to walk through the process of collecting vision system data. Imagine that an ADAS engineer is deploying a collision avoidance system in production vehicles. One way this system helps vehicles avoid collisions is by automatically applying brakes in certain scenarios (e.g., an impending rear-end collision with another vehicle).

While the software used in this system has already gone through rigorous testing, the engineer wants to continuously improve the software for both current-gen and future-gen vehicles. In this case, the engineer wants to see all scenarios where a collision was detected. To understand what happened during the event, the engineer will look at vision data comprised of images and telemetry data before and after the collision was detected. Once in the S3 bucket, the engineer may want to visualize, analyze and label the data.

Prerequisites

Before you get started, you will need:

  • An AWS account with console, CLI and programmatic access in supported Regions.
  • Permission to create and access AWS IoT FleetWise and Amazon S3 resources.
  • To follow the instructions in our AWS IoT FleetWise vision system demo guide, up to and including, “Playback ROS 2 data.”
  • (Optional) A ROS 2 environment that supports the “Galactic” version of ROS 2. During the Preview period for vision system data, the AWS IoT FleetWise Reference Edge Agent supports ROS 2 middleware to collect vision system signals.

Walkthrough

Step 1: Model your vehicle

  • Create a signal catalog by creating the file: ros2-nodes.json . Feel free to change the name and description within this file to your liking.
{
 "name": "fw-vision-system-catalog",
    "description": "vision-system-catalog",
    "nodes": [
      {
        "branch": {
          "fullyQualifiedName": "Types"
        }
      },
      {
        "struct": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_CompressedImage"
        }
      },
      {
        "struct": {
          "fullyQualifiedName": "Types.std_msgs_Header"
        }
      },
      {
        "struct": {
          "fullyQualifiedName": "Types.builtin_interfaces_Time"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.builtin_interfaces_Time.sec",
          "dataType": "INT32",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.builtin_interfaces_Time.nanosec",
          "dataType": "UINT32",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.std_msgs_Header.stamp",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.builtin_interfaces_Time"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.std_msgs_Header.frame_id",
          "dataType": "STRING",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_CompressedImage.header",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.std_msgs_Header"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_CompressedImage.format",
          "dataType": "STRING",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_CompressedImage.data",
          "dataType": "UINT8_ARRAY",
          "dataEncoding": "BINARY"
        }
      },
      {
        "branch": {
          "fullyQualifiedName": "Vehicle",
          "description": "Vehicle"
        }
      },
      {
        "branch": {
          "fullyQualifiedName": "Vehicle.Cameras",
          "description": "Vehicle.Cameras"
        }
      },
      {
        "branch": {
          "fullyQualifiedName": "Vehicle.Cameras.Front",
          "description": "Vehicle.Cameras.Front"
        }
      },
      {
        "sensor": {
          "fullyQualifiedName": "Vehicle.Cameras.Front.Image",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.sensor_msgs_msg_CompressedImage"
        }
      },
      {
        "struct": {
          "fullyQualifiedName": "Types.std_msgs_msg_Float32"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.std_msgs_msg_Float32.data",
          "dataType": "FLOAT",
          "dataEncoding": "TYPED"
        }
      },
      {
        "sensor": {
          "fullyQualifiedName": "Vehicle.Speed",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.std_msgs_msg_Float32"
        }
      },
      {
        "branch": {
          "fullyQualifiedName": "Vehicle.Airbag",
          "description": "Vehicle.Airbag"
        }
      },
      {
        "sensor": {
          "fullyQualifiedName": "Vehicle.Airbag.CollisionIntensity",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.std_msgs_msg_Float32"
        }
      },
      {
        "struct": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu.header",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.std_msgs_Header"
        }
      },
      {
        "struct": {
          "fullyQualifiedName": "Types.geometry_msgs_Quaternion"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.geometry_msgs_Quaternion.x",
          "dataType": "DOUBLE",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.geometry_msgs_Quaternion.y",
          "dataType": "DOUBLE",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.geometry_msgs_Quaternion.z",
          "dataType": "DOUBLE",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.geometry_msgs_Quaternion.w",
          "dataType": "DOUBLE",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu.orientation",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.geometry_msgs_Quaternion"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu.orientation_covariance",
          "dataType": "DOUBLE_ARRAY",
          "dataEncoding": "TYPED"
        }
      },
      {
        "struct": {
          "fullyQualifiedName": "Types.geometry_msgs_Vector3"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.geometry_msgs_Vector3.x",
          "dataType": "DOUBLE",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.geometry_msgs_Vector3.y",
          "dataType": "DOUBLE",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.geometry_msgs_Vector3.z",
          "dataType": "DOUBLE",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu.angular_velocity",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.geometry_msgs_Vector3"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu.angular_velocity_covariance",
          "dataType": "DOUBLE_ARRAY",
          "dataEncoding": "TYPED"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu.linear_acceleration",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.geometry_msgs_Vector3"
        }
      },
      {
        "property": {
          "fullyQualifiedName": "Types.sensor_msgs_msg_Imu.linear_acceleration_covariance",
          "dataType": "DOUBLE_ARRAY",
          "dataEncoding": "TYPED"
        }
      },
      {
        "sensor": {
          "fullyQualifiedName": "Vehicle.Acceleration",
          "dataType": "STRUCT",
          "structFullyQualifiedName": "Types.sensor_msgs_msg_Imu"
        }
      }
    ]
}
aws iotfleetwise create-signal-catalog --cli-input-json file://ros2-nodes.json
  • AWS IoT FleetWise can collect both vision system and CAN bus data at the same time. You can also update the signal catalog by adding CAN signals from any vss-json file. Make sure the “name” field in the file matches the signal catalog you created:
aws iotfleetwise update-signal-catalog --cli-input-json file://<can-nodes>.json
  • Create a model manifest named: vehicle-model.json. Your model manifest should be comprised of the following signals (fully qualified names outlined below):
    • Vehicle.Cameras.Front.Image
    • Vehicle.Speed
    • Vehicle.Acceleration
    • Vehicle.Airbag.CollisionIntensity
{

"name": "fw-vision-system-model",

"signalCatalogArn": "<signal-catalog-ARN>",

"description": "Vehicle model to demonstrate FleetWise vision system data",

"nodes": ["Vehicle.Cameras.Front.Image","Vehicle.Speed","Vehicle.Airbag.CollisionIntensity","Vehicle.Acceleration"]

}
aws iotfleetwise create-model-manifest --cli-input-json file://vehicle-model.json
  • Update your model manifest by setting it to ‘active:’
aws iotfleetwise update-model-manifest --name fw-vision-system-model --status ACTIVE
  • Create a decoder manifest file: decoder-manifest.json. Adjust the JSON to reflect the appropriate model manifest ARN. If you’re also using CAN signals, refer to the AWS IoT FleetWise documentation for an example decoder manifest with both vision system and CAN signals. You will need to update the decoder manifest to ‘active’ status once you create the decoder manifest:
{
    "name": "fw-vision-system-decoder-manifest",
    "modelManifestArn": "<your model manifest arn>",
    "description": "decoder manifest to demonstrate vision system data",
    "networkInterfaces":[
  {
    "interfaceId": "10",
    "type": "VEHICLE_MIDDLEWARE",
    "vehicleMiddleware": {
      "name": "ros2",
      "protocolName": "ROS_2"
    }
  },
],

"signalDecoders":[	
  {
    "fullyQualifiedName": "Vehicle.Cameras.Front.Image",
    "type": "MESSAGE_SIGNAL",
    "interfaceId": "10",
    "messageSignal": {
      "topicName": "/carla/ego_vehicle/rgb_front/image_compressed:sensor_msgs/msg/CompressedImage",
      "structuredMessage": {
        "structuredMessageDefinition": [
          {
            "fieldName": "header",
            "dataType": {
              "structuredMessageDefinition": [
                {
                  "fieldName": "stamp",
                  "dataType": {
                    "structuredMessageDefinition": [
                      {
                        "fieldName": "sec",
                        "dataType": {
                          "primitiveMessageDefinition": {
                            "ros2PrimitiveMessageDefinition": {
                              "primitiveType": "INT32"
                            }
                          }
                        }
                      },
                      {
                        "fieldName": "nanosec",
                        "dataType": {
                          "primitiveMessageDefinition": {
                            "ros2PrimitiveMessageDefinition": {
                              "primitiveType": "UINT32"
                            }
                          }
                        }
                      }
                    ]
                  }
                },
                {
                  "fieldName": "frame_id",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "STRING"
                      }
                    }
                  }
                }
              ]
            }
          },
          {
            "fieldName": "format",
            "dataType": {
              "primitiveMessageDefinition": {
                "ros2PrimitiveMessageDefinition": {
                  "primitiveType": "STRING"
                }
              }
            }
          },
          {
            "fieldName": "data",
            "dataType": {
              "structuredMessageListDefinition": {
                "name": "listType",
                "memberType": {
                  "primitiveMessageDefinition": {
                    "ros2PrimitiveMessageDefinition": {
                      "primitiveType": "UINT8"
                    }
                  }
                },
                "capacity": 0,
                "listType": "DYNAMIC_UNBOUNDED_CAPACITY"
              }
            }
          }
        ]
      }
    }
  },
  {
    "fullyQualifiedName": "Vehicle.Speed",
    "type": "MESSAGE_SIGNAL",
    "interfaceId": "10",
    "messageSignal": {
      "topicName": "/carla/ego_vehicle/speedometer:std_msgs/msg/Float32",
      "structuredMessage": {
        "structuredMessageDefinition": [
          {
            "fieldName": "data",
            "dataType": {
              "primitiveMessageDefinition": {
                "ros2PrimitiveMessageDefinition": {
                  "primitiveType": "FLOAT32"
                }
              }
            }
          }
        ]
      }
    }
  },
  {
    "fullyQualifiedName": "Vehicle.Airbag.CollisionIntensity",
    "type": "MESSAGE_SIGNAL",
    "interfaceId": "10",
    "messageSignal": {
      "topicName": "/carla/ego_vehicle/collision_intensity:std_msgs/msg/Float32",
      "structuredMessage": {
        "structuredMessageDefinition": [
          {
            "fieldName": "data",
            "dataType": {
              "primitiveMessageDefinition": {
                "ros2PrimitiveMessageDefinition": {
                  "primitiveType": "FLOAT32"
                }
              }
            }
          }
        ]
      }
    }
  },
  {
    "fullyQualifiedName": "Vehicle.Acceleration",
    "type": "MESSAGE_SIGNAL",
    "interfaceId": "10",
    "messageSignal": {
      "topicName": "/carla/ego_vehicle/imu:sensor_msgs/msg/Imu",
      "structuredMessage": {
        "structuredMessageDefinition": [
          {
            "fieldName": "header",
            "dataType": {
              "structuredMessageDefinition": [
                {
                  "fieldName": "stamp",
                  "dataType": {
                    "structuredMessageDefinition": [
                      {
                        "fieldName": "sec",
                        "dataType": {
                          "primitiveMessageDefinition": {
                            "ros2PrimitiveMessageDefinition": {
                              "primitiveType": "INT32"
                            }
                          }
                        }
                      },
                      {
                        "fieldName": "nanosec",
                        "dataType": {
                          "primitiveMessageDefinition": {
                            "ros2PrimitiveMessageDefinition": {
                              "primitiveType": "UINT32"
                            }
                          }
                        }
                      }
                    ]
                  }
                },
                {
                  "fieldName": "frame_id",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "STRING"
                      }
                    }
                  }
                }
              ]
            }
          },
          {
            "fieldName": "orientation",
            "dataType": {
              "structuredMessageDefinition": [
                {
                  "fieldName": "x",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                },
                {
                  "fieldName": "y",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                },
                {
                  "fieldName": "z",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                },
                {
                  "fieldName": "w",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                }
              ]
            }
          },
          {
            "fieldName": "orientation_covariance",
            "dataType": {
              "structuredMessageListDefinition": {
                "name": "listType",
                "memberType": {
                  "primitiveMessageDefinition": {
                    "ros2PrimitiveMessageDefinition": {
                      "primitiveType": "FLOAT64"
                    }
                  }
                },
                "capacity": 9,
                "listType": "FIXED_CAPACITY"
              }
            }
          },
          {
            "fieldName": "angular_velocity",
            "dataType": {
              "structuredMessageDefinition": [
                {
                  "fieldName": "x",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                },
                {
                  "fieldName": "y",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                },
                {
                  "fieldName": "z",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                }
              ]
            }
          },
          {
            "fieldName": "angular_velocity_covariance",
            "dataType": {
              "structuredMessageListDefinition": {
                "name": "listType",
                "memberType": {
                  "primitiveMessageDefinition": {
                    "ros2PrimitiveMessageDefinition": {
                      "primitiveType": "FLOAT64"
                    }
                  }
                },
                "capacity": 9,
                "listType": "FIXED_CAPACITY"
              }
            }
          },
          {
            "fieldName": "linear_acceleration",
            "dataType": {
              "structuredMessageDefinition": [
                {
                  "fieldName": "x",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                },
                {
                  "fieldName": "y",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                },
                {
                  "fieldName": "z",
                  "dataType": {
                    "primitiveMessageDefinition": {
                      "ros2PrimitiveMessageDefinition": {
                        "primitiveType": "FLOAT64"
                      }
                    }
                  }
                }
              ]
            }
          },
          {
            "fieldName": "linear_acceleration_covariance",
            "dataType": {
              "structuredMessageListDefinition": {
                "name": "listType",
                "memberType": {
                  "primitiveMessageDefinition": {
                    "ros2PrimitiveMessageDefinition": {
                      "primitiveType": "FLOAT64"
                    }
                  }
                },
                "capacity": 9,
                "listType": "FIXED_CAPACITY"
              }
            }
          }
        ]
      }
    }
  }
]
}
aws iotfleetwise create-decoder-manifest --cli-input-json file://decoder-manifest.json

aws iotfleetwise update-decoder-manifest —name fw-vision-system-decoder-manifest —status ACTIVE

Step 2: Create a vehicle

  • Create a vehicle using the above model manifest and decoder manifest. Make sure you use the same name as the provisioned AWS IoT Thing that you created in your prerequisite steps.
aws iotfleetwise create-vehicle --vehicle-name FW-VSD-ROS2-<provisioned-identifier>-vehicle --model-manifest-arn <Your model manifest ARN> --decoder-manifest-arn <Your decoder manifest ARN>

Step 3: Create campaigns

  • Set up the access policy to enable AWS IoT FleetWise to access your S3 bucket by following the instructions here (see “bucket policy for all campaigns”)
  • Create an event-based campaign that collects data based on a detected collision event, including 5 seconds of pretrigger and 5 seconds of posttrigger data.
{
    "name": "fw-vision-system-collectCollision",
    "description": "Collect 10 seconds of data from a subset of signals if vehicle detected a collision - 5 pretrigger seconds, 5 posttrigger seconds",
    "signalCatalogArn": "<your signal catalog>",
    "targetArn": "<your target>",
        "signalsToCollect": [
        {
            "name": "Vehicle.Cameras.Front.Image",
            "maxSampleCount": 1000,
            "minimumSamplingIntervalMs": 10
        },
        {
            "name": "Vehicle.Speed",
            "maxSampleCount": 1000,
            "minimumSamplingIntervalMs": 10
        },
        {
            "name": "Vehicle.Acceleration",
            "maxSampleCount": 1000,
            "minimumSamplingIntervalMs": 10
        },
        {
            "name": "Vehicle.Airbag.CollisionIntensity",
            "maxSampleCount": 1000,
            "minimumSamplingIntervalMs": 10
        }
    ],
    "postTriggerCollectionDuration": 5000,
    "collectionScheme": {
        "conditionBasedCollectionScheme": {
            "conditionLanguageVersion": 1,
            "expression": "$variable.`Vehicle.Airbag.CollisionIntensity` > 1",
            "minimumTriggerIntervalMs": 10000,
            "triggerMode": "ALWAYS"
        }
    },
    "dataDestinationConfigs": [
        {
            "s3Config": {
                "bucketArn": "<your S3 bucket>",
                "dataFormat": "PARQUET",
                "storageCompressionFormat": "NONE",
                "prefix": "collisionData"
            }
        }
    ]
}
aws iotfleetwise create-campaign --cli-input-json file://campaign.json
  • Create another campaign to collect 10 seconds of data as a timed event.
{
    "name": "fw-vision-system-collectTimed",
    "description": "Collect 10 seconds of data from a subset of signals",
    "signalCatalogArn": "<Your signal catalog ARN>",
    "targetArn": "<Your vehicle ARN>",
        "signalsToCollect": [
        {
            "name": "Vehicle.Cameras.Front.Image",
            "maxSampleCount": 500,
            "minimumSamplingIntervalMs": 10
        },
        {
            "name": "Vehicle.Speed",
            "maxSampleCount": 500,
            "minimumSamplingIntervalMs": 10
        },
        {
            "name": "Vehicle.Acceleration",
            "maxSampleCount": 500,
            "minimumSamplingIntervalMs": 10
        },
        {
            "name": "Vehicle.Airbag.CollisionIntensity",
            "maxSampleCount": 500,
            "minimumSamplingIntervalMs": 10
        }
    ],
    "postTriggerCollectionDuration": 5000,
    "collectionScheme": {
        "timeBasedCollectionScheme": {
            "periodMs": 10000
        }
    },
    "dataDestinationConfigs": [
        {
            "s3Config": {
                "bucketArn": "<Your S3 bucket>",
                "dataFormat": "PARQUET",
                "storageCompressionFormat": "NONE",
                "prefix": "timeData"
            }
        }
    ]
}
aws iotfleetwise create-campaign --cli-input-json file://campaign-timed.json
  • Make sure to approve all your campaigns!
aws iotfleetwise update-campaign --name fw-rich-sensor-collectCollision --action APPROVE

aws iotfleetwise update-campaign --name fw-rich-sensor-collectTimed --action APPROVE

Step 4: View your data in Amazon S3 

AWS IoT FleetWise takes up to 15 minutes to load your data into Amazon S3. You will see three sets of files in your S3 bucket: 1/Raw data or iON files that contains the binary blobs of data that AWS IoT FleetWise decodes — these files can be used to deep dive errors; 2/Unstructured data files that contain binaries for images/video collected; 3/Processed data (i.e., structured data) files that contain decoded metadata, object lists and telemetry data, with links to corresponding unstructured data files.

To do more, you can:

  • Utilize campaign ID, event ID, and vehicle ID to ‘join’ your data using AWS Glue.
  • Catalog your data using an AWS Glue Crawler to make it searchable.

Explore your data using ad-hoc queries in Amazon Athena to identify scenes of interest.

Data from scenes of interest can then be passed to downstream tools for visualization, labeling, and re-simulation to develop the next version of models and vehicle software. For example, third party software such as Foxglove Studio can be used to visualize what happened before and after the collision using the images stored in Amazon S3; Amazon Rekognition can be utilized to automatically discover and label additional objects present at the time of collision; Amazon SageMaker Groundtruth can be used for annotation and human-in-the-loop workflows to improve the accuracy and relevance of the collision avoidance software. In a future blog, we plan to explore options for this part of the workflow.

Conclusion 

In this post, we showcased how AWS IoT FleetWise vision system data enables you to easily collect and organize data from advanced vehicle sensor systems to assemble a holistic view of events and gain insights. The new feature expands the scope of data-driven use cases for automotive customers. We then used a sample ADAS development use case to walk through the process of creating condition-based campaigns can help improve an ADAS system, and how to access that data in Amazon S3.

To learn more, visit the AWS IoT FleetWise site. We look forward to your feedback and questions.

About the Authors


Akshay Tandon
is a Principal Product Manager at Amazon Web Services with the AWS IoT FleetWise team. He is passionate about everything automotive and product. He enjoys listening to customers and envisioning innovative products and services that help fulfill their needs. At Amazon, Akshay has led product initiatives in the AI/ML space with Alexa and the fleet management space with Amazon Transportation Services. He has more than 10 years of product management experience.


Matt Pollock
is a Senior Solution Architect at Amazon Web Services currently working with automotive OEMs and suppliers. Based in Austin, Texas, he has worked with customers at the interface of digital and physical systems across a diverse range of industries since 2005. When not building scalable solutions to challenging technical problems, he enjoys telling terrible jokes to his daughter.