In the News: Universal Translator – Harnessing Sensor Data to Build Better Automotive Software
This article originally appeared on Automotive News.
“If you go into an entry-class car, you probably end up with 20 to 50 sensors that use semiconductors,” Frank Findeis, vice president for automotive sensors at German semiconductor maker Infineon, told Automotive News in August, 2020. “If you go to a high-end car, you easily end up with 100.”
“If you start looking into the car where the sensors are hidden it’s amazing, really, the number of applications is going through the roof,” Findeis added.
A recent study by consulting firm McKinsey estimated that the overall sensor market will grow by 8 percent a year into 2030, making it by far the fastest growing segment in automotive componentry. This despite the fact that the company says the market sensors for internal combustion engines will decline overall. It’s body, chassis, and most of all driving assistance sensors that are booming.
The upside is easy to see. The vast amounts of information those sensors produce presents the opportunity to build safer, better, and more functional vehicles as well as providing more amenities to drivers. ADAS systems, mobility use cases, and autonomous vehicles are the most obvious long-term trends enabled by sensor data.
The challenges, however, scale with the opportunities.
There are already industry coding standards and guidelines, but there’s no universal way of easily accessing sensor data or standardized interfaces for using it. Vehicle and sensor data is often in proprietary formats, and software applications can be specific to a single vehicle or brand, with little opportunity to spread development costs or create multi-use products.
Then, of course, there’s the security factor, which dominates the automotive software landscape.
“Security concerns are a longstanding hurdle,” says Brett Francis, a principal product solutions architect at Amazon Web Services (AWS). “Vehicle safety cannot be compromised, but not being able to access certain types of data is a real point of friction in trying to create the products automakers see ahead of them.”
The disparate systems and standards between automakers also make it difficult for third party innovators to develop automotive applications quickly and cheaply. They’re a high bar to clear even for internal OEM teams, Francis says. “Vehicles have thousands of different parts from different suppliers and thousands of different software components inside those parts, many of which are proprietary and specific per model.”
In wanting to overcome these difficulties, Francis says, “customers kept asking us how they could take the agility they have with our cloud computing solutions and apply it to in-vehicle software development,” Francis says, “Because that’s where the real challenges are in this new environment.”
“After about the fourth or fifth customer approached us,” Francis says, “We got to wondering: What if there was a way of creating a multilayered environment where the safety systems in the vehicle remain safe, but access to the data from those systems could be made available to developers in another layer of software systems?
“That other layer could be where the agility innovation takes place, another way of unlocking some of the challenges there,” he adds. That was in the middle of 2019.
In the run up to AWS’ 2019 re:Invent conference (an annual summit that showcases technologies related to cloud computing) the company teamed with BlackBerry to use that company’s QNX software platform to connect a car, a Karma Revero, with AWS’ IoT infrastructure. The idea previewed monitoring systems that both companies hope will make cars more of a presence in consumer’s connected-technology lives.
In the discussions surrounding that project, it turned out BlackBerry were hearing many of the same requests from their customers for a methodology to reduce the complexity, cost, and time commitments of automotive software development.
The two companies, BlackBerry the top vehicle software security provider and AWS the top automotive cloud platform, partnered at the end of 2019 to create exactly the multilayered environment Francis’ fellow developers had envisioned.
The jointly-developed system is called BlackBerry IVY. The system uses BlackBerry’s QNX data abstraction and AWS’ Edge cloud computing services and operates within a vehicle’s embedded systems.
The ultimate goal? An API (application programming interface) that allows developers to create contextually aware in-car experiences without needing to worry about special automotive programming languages, hardware variations, proprietary sensor data formats, or accessing safety critical systems.
Developers can also build their applications with live in-car data and achieve better economies of scale by deploying it across multiple vehicle brands, makes, and models. “What we’re bringing is the consistency of the interaction model so that a developer in one environment will find it easier and more familiar to develop in another,” Francis says.
The setup also allows developers to harness data in ways they may not have been able to before, combining real-time information from different sensors and using machine learning to glean new insights.
The synthetic sensor
IVY essentially piggybacks onto a vehicle’s existing systems. “Automakers have fine-grained control over the data developers can access,” explains Michael Garcia, lead solutions architect on the IVY team.
Sensors that may be part of vehicle control and safety functions funnel distinct sets of information into IVY which can then be used by machine learning models, either in the vehicle or in the cloud, which help build applications that can inform the vehicle experience. By using data from multiple sensors, new uses and new insights can be created from “synthetic sensors,” which analyze sets of data beyond their original purpose.
“Imagine you have two sensors. One is a weight sensor and the other is an inward facing camera,” says Garcia. “With those two sensors, you can then get the raw information from the hardware and make it available to a piece of code. That code can take the data and answer the question ‘Is there an occupant in the car?’”
With machine learning, Garcia says, it might be possible to determine the age of the occupant or other characteristics from the camera data, locally in the vehicle, but at a base level the job of the code is being an occupant detector. “That’s what we call a synthetic sensor. It takes previously raw data and processes it into a locally valuable insight.”
“So now that I have an occupant detector running in the vehicle in IVY I can have applications elsewhere in the car or communicating with systems outside of the car,” he adds.
“You could, for example, have a teenager detector. The application could report up to the cloud and send an alert to a parent when there are teenagers in the car,” he adds. “In California, teenagers are only supposed to drive by themselves or with a parent or guardian.” Such an application could be deployed to multiple automakers’ vehicles after a period of vetting, much like an iPhone app.
This is a hypothetical example, Garcia stresses, but it’s the kind of functionality that’s only possible when you’re able to easily harness the data in real time in the vehicle itself. That’s something which has historically been hard to do without a common platform like IVY.
BlackBerry’s technology makes the data secure while AWS’ cloud services also make it easier to monetize incoming data and to integrate vehicle data with other back-end systems within Tier 1 suppliers and OEMs. That’s useful for insights which don’t need to be generated within the vehicle, such as training algorithm models for machine learning, which can then be integrated back into vehicles.
The system is also useful for deploying applications and insights to a vehicle. “Most automakers have a cloud vehicle platform that they’re hosting somewhere and exercise very tight control over what goes into the vehicle,” Garcia says. Because it runs parallel to proprietary systems, IVY is agnostic about how it interacts with those platforms. “Automakers can send data to any endpoint of their choosing.”
The deployment tools are relatively easy to use, too. Automakers can update their software from the platform’s Cloud Console (a web interface for managing IVY) to continuously improve functionality.
Blackberry/QNX and AWS began interaction with some customers in early 2020 and many of its first wave of projects are still coming online, says Brett Francis, but the possibilities the technology unlocks are very broad.
Applications created via IVY could provide automakers with feedback on how and when safety features such as adaptive cruise control, traction control, or lane-keeping assist are used, allowing them to make targeted investments to improve vehicle performance. Drivers of electric vehicles could choose to share their car’s battery information with third-party charging networks to reserve charging stations according to the driver’s current location and travel plans.
There are implications for dealership service and vehicle manufacturing as well. “For dealers, a car might now be able to offer its most recent batch of operational history when it arrives on the service lane.” Similarly, Francis adds, manufacturers could use IVY applications to perform digital pre-delivery inspections or test systems within a vehicle for any defects before it ever leaves the factory.
IVY, Francis and Garcia say, will make it easier for automakers to collaborate with a wider pool of developers to accelerate creation of new applications and may reduce development cycle times.
“There’s actually two levels of acceleration,” says Garcia, “The first one is that developers will be able to have access to sensor data from the vehicle sensors quite rapidly because they’ll have tooling to do so and an SDK (software development kit).”
“The second level of acceleration is from the automakers perspective,” he adds. “If you’ve developed a synthetic sensor based on sensor data, you can now reuse it on other vehicle platforms or from model year to model year. For people creating the applications, consuming those insights, you don’t have to redo the development for every new vehicle.” That could save time and code complexity, Garcia adds.
Francis says that the first OEM projects are coming online soon, but early validation of the capabilities have already opened up broad reaching applications While it’s still early days the customer excitement points to an accelerating development time and a broad scope of applications being created.”
To learn more about AWS Blackberry IVY, click here.