AWS Architecture Monthly

Customer Conversations

What are the main barriers for businesses adopting IoT?

The Internet of Things (IoT) has expanded. Now the term can be applied to many household and industrial devices that communicate with each other to streamline processes. The main challenges of the IoT industry currently relate to security, cost of implementation, connectivity, complexity, and regulatory standards.

Security. As more devices are connected, the risk of malware increases. Suppliers try to introduce new connected devices quickly with a predominance of functionality over security. The good news is that proven technologies like end-to-end encryption and token-based authentication, features that are well suited for IoT applications, help address the problem.
Cost. Sensors and actuators in IoT projects are not usually expensive, but monitoring complex environments can involve thousands of devices. Low Powered Wide Area Networks (LPWAN) can bring down the cost of devices and connectivity. However, there are scenarios that require higher bandwidth, such as cellular, Wi-Fi, or wired internet. This can impact the cost of a project.

Connectivity. The future of IoT must rely on decentralizing IoT networks by moving functionality to the edge, such as using fog computing models. The variety of platforms make it difficult to find a foundational layer of connectivity. However, some protocols (MQTT, CoAP, XMPP, and OPC-UA, among others) are a good sign of standards on the rise in this industry.

Complexity. Understanding the benefits of concepts like predictive maintenance is straightforward, but engineering a way to accomplish that objective is not. That’s because an IoT system often consists of a wide array of components ranging from security to analytics. It is difficult to establish clear workflows for product development because companies don’t have experience implementing IoT technologies.

Regulatory standards. The legal issues involved in IoT projects include information flows across borders, conflicts between surveillance devices and customer privacy, data retention policies, or security breaches. The development speed of the IoT technology frequently exceeds the regulatory environment, creating a perception of unfair or deceptive behaviors to consumers.

Which use cases are more approachable as businesses lean further into edge technologies?

Our experience with customers outlines some common use cases in industrial IoT projects, such as affordable near real-time device monitoring with proactive alerts and predictive maintenance. There is also a demand for a powerful central IoT repository for storing asset data, performing metrics calculations, and extracting insights from sensor data.

With these scenarios in mind, we developed Neuron, our SaaS solution implemented as a collection of serverless microservices. Neuron relies on AWS IoT Core and AWS IoT SiteWise. Neuron Edge, running on AWS IoT Greengrass, is the component performing intensive edge computing tasks. These tasks include data cleansing, data aggregation, delta upload to the AWS IoT SiteWise repository, alarms evaluation with AWS IoT Events, field protocol implementation, and many others. This component is the cornerstone of the low-cost pricing schema offered by Neuron. It reduces data trips to the cloud and implements advanced functionality at the edge, such as sensors and actuators management, or real-time alarms.

What are the general architecture pattern trends for IoT at the edge?

Edge computing is more decentralized and distributed than traditional cloud computing because of IoT’s inherent mobility requirements. That is the source of a new set of reference architectures that take a layered approach to decentralize edge computing.

These architectures usually have three distinct layers: device, edge, and cloud. Let’s focus on
the edge layer, which is responsible for:

  • Receiving, processing, and forwarding data from the device layer
  • Providing local services such as edge security
  • Edge data cleansing, preprocessing, and analysis
  • IoT process optimization

The edge layer can in turn be divided into three sublayers, or levels of responsibility:

Low: Contains edge controllers that collect data from the device, perform preliminary data thresholding, and implement control flow down to the devices. It supports a wide array of communication protocols and interfaces. Upper layers receive operational instructions or data-driven decisions, which are then transmitted to the devices through this level.

Medium: Contains edge gateways and is responsible for exchanging data with the low and high levels. It has more storage and computing resources compared to the low level. Data and intelligence derived from the data processing can be cached locally to support future processing.

High: Contains powerful edge servers responsible for performing more complex and critical data processing. It makes decisions based on the data collected from the medium level. It processes bulk data by using more complex machine learning algorithms and analyzes data from different equipment to achieve process optimization, usually with longer latency.

The real value of these architectural patterns lies in the swift development of actionable decisions. This can be achieved by uncovering intelligence and insights from data generated at the edge.

Do you see different trends in IoT at the edge versus IoT in the cloud?

As the volume of data to be processed explodes, alternatives to the traditional IoT model of sending all data to the cloud for processing are required. Edge computing tries to bring data processing as close to an IoT device as possible. That can mean improved latency, performance, cost, and security advantages for companies. Rather than sending data to be processed on cloud servers, the computation takes place on the device or in the local network itself. This reduces the impact on your time and resources.

One current trend is that edge processing power and data storage could all be combined to enable analytics and AI. These processes require very fast response times or involve the processing of large real-time datasets that are impractical to send to the cloud. Beyond performance and latency advantages, this can also be the most economical architectural choice. As much of this data may be ephemeral, a round trip to the cloud may not actually create any value.

A significant benefit of this new model is that it allows companies to have the best of both worlds. You can sense, capture, and analyze massive amounts of data at the point of origin. At the same time, you can obtain global visibility, management, and deeper analysis in the cloud. Organizations using a hybrid cloud strategy and edge computing in tandem will gain greater flexibility and consistency.

What’s your outlook for IoT, and what role will the edge technologies play in future development efforts?

I am a strong advocate of IoT as the key factor of a new era of distributed intelligence with an increasing number of connected devices. Putting compute power closer to data sources at the edge is the most effective way to manage the volume of data now being generated. Edge computing will also play a pivotal role in enabling businesses to realize the advantages of emerging technology, like 5G and AI.

IoT will bring digital and physical technologies together. This has great potential to transform industrial processes due to the amounts of raw data generated by machines. Some examples are condition-based monitoring and predictive maintenance.

As the telco industry prepares for 5G, it is also important to consider the fundamental role that the edge will play in delivering those services. The high-bandwidth and low-latency capabilities of 5G make the shorter distance between the device and the edge even more efficient. This allows enterprises to capitalize on massive amounts of data. The volume and complexity of 5G services will surge at the edge. Pushing computing closer to data acquisition is the only way to achieve the ultralow latency outcomes enabled by 5G.

References
[1] H. Washizaki, S. Ogata, A. Hazeyama, T. Okubo, E. B. Fernandez, and N. Yoshioka, “Landscape of Architecture and Design Patterns for IoT Systems,” IEEE Internet of Things Journal, vol. 7, no. 10, pp. 10091–10101, Oct. 2020, doi: 10.1109/jiot.2020.3003528.

[2] F. Dahlqvist, M. Patel, A. Rajko, and J. Shulman, “Growing opportunities in the Internet of Things,” McKinsey & Company, Jul. 22, 2019.

[3] “IoT Architecture: Topology and Edge Compute Considerations.” https://www.digi.com/blog/post/iotarchitecture-topology-and-edge-compute/ (accessed Sep. 09, 2021).

[4] F. Borelli, “Architectural Software Patterns for the Development of IoT Smart Applications,” Universidade Federal do ABC, Mar. 2020.

[5] A. Calihman, “Architectures in the IoT Civilization,” NetBurner, Jan. 30, 2019. https://www.netburner.com/learn/architectural-frameworks-in-the-iot-civilization/ (accessed Sep. 28, 2021).

[6] M. Kashaboina, “An intelligent edge: A game changer for IoT,” TechTarget, Sep. 15, 2021.

Jaime-González
Jaime González
Chief Technology Officer, Pentasoft
Jaime González is the Chief Technology Officer at Pentasoft. He began his career in IBERIA airlines (IAG Group) developing software modules for airport ground operations. In 2005, he co-founded Pentasoft, a software development company specializing in serverless SaaS solutions on AWS. He is passionate about serverless architectures and software design patterns based on microservices.
 
Jaime has a Bachelor’s Degree in Computer Science from the Universidad Politécnica of Madrid with further studies in networking and data science.