Four Major Trends in IoT Analytics in 2023
From enhanced digital twin tech to the rise of computer vision, here are some key trends expected in the new year
Over the years, analytics has become an integral part of IoT. Industrial organizations such as manufacturers, as well as transportation and energy companies, and governments worldwide, continue to embrace these technologies to enhance operational efficiency and enable significant cost and operational savings.
Advanced analytics like artificial intelligence (AI), streaming analytics and machine learning (ML), when combined with IoT technologies and sensors, can help power smart factories, grid infrastructure and even cities. But what will 2023 bring in this important area? IoT World Today spoke to Jason Mann, vice president of analytics company SAS about the rise of this technology, and the trends predicted to emerge.
The Rise of Analytics in IoT
According to Mann, four major trends will emerge in IoT analytics over the next year: the rise of low-code and no-code automated machine learning (AutoML), enhanced digital twin technologies, industrial adoption of computer vision (CV), and a blurring of the lines between edge and cloud. These trends don’t mark a departure from previous years, but rather a continuation of market trajectories following the pandemic.
Specifically, in 2023 SAS predicts there will be greater availability of industrialized AI through low-code and no-code AutoML, with these models provided through self-service marketplaces and with the possibility of being enhanced with packaged services for customization and deployment.
Mann says we’ll also see more purpose-built digital-twin applications in 2023 specialized for defined use cases in energy, infrastructure optimization and industrial manufacturing sectors. Organizations are also expected to increasingly adopt CV and other AI technologies, with the kinds of industries harnessing these technologies expanding beyond more niche use cases by IT staff and data scientists. According to Mann, CV initiatives will focus on “yield improvement, operational efficiency and safety.”
Finally, with cloud hyperscalers like Microsoft Azure, Amazon Web Services and Google Cloud Platform starting to roll out core cloud services on the edge, edge computing will become an extension of cloud computing. Workloads will be distributed intelligently across hybrid environments. This will mean quicker adoption of IoT analytics at the edge in 2023 to enhance decision making at the source.
Low-Code, No-Code
“We’ll continue to see the expanded adoption of IoT initiatives across industries,” Mann said. “There’s been momentum in this area for quite some time. If you look back three or four years, there was a real focus on the idea of a proof of concept (POC), but now our customers are transitioning from these POCs to something more sustainable and long-term.
“Within the last year, we’ve seen organizations wanting to test IoT and analytics projects and prove that they can continue to generate value. This isn’t necessarily a switch we’re going to see from last year to next, but a move from narrow PoCs to wider adoption as customers start to see significant return on their projects.”
It now seems hard to imagine a time when analytics was not an integral part of every IoT use case, however, there has been a gradual shift over several years as the systems surrounding it became easier to understand and more widely deployed. The rise of low-code no-code analytics has been a primary driver of this rise in accessibility.
“The big goal with low-code no-code analytics is for anyone to be able to transform data into insights,” said Mann. “Low-code, no-code environments are opening up adoption to companies that don’t have substantial data scientist skill sets, and manufacturing is one of the industries that has really adopted IoT and analytics. Analytics and data are no longer the realm of just the white-collar workers and the blue-collar workers, it’s starting to be used by people all across the supply chain.”
Digital Twins
The proliferation of sensors also means it’s becoming increasingly simple to represent systems in a digital environment, leading to the next predicted trend of enhanced digital twin technologies.
“Once you’re able to accurately replicate a real-world system in a digital world, you can start to play with variables with the goal of optimizing the physical elements without impacting day-to-day operations,” said Mann. “Now you can start to create a digital twin of your infrastructure and start to move those levers to anticipate if there are issues with any part of the supply chain, and you can put measures in place to address it prior to them happening.”
Most analytic procedures used to involve gaining access to massive amounts of data, moving it through a network and getting it into a consistent environment. Then there was the process of creating algorithms that look at that data and generate insight which was then distributed for consumption.
“The use of analytics expanded into IoT seven or eight years ago,” said Mann. “It was really more about the expansion of the ecosystem rather than a complete switch. Most analytic procedures used to involve gaining access to massive amounts of data, moving it through a network, and getting it into a consistent environment. Then there was the process of creating algorithms that looked at that data and generated insight which was distributed for consumption. Seven or eight years ago, changes in sensor technology remade the landscape. Cheaper and more powerful sensors became widespread, and their deployment helped take decision-making to the point of the origin of data – at the edge, at the sensor, with streaming data in real time, using powerful analytics.”
Industrial Adoption of Computer Vision.
“A lot of people think about CV as object detection,” said Mann. “But this is an area where we’re seeing a lot of growth and it has a broad range of applications. You can use it to identify an area that requires monitoring and set up alerts to warn operators that something has occurred and, over time, identify problem areas that they can correct through training.”
A huge benefit of this tech is of course predictive maintenance, in allowing operators to identify and address particularly accident- or problem-prone areas, though Mann stresses that this is only the tip of the iceberg in use cases.
“We’re often seeing broader applications than just predictive maintenance,” Mann said. “Oftentimes it’s real-time operational defect detection. The big benefit of CV is it’s usually not a displacement technology. You don’t need to deploy vast amounts of sensors or changes to the system or equipment, it can be as simple as deploying cameras. It’s a low-impact measure that can vastly improve the quality of predictive maintenance or safety, and I think that’s why it’s starting to gain good adoption.”
Blurring the Line Between Edge and Cloud
“There used to be a clear line between on-premise or in-cloud computing, and edge computing,” Mann said. “The edge was the domain of networking companies providing distributed devices that lived outside the cloud. Within the last 12 to 18 months, there has been accelerated movement toward edge computing on cloud infrastructure as organizations move edge analytics, and the resulting decision making, closer and closer to the source of the data.”
This shift from the cloud to on-premise sparked the increased emergence of hybrid environments.
“The goal isn’t to be a displacement technology, but a value-added capability. And all of our methods for presentation or consumption are based on that premise.”
“I think a consistent line across all of the projects that we’re seeing is the benefit of scoping the problem, being able to target it to a specific outcome,” said Mann. “That’s where we’re seeing companies excel in the use of analytic techniques, not just machine learning or IoT. But I think that’s a direction for all to achieve the greatest success in the smallest amount of time.”
About the Author
You May Also Like