Making Sense of the Edge Computing Hardware Landscape
Eight analysts, vendors and early edge adopters give their advice on edge computing hardware approaches.
May 29, 2019
By Matt Hamblen
Edge computing has been around for several years but remains a relatively new and complex concept for many IT shops, even though the related concept of distributed computing has gone in and out of fashion in recent decades. The technology poses a number of thorny questions, many related to immense scale.
How to best implement edge computing can ultimately be a monumental undertaking for common enterprises and industrial IoT.
The edge computing hardware landscape alone is diverse, covering thousands of products from hundreds of vendors. Even so, some engineering advice from experts has begun to emerge, even if standard hardware approaches are elusive. The ideas described here were developed from interviews and the writings of eight analysts, vendors and early edge adopters.
How to Define the Edge
At the start, one question comes to mind: What, exactly, is the edge? One of the simplest definitions comes from IDC: “An intelligent edge provides a distributed compute and data persistence and network aggregation layer and it serves as the intermediary analytics of collected data.”
Some vendors argue that analytics at the edge sometimes requires much more than intermediary decision-making power. For example, an edge computing architecture for an oil rig far at sea with limited connectivity to a distant central data center might need to rival the compute power of an average data center. Compute systems on the rig would need the horsepower to monitor pressure, temperature and other environmental factors to be able to make split-second, automatic decisions to protect equipment and personnel and to keep the rig productive. In that example, edge computing hardware might be roughly equivalent to a ruggedized version of the processing, security, storage and networking capabilities of an entire data center.
At the other extreme, billions of smartphones and coming 5G base stations themselves are usually considered part of the edge. At the least, the potential proliferation of these devices is an important consideration for a long-term edge computing strategy.
The staggering list of available hardware components in the Internet of Things at the edge also includes a seemingly endless array of industrial sensors, connectivity hardware, mini-computing hardware such as Raspberry Pi, gateways, microchips, collaborative robots, self-driving vehicles, drones and unmanned aerial vehicles.
Often, major vendors such as Dell and HPE are embedding various components, such as sensors from third parties, into their servers or appliances. Every IT shop will need to know precisely which components will be part of their IoT infrastructure and, then, their protocols for connectivity to even begin to imagine the underlying value of edge computing in making better decisions for greater productivity and system agility.
IoT Analytics lists more than 300 vendors in the Industrial IoT category, of which 175 are directly related to hardware. The largest hardware category includes 44 connectivity hardware vendors, including some of the largest names in the tech field: Dell, Cisco, Huawei and Siemens. The firm counted 28 sensor vendors (such as ABB and Festo) and 23 microchip vendors (such as Nvidia and ARM).
There are so many categories of components (and actual numbers of components) at the edge that most IT shops don’t even know what or how many devices are connected. In a poll conducted last year, ZK Research found that 61% of 841 IT professionals in North America said they had poor or low awareness of which IoT devices are connected.
“At least with a good edge strategy you’ll close the gap” on the number of connected devices, said Zeus Kerravala, an analyst at ZK Research.
Most edge deployments are “highly custom in nature because of the lack of standard approaches,” said IDC’s Ashish Nadkami in a report on best practices for planning edge infrastructure.
Kerravala advised IT pros to start by planning for the four pillars of infrastructure: storage, security, processing and networking. “You want some kind of converged platform [with all four pillars], since you can’t buy all these pieces separately,” he said.
“I wouldn’t go fully white box, because that puts a lot of onus on the IT department,” Kerravala added. “I’d look for a turnkey platform with a lot of software flexibility on top. Try to do as much as you can in software.”
On the other hand, Nadkami advised engineers when picking edge hardware to “stay away from custom hardware.” Instead, IT managers should customize industry-standard hardware, perhaps choosing Raspberry Pi to leverage off-the-shelf Linux or Windows or another OS. With the trend to connect the IT world to the OT (operations technology) world, the use of industry-standard hardware can be further defined in software.
Any hardware approach can involve months of research, lab-testing and field-testing. “There’s definitely a learning curve with edge computing,” Kerravala added. “It’s definitely a different model. It’s still early days of having IT own IoT. I still get a lot of clients just asking what edge is.”
Arpit Joshipura, general manager at the Linux Foundation, added, “If you thought cloud was hard, edge is a thousand times harder. That’s because edge devices are in the thousands and the scale at which you solve the problem is different.”
Edge Case Studies Emerging
Even amid such difficulty, some edge computing case studies are beginning to emerge to show its potential benefits. They offer hints at ways to assemble infrastructure.
In 2017, Japanese industrial electronics company Daihen Corp. began deploying environmental sensors to monitor dust, moisture and temperature changes during product assembly at a facility in Osaka. Coupled with RFID technology to track each product, the company relied on edge-intelligence software from startup FogHorn Systems to match each product under assembly with environmental conditions detected in various phases and locations of assembly.
The result: a drastic reduction in manual data entry of 1,800 hours a year, and more accurate measurements to satisfy regulators. The expansion of the system to other factories throughout Japan should be completed this year, resulting in a savings of 5,000 man-hours per year.
FogHorn’s Lightning ML software relies on machine learning to detect anomalies. The data is managed at the edge, which means only the data of greatest value is evaluated, which prevents sending great volumes of data to back-end software.
FogHorn’s solution is hardware agnostic, which means it supports everything from Raspberry Pi to various gateways to industrial PCs. As a result, “there’s no rip and replace” of existing hardware, added Keith Higgins, vice president of marketing at FogHorn.
FogHorn is also providing software for an unidentified oil refinery outside the U.S. that uses edge computing to analyze the flare, or fire, that comes from a chimney to raise alarms if there is too much smoke or other anomalies that can indicate a problem with a compressor or other equipment. The approach relies on a ruggedized camera that streams video to a processor that has been trained to recognize unusual characteristics in the flare.
In another example, Schindler Elevator is relying on FogHorn software running on Raspberry Pi devices on top of elevators to connect to existing motion sensors for predictive insights for maintenance.
At Dell, the edge hardware approach is to embed sensors, much of it on Raspberry Pi.
“A lot of people are experimenting in edge computing, even with ‘Pi in the sky’ to connect to the cloud,” said Jason Shepherd, Dell chief technology officer for IoT and edge computing.
He advised IT shops to pick “credible hardware for the environment, with long-term support.” “Credible” could mean rugged — to deal with environmental demands such as tropical weather or high altitude/low pressure — and large enough for application and sensor growth in the next few years. Dell tries to address a highly fragmented market, which means that Dell develops edge hardware and software that includes a small set of SKUs with more features than a customer initially needs, leaving the option to expand as needed later on, Shepherd explained.
By comparison, at HPE, the philosophy about edge hardware puts a preference on creating a ruggedized data center near where data is created at the edge.
“There are edge systems with the similar capabilities of a laptop or desktop, but much more is needed, such as a full data center that’s ruggedized for compute, storage and networking capability,” said Tripp Partain, chief technology officer for edge and related fields at HPE. “Raspberry Pi is limited in processing at the edge…As soon as they start putting capability at the edge, people realize they need more compute power.”
With the equivalent of a full data center embedded with OT into a dedicated edge hardware platform, “you have greatly reduced complexity and near real-time decisions,” Partain added. HPE has relied on its edge computing platform to analyze high definition video in its own manufacturing of servers, improving quality and cutting down the time needed for quality assurance tasks.
HPE’s machine learning edge system is also being used by storage vendor Seagate for analyzing data from electron microscopes used to scan silicon wafers for defects.
As Dell and HPE demonstrate, there are various hardware approaches for edge computing.
MachNation Analyst Josh Taubenheim suggested organizations ask three questions to help decide when to adopt edge computing. The answers can also help dictate the resiliency and suitability of hardware:
Is the edge process or device being monitored mission critical?
Would the IoT solution suffer from loss of connectivity to the cloud?
Are there regulatory requirements that mandate data be stored locally?
A “yes” answer to any of the above would dictate creating an edge deployment. The answers could also help determine the power, size and complexity of the hardware.
You May Also Like