Sensor Fusion Technology Brings Data-Rich Environments to Fruition
Sensor fusion – the confluence of environmental IoT data from multiple, disparate sources – is gaining momentum in various industries. Here’s what the future looks like.
February 5, 2020
By Scott Robinson
As we collect IoT data from a variety of sources and bring it together, the sum of the parts is creating the environment around us.
Sensor fusion technology, as it’s known, brings together data from multiple sensors of different types to present a rich portrait of an environment. Sensors can measure various conditions, from light to motion with, you name it – and the intelligent system can be practically anything: vehicles, , robots, buildings, drones all come to mind as sensor fusion candidates.
The underlying technology in sensor fusion is lidar – a light-based alternative to radar.. A lidar sensor fires lasers at a target (at up to 900,000 times per second) to gather reflections to map its surrounding environment.
Since such sensors can be added to systems that include GPS, they not only produce maps but also offer real-time navigation, enabling robotic applications. Autonomous vehicles are obvious – but just the tip of the iceberg.
Lidar has been used to study ancient ruins beneath jungle foliage, examine air pollution in cities, to improve fertilizer distribution on farmland, to track customer movements in retail stores, and to map the surfaces of other celestial bodies in the solar system.
The Fusion Sensor Technology Market
There’s more to sensor fusion than lidar. GPS is a sensor fusion system component, along with an array of other devices including: accelerometers, cameras (including infrared and imaging cameras), microphones, , bluetooth beacons and pressure meters. These sensors handle data differently – each has its own data format, rate of transfer and (in some cases) encryption.
The U.S. sensor fusion market is already a $2.25 billion industry, and will pass $9 billion by 2025, according to MarketWatch, growing at 18.73% CAGR according to BusinessWire. Companies such as NXP Semiconductors, Kionix, STMicroelectronics, Bosch Sensortec, Invensense, Hillcrest Labs, Renesas Electronics and others occupy top niches in this rapidly growing arena.
Fusion sensors combine to facilitate hundreds of applications, from self-driving vehicles (land, sea and air) and surgical robots to environmental management and wide-area security systems. The software underlying them integrates data from dozens, even hundreds of devices, assembling an often real-time mapping of complex environments that is more accurate than what is provided by an individual sensory mode.
Sensor Fusion Technology: Making Good Things Better
Sensor fusion arrays have already taken many applications and automation systems to new levels, thanks to certain factors. When more than one sensory modality is applied to an environment, one will catch things another misses – the signal-to-noise ratio will increase, improving a system’s performance in that environment.
Sensor fusion arrays have a high fault tolerance. When a multi-sensory system scans an environment (say, a secure location), a cascade failure in one modality won’t crash the system because all the motion sensors are still active. Finally, multiple sensory modalities validate the overall system’s performance: the various sensors verify one another’s performance.
These capabilities create superior performance, robustness and efficiency in sensor fusion systems. In transportation, where the technology is most prominent, it means combining visual sensors, radar and lidar for object detection and data integration, producing uniform coordinates for objects in dynamic surroundings from media that stream data at different rates. The result is faster, more accurate object detection.
Hospitals are sensor fusion havens, boasting the highest concentration of IoT devices per square foot of any public environment. Sensor fusion allows this IoT ocean to be organized for far greater efficiency, optimizing both environmental and human resources while improving outcomes.
A smart city may be the richest sensory environment of all, offering fusion of cameras, lidar, bluetooth beacons, motion sensors, RFID and dozens of other devices to optimize public spaces, monitor pollution, improve traffic flows and improve public safety.
Challenges with Sensor Fusion Technology
More data is always better, whatever the domain, but more different-in-kind data increases the complexity of the underlying software and the system overall. Sensor fusions present issues with inconsistent, spurious and out-of-sequence data integration – all of which require resolution in real time, which can adversely affect performance and slow a system’s response.
In transportation, where lidar must be fused with luminance data from an imaging sensor, it makes geometric alignment difficult, potentially compromising the effective collision avoidance that is the entire point of the system.
And then there’s the issue of privacy: an IoT-rich environment is of enormous benefit to many parties, not the least of which is the consumer, but that environment is constantly gathering data, passively and without consent or notification, from everyone within it. This might not be openly intrusive, and may in many cases include nothing more than the fact that the consumer is walking in or out of a door, or down an aisle; but it nonetheless raises the issue of data privacy, which will take some time to negotiate, governance-wise, between industry, government and the consumer.
Sensor fusion has come a long way, but it has a long way to go. It has provided many problems of interest, fortunately, for a growing research initiative.
Just Around the Corner
The holy grail of transportation is autonomous vehicles. Recent research innovations that may soon become reality include a sensor fusion system presented in January by Eyeris Technologies in Las Vegas, combining radar, imaging sensors and thermal sensors for an automotive safety enhancement that exploits AI to detect the presence of children in the vehicle.
In December, Congress announced plans to instruct the U.S. Air Force to proceed with development of the Advanced Battle Management System, which will combine sensor fusion with AI for enhanced command and control systems. Lockheed Martin’s F-35 has already demonstrated sophisticated sensor fusion integration that greatly boosts the data available to a pilot.
In the smart city realm, citizens could become real-time “emotional sensors,” a concept presented last year at the International Conference on Smart Infrastructure and Construction. Citizens would passively feed behavioral data that would create context. In the event of a building fire, for example, the spread of fear and retreat would be instantly sensed and measured, creating a more effective emergency response. Likewise, emotional mapping of park dwellers would lead to improvements in recreational environments.
Finally, there’s the IoT we all carry everywhere: the smart phone. Virtually all cell phones deployed today include a camera and a microphone, and many can read fingerprints and faces. But the potential for broad inclusion of additional sensors is enormous – accelerometers, magnetometers, proximity sensors, some form of lidar – enabling smart phones as potential sophisticated diagnostic instruments, configurable health care and engineering tools and augmented reality system components.
Even before this sensory explosion in smart phones gets underway, their potential as activity recognition systems is becoming established. Your phone will soon be able to sense and recognize – and potentially record – many of your actions. That takes the study of human behavior to a whole new level, and the technology is already with us.
For better or worse, it’s a brave new world, and sensor fusion is what makes it work.
You May Also Like