Friday, March 6, 2020

Sensor Fusion Technology Brings Data-Rich Environments to Fruition

Sensor fusion – the confluence of environmental IoT data from multiple, disparate sources – is gaining momentum in various industries. Here’s what the future looks like.

As we collect IoT device data from a variety of sources and bring it together, the sum of the parts is creating the environment around us.

Sensor fusion technology, as it’s known, brings together data from multiple sensors of different types to present a rich portrait of an environment. Sensors can measure various conditions, from light to  motion with, you name it – and the intelligent system can be practically anything: vehicles, , robots, buildings, drones all come to mind as sensor fusion candidates.

The underlying technology in sensor fusion is lidar – a light-based alternative to radar.. A lidar sensor fires lasers at a target (at up to 900,000 times per second) to gather reflections to map its surrounding environment.

Since such sensors can be added to systems that include GPS, they not only produce maps but also offer real-time navigation, enabling robotic applications. Autonomous vehicles are obvious – but just the tip of the iceberg.

Lidar has been used to study ancient ruins beneath jungle foliage, examine air pollution in cities, to improve fertilizer distribution on farmland, to track customer movements in retail stores, and to map the surfaces of other celestial bodies in the solar system.


The Fusion Sensor Technology Market

There’s more to sensor fusion than lidar. GPS is a sensor fusion system component, along with an array of other devices including: accelerometers, cameras (including infrared and imaging cameras), microphones, , bluetooth beacons and pressure meters. These sensors handle data differently – each has its own data format, rate of transfer and (in some cases) encryption.

The U.S. sensor fusion market is already a $2.25 billion industry, and will pass $9 billion by 2025, according to MarketWatch, growing at 18.73% CAGR according to BusinessWire. Companies such as NXP Semiconductors, Kionix, STMicroelectronics, Bosch Sensortec, Invensense, Hillcrest Labs, Renesas Electronics and others occupy top niches in this rapidly growing arena.

Fusion sensors combine to facilitate hundreds of applications, from self-driving vehicles (land, sea and air) and surgical robots to environmental management and wide-area security systems. The software underlying them integrates data from dozens, even hundreds of devices, assembling an often real-time mapping of complex environments that is more accurate than what is provided by an individual sensory mode.


Sensor Fusion Technology: Making Good Things Better

Sensor fusion arrays have already taken many applications and automation systems to new levels, thanks to certain factors. When more than one sensory modality is applied to an environment, one will catch things another misses – the signal-to-noise ratio will increase, improving a system’s performance in that environment.

Sensor fusion arrays have a high fault tolerance. When a multi-sensory system scans an environment (say, a secure location), a cascade failure in one modality won’t crash the system because all the motion sensors are still active. Finally, multiple sensory modalities validate the overall system’s performance: the various sensors verify one another’s performance.

These capabilities create superior performance, robustness and efficiency in sensor fusion systems. In transportation, where the technology is most prominent, it means combining visual sensors, radar and lidar for object detection and data integration, producing uniform coordinates for objects in dynamic surroundings from media that stream data at different rates. The result is faster, more accurate object detection. 

No comments:

Post a Comment

Server management systems

Enterprises receive the services and functions they need (databases, e-mail, website hosting, work applications, etc.) for their corporate I...