Sensing the World How TDKs Multi-Sensor Fusion Improves Perception in Industrial and Service Robots

Service robots have evolved from simple automated machines to intelligent adaptive systems that can navigate unpredictable environments and interact with humans.

author avatar

06 May, 2025. 6 minutes read

Image: Robots like Agility Robotics’ Digit use multiple types of sensors powered by AI algorithms, including IMUs, as well as LiDAR and cameras to autonomously climb stairs or avoid obstacles. Source: Agility Robotics.

Image: Robots like Agility Robotics’ Digit use multiple types of sensors powered by AI algorithms, including IMUs, as well as LiDAR and cameras to autonomously climb stairs or avoid obstacles. Source: Agility Robotics.

This article is Part 2 of a four-part series. Read Part 1: Mastering Multitasking: Exploring the Distributed Processing Capabilities of TDKs Advanced Robotics.

Introduction

Service robots have evolved from simple automated machines to intelligent adaptive systems that can navigate unpredictable environments and interact with humans. This transformation is the result of advancements in sensor technology and data processing. With those developments, multi-sensor fusion, which combines various sensors like IMUs, LiDAR, and cameras, plays a critical part in enabling real-time perception and decision-making.

The key challenges in sensor fusion include synchronizing data from multiple sensors operating at different frequencies and reducing noise from environmental interference, such as lighting variations for cameras or vibrations for IMUs. Another critical challenge is managing the computational complexity of processing large volumes of sensor data in real-time, which is essential for accurate and reliable robotic operation in dynamic environments.1

TDK, a leader in electronic components and systems, is at the forefront of robotics, with its innovative multi-sensor fusion technology. By combining data from various sensors, TDK’s solutions ensure precise data synchronization, effective noise reduction, and efficient real-time processing. This enables service robots to achieve enhanced perception capabilities, allowing them to operate more effectively in complex, human-centric environments

This article explores how TDK’s sensor fusion enables real-time decision making, stability, and precise movement for service robots. We’ll look at the physical aspects of TDK’s multi-sensor platform with ROS drivers, and how it is shaping the future of robotics.

The Need for Advanced Perception in Service Robots

Service robots usually operate in dynamic environments, and as such, they face some unique challenges like moving humans and objects, which present unpredictable obstacles. They must also handle human interaction, and varying terrain. These challenges require real-time decision-making, stability, and precision to navigate the world and manipulate objects, which goes beyond the capabilities of single-sensor systems.2

Single-sensor perception has several limitations:

  • Limited field of view

  • Susceptible to environmental interference

  • Unable to capture complex 3D environments accurately, for instance, when a camera fails in low lighting.

Multi-sensor fusion addresses these limitations by combining data from multiple sensors, each compensating for the others’ weaknesses. For example, when IMUs provide fast motion data, they can drift over time, whereas LiDAR and vision systems provide precise spatial awareness, but may struggle in low-light or featureless environments. A combined approach that uses both sensors allows robots to generate a more comprehensive model of their environment so that they can make better decisions and interact more naturally with humans and objects.3

TDK’s sensor fusion technology is particularly important for service robots that need to navigate crowded areas, interact with humans, and perform delicate tasks. By developing a more robust and reliable system, TDK is helping to create robots that can operate safely and efficiently in unpredictable real-world scenarios.

TDK’s Multi-Sensor Fusion Technology

Overview of Sensor Technologies

TDK’s multi-sensor fusion technology incorporates a variety of advanced sensors that each contribute unique data to create a comprehensive model of the environment. Here are a few examples of TDK’s sensors:

  • Inertial Measurement Units (IMUs)  — These provide motion tracking and orientation estimation, allowing robots to understand their position and movement in 3D space.
    • The ICM-4288-P 6-axis IMU combines a 3-axis accelerometer and 3-axis gyroscope.

    • Industrial-grade IIM-46234 and IIM-46230 6-axis IMU modules gather precise measurements in harsh environments with vibration and wide temperature variations.4

  • Ultrasonic Sensors — These sensors use sound waves to measure distance, enabling robots to detect obstacles regardless of color or lighting conditions.

    • The CH101 and CH201 ultrasonic Time of Flight (ToF) sensors (9 units) provide reliable, real-time obstacle detection in diverse environments.

  • Magnetometers — These sensors measure magnetic fields to assist with orientation and navigation, particularly useful for detecting heading direction.

    •  TDK’s magnetometers enhance orientation accuracy by providing additional heading data.

These, and other TDK sensors work in concert with TDK’s fusion algorithms to provide enhanced capabilities to robotics. TDK’s comprehensive suite of sensors and fusion methods allows robots to navigate complex environments with human-like perception and adaptability.

Integration Framework

TDK’s integration framework effectively brings these sensors together:

  • Fusion Algorithms — TDK uses sophisticated algorithms, primarily Extended Kalman Filters (EKF) and Particle Filters, to combine sensor data and enhance accuracy and reliability. These algorithms process vast amounts of data to create a cohesive understanding of the world (like sci-fi neural networks that blur the line between the digital and physical world).

  • Kalman Filtering and Deep Learning Models — These advanced techniques process sensor data in real-time, allowing robots to make split-second decisions. The Kalman filter in particular estimates the robot’s state through noisy sensor data.

  • Latency Reduction Techniques — TDK has developed methods to minimize processing delays to ensure that robots can respond immediately to environmental changes. For example, TDK’s sensor fusion systems use parallel processing to perform multiple computations simultaneously. They also implement edge computing, moving data processing closer to the sensors themselves. This is important for robots that need to move fast in unpredictable scenarios, from autonomous delivery bots to industrial automation.5

By integrating these technologies into a unified system, TDK enables service robots to achieve human-like perception capabilities. For example, robots like Agility Robotics’ Digit use multiple types of sensors powered by AI algorithms, including IMUs, as well as LiDAR and cameras to autonomously climb stairs or avoid obstacles.6

Compact and Lightweight Sensors for Human-Sized Robots

Service robots must be both compact and lightweight to operate efficiently in human-centric environments. TDK’s sensors are designed with these requirements in mind:

Technical Specifications:

  • Weight — Typically under 50 grams

  • Size — Miniaturized dimensions suitable for integration into slim robotic designs

    • Ultrasonic Time-of-Flight sensors ICU-10201 and ICU-20201 have dimensions of 3.5 mm x 3.5 mm x 1.26 mm.

    • The ICM-42688-P 6-axis IMU compact enough to fit on the RoboKit1 development board.

  • Power Consumption — Optimized for energy efficiency without compromising performance.

    • ICM-42670-P IMU is described as ultra-low power, and has a low-noise 6-axis current consumption of 0.55 mA.

These specifications allow human-sized service robots like Digit to maneuver freely while maintaining long operating times. For example, Digit’s compact sensors enable it to climb stairs or carry loads up to 16 kg without compromising stability, a feat that is supported by TDK’s expertise in miniaturization derived from smartphone technology.7

By leveraging their experience with smartphone technology, TDK has developed sensors that combine high performance and compact design—key factors that enable the creation of human-sized service robots that can integrate smoothly into daily life.8

Real-World Applications of TDK’s Sensor Fusion

TDK’s sensor fusion technology enables real-time path planning and collision prevention. For example:

  • Warehouse Robots — TDK’s sensor technology can enable warehouse robots to autonomously moves goods through crowded warehouses using sensor data processed with AI algorithms.9

  • Healthcare Robot  — Service robots equipped with TDK’s sensors can navigate hospital corridors safely while avoiding patients and medical equipment.

Human-Robot Interaction and Collaboration

Sensor fusion enhances human-robot interaction by enabling gesture recognition and voice interpretation:

  • Customer Service Robots — Equipped with TDK’s perception technology, these robots can interpret gestures and respond verbally, creating seamless human-robot interactions. TDK’s advanced technologies in this area include:
    • Voice Recognition and Natural Language Processing (NLP) — TDK's MEMS microphones, combined with advanced speech recognition algorithms, enable robots to understand and respond to voice commands.

    • Multi-modal Integration — TDK's sensor fusion algorithms combine data from various sensors, including cameras, microphones, and IMUs, to create a comprehensive understanding of human communication, including both verbal and non-verbal cues

ROS Integration

TDK’s sensors are compatible with ROS 1 and ROS 2 frameworks, simplifying the development of intelligent robotic systems:

  • Autonomous Delivery Robots: Using ROS-compatible sensor fusion frameworks, these robots navigate urban environments efficiently while avoiding pedestrians—similar to delivery bots featured in futuristic sci-fi settings.

Conclusion

TDK’s multi-sensor fusion technology is revolutionizing service robotics by providing advanced perception capabilities and interacting naturally with humans. By combining lightweight sensors with cutting-edge algorithms, TDK enables service robots like Digit to perform tasks efficiently while adapting to complex scenarios.

As robotics continues to evolve, TDK remains committed to leading innovation through investments such as TDK Ventures’ support for Agility Robotics. These advancements are not just futuristic visions—they are shaping a future where intelligent robots integrate seamlessly into society, enhancing industries and everyday life alike.

Equipped with TDK’s perception technology, these robots can interpret gestures and respond verbally, creating seamless human-robot interactions. Learn more about TDK’s innovative advancements at https://www.tdk.com/en/index.html.


Reference:

1. “Human-Centric Robots Will Dramatically Transform the Forefronts of Labor” (3 Jul. 2023). Accessed from https://www.tdk.com/en/featured_stories/entry_056-human-centric-robots.html 

2. Ibid.

3. “Sensor Fusion in Robotics” (6 Jan. 2025). Meegle. Accessed from https://www.meegle.com/en_us/topics/robotics/sensor-fusion-in-robotics 

4. TDK IMU Product Page. Accessed from https://product.tdk.com/en/products/sensor/mortion-inertial/imu/index.html 

5. “Human-Centric Robots Will Dramatically Transform the Forefronts of Labor”.

6. “The Future of Robotics: Digit By Agility Robotics”.

7. “Human-Centric Robots Will Dramatically Transform the Forefronts of Labor”.

8. Ibid.

9. John Koetsier. “Meet the Only Humainoid Autonomous Robot Actually Working In Warehouses Today” (14 Nov. 2024). Forbes. Accessed from https://www.forbes.com/sites/johnkoetsier/2024/11/14/meet-the-only-humanoid-autonomous-robot-actually-working-in-warehouses-today/