Cutting-edge vision chip brings human eye-like perception to machines

The demo platform for autonomous driving perception. Credit: Tsinghua University.

With rapid advancements in artificial intelligence (AI), technologies like autonomous driving and intelligent robots are becoming more common.

These systems rely heavily on visual perception to gather information, but it’s challenging to achieve accurate and reliable visual perception in complex and unpredictable environments.

In real-world scenarios, intelligent systems must process large amounts of data and handle extreme events, such as sudden dangers, drastic light changes when entering tunnels, and bright flashes at night while driving.

Traditional visual sensing chips often struggle with these situations, facing issues like distortion, failure, or high latency, which can affect the stability and safety of these systems.

To tackle these challenges, the Center for Brain Inspired Computing Research (CBICR) at Tsinghua University has developed a new vision sensing technology.

Their innovative approach uses a primitive-based representation and two complementary visual pathways, inspired by how the human visual system works.

This research was highlighted in Nature in the article titled “A Vision Chip with Complementary Pathways for Open-world Sensing.”

The new approach breaks down visual information into basic elements, called primitives. By combining these primitives, the system mimics human vision, creating two complementary pathways that provide complete and detailed visual perception.

Using this new method, CBICR developed the world’s first brain-inspired complementary vision chip, named Tianmouc.

This chip can capture visual information at an impressive speed of 10,000 frames per second, with 10-bit precision and a high dynamic range of 130 dB. It also reduces bandwidth usage by 90% and consumes very little power.

This allows it to overcome the limitations of traditional visual sensing chips and handle extreme scenarios effectively, ensuring the stability and safety of the system.

The research team has also developed high-performance software and algorithms to work with the Tianmouc chip. They tested these on a vehicle-mounted perception platform in various challenging environments. The system showed excellent real-time perception with low latency, demonstrating its potential for applications in intelligent unmanned systems.

The development of the Tianmouc chip is a significant breakthrough in visual sensing technology. It not only supports the advancement of intelligent systems but also opens new possibilities for applications like autonomous driving and smart robots.

Along with CBICR’s other brain-inspired technologies, such as the Tianjic computing chips and brain-inspired robotics, the Tianmouc chip enhances the ecosystem of brain-inspired intelligence. This progress brings us closer to achieving artificial general intelligence, which can think and learn like humans.

In summary, the Tianmouc chip represents a major step forward in making machines see and understand the world more like humans do, paving the way for safer and more efficient intelligent systems in the future.