Artificial perception pioneer AEye (great name) recently announced it has been awarded foundational patents with numerous claims for its solid state, MEMs-based agile LiDAR and embedded AI technology that are core to AEye’s iDAR™ perception system.
Of all of the automotive lidar start-ups, AEye’s approach to solving the artificial perception problem may hold the most promise, at least on paper.
AEye’s iDAR (Intelligent Detection and Ranging) perception system mimics how a human’s visual cortex focuses on and evaluates potential driving hazards. Using embedded AI within a distributed architecture, iDAR critically and dynamically assesses general surroundings, while applying differentiated focus to track targets and objects of interest. As a scalable, integrated system, iDAR delivers more accurate, longer range, and more intelligent information faster. AEye’s first iDAR-based product, the AE100 artificial perception system, will be available this summer to OEMs and Tier 1s launching autonomous vehicle initiatives.
That is different.
“AEye’s groundbreaking iDAR system is the first to use intelligent data capture to enable rapid perception and path planning,” said Elliot Garbus, former Vice President of Transportation Solutions at Intel. “Most LiDAR systems function at only 10Hz, while the human visual cortex processes at 27Hz. Autonomous vehicles need perception systems that work at least as fast as humans. iDAR is the first and only perception system to consistently deliver performance of at least 30-50Hz. Better quality information, faster. This is a game changer for the autonomous vehicle market.”
I have seen a requirement recently for two to three times the speed of human recognition in order to get humans to feel safe about adopting the new technology. The human brain is a beautiful thing.