Bots at DARPA Challenge Have LiDAR Eyes

In this Gizmodo report from a DARPA Challenge event currently taking place in Homestead, Florida the robots are using LiDAR sensors as their “eyes” to accomplish the eight tasks that make up the challenge.┬áThe machine-driving-machine challenge is the hardest task among the eight that the robots have to perform, but MIT made it look almost easy. That’s all the more remarkable considering that today’s race is the first time the team has had a chance to actually drive a vehicle with their bot.

Can you imagine where we will be in the next 5 to 10 years? I believe this decade will be one of the most incredible in man’s history of working with technology. As John Walker, one of the founders of Autodesk pointed out in the Autodesk File, his inspiring chronicle of the start-up days, technology comes from the word for tool. Too many let the technology control them.

Thanks to Bill Gutelius at Active Imaging Systems for the heads up.

This entry was posted in Autonomous vehicles, remote sensing, Research. Bookmark the permalink.

One Response to Bots at DARPA Challenge Have LiDAR Eyes

  1. The most common LIDAR used during the recent DARPA challenge was an Hokuyo UTM30LX-EW that can be found at http://www.autonomoustuff.com/hokuyo-utm-30lx-ew.html. Please feel free to contact AutonomouStuff if you have any questions about this sensor.

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>