The mission should you choose to accept it:
Send Treggon "Iceplant" Owens out to Denver to attempt to do a live demo of a LIDAR. Said LIDAR wouild be hooked up to an enormous Octacopter. Never before had the LIDAR sensor and copter shook hands and made friendly. Against all Ghostbusters logic, Iceplant crossed his lasers, and headed out from San Diego to Denver to put on a show. All specs in this article are strictly from Iceplant hanging around with the Velodyne guys for 3 days and talking to so many people interested in seeing the Velodyne sensor on a lower cost platform. Facts are quoted, "as Iceplant understand's them."
The Octacopter had been tested the week before, dead-lifting 15, then 20 pounds of steel, tethered to the ground. Iceplant, Steve "Kloner" Blizzard, and Grayson "Deadbeat" Omans had to do this so that at the trade show we could let them know we could lift their sensor. That was another whole bunch of work involving much vibration isolation, changeover from Hoverfly to WooKong, as well as about a hundred knuckle bashing metric sized allen screws.
After test flying and making sure we could stuff two 1600 class pelican cases and a Burton Snowboarding bag full of a Cinestar 1000, pay the southwest $75.00 new extra high bag fees, and off to Denver Iceplant went alone to meet Wolfgang "Phud" Juchmann. The concept was to stream ethernet data off of the Velodyne sensor. The view from the Octacopter shot 700,000 points per second of XYZ laser information down to a big screen in real time. The sensor was mounted to the vibration isolation stage of the Cinestar 1000 heavy lifter rig.
Iceplant was given a carbon plate, 4 screws and a mechanical spec sheet to go off of, so he was a little concerned about final integration on site, the morning that the trade show started.
The RC gods smiled down on this ambitious project, and the concept worked out amazing. You could see the future through the eyes of this Multi-Rotor. Day, Night, Whenever, lasers shot out from the machine. In 2 nanoseconds, the electronics on board the sensor detects and calculates a reflected photon's position. It takes 32 of these lasers and spins them around 360 degrees in a circle, continuously. It does this from 8 to 20 times a second. During parts of the video, different colored lines you see on the screen represent each individual laser, so you see 32 circles swirling around you, each going out as far as it can in every direction.
The laser is also picking up information on the reflectivity of the surface that it is bouncing itself off of. When the video is mostly blue, that is in reflectivity mode. The red bits are sign material, the reflection you would see on a STOP sign. It can see a skid mark on pavement for instance, or the lines painted in the road. It sees the difference. The display is showing how the sensor can see this information. The laser can see out to between 70 and 100 Meters. You can spin yourself around in this 'POINTCLOUD' as it is called by all the people walking around the trade show.
It is all quite incredible. It is the same way the Google Automated Car cruises about and keeps itself from hitting all the obstacles in San Francisco all day long. Dozens of these cars are driving around every single day looking through this set of eyes.
It is one of the ways all machines will see in the future. What is incredible is the precision, it knows everything happening around it to within roughly 2cm accuracy. Combined with an Inertial Measurement Unit (IMU), this technology is transformative to the UAV, Hobby, and general WORLD! After three days looking through the eyes of the Velodyne HDL-32E sensor, literally having eyes in the back of your head, your brain starts to transform and get used to the feeling. I thought the FliteTest community would enjoy the mind expansion as well, so I dropped a little homage by way of the walpaper logo.