I just became aware of this entrepreneurial effort by a young Oklahoma State student to help you get flying your UAS legally.
Shaye Andelin became interested in UAS when her father purchased one for fun. She told her father a fellow student had bought the same system and while operating it somewhere in Texas this student was fined by the FAA. Her father explained the likely reasons her fellow student was fined and how there is a process to operate it legally. In fact he was going through this process himself and asked if she would like to help.
She learned a lot while assisting her father in writing a Sec 333 request for exemption to operate a UAS. She also saw how convoluted the process was and an opportunity in this, and is now offering a filing service for others interested in operating legally. While the Sec 333 exemption filing is only part of the requirement, she also offers assistance with completing the Registration requirements.
Built in collaboration with renowned 3D artist Josh Harker and Artists Lend Support (ALS) to benefit ALS research, the piece features a montage of facial scans, including those of ALS founder and survivor Brian Fender and his supporters, all taken using Fuel3D’s SCANIFY handheld, point-and-shoot 3D scanner, built to enable consumers and professionals to 3D scan objects in high-resolution shape and color in under a second. The sculpture, created to represent the impact supporters have on the lives of people with ALS and the pursuit of a cure, was recently sold at auction, raising $2,500 in support of ALS Therapy Development Institute, a nonprofit biotechnology organization developing effective treatments for ALS.
This research is being conducted in Australia to automatically extract forest structure. Point clouds hold a vast amount of information on vegetation, habitat characteristics, fuel loads, timber volumes, and other forest characteristics. However, it is challenging to turn this data into usable information.
This research aimed to develop a technique to automatically classify point clouds into different forest components: near-surface vegetation, mid-storey scrubs, tree stems, and tree canopy. After classification, useful information on these different vegetation strata can be extracted, making it easier to interpret vegetation information hidden in the point clouds.
Operation IceBridge is an annual NASA mission, which involves sending two flights all the way along the North American side of the Arctic Ocean — basically, Greenland to Alaska and back again. The planes use a variety of sensors to record the thickness, height and characteristics of the ice — a job that can’t quite be done by satellite, yet.
By timing the round-trip journey of the light pulses and accounting for the location and altitude of the plane, scientists can determine the height of the ice surface. Flying the same path every year provides detailed information about how the surface height of the ice has changed. This information in turn contributes to calculations of sea ice thickness in these areas, which scientists have shown to be thinning.
This is without a doubt one of the most impressive demonstrations of the use of lidar to create a visual understanding of 3D that I have seen. I think it is fair to say that the entertainment industry is one of the most advanced users of lidar technology.
AUVSI recently met with the U.S. Small Business Administration, and they have invited AUVSI’s members to participate in its Small Business Aviation Safety Roundtable on the FAA’s proposed small unmanned aircraft systems rule on Thursday, April 9, 2015, from 2:00 p.m. until 4:00 p.m. The meeting will be held at SBA, Eisenhower (Concourse Level) Conference Room, 409 3rd Street SW, Washington, DC, 20416.
This special SBA event will focus on the impact of the rule on small business. Representatives from Federal Aviation Administration and the Department of Transportation will be on hand to discuss the proposed rule and answer questions about this issue.
Please RSVP to Bruce Lundegren by email at email@example.com if you would like to attend the roundtable. A call-in option is also available upon request.
In early September, 2013, the rain started in Colorado. It didn’t relent for an incredible five days, until it had dropped about a year’s worth of water. Washed out roads dominated the news images, but there were also more than 1,100 landslides in the rugged Colorado Front Range terrain. It was unlike anything seen in 150 years of recorded history there.
Part of the area had been mapped two years prior by airborne LiDAR. Changes in surface elevation since the 2010 LiDAR mapping showed 120 new grooves formed by landslides, some of which broke apart and flowed hundreds of meters down gullies and into swift-moving rivers. In most cases, these grooves were about one-half meter deep, leaving only bare bedrock behind. In total, about 21,000 cubic meters of sediment came down in the landslides within this 100 square kilometer area.
This could be a very important development in bringing 3D to the masses. Electrical engineer Ali Hajimiri at CalTech and his team have developed a chip that can record height, width, and depth information from each pixel.
Each “pixel” on the new sensor can individually analyze the phase, frequency, and intensity of the reflected waves, producing a single piece of 3D data. According to the researchers, the chip can produce scans that are within microns of the original.
The chip produced by the lab currently only has 16 pixels on it — not enough to capture a scan of any real objects without moving it slightly after taking each “picture.” But researchers say that the product could easily be scaled up to a model with hundreds of thousands of pixels, providing a new, low-cost way for smartphones, driverless cars, and a whole host of other products to capture precise 3D image data.
Students at Arizona State University recently ran a successful experiment with a large weather balloon to collect panoramic video and thermal imaging data.
The balloon and camera made it up high enough to see the black sky curling around our blue planet, a staggering 94,687 feet. The flight was approximately three hours. When the balloon burst, the payload took about 45 minutes to come back to Earth, landing about 5.7 miles from the launch site.
Experienced “do-it-yourselfer” Jonathon Coco at Forte and Tablada was telling me that he has been experimenting with the use of tethered balloons to get a camera up in the sky and map the area in question. The good news is there are no FAA restrictions on this.