As reported in USA Today the Obama administration unveiled initiatives this past week that include 3-dimensional mapping to better identify flood risks, landslide hazards and coastal erosion.
“It gives us a 3-D picture. That’s what makes lidar a game-changing technology,” says Vicki Lukas, USGS’ chief of topographic data services team. She says USGS has contracted with private firms to develop lidar since the 1990s but now aims to collect consistent, higher-quality data from all 50 states. She says it will use a different technology for Alaska, because cloud coverage and remote areas have limited lidar data there.
The U.S. Geological Survey is launching a $13 million 3-D Elevation Program (3DEP)to develop advanced mapping that it says could, among other things, make it quicker to update flood maps and easier to find ideal sites for wind turbines and solar panels.
NASA announced a new combined ship/aircraft field campaign that they are launching today that will use a prototype lidar to measure microscopic phytoplankton in the ocean, down to 160 feet below the surface. The lidar will be flown on NASA Langley Research Center’s B-200, along with other instruments, as the plane overflies the track of an NSF research vessel with a suite of complementary instruments.
This is all in service of ultimately improving our ability to measure phytoplankton from space — the only way to get a true global picture of this key marine resource.
There is encouraging news from Trimble on the UAS front. The Mesa County, Colorado Department of Public Works has obtained a Certificate of Operation (COA) from the FAA to fly a UAS for the purpose of mapping and surveying. Turns out the Mesa County Sheriff’s Office, who manages the county’s unmanned aircraft system (UAS) operations has been flying systems since 2008.
This COA is an authorization from the Federal Aviation Administration (FAA) allowing the operation of an unmanned aircraft in a designated area and not for commercial use.
Velodyne showed what it is calling the “LiDAR Puck (VLP-16)” for the first time yesterday at AVUSI Automated Vehicles Symposium in San Francisco. The prototype sensor with 16 laser/detector pairs rotating 360 degree was showing real-time 3d measurements on VeloView. It is half the size and about 2/3 of the weight of the HDL-32E and will be attractively priced according to Wolfgang Juchmann.
The unit is between 3 and 4 inches in diameter, slightly wider than the HDL-32, and a little over 2 inches tall. According to Ray Mandli it seems to be pretty functional for an early prototype. Thanks for the tip Ray.
Velodyne continues to be the innovation leader in this space.
The PBS Time Scanners series moves to Petra this week – incredible scenery. Consult your local PBS station for the time in your area.
To accurately track a satellite requires the ability to measure time to the nearest pico second. That’s 1o to the minus 12. This results in sub-centimeter positional accuracy. Eventech, a Latvian company has 50% of the world market for this application, but it turns out that is still not enough to build a profitable business so the company has been looking for other applications.
One that they have identified as needing this kind of accurate time measurement is LiDAR. Turns out the R&D is well on its way; the Eventech LIDAR application has already been licensed as Spatial Initiatives, which is now working on a proof-of-concept and a prototype to be ready for market entrance.
As reported in Alaska Native News NASA will launch ICESat-2 in 2017 with a mission of measuring the elevation of the earth’s surface including areas covered by snow and ice, both in winter and summer. The number and patterns of photons that come back depend on the type of ice they bounce off – whether it’s smooth or rough, watery or snow-covered.
In order to calibrate the system they are flying a similar LiDAR sensor on an airborne test bed called the Multiple Altimeter Beam Experimental Lidar, or MABEL. MABEL collects data in the same way that ICESat-2’s instrument will – with lasers and photon-detectors. So the data from the Alaskan campaign will allow researchers to develop computer programs, or algorithms, to analyze the information from ICESat-2.
In a recent article in Laser Focus World Senior Editor John Wallace reports on the use of a UAS equipped with a LiDAR to create a better type of subject lighting called “rim lighting.” He notes that, “Researchers at the Massachusetts Institute of technology (MIT; Cambridge, MA) and Cornell University (Ithaca, NY) have created a photographer’s and moviemaker’s lighting system that includes a small quadricopter drone carrying a white-light source and a lidar system.”
In rim lighting, only the edge of the photographer’s subject is strongly lit; however, this type of lighting is normally very difficult to achieve and maintain, especially when the subject is moving. The MIT/Cornell system incorporates the moviemaker’s video camera, an image-processing unit to determine the instantaneous proportion of rim lighting in the image, and the drone itself, which moves around (using its lidar for positioning) to maintain optimum rim lighting.
Sungevity is a solar energy firm that is trying to entice potential customers with a website that provides an instant quote for adding solar panels to your home. Simply type in your address if you live in the San Francisco Bay area and Sungevity will instantly supply you with a pre-determined panel design and cost. LiDAR was used to derive the 3-D building models.
“It really shifts from being about hardware to being about software,” said Sungevity CEO Andrew Birch. “We are hyper-focused on customer experience, and we think we have the software platform to deliver it.”
This article in Laser Focus World describes the progress that Sigma Space Corporation (Lanham, MD) is making with the ongoing development of a single-photon-sensitive airborne LiDAR. The advantages of this approach are to allow large areas to be mapped quickly while still permitting the user to tailor the measurement point density.
By passing a low-power, high-repetition-rate 532 nm green laser beam (140 mW at 20 kHz) through a diffractive optical element (DOE) the are able to generate a 10 × 10 array of low-power (1 mW) beamlets and allowed surface measurement rates up to 2 megapixels per second (Mp/s).
It seems like a promising approach.