The U.S. Institute of Building Documentation (USIBD) is pleased to announce the online publication of a new resource: a Referral Listing of Individuals who have become certified in the use of USIBD’s LOA Specification V.2.0. These individuals have gone through training, and have been tested to become certified in the LOA V.2.0.
Trainings are held online monthly via CD-BIM, and testing is available there as well. The trainings are held on the last Friday of the month. Those who passed the test are certified in the use of the LOA V.2.0, are eligible to be listed on this referral listing.
The Level of Accuracy (LOA) Specification is USIBD’s most often downloaded document. It is designed to help service providers and clients define accuracy needs based on the project. The LOA is now being integrated into popular industry software packages, to increase the ability to validate that accuracies are in compliance with those specified using the LOA. Version 2, released last October, includes a heritage overlay into the specification.
The LOA V.2.0 is available for download at no charge from USIBD’s eStore.
While the world of autonomy is focused on cars and trucks, Near Earth Autonomy (great company name) is working on a plan to automate personal aircraft. Now if you would dismiss this out of hand you might want to know that Airbus is working with Near Earth to provide the vertical take-off and landing, VTOL aircraft to support this “Jetsons-like” vision of personal transportation.
So where does lidar come in? It turns out that the take-off and flying states are relatively straight forward to automate. It’s the landing that is tricky. Enter lidar with its ability to survey the landing area and determine if it would be safe.
Near Earth Autonomy is lead by a team of Carnegie Mellon alums who have the robotics base well covered. It might be worth noting that the U.S. military has had their share of problems with VTOL aircraft, but it would seem that this vision is going to be part of the future of transportation.
The Sonoma County Vegetation Mapping and LiDAR Program, or more succinctly, the Veg Map, “is a 5-year program to map Sonoma County’s topography, physical and biotic features and diverse plant communities and habitats,” according to its website.
According to a statement from the Sonoma County Agricultural Preservation and Open Space District, the Veg Map provides a fine-scale representation of all the natural vegetation in Sonoma County. It represents 83 different vegetation communities and land cover types, with vegetation cover identified for areas 400 square feet to one acre in size.
While funding came from many sources, the primary source was a $1 million grant from NASA, in coordination with an international carbon-monitoring program at the University of Maryland.
New Zealand – based Loadscan Ltd. has been automating the volume measurement business for the past 20 years. The latest device uses laser scanners to accurately measure the volume of materials being moved on construction sites or mines.
The Loadscan Load Volume Scanner (LVS) system is based on laser scanning technology combined with software that creates 3D model images of trucks to measure the exact volume of the material loaded in a truck or trailer bin.
The LVS-3TMM is a fully self-contained mobile truck measurement unit that can be driven to a site and be fully setup and operational in as little as 45 minutes. There are a range of power options, plus the ability to transmit truck measurement load reports via Wi-Fi, Cellular Modem or Network cable. Loadscan is the inventor and first to launch the mobile truck measurement unit into the global market place which has made it possible to measure and monitor loads in temporary locations that were previously going unmeasured. The vehicles are not required to stop in order to obtain the measurements.
Check out Loadscan’s website for additional information.
Israeli-based Oryx Vision has landed a second round of financing of $50 million for its solid state flash lidar technology targeting the autonomous vehicle market.
The sensor technology claims to have a 1 million times better signal to noise ratio when compared to mechanical scanning systems. The technology is not blinded by the sun and produces range and velocity data for each point in its field of view.
The key element of the design is a microscopic antenna that can receive the ultra fast light waves. Tens of thousands of these can be assembled into a single sensor using silicon allowing them to be mass produced.
Keep an eye on this company.
Blue Marble Geographics recently recorded an eight-part video series that explores the extensive LiDAR processing functionality of Global Mapper and the accompanying LiDAR Module.
Beginning with an introduction to the structure and characteristics of LiDAR data, each 30-40 minute video covers a specific aspect of the software’s functionality from LiDAR editing and filtering to feature extraction and terrain creation.
The videos are available on Blue Marble’s YouTube channel or the individual video files can be downloaded for offline viewing. Blue Marble also provides a free two-week trial of Global Mapper and the LiDAR Module, allowing viewers to evaluate the software’s capabilities using their data.
Archaeologists have discovered a perfectly circular Danish ring fortress in Borgring, Denmark, that dates back to AD 975-980.
The fortress is believed to have been built during the reign of Harald Bluetooth – the King of Denmark who is often credited with the first unification of the country.
The Borgring fortress is the first to be discovered in Denmark since 1953, and experts believe that there are many more to be identified around the country.
Researchers from Aarhus University discovered the fort using LiDAR technology, which revealed the tell-tale geometric outline of a ring fortress.
They then worked with experts from the University of York to use geophysics and radiocarbon dating of excavated timbers from a gateway to confirm the remarkable early medieval find.
Mike Tully, CEO of ASI provides an interesting look at a number of technical issues concerning the use of what he terms a “flying camera” in this article from a recent newsletter. Here’s an excerpt:
“Drones operate in an inherently unstable atmosphere under constant, random motion where lighting conditions can vary continuously and dramatically. Resolving power is a measure of how much detail is discernable in photography (Figure 1). The ultimate resolving power is a product of not only the quality of the camera and lens but also the performance of the entire “camera system”. These and other factors affect the ultimate detail visible in drone photography.
Clarity of detail is often needed to accurately measure or map visible features. Clarity (high resolving power) also has a direct impact on the ultimate positional accuracy of the orthophotography generated from the aerial photography. As budding new drone operators or consumers of these professional services we need to understand these fundamentals of remote sensing and mapping.”
Mike explains that resolving power and resolution are not the same thing and that there is no direct relationship between the two.
Definitely worth the time to read.
What is required of a lidar sensor to safely support autonomous vehicle operation at highway speed? At 70 miles per hour, spotting an object at, say, 60 meters out provides two seconds to react. But when traveling at that speed, it can take 100 meters to slow to a stop. A useful range of somewhere closer to 200 meters is a better target to shoot for to make autonomous cars truly safe.
There is no shortage of companies that are developing both mechanical and solid state sensors to support the needs of the rapidly approaching autonomous vehicle market. This article provides an excellent summary of the current state of the automotive lidar sensor ecosystem.
There is still a lot of R & D needed to develop lidar sensors that can be mass produced in order to support the autonomous vehicle market.
SPAR 3D 2018 is looking for 3D technology innovators eager to share their expertise with an audience hungry for the latest information, applications and future possibilities related to 3D technologies.
Recent 3D product innovations and industry growth are changing the way people work across multiple industries. From sensing with drones, mobile rigs or hand-held devices to data processing, to AR/VR and 3D printed deliverables, everything 3D is here at the only industry-agnostic, platform-neutral 3D event in the market.
SPAR 3D will be co-locating with AEC Next Technology Expo & Conference in 2018. The SPAR 3D team will automatically consider your SPAR 3D presentation submission for the AEC Next Conference Program as well. However, should you wish to submit a separate presentation for AEC Next, please click here.
Don’t miss your opportunity to present at the leading global event for 3D technology. Submit your abstract by Friday, September 8th.