OGC 3D Fusion Summit

  1. The OGC is sponsoring a one day summit on 3D at MIT on June 23, 2009.
  2. A number of the major software players are on the agenda, as well as practicing professionals.
  3. Yours truly will present the case for including laser scanning in the discussion.

ogc logo

The Open Geospatial Consortium is sponsoring what they are calling their 3D Fusion Summit at MIT in Cambridge, MA on June 23, 2009. This is being billed as a first of its kind event in North America, bringing together many of the major technology players in 3D and related topics. The major applications being addressed include urban planning, homeland security, urban warfare and personal navigation. Not quite sure how the last one fits with the first 3.

The background on this is that the OGC has formed the 3DIM – 3D Information Management Working Group. Tim Case is the chair. His background is in visualization. With the group being part of the OGC the focus is on developing standards that will facilitate the exchange of data between the various applications and software environments. The OGC has been doing the heavy lifting in this area for the GIS profession for the past 15 years.

The day long agenda includes software vendors, academics and practitioners as well as yours truly promoting the need for 3D laser scanning to be included in the discussion. It appears to be an important group that is attempting to develop a unifying 3D strategy.  The fusion of scanned, image, other remotely sensed, CAD and GIS data is where the opportunity to solve real world customer problems is going to be found.

Hope to see some of you there.

This entry was posted in Conferences, Education, Orgs, remote sensing, The Industry and tagged , , , . Bookmark the permalink.

2 Responses to OGC 3D Fusion Summit

  1. randy george says:

    Interesting to run across your post today since I just blogged a recent experiment for viewing 3D LiDAR using current browser technology.
    http://www.cadmaps.com/gisblog/?p=72

    I’ve been wondering about how to do something similar for hyperspectral cubes but haven’t gone too far down that track.

    As far as “personal navigation,” I think it wouldn’t be too much of stretch to imagine “seeing eye” LiDAR. Now that point clouds can be obtained in real time it’s at least imaginable and definitely personal, an interesting concept anyway. Turning a real time point cloud into audio relating your body envelope to surrounding features could be a boon to the blind, though hopefully not advertisers.

    Thanks for the heads up.

    randy

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.