Mixed Reality Photogrammetry Archives

Reality Capture. Leverage a combination of photogrammetry software, photographs, and laser scans to create 3D models. As part of integrated design team you'll contribute to accuracy, safety, and outcomes throughout the lifecycle of our projects. Donning Microsoft HoloLens 2 goggles, they will then embark on a mixed-reality tour of the Krishna’s 1,500-year history, ultimately entering the cave temple where the sculpture was originally.

Articles Volume XLII-2/W13
Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W13, 805–810, 2019
https://doi.org/10.5194/isprs-archives-XLII-2-W13-805-2019
  1. Al.’s VENUS project and Pietroszek’s mixed reality exhibi-tion. Used photogrammetry to survey marine areas of the Pianosoa island. It is a step forward to having archaeologists investigate untouched and unreachable areas of the deep ocean (Drap et al. This is a great way to digitally archive and preserve underwater.
  2. Feb 26, 2020 Behind the Scenes with The Weather Channel’s Mixed Reality Broadcasting. One year ago today, The Weather Channel debuted its first immersive mixed reality (IMR) experience. The 8-minute TV broadcast aimed to transport viewers to the scene of a typical tornado, 30th Apr 2019. How Long Should Cinematic VR Be?
  3. RealityCapture $: Create virtual reality scenes, textured 3D meshes, orthographic projections, geo-referenced maps and much more from images and/or laser scans completely automatically. Agisoft Metashape $: is a stand-alone software product that performs photogrammetric processing of digital images and generates 3D spatial data.
© Author(s) 2019. This work is distributed under
the Creative Commons Attribution 4.0 License.

Mixed Reality Photogrammetry Archives Pictures

Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W13, 805–810, 2019
https://doi.org/10.5194/isprs-archives-XLII-2-W13-805-2019
© Author(s) 2019. This work is distributed under
the Creative Commons Attribution 4.0 License.
Video

05 Jun 2019

05 Jun 2019

K. Khoshelham,H. Tran,and D. Acharya
  • Dept. of Infrastructure Engineering, University of Melbourne, Parkville 3010, Australia
Videos
  • Dept. of Infrastructure Engineering, University of Melbourne, Parkville 3010, Australia
Hide author details

Keywords: Indoor Mapping, Depth Camera, SLAM, Point Cloud, 3D Mesh Model, Accuracy, Mixed Reality

Abstract. Existing indoor mapping systems have limitations in terms of time efficiency and flexibility in complex environments. While backpack and handheld systems are more flexible and can be used for mapping multi-storey buildings, in some application scenarios, e.g. emergency response, a light-weight indoor mapping eyewear or head-mounted system has practical advantages. In this paper, we investigate the spatial mapping capability of Microsoft Hololens mixed reality eyewear for 3D mapping of large indoor environments. We provide a geometric evaluation of 3D mesh data captured by the Hololens in terms of local precision, coverage, and global correctness in comparison with terrestrial laser scanner data and a reference 3D model. The results indicate the high efficiency and flexibility of Hololens for rapid mapping of relatively large indoor environments with high completeness and centimetre level accuracy.

  • Article(1163 KB)

Mixed Reality Photogrammetry Archives Free

How to cite. Khoshelham, K., Tran, H., and Acharya, D.: INDOOR MAPPING EYEWEAR: GEOMETRIC EVALUATION OF SPATIAL MAPPING CAPABILITY OF HOLOLENS, Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XLII-2/W13, 805–810, https://doi.org/10.5194/isprs-archives-XLII-2-W13-805-2019, 2019.