Monthly Archives: January 2015

What would a laser scanner look like at 20 billion frames per second?

To help answer the perennial question ‘What does a laser path look like in slow motion?’ a team of researchers (published Open Access in Nature in August 2014) undertook an experiment at Heriot-Watt University that used a 2D silicon CMOS array of single-photon avalanche diodes (‘SPAD’) to essentially construct a 0.1 megapixel resolution camera capable of recording the path of laser pulses through a given area. While the article acknowledges that light has been captured in flight as early as 1978, the challenge addressed by the team is one of simplifying data acquisition and reducing acquisition times “by achieving full imaging capability and low-light sensitivity while maintaining temporal resolution in the picosecond regime” (Gariepy et al, 2014: 2). To produce an image (or rather a video) from the experiment, the raw sensor data was put through a sequence of processing steps categorised into 3 stages: noise removal, temporal deconvolution and re-interpolation – which is illustrated in the graphic below:

Creating an image from the 32 x 32 array of sensors. Image from article (linked).

Creating an image from the 32 x 32 array of sensors. Image from article (linked).

The video produced by the team (GIF excerpt below) is an overlay of the 500 picosecond pulse of laser light on top of a DSLR photograph of the experimental setup. The scattering of light that makes the beam visible is remarkably only through interaction with ambient gas molecules (Gariepy et al, 2014: 4), versus a more ‘dense’ medium that is traditionally required to highlight laser light (e.g. fog, participating media such as airborne dust, etc).

GIF of laser path. Image created from video (linked)

GIF of laser path. Image created from video (linked)

This laser path in flight is the missing step from the following video produced as part of the Scottish Ten project: we see the Leica C10 scanner laser spot reflecting from the surface of a bridge at the Eastern Qing Tomb, China. If we applied the same methodology as the research team in the article to the scanner, we might see the same phenomenon repeated at incremental spatial locations to record the environment around the scanner – perhaps the ultimate LiDAR demo?

Microsoft HoloLens mixes the real and the virtual

The HoloLens headset. Image source: Engadget (linked).

With the announcement of Windows 10 in the live event, Microsoft also recently presented the HoloLens. This is a ‘mixed reality’ headset that overlays ‘holographic’ imagery over your day-to-day vision, allowing you to interact virtually – make Skype calls, build 3D objects in their HoloStudio software, play HoloBuilder (essentially MineCraft), and so on – untethered & without markers/tracking. The specification of it are unclear at this point, described variously as ‘sharp’ and having ‘HD lenses’, and in the presentation it is explained that a traditional CPU/GPU combination were not enough and that the answer was in inventing a ‘HPU’ (‘Holographic Processing Unit’), which deals with the input from various sensors detecting sound, our gestures and ‘spatially map the world around us’ in real-time.

It requires little imagination to visualise how units like this could integrate with archaeological excavation, training, virtual access/reconstruction, etc, in a similar vein to how it has already been employed by NASA. To quote Dave Lavery, program executive for the Mars Science Laboratory mission at NASA Headquarters in Washington: “OnSight gives our rover scientists the ability to walk around and explore Mars right from their offices [...] It fundamentally changes our perception of Mars, and how we understand the Mars environment surrounding the rover.”

Exploring Mars - Microsoft worked with NASA's Jet Propulsion Laboratory team and the Curiosity Mars rover

Exploring Mars – Microsoft worked with NASA’s Jet Propulsion Laboratory team and the Curiosity Mars rover to explore how engineers, geologists, etc could use the technology. Image source: Frame from video (linked).

We’ve all long been aware of the development of consumer VR headsets (e.g. Oculus Rift) – which can immerse us in entirely 3D environments, and Google’s Glass (Prototype production has now ended). HoloLens is an interesting move from Microsoft, and somewhat curiously there is also an absence of reference to ‘augmented reality’ (see Microsoft’s FAQ), which has been suggested may be for marketing purposes. In terms of availability, we are told the HoloLens will be made available within the timeframe of Windows 10. Going forwards it will be a question of which applications befit VR/AR – especially for heritage (and conservation), where virtual access to the present or the past in 2D and 3D form is a central requirement.

Faro’s new handheld scanner (‘Freestyle 3D’)

Faro Freestyle 3D handheld scanner. Image source http://www.faro.com/

Faro Freestyle 3D handheld scanner. Image source http://www.faro.com/

Faro surprised us at the start of this year with the release of a new handscanner, the Freestyle 3D. It is being marketed as a lightweight, mobile tool that can rapidly enable the capture of accurate full-colour pointcloud data on its own, or in tandem with Faro’s product line of terrestrial laser scanners such as the Focus and X330, where its direct integration with Faro’s Scene package gives it a workflow advantage. The hardware specification puts the point accuracy at less than 1.5mm under certain conditions, measuring at a distance between 0.5m and 3m from the subject.

It has already made a splash in the scanning community, with comparisons and early tasters of the handscanner being used in real-world scenarios. Being marketed at the AEC industries, it will be interesting to see if or how it is employed within the context of digital documentation of cultural heritage, which frequently puts us in situations with irregular, hard-to-reach built and natural environments.