Inside the HoloLens

Augmented reality has been around for a long time. The idea of mixing computer-generated content and the world around us is compelling. It's also hard to deliver especially in immersive 3D. Headsets have been either simplistic, like Google Glass, or complex and expensive, like the helmet designed for F-35 pilots. But now things have changed, and I've spent some time in tomorrow's world.

Microsoft's announcement of Windows Holographic and its HoloLens headset was a surprise "one more thing" at its Windows 10 event in Redmond earlier this year. After the big Windows 10 keynote, press were taken down into the basement of Building 92 for hands-on time with HoloLens prototypes, with four demos showcasing some of the software demonstrated in the keynote.

One thing to note: the demos that Microsoft walked me through were tightly scripted, and used prototype hardware that bore little resemblance to the sleek units shown off in the Windows 10 keynote. Instead, imagine me wearing a bicycle helmet that's had most of its frame removed, with two large lenses in front of my eyes, and a sensor array over my forehead. Hanging from my neck was a small computer, and a long umbilical cord linked me to another PC that was being controlled by an engineer.

The HoloLens lenses give it its name, and are the key to Microsoft's AR approach. It's clear that they're similar to military jet heads-up displays: holographic images of a complex set of lenses that take a compressed image and expand it out in front of your eyes. From experience with HUD technologies, I suspect that Microsoft is using laser projection to get the brightness levels HoloLens delivers, but full technical details of the display haven't been released.

Sensors are a key part of the HoloLens headset, with one set detecting head position and another using cameras to show what a user is looking at. Two of the demos implied that the camera array includes at least one Kinect-style depth camera along with standard video cameras. There's also at least one microphone for voice commands, and what appears to be a set of bone-conduction headphones.

I've worked in research labs, both academic and commercial, and while what I was wearing was obviously a prototype, the custom hardware that Microsoft was using was clearly in the final stages of development the point when you manufacture a set of trial hardware, forty or fifty devices so you can get the software right. The experience wasn't perfect; we couldn't get the headset used for the Skype demo to focus, and most of the displays had problems with my varifocal glasses. It also wasn't as clear as the images Microsoft has shared, though that may well be a factor of it being prototype hardware.

Putting those issues to one side, wearing HoloLens really was a very different computing experience. If you've seen Google Glass or even Skully's AR-enhanced motorcycle helmet, you've really only experienced a 2D augmented reality. What Microsoft is doing is more akin to the complex heads-up displays used in the upcoming generation of military jets, with a true 3D experience coming through each eye.

Walking on Mars was (to steal a phrase from Apple) "magical". Talking to a NASA scientist's Oscar-like golden avatar as he pointed out rocks and explained geological features gave me a whole new perspective on the work JPL is doing with the Mars rovers. With a 3D model of Curiosity sat in my peripheral vision, I could see just how big the SUV-sized robot was, as I bent down to examine a rock.

The OnSight software Microsoft had developed with JPL was an interesting example of just what HoloLens can do when linked to other computers. I could look at the screen of a PC on a desk running Curiosity's current navigation software; the HoloLens software was aware of the PC's location and left its display clear as I looked at the screen. But unlike a normal PC I could move the mouse off the screen and start clicking on the 3D Martian landscape, setting way points for the Rover, points that were dropped straight into the navigation tools map of Mars.

Microsoft was keen to show off possible scenarios for its augmented reality tools, and a Windows Holographic version of Skype was used to demonstrate how HoloLens might be used for remote working, or for hands-on support. I was presented with a set of tools, and a disassembled light switch. Using the HoloLens' Skype software I talked to an engineer who could use the HoloLens' cameras to see what I was seeing.

At the other end of my connection was a Windows tablet, running a variant of the familiar Skype client. As I used the tools to test the wiring and hook up the light switch, the engineer I was talking to was able to use their copy of Skype to draw on my field of view, guiding me through the intricacies of American wiring letting me know which wire was which and how it should be connected to the switch.

Games where at the heart of the next demo, a Minecraft variant the team called HoloBuilder. The furniture in a simulated living room was covered in block constructions, with a castle hidden under a table. Using voice commands I could switch between tools, digging a moat with a finger click, before blowing an army of zombies into lava. Looking through tables into pools of glowing lava was an unusual experience, with the lava below the floor I was standing on. Blowing holes in walls with Minecraft explosives revealed bats flying at me, and revealed a cavern landscape, with pouring lava.

HoloLens' main interaction tools are a finger click (folding down your index finger) and voice. You use voice to change tools, and a click to interact with the 3D model. I used clicks to flag objects on Mars, to pin Skype windows in free space, and to play with Minecraft sheep.

The keynote showed off Windows Holographic's HoloStudio app, and the final demo went into more detail on how it could be used to build models for 3D printing. While this wasn't hands on time, we were able to watch a 3D model being built and edited. HoloStudio used what appeared to be depth sensors to bring models into the real world, placing them on furniture and letting the user scale them for printing. A Microsoft representative suggested that it would make an ideal tool for previewing 3D models before printing, with a user able to place a model in the physical world and walk around it to see how it fitted in.

Back in the early 1990s I worked on an augmented reality project at a telecoms research lab. It got nowhere, as we just couldn't deliver the computing power we needed. Twenty years later, Microsoft is finally delivering on the AR vision in HoloLens and Windows Holographic. Being able to use gaze, gestures, and voice to interact with 3D images overlaid on the real world is a very different way of interacting with computers; but it's also one that's surprisingly logical.

Microsoft's HoloLens works, and (at least in prototype form) works well. Now we need to see how apps are built, and what the final production hardware is like. But after a few minutes inside Windows Holographics' world, it's a place I want to explore in depth.