Announced on stream. Yes, this is really a thing. NASA had a hand in it, too.
I'm not exactly clued up on this kind of technology, but it works as a mixture of lots of different things all working in conjunction with one another; holograms are created and the headset is used to see them, as explained in depth below, with a Kinect-like sensor used to judge where your hands are in combination with the holograms and allow you to interact with them.
There's no wires and you don't need a phone or PC, with the sensor having it's own CPU, GPU, and "HPU" (holographic processing unit). It'll also work with all Windows 10 builds, which is free for any Windows 7, 8, or 8.1 users.
Here's what we see, as observers:
And here's what the user sees:
It's still in very very early stages but it seems to at least work, and work well! Wired's hands-on article can be found below and I've cut some bits out. Here's an explanation into what exactly is happening:
And some hands-on impression from Wired editor.
Wired, via NeoGAF
do want. This is the kind of future tech I always love to see.
I'm not exactly clued up on this kind of technology, but it works as a mixture of lots of different things all working in conjunction with one another; holograms are created and the headset is used to see them, as explained in depth below, with a Kinect-like sensor used to judge where your hands are in combination with the holograms and allow you to interact with them.
There's no wires and you don't need a phone or PC, with the sensor having it's own CPU, GPU, and "HPU" (holographic processing unit). It'll also work with all Windows 10 builds, which is free for any Windows 7, 8, or 8.1 users.
Here's what we see, as observers:
And here's what the user sees:
It's still in very very early stages but it seems to at least work, and work well! Wired's hands-on article can be found below and I've cut some bits out. Here's an explanation into what exactly is happening:
Wired said:Project HoloLens’ key achievement—realistic holograms—works by tricking your brain into seeing light as matter. “Ultimately, you know, you perceive the world because of light,” Kipman explains. “If I could magically turn the debugger on, we’d see photons bouncing throughout this world. Eventually they hit the back of your eyes, and through that, you reason about what the world is. You essentially hallucinate the world, or you see what your mind wants you to see.”
To create Project HoloLens’ images, light particles bounce around millions of times in the so-called light engine of the device. Then the photons enter the goggles’ two lenses, where they ricochet between layers of blue, green and red glass before they reach the back of your eye. “When you get the light to be at the exact angle,” Kipman tells me, “that’s where all the magic comes in.”
And some hands-on impression from Wired editor.
Kipman hands me a HoloLens prototype and tells me to install the switch. After I put on the headset, an electrician pops up on a screen that floats directly in front of me. With a quick hand gesture I’m able to anchor the screen just to the left of the wires. The electrician is able to see exactly what I’m seeing. He draws a holographic circle around the voltage tester on the sideboard and instructs me to use it to check whether the wires are live. Once we establish that they aren’t, he walks me through the process of installing the switch, coaching me by sketching holographic arrows and diagrams on the wall in front of me. Five minutes later, I flip a switch, and the living room light turns on.
Another scenario lands me on a virtual Mars-scape. Kipman developed it in close collaboration with NASA rocket scientist Jeff Norris, who spent much of the first half of 2014 flying back and forth between Seattle and his Southern California home to help develop the scenario. With a quick upward gesture, I toggle from computer screens that monitor the Curiosity rover’s progress across the planet’s surface to the virtual experience of being on the planet. The ground is a parched, dusty sandstone, and so realistic that as I take a step, my legs begin to quiver. They don’t trust what my eyes are showing them. Behind me, the rover towers seven feet tall, its metal arm reaching out from its body like a tentacle. The sun shines brightly over the rover, creating short black shadows on the ground beneath its legs.
Norris joins me virtually, appearing as a three-dimensional human-shaped golden orb in the Mars-scape. (In reality, he’s in the room next door.) A dotted line extends from his eyes toward what he is looking at. “Check that out,” he says, and I squat down to see a rock shard up close. With an upward right-hand gesture, I bring up a series of controls. I choose the middle of three options, which drops a flag there, theoretically a signal to the rover to collect sediment.
After exploring Mars, I don’t want to remove the headset, which has provided a glimpse of a combination of computing tools that make the unimaginable feel real.
Wired, via NeoGAF
do want. This is the kind of future tech I always love to see.