In a smallish, dimly lit room on the second floor of Intel’s booth on the CES show floor (yes, some of these booths have multiple stories), Intel’s Project Alloy HMD sat on a round coffee table in front of me. The space was dressed up to look like a living room. When I donned the Project Alloy HMD, I could see a facsimile of the room and its furniture–the results of the system’s scan.
Because Project Alloy is an untethered device–all of the PC components required to run VR experiences on the headset are on board–I could walk freely around the room. Then, they fired up the demo.
Slowly, as I looked around the room, the real world facade faded–no, that’s not right–it transformed. The coffee table morphed into a round, flat platform with a glowing orb in the middle of it. The couch became a pile of metal boxes. The walls dropped off to reveal that I was standing on a platform of some kind way up high somewhere, and when I looked up, I saw blue sky instead of a black ceiling.
Alloy had scanned the room (the demonstrators set it up before I arrived), and when it ran a demo, it painted a virtual world on top of the real one. This is part object detection, part game design, and part art, in a way. When it scanned the room, Alloy learned where every piece of furniture was located, how far away the walls and ceiling were, and more. Then, in the game, Alloy was able to layer artwork that’s germane to the title on top of those real objects. (Although the underlying technology is different, that’s essentially how the Mars Rover, a satellite, and HoloLens provide a 3D view of the surface of Mars.)
Thus, I got what Intel calls “merged reality.” I was completely swept into a virtual world, but Alloy used the physical world to create it.
intel project alloy ces 17 1_result
intel project alloy ces 17 4_result
intel project alloy ces 17 3_result
intel project alloy ces 17 8_result
intel project alloy ces 17 2_result
intel project alloy ces 17 5_result
intel project alloy ces 17 7_result
intel project alloy ces 17 9_result
intel project alloy ces 17 10_result
The rest of the demo was a bit less impressive. Once the world around me was fully converted, I was handed a (physical) controller that had a white ball on top for tracking and a trigger for shooting. The headset tracked the controller and rendered it in-game as a laser gun. Except…it didn’t fully track it. The virtual gun was positioned up near my shoulder, even though I was holding it much lower, with my hand down at my side. No matter where I moved my hand, the gun stayed parked on my shoulder. I could, however, aim in any direction with the gun, and those movements were tracked. (It was 3DoF, not 6DoF.)
I encountered several waves of flying drones, which came at me from all sides, in turn. I had to use the laser gun to zap them out of the air (which I did with aplomb, thank you very much). I had to walk around the room to pick up ammo boxes to reload my weapon.
The demo was designed to make me move around in the virtual and physical space. Doing so while in VR makes me nervous, to say the least. Ostensibly, all I had to do to avoid smacking into physical objects was to avoid smacking into the virtual ones. That’s the superiority of merged reality, but I didn’t trust that Alloy would keep me safe.
I was right; it didn’t. Remember that round coffee table? I noticed that the virtual version looked a little smaller than the physical version. Sure enough, when I extended my hand a few inches away from the virtual table–”crack” went my knuckles onto the physical one. The HMD’s tracking was off a bit (sensor drift, perhaps), and its estimation of the table’s size was also not quite right.
That brings up something we probably don’t discuss enough when we talk about freedom of movement in VR: In order to have an immersive experience, you want an untethered HMD, but you also need to trust that you won’t bump into things. If you don’t have absolute faith that the images your HMD is serving up are tracked correctly onto physical objects, and you try and physically move within a virtual space, you’re functionally blind.
(Have you been there in a VR demo? Shuffling your feet, hands outstretched, apologizing to anyone you might poke as you try and reach some virtual object without crashing into the physical wall you’re pretty sure is right in front of you? I have. It seems like I have that experience every single time I do an untethered demo, which makes for much worse immersion than a good tethered experience. That is to say; untethered VR is going to have to offer nearly perfect tracking, or it’s going to be a disaster. I digress.)
With all the drones lasered to bits, the demo ended.
The room, as scannedA Proof Of Concept, Not A Prototype
My time with the Project Alloy HMD left much to be desired. The controller tracking was so locked down it was comedic, and it wasn’t especially accurate, either. (Note that hand and body tracking was not enabled on this demo, although the headset is capable of doing it.) The tracking was off, to the point that a more trusting soul would have left the facsimile living room with a bruised shin from the coffee table. I also had to remove my glasses to fit my face into the HMD; this is an issue Intel said it plans to address. (On a positive note, though, the demo ran at 90 FPS, and the resolution was 1080p per eye.)
And that’s okay, for now. Intel was showing a proof of concept here, more than anything. This version of Alloy isn’t even really a prototype; representatives were keen to point out to me that the final version of Alloy will have all new internal hardware. Therefore, it’s important that we parse out the success of the proof of concept demo from the performance of the hardware.
Can Alloy do this merged reality thing? Yes, it can. And it’s compelling. It merges (oh hey, that’s why they call it that) the virtual and physical worlds in a way that no other technology is, at the moment. That, in and of itself, is notable and innovative.
Project Alloy 2 Will Be Better In Every Way
The fact that the demo had problems is not an issue at this point because the shipping version will have entirely different hardware.
The current Alloy has a pair of older RealSense cameras for passthrough and tracking; the new version will have a single, superior RealSense 400-series camera. The sensor processing is done with an Atom chip; the new version will have a Movidius Myriad 2 VPU. (Movidius, you may recall, is now the property of Intel.) The application processor is a Skylake chip, but the new version will have a Kaby Lake CPU. And whereas right now Alloy relies on integrated graphics, the new version will have a discrete GPU.
Expect 100-degree FoV, less than 20ms of latency, and 90 minutes of battery life from the final product, too.
The price point is TBD, but we were told that sub-$1,000 is the goal. This, though, is a rather wide open point. Intel expects its hardware partners to define a range of specifications and features, which will in turn, of course, affect the price. Presumably, a fully tricked-out Project Alloy HMD with a set of high-quality input controllers will be the version that ends up costing about a grand.
Intel representatives further noted that the Alloy design is modular, meaning although the core functionality is locked down, there are design aspects that OEMs can easily alter. An example is the battery; whereas some will leave it mounted to the back of the headset, others may set it as a hip-mounted unit with a power cable running up to the headset, which would both reduce the weight of the HMD (which is not terrible as-is) and possibly enable companies to sell the device with a higher-capacity battery.
Project Alloy 2 (or Project Alloy 1.0, if we consider the current iteration to be in beta) is scheduled to arrive in full production by the end of the year. Intel, at least, is calling this future device “Alloy 2,” and a dev kit for the ISV community will be around by mid-2017.
Cryptically, Intel representatives told us that it would launch Project Alloy with “one lead partner,” meaning that some company has an exclusive on the first Alloy HMD. Determining which company would be little more than wild speculation at this point, but, well, here’s some wild speculation: It could be Microsoft, to which Intel is married in the XR process. That would make sense from a relationship standpoint, but then again, Microsoft seems to love its own homegrown HoloLens, which is a fundamentally different device. Microsoft’s play with all of this would more likely be on the operating system side. It’s also possible Intel may have ordained one of the regular PC OEM suspects to be the standard-bearer. If that’s the case, Lenovo could be it; of all the PC makers, Lenovo was the only one to show us even a mockup of an HMD at CES, which could indicate that the company is more serious about the whole of Wintel’s XR efforts.
In any case, other OEMs will get a piece of the pie by the end of the year. We suspect that this “lead partner” will launch–or at least announce–a device by mid-late 2017, with all others landing by the end of the year.