At the recent Game Developer Conference and Mobile World Congress events, Valve had a demo for HTC's Vive VR system that was based in the Portal universe. The headset is combined with two controllers, one for each hand, which sound like a cross between Valve's Steam Controller and the Razer Hydra.
When HTC briefed journalists about the technology, they brought a few examples for use with their prototype. C|Net described three: a little demo where you could paint with the controllers in a virtual space, an aquarium where you stand on a sunken pirate ship and can look at a gigantic blue whale float overhead, and a Portal-based demo that is embedded above. I also found “The Gallery” demo online, but I am not sure where it was presented (if anywhere).
Beyond VR, the Source 2 engine, which powers the Portal experience, looks good. The devices looked very intricate and full of detail. Granted, it is a lot easier to control performance when you are dealing with tight corridors or isolated rooms. The lighting also seems spot on, although it is hard to tell whether this capability is dynamic or precomputed.
The HTC Vive developer kit is coming soon, before a consumer launch in the Autumn.
so, I know it’s late, but I
so, I know it’s late, but I finaly got to try the OC-RIFT, and, well I didnt like it. It’s got me thinkin, is VR actually possible? Various companys are marching forward with VR IS THE FUTURE (90’s all over again) yet we havent actualy proven that the brain can handle it. I actualy have no idea, but it seems….. uncertain.
The concept, when you spell it out, is actualy kinda…. something. So you are trying to simulate reality on a limited number of senses that is immerseive to the point where the brain belives what it’s seeing is real yet also be aware that you are in a game and that you can exit anytime……….
I could go on with this stoned rant, but my point is this. I feel that VR is something that should be treated as a scientific theory at this point, actual scientific experiments to see and learn about augmented/enhanced/virtual reality and how it affects the human mind and body, learing about the ear, the eye, testing lower mammals ability to live in virtual spaces, studying the affect and strain on the eyes, the spine, the mind, learning and experimenting, determining the possibility and limits. INSTEAD we have a march to monetize. VR was imagined as a finished product with the assumption that it will work, and that’s exactly what happened last time. I’m not saying it cant work, and I’m not saying it can, I’m saying we should answer that question first.
And also, this is GREAT bubble hash!
“tested lower mammels” just a
“tested lower mammels” just a sick way of thinking, plain and simple.
“testing lower mammals”
“testing lower mammals”
I know what you’re thinking but unfortunately SJWs are classified as human beings.
Ugh. Watching that video
Ugh. Watching that video cross-eyed to get the 3D effect was a pain. It brought back memories of those hidden 3D posters of yore. “Can you see it? No, I can’t! You? No, I… wait! Wait!! Yes, I can see it!”
Look great. Kinda reminds me
Look great. Kinda reminds me of the Engine Sega uses for Alien Isolation
I’d love to see product
I’d love to see product manuals come like this, even if the they are shown on an regular display. Imagine the Technical manual of your laptop, shown/displayed as a user navigated, and initiated, VR exploded view with the motherboard, and its various components, able to be viewed in a virtual 3d space, with any port, and device plugged into the port, highlighted when selected etc. and graphically represented in 3d, and the relevant information shown, including what BUS traces are activated for the highlighted/etc. port and Device/s plugged into the port.
A real time(system information/Other) 3d representation of the actual device, with queries done via the mouse/other device. Users would have a 3d schematic view, with the query/troubleshooting and visual queues to indicate exactly what port was populated on the motherboard, and what port was empty, and what port/plug/controller chip was servicing what device, with the bandwidth used/other statistics overlaid Heads-Up-style. All the peripheral device makers could provide Plug and Play visual manuals 3d represented components for their respective PCI/Plugged in, or via a port(internal/External) devices, giving the devices user the additional 3d schematics and 3d virtual interface of the respective components, for configuration/troubleshooting etc.
Imagine being able to zoom in on a internal SSD drive, and visually go inside an actual VR representation of the drive and point and click on the visual representation of the SSD’s controller chip, and the Heads-Up-style projected data and graphics show the amount of error correction going on, and other such information like total GBs/MBs written/read, or re-provisioned data blocks etc.
That would be one hell of a GUI, and a real time 3d VR GUI/manual, of your actual system hardware, with point and click, or VR grab with virtual hands, where every devices’ components give the user the their relevant operation/troubleshooting/configuration options at the virtual touch of a virtual finger/other pointing interface, Using VR goggles, or the LCD display.