Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Microsoft shows off cloud-based physics rendering and virtual reality via Internet Explorer

Mark Hachman | April 4, 2014
Company takes a page out of OnLive's playbook.

Oculus Rift

Virtual reality through the Oculus Rift headset is novel enough. But VR through Internet Explorer, enabled with just a few lines of code? Microsoft just demonstrated it at its Build conference, drawing applause from a room packed with developers. 

On this, the second day of its annual developers conference, Microsoft is focusing on enhancements to its Visual Studio software development toolkit to assist in the creation, publication, debugging, and porting of Windows apps. That will be immeasurably helped, Microsoft executives said, by the company's shift to "universal apps" that share common code between the Windows, Windows Phone, and Xbox platforms.

Microsoft Build
Microsoft's Scott Guthrie, on stage at the Microsoft Build conference.

But the cloud also has another purpose: cloud computing. Microsoft's Azure service already allows developers to spin up virtual machines and SQL databases, and to host websites in the cloud. But in today's demonstration, executives also showed off what could be a capability in future games: blowing stuff up using a cloud-based physics engine instead of relying on a local machine's hardware resources.

Microsoft gave no indication that either the Oculus Rift demo the were about to present or what it called Cloud Assist will ever see the light of day as a full-fledged product. But both point to the power of what Microsoft has running behind the scenes.

Microsoft hosts tens of thousands of servers—stored within modular containers with their own cooling and power—in locations throughout the United States and abroad. These servers do everything from hosting enterprise websites and services, to conducting matchmaking services for the Xbox, to powering cloud services such as Outlook.com.

On Thursday, Microsoft corporate vice president Steve Guggenheimer showed off a virtual scene rendered by the WebGL a programming language on a smartphone; and from there, on a PC. His next trick was much more impressive: By dropping 13 lines of code into the scene using a service called Babylon.js—using the "go big button," according to his partner, technical fellow John Shewchuk—he enabled the scene to be virtually toured though the Oculus Rift headset.

Microsoft Build
This scene was rendered in WebGL, just before it was rendered for the Oculus Rift.

"One of the keys to making this work is that we're able to handle this loop at 200Hz," Guggenheim said. "Any of the other browsers out there don't have that capability. But it's also the speed of the PC ecosystem" the USB peripherals, the development tools, that let developers create these experiences with an incredibly low amount of effort, Guggenheim said.

 

1  2  Next Page 

Sign up for CIO Asia eNewsletters.