Tuesday, June 30, 2009

Crysis on a cell phone

The following article and video show that even the most demanding games of today can be played on any web-enabled device:


A portable game platform with dedicated game controls such as the PSP connected via WiFi to a 3G phone seems more comfortable and mobile than the setup in the video. In my opinion this is just a proof of concept and I think that the real crowd pleaser will be the photorealistic virtual world with LightStaged characters.

Tuesday, June 16, 2009

2 New OTOY video's

2 new video's showing the server side rendering in action and giving some specifics about the lag:


The first video shows a guy playing Left 4 Dead and Crysis on his HD TV, hooked up to his laptop, which is connected to the OTOY server through broadband access. He can switch instantly between both games while playing, very impressive. According to the TechCrunch article, EA is partnering with OTOY.

The second video shows GTA4 being played in a web browser while running in the cloud. According to the tester, there is some lag, but it's very playable. Personally, I think GTA4 is not the best game to show off server side rendering because it runs on a terribly unoptimized, buggy and laggy engine. There's no way you can tell if the lag is due to the crappy engine or to the connection. Unfortunately, there's no info about the geographical distance between the player and the cloud. What it does show is that a GTA game with LightStage quality characters and environments could definitely be possible and playable when rendered in the cloud. In fact, I asked Jules this very question yesterday and he confirmed to me that it was indeed possible.

update: Many people cannot believe how OTOY can render so many instances per single GPU. I checked my notes and as Jules explained it to me, he can run 10 instances of a high-end game (like Crysis) and up to 100 instances of a low-end game per GPU. The GPU has a lot of "idle" and unused resources in between the rendering of frames for the same instance. OTOY efficiently uses this idle time to render extra instances. The games shown in the videos (Crysis, Left 4 Dead, GTA IV) are of course traditionally rendered. When using voxel ray tracing, OTOY scales even better.
OTOY can switch between rasterizing and voxel raycasting, because it uses a point cloud as input. Depending on the complexity, one is faster than the other. The scorpion demo for example (the Bug Snuff movie), was first rendered as voxels, but rasterizing it was faster. The Ruby demo from last year was completely raytraced (the voxel rendering is not limited to raycasting, but uses shadow rays and reflection rays as well, so it could be considered as true raytracing).

A quantum leap of faith

Believe it or not, but yesterday I was on the phone with Jules Urbach, the man himself behind OTOY and LightStage (I guess writing a blog does pay off ;-). He had offered me the opportunity to talk a bit about OTOY, LightStage, the Fusion Render Cloud and where things are heading. It was my first interview ever and I was just shooting one question after another. Too bad I was bloody nervous (I haven’t been that nervous since my last oral exam). Due to my nervousness and my limited understanding of graphics programming, I didn’t absorb a lot of the things he said, but I think I’ve got the bottom line. He answered a lot of my OTOY-related technical questions (unfortunately OTOY isn’t open source, so obviously he couldn’t answer every single one of my questions) and offered me some insight in the cloud computing idea. What follows is my own interpretation of the information that Jules gave me.

Just a couple of weeks ago, I was still wondering what the technical specifications of the next generation of consoles would be like. But after yesterday… frankly I don’t give a damn anymore. The promise of OTOY and server side rendering is even bigger than I initially thought. In fact it’s huge and that’s probably an understatement. In one interview, Jules said that it “is comparable to other major evolutions of film: sound, color, cinemascope, 70mm, THX, stereoscopic 3D, IMAX, and the like” I think it’s even bigger than that, and it has the potential to shake up and “transform” the entire video game industry.

Server side rendering opens up possibilities for game developers that are really hard to wrap your head around. Every game developer has learned to work inside the limitations of the hardware ( memory, polygon and texture budgets, limited number of lights, number of dynamic objects, scene size and so on). These budgets double in size only every 12 to 18 months. Now imagine that artists and level designers could make use of unlimited computational resources and no longer have to worry about technical budgets. They can make the scene as big as they want, with extreme detail (procedurally generated at the finest level) and with as much lighting information and texture layers as they desire. That’s exactly what server side rendering combined with OTOY’s voxel ray tracing might offer. It requires a shift in the minds of game developers and game publishers that could be considered a quantum leap of faith. The only limitation is their imagination (besides time and money of course), and anything that you see in offline rendered CG, could be possible in real-time. Jules is also working on tools to facilitate the creation of 3D environments and to keep development budgets reasonable. One of those tools is a portable LightStage, which is (as far as I understood) a cut down version of the normal LightStage that can be mounted onto a driving car and that can capture whole streets and cities and convert them into a 3D point cloud. It’s much better than LIDAR, because it captures lighting and texture information as well. Extremely cool if it works.

Because the server keeps the whole game scene in memory and because of the way that the voxel ray tracing works, OTOY and the render cloud can scale very easily to tens of thousands of users. Depending on the resolution, he can run 10 to 100 instances of a game scene on one GPU. And you can interconnect an unlimited number of GPU’s.
The best thing about the server side rendering idea is that every one is a winner: IHV’s, ISV’s, game publishers and most importantly the gamers themselves (for a number of reasons which I talked about in one of my previous posts).

In conclusion, I guess every PC gamer has dreamt at some point about a monster PC with terabytes of RAM and thousands of GPU’s working together, with a million unified shaders combined. Until recently, no one in their right mind would make such a monster, because economically it makes no sense to spend a huge load of cash on the development of a game that would make full use of such enormous horse power and could only be played by one person at a time. But with the rapid spreading of broadband internet access, suddenly a whole lot of people are able to play on that monster PC and it becomes economically viable to make such an extremely high quality game. I think OTOY will be the first to achieve this goal. Following the increasing trend of office applications being run in the cloud, server side rendering is going to be the next step in the evolution of the video game industry and it will make “client-side hardware” look like an outdated concept. Jules told me he thinks that in the future, the best looking games will be rendered server side and that there’s no way that expensive local hardware (on the client side) will be able to compete. I for one can’t wait to see what OTOY will bring in the near future.

Saturday, June 6, 2009

Has it been 1 year already?

The Ruby city demo, the very reason why I started this blog, was first shown to the world on June 4, 2008. Check the date on this youtube video: http://www.youtube.com/watch?v=zq1KTtU8loM

One full year, I cannot believe it. AMD has released every Ruby demo to the public well within a year after the introduction of the hardware. It began with Ruby Double Cross on Radeon X800, then Ruby Dangerous Curves on X850, followed by Ruby The Assassin on X1800, and finally Ruby WhiteOut on HD 2800. So it made perfect sense that the Voxelized Ruby would come out within a few months after the unveiling of the Radeon 4870. Even Dave Baumann said on the Beyond3d forum that the demo would be released for the public to play with.

So what went wrong? Did ATI decide to hold back the demo or was it Jules Urbach? I think the intial plan was to release the demo at a certain point, but the voxel technology was not finished and took longer to develop than expected. To enjoy the demo at a reasonable framerate, it had to be run on two 4870 cards or on the dual 4870X2, so only the very high end of the consumer market would be able to run it. The previous Ruby demo's were made by RhinoFX, and this was the first time that OTOY made a Ruby demo. Either way, if AMD is making another Ruby demo (with or without OTOY, but I prefer with), it has to look better than the last one, and they better release it within a reasonable amount of time.

Something else that crossed my mind: OTOY is now being used to create a virtual world community (Liveplace/Cityspace) and I think OTOY's technology would be a perfect match for Playstation Home. Virtual world games are much more tolerant to lag than fast-paced shooters or racers, and I think that even a lag of 500 milliseconds would be doable. Imagine you're playing a game on your PS3. Once you're done playing, you automatically end up in Home, being rendered in the cloud. Sony has trademarked PS Cloud, ( http://www.edge-online.com/news/sony-trademarks-ps-cloud ) and I wouldn't be surprised if Sony moved the rendering for PS Home from the client to the server side sooner or later.

Friday, June 5, 2009

A possible faster-than-light solution for the latency problem?

Interesting article: http://spectrum.ieee.org/computing/networks/game-developers-see-promise-in-cloud-computing-but-some-are-skeptical/0

I have totally embraced the cloud computing idea. I hope OTOY and OnLive can pull it off and create a paradigm shift from client to cloud rendering. The main problem seems to be lag. Apart from the extra lag introduced by encoding/decoding the video stream at the server/client side respectively, which should not be greater than a couple of milliseconds, there is lag due to the time that the input/video signal needs to travel the distance between client and server, which can amount to several tens to hundreds of milliseconds. This is due to the fact that information cannot travel faster than the speed of light (via photons or electromagnetic waves). Quantum physicists have discovered ways to do quantum computing and quantum encryption at 10.000 times the speed of light, but they all agreed that it was not possible to send information faster than lightspeed, because they could not control the contents of quantum entangled photons. But very recently, Graeme Smith, a researcher at IBM has proposed a way to "transmit large amounts of quantum information" described in the following paper, published in Feb, 2009:



If his theory holds true, IBM or someone else could make a computer peripheral based on quantum technology (sort of like an Apple Airport) that can communicate large amounts of data instantaneously! Distance between client and cloud would no longer be a problem and transmission lag would be non-existent! It would make playing server side rendered games an almost lag-free experience and the ultimate alternative for costly, power hungry consoles.