Monday, August 16, 2010

Virtual texturing on iPhone

Sorry for the lack of updates, I've been on vacation in Illinois for the last 3 weeks, visiting family in laws, and only now starting to recover from jet lag. Before I left me and Matthew Baranowski (start blogging already!) submitted a proposal to write an article for a book (not sure if I'm allowed to say anything more about it, so I won't, at least not yet), and while I was on vacation it got accepted! Yesterday (Sunday) was the deadline for the first draft, and I came back last Wednesday.
If I hadn't taken my laptop with me and worked on the article in my vacation, I would've only had a couple of days to write it! Pfew!

Today we submitted the draft and we should be getting comments on the article eventually.

Anyway.

While I was on vacation John Carmack showed a technical demo on the iPhone which used virtual texturing.
It's pretty cool, but I'm actually not surprised that it works. In fact it's something I wanted to try myself, but haven't had the time to do yet.

Why does it work so well on the iPhone? Well from what I've heard they've backed lighting into most surfaces in Rage, leaving only the specular channel to store and render.

IO latency and throughput are less of an issue on the iPhone because of flash memory for 'disk', and that pages don't need to be of as high a resolution compared to consoles.

So if you have pixel shaders, which the iphone does, then it's really not that hard.

The only thing that is more problematic on the iPhone is storage space which is alleviated somewhat by the lower resolution pages. Texture decompression would be more of a problem on the iPhone however, so they're probably not doing that.

I'm still wondering if Id software is subdividing it's geometry at the highest resolution page level though, in which case you wouldn't even need a pixel shader.