Monday, June 16, 2014

Oculus rift + mouse control experiment II

Added Mecanim animations to model to animate legs (not exactly great animations right now, but good enough to show that it's possible). Also scaled head to near 0 to make sure that you don't see pieces from the inside of the head when you rotate the view using the Oculus rift. (thanks for the suggestion Merlijn Van Holder!)


Update: I noticed that the oculus rift neck model has a serious flaw: it rotates the eyes around the neck, as if your eyes where in the back of your head, instead of the font. So when you, for instance, rotate your head to look at your right shoulder, your eyes are way back on your left shoulder. I'm assuming this will not be an issue anymore when the DK2 arrives ...

Wednesday, June 11, 2014

Gunship X coming to PS vita this year!


Oculus rift + mouse control experiment

Just a simple experiment on how to control a gun inside 
a game like setting using a mouse together with the oculus rift. 


I obviously used Unity here together with some plugins from the unity asset store.
The camera is placed inside the models head, I just used a model from some unity mecanim example.

The head is moved using IK. Ideally the head would not be rendered at all, but mecanim is a complete black box and only exposes a limited amount of it's internals.
So as far as I can see it's impossible to not render the head without actually creating a model without a head.

I should be able to animate the legs using mecanim, but I haven't tried that yet.
You would then be able to see yourself walk :)

The hand is also moved using IK, unfortunately mecanim doesn't allow me to modify the position of the elbow and it moves inside the character, which is unfortunate.

It's surprisingly easy to aim this way ... as long as you have that 'laser pointer' that is :)

Monday, May 26, 2014

Gunship X



My new game, Gunship X has been released on the iOS app store!

The game is basically a spiritual successor to our previous game, Zombie Gunship. Since we didn't have the rights anymore to Zombie Gunship, this game doesn't have any zombies in them. 

Instead we have Starship Troopers inspired bugs and in this game they shoot back! 

It's a very intense action game that will get your adrenaline pumping :) The learning curve might be a bit steep, that's one of the things we're addressing in the next update. It will also include a lot of new content.


https://itunes.apple.com/us/app/gunship-x/id763997659?mt=8

Monday, February 24, 2014

Voxelization meshes

Just a random thought;
If the future is going to be using voxelization to perform real time indirect lighting / glossy reflections etc. (using techniques such as voxel cone tracing). And for a moment ignoring that this might not be ready for prime time just yet. Wouldn't it then make sense to have lower resolution 'voxelization meshes', just like we have different meshes for collision detection? Since the voxelized world representation is an approximation anyway? It might save a lot of cycles (at the cost of memory) ..

On the other hand, the static meshes could probably be pre-voxelized, and the most dynamic meshes would be skinned meshes .. which would require you to skin the same character twice with two different meshes? Might still be worth it if the voxelization representation is simple enough ..

Monday, January 27, 2014

VR head movement momentum

So about 2 weeks ago I went to Steam DEV Days and had a blast! Valve's presentations where certainly eye opening on some issues. Can't wait until they put the videos only so I can see the presentations I wasn't able to attend, I'm sure they'll be great too.

Anyway, in one of the presentations (I think it was "wild west of VR") the guys behind AaaaaAAaaaAAAaaAAAAaAAAAA talked about their Oculus Rift version of the game and how they discovered this interesting trick where they would slowly tilt the world which caused the player to automatically compensate by moving their heads in the opposite direction. This made the players feel like they where falling downwards. Also interesting that they only had to do a 45 degree rotation to give the impression of a 90 degree turn so that the players wouldn't get neck pain while playing the game. Players would swear that they where looking straight down even though they weren't.

In other VR presentations they would talk about sideways and backwards movements making players feel sick etc. (which is sort of common knowledge already I guess)

After these presentations I was suddenly wondering... maybe we've got this all wrong .. what if it's not the movements themselves which make people feel sick? (or at least not completely)
What if the problem is that we're not simulating momentum for the head when doing these movements?
I mean if you move sideways in real life no way that your head will remain perfectly still, it'll bob slightly to the left or right (depending in which direction you're moving). Maybe our brain is expecting this and when these natural movements are missing people get sick?

Unfortunately this is very hard for me to test as I seem to be completely immune to motion sickness :-/

Friday, December 13, 2013

Screen orientated basis vectors

Just a random idea; what if light is cached in texture space using screen orientated vectors.
Basically just like the half life 2 "Basis for Radiosity Normal Mapping" vectors, only all vectors are orientated towards the screen to have more accuracy there.

The texture space cache would have to be updated eventually when the camera moves too much relative to the cached pixels, but since we're not storing a single direction we can interpolate (and slightly extrapolate) between the vectors to increase the time the specular reflections remain valid between motions. The further away pixels are, the longer they can remain cached. For 3D rendering both cameras can use the same data.

Tuesday, December 3, 2013

Oculus Rift & virtual texture space lighting

I was reading a thread on the oculus rift forums where some people where experimenting with super-sampling when rendering the oculus Tuscany scene in unity, which really improved the graphics quality. So some people where experimenting with rendering the scene 3-4x 1080p ... ouch

So this made me think of that old idea of mine where you would render the lighting of a scene in the page buffer texture of a virtual texture setup ..
For the oculus rift, VR and 3D in general, this would have following benefits:

  • Lighting is (partially) decoupled from the camera (don't need to do all the lighting twice)
  • Lighting can be (partially) cached, Diffuse as long as nothing moves in between it and the light
  • Lighting can be updated at different rate than the frame rate if / when necessary..
  • Diffuse and specular lighting could, perhaps, be updated at different rates? (would mess up physically based materials though, unless lighting normalization is done during the final pass)
  • Specular could have a different update frequency depending on distance
  • Specular could be calculated for both camera's at once, materials would only have to be read once, specularity could perhaps be compressed as 1 color + 2 intensities?
  • Lighting resolution can be decoupled from the actual resolution of the final rendering of the scene (which is fine because we generally want smooth shadows anyway)
  • Specular will obviously look better in texture space because of the regular normal map aliasing issues.
  • The final rendering pass, which is done twice, would be relatively simple and cheap.

Note that you don't need unique texturing for this to work, just unique texture addressing.