Thursday, August 7, 2014

Game meta post

Over the years I've created this large list of game ideas, some of which are just some random game mechanics, others settings, story elements or more developed stories.

I noticed that when talking to friends and random people I've met over the years that one of these ideas consistently silences the people around me, people staring at me not realizing their yaw has dropped.

It's an idea I'd love to develop into a full game, but unfortunately it's not an idea that would work well with a tiny team. It would simply be too large a project for <10 persons, even in it's most simplified form. It's also very story heavy and it would definitely require help from someone who has more experience fleshing out stories & dialogues.

Unfortunately I can't describe the idea in such a public forum, since that would make it harder to actually make a game out of it, since it'd be already out there.

So I'm not sure what to do with this. How to form a large enough team (with the right people) to make something like this? How to fund this team? How did other people do this?

Monday, June 16, 2014

Oculus rift + mouse control experiment II

Added Mecanim animations to model to animate legs (not exactly great animations right now, but good enough to show that it's possible). Also scaled head to near 0 to make sure that you don't see pieces from the inside of the head when you rotate the view using the Oculus rift. (thanks for the suggestion Merlijn Van Holder!)

Update: I noticed that the oculus rift neck model has a serious flaw: it rotates the eyes around the neck, as if your eyes where in the back of your head, instead of the font. So when you, for instance, rotate your head to look at your right shoulder, your eyes are way back on your left shoulder. I'm assuming this will not be an issue anymore when the DK2 arrives ...

Wednesday, June 11, 2014

Gunship X coming to PS vita this year!

Oculus rift + mouse control experiment

Just a simple experiment on how to control a gun inside 
a game like setting using a mouse together with the oculus rift. 

I obviously used Unity here together with some plugins from the unity asset store.
The camera is placed inside the models head, I just used a model from some unity mecanim example.

The head is moved using IK. Ideally the head would not be rendered at all, but mecanim is a complete black box and only exposes a limited amount of it's internals.
So as far as I can see it's impossible to not render the head without actually creating a model without a head.

I should be able to animate the legs using mecanim, but I haven't tried that yet.
You would then be able to see yourself walk :)

The hand is also moved using IK, unfortunately mecanim doesn't allow me to modify the position of the elbow and it moves inside the character, which is unfortunate.

It's surprisingly easy to aim this way ... as long as you have that 'laser pointer' that is :)

Monday, May 26, 2014

Gunship X

My new game, Gunship X has been released on the iOS app store!

The game is basically a spiritual successor to our previous game, Zombie Gunship. Since we didn't have the rights anymore to Zombie Gunship, this game doesn't have any zombies in them. 

Instead we have Starship Troopers inspired bugs and in this game they shoot back! 

It's a very intense action game that will get your adrenaline pumping :) The learning curve might be a bit steep, that's one of the things we're addressing in the next update. It will also include a lot of new content.

Monday, February 24, 2014

Voxelization meshes

Just a random thought;
If the future is going to be using voxelization to perform real time indirect lighting / glossy reflections etc. (using techniques such as voxel cone tracing). And for a moment ignoring that this might not be ready for prime time just yet. Wouldn't it then make sense to have lower resolution 'voxelization meshes', just like we have different meshes for collision detection? Since the voxelized world representation is an approximation anyway? It might save a lot of cycles (at the cost of memory) ..

On the other hand, the static meshes could probably be pre-voxelized, and the most dynamic meshes would be skinned meshes .. which would require you to skin the same character twice with two different meshes? Might still be worth it if the voxelization representation is simple enough ..

Monday, January 27, 2014

VR head movement momentum

So about 2 weeks ago I went to Steam DEV Days and had a blast! Valve's presentations where certainly eye opening on some issues. Can't wait until they put the videos only so I can see the presentations I wasn't able to attend, I'm sure they'll be great too.

Anyway, in one of the presentations (I think it was "wild west of VR") the guys behind AaaaaAAaaaAAAaaAAAAaAAAAA talked about their Oculus Rift version of the game and how they discovered this interesting trick where they would slowly tilt the world which caused the player to automatically compensate by moving their heads in the opposite direction. This made the players feel like they where falling downwards. Also interesting that they only had to do a 45 degree rotation to give the impression of a 90 degree turn so that the players wouldn't get neck pain while playing the game. Players would swear that they where looking straight down even though they weren't.

In other VR presentations they would talk about sideways and backwards movements making players feel sick etc. (which is sort of common knowledge already I guess)

After these presentations I was suddenly wondering... maybe we've got this all wrong .. what if it's not the movements themselves which make people feel sick? (or at least not completely)
What if the problem is that we're not simulating momentum for the head when doing these movements?
I mean if you move sideways in real life no way that your head will remain perfectly still, it'll bob slightly to the left or right (depending in which direction you're moving). Maybe our brain is expecting this and when these natural movements are missing people get sick?

Unfortunately this is very hard for me to test as I seem to be completely immune to motion sickness :-/

Friday, December 13, 2013

Screen orientated basis vectors

Just a random idea; what if light is cached in texture space using screen orientated vectors.
Basically just like the half life 2 "Basis for Radiosity Normal Mapping" vectors, only all vectors are orientated towards the screen to have more accuracy there.

The texture space cache would have to be updated eventually when the camera moves too much relative to the cached pixels, but since we're not storing a single direction we can interpolate (and slightly extrapolate) between the vectors to increase the time the specular reflections remain valid between motions. The further away pixels are, the longer they can remain cached. For 3D rendering both cameras can use the same data.