GL extension for GPU skinning

Discussion about everything. New games, 3d math, development tips...

GL extension for GPU skinning

Postby REDDemon » Mon Oct 10, 2011 5:49 am

Ehy does anyone know wich is the GL extension for doing GPU based characters animations? Or it has to be done with shaders?
OpenGL is not hard. What you have to do is just explained in specifications. What is hard is dealing with poor OpenGL implementations.
User avatar
REDDemon
 
Posts: 837
Joined: Tue Aug 31, 2010 8:06 pm
Location: Genova (Italy)

Re: GL extension for GPU skinning

Postby hybrid » Mon Oct 10, 2011 7:25 am

There's also the matrix array or something like that. It's a prominent extension for OpenGL-ES 1.x, which lacks shader support. So this extension is the only way for GPU based animation. But it should be available in vanilla opengl as well. Otherwise shaders are required.
hybrid
Admin
 
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany

Re: GL extension for GPU skinning

Postby REDDemon » Tue Oct 11, 2011 5:07 am

hybrid wrote:There's also the matrix array or something like that. It's a prominent extension for OpenGL-ES 1.x, which lacks shader support. So this extension is the only way for GPU based animation. But it should be available in vanilla opengl as well. Otherwise shaders are required.


Are you serious.? Are you trying to say that Khronos group never thinked about skinned models since OpenGL born? LOL
That's a lot funny
OpenGL is not hard. What you have to do is just explained in specifications. What is hard is dealing with poor OpenGL implementations.
User avatar
REDDemon
 
Posts: 837
Joined: Tue Aug 31, 2010 8:06 pm
Location: Genova (Italy)

Re: GL extension for GPU skinning

Postby hybrid » Tue Oct 11, 2011 12:43 pm

Hmm? Khronos is only responsible for a few years now, so they probably didn't think about it from the beginning. You're right so far. However, I already told you about two ways to support hw skinning. Either uploading several matrices and render vertices under a choice of those transformations, or executing the transformation in shaders from data taken from anywhere available to the shader. The matrix stack is an ARB extension since 2000, but I don't know if it was promoted to core at any time. Vertex shaders are core supported for a very long time now. So what do you want to know, AFAICS I gave you all the information you wanted.
hybrid
Admin
 
Posts: 14143
Joined: Wed Apr 19, 2006 9:20 pm
Location: Oldenburg(Oldb), Germany


Return to Off-topic

Who is online

Users browsing this forum: No registered users and 1 guest