Can't get a simple fragment program to do... anything at all, really

Sage
Posts: 1,199
Joined: 2004.10
Post: #1
My game uses GL_NV_DEPTH_CLAMP to allow me to use zfail shadows without the need for an infinite view frustum. Works *great* on NVIDIA cards.

Many moons ago, I asked what approaches I could take to emulate the depth clamp behavior on ATI cards, and Arrekusu kindly pointed out to me a fragment program from frustum.org which emulates depth clamping.

Here's the fragment shader.

Code:
!!ARBfp1.0

MOV_SAT result.depth, fragment.position.z;

END

Pretty simple, & makes sense. -- The trouble is, I can't get it to work.

So, I feel like I basically understand how to use it, but I figured I'd use frustum.org's Shader class for parsing and loading, since it's public domain anyway ( and since when I reach 1.0 I'll release my game with source, and proper attributions anyhow ).

Here are the Shader class files:

Shader.h
Shader.cpp

So, anyhow, in his sample code he brackets the drawing of the shadow volume as such:

Code:
... //various setup

if(have_nv3x) glEnable(GL_DEPTH_CLAMP_NV);
else {
    depth_clamp_fp->enable();
    depth_clamp_fp->bind();
}

... //draw extruded shadow volumes

if(have_nv3x) glDisable(GL_DEPTH_CLAMP_NV);
else depth_clamp_fp->disable();

... // cleanup

I've basically done the same thing in my code. But, nothing happens. Shadows don't draw *at all*.

Here's my code -- which is basically similar. This is just one path, I've got two-pass versions where two sided stencils aren't available, and I've also got subtractive shadowing for relatively slow machines ( like the PB I'm writing this game on Cry )

Code:
void StencilShadow::singlePassModulative( vec4 lightPos )
{
    if ( _depthClampAvailable )
    {
        glEnable( GL_DEPTH_CLAMP_NV );
    }
    else if ( _depthClampFP )
    {
        // I know the shader is valid
        _depthClampFP->enable();
        _depthClampFP->bind();
    }

                      
    glDisable( GL_LIGHTING );
    glDisable( GL_CULL_FACE );
    glDepthFunc( GL_LESS );
    
    glDepthMask( GL_FALSE );
    glColorMask( GL_FALSE, GL_FALSE, GL_FALSE, GL_FALSE );    
    
    glEnable( GL_STENCIL_TEST );
    glEnable( GL_STENCIL_TEST_TWO_SIDE_EXT );
    
    glActiveStencilFaceEXT( GL_BACK );
    glStencilOp( GL_KEEP, GL_INCR_WRAP_EXT, GL_KEEP );
    glStencilMask( 0xFFFFFFFF );
    glStencilFunc( GL_ALWAYS, 0, 0xFFFFFFFF );
    
    glActiveStencilFaceEXT( GL_FRONT );
    glStencilOp( GL_KEEP, GL_DECR_WRAP_EXT, GL_KEEP );
    glStencilMask( 0xFFFFFFFF );
    glStencilFunc( GL_ALWAYS, 0, 0xFFFFFFFF );

    Entity::Vec &all = Entity::shadowCasters();
    Entity::Vec::iterator it( all.begin() ), end( all.end() );
    
    for (; it != end; ++it )
    {
        (*it)->generateShadowVolume( lightPos );
        (*it)->drawShadowVolume();
    }

    if ( _depthClampFP )
    {
        _depthClampFP->disable();
    }
    else if ( _depthClampAvailable )
    {
        glDisable( GL_DEPTH_CLAMP_NV );    
    }

    glStencilFunc( GL_EQUAL, 0, 0xFFFFFFFF );
    glStencilOp( GL_KEEP, GL_KEEP, GL_KEEP );

    glDepthMask( GL_TRUE );
    glEnable( GL_CULL_FACE );
    glColorMask( GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE );
    
    /*
        We're ready to draw lit scene. Let caller do it.
        and then call modulativeZFailFinish
    */

}

Now, what I've done is set up a test situation where I tell my code that GL_NV_DEPTH_CLAMP isn't available and force it to fall back on depth_clamp.fp -- and my video card ( 5200 Fx GO ) *ought* to be able to run the fragment shader.

Also, I've massively checked for glErrors -- profiler says I'm clean, and my own code which is peppered with calls to glError says I'm clean.

So, the question is, why's nothing happening? Is there something I'm missing?
Quote this message in a reply
Puzzler183
Unregistered
 
Post: #2
Part of your problem may be that you don't have depth output from your fragment program, only color. I'm not sure if that's it, but I don't remember DEPTH being a valid output binding for PS_1_0 in FX Composer (I don't use low level shader language). Have you tried the same thing in a high level language in FX Composer or RenderMonkey?
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #3
I don't have a PC!

Eh... anyway. You sound like you know a lot more about this than me. I've read the chapters in the openGL Super Bible, but I've got no real experience.

For what it's worth, the shader seemed to work for the demo code I downloaded.
Quote this message in a reply
Member
Posts: 184
Joined: 2004.07
Post: #4
FX Composer is quite a bit different from ARB_FRAGMENT_PROGRAM; the output bindings are not analogous. The fragment program you have looks fine, though you should make sure that there are no errors generated from reading it in (which it looks like you do.)

Here's why the fragment program you have is valid, from the spec:

Code:
(7) If a fragment program does not write a color value, what should
    be the final color of the fragment?

      RESOLVED: The final fragment color is undefined.  Note that it may
      be perfectly reasonable to have a program that computes depth
      values but not colors.  Fragment colors are often irrelevant if
      color writes are disabled (via ColorMask).

I would actually mess around and try some different fragment programs to make sure your code works. For example, you could draw every pixel red and check your color buffer to make sure it's doing the right thing.
Quote this message in a reply
Puzzler183
Unregistered
 
Post: #5
Yeah, sorry. I don't know much about the OpenGL or DirectX specific stuff or low level stuff. I just write a lot of shaders in Cg using RenderMonkey and FX Composer...

Anyway, phydeaux has the right idea: write a few simple ones that will just color everything one color or write all one depth and then either view it or read back the depth and see. Sorry I can't be more helpful :-/.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #6
I appreciate the replies, thanks.

Now, I've got to get on my knees and grovel a little. But first, let me explain, I know nothing of shaders. I've only in the last year or so taught myself OpenGL by buying books and reading articles. The fact that I didn't have a machine which would support shaders until recently kept me from even considering them for my learning. I've found there's a fair bit I could do in the fixed function pipeline... until now.

So, I've got a dumb question. Regarding fragment shaders, is it necessary to have some boilerplate vertex shader bound for the fragment shader to get the required info? Or, does the fixed vertex pipeline provide enough that I can simply bind the fragment shader and expect it to work?
Quote this message in a reply
Puzzler183
Unregistered
 
Post: #7
I've always used them together... Mostly because I do some lighting computations in both of them. I'm pretty sure that you can just use vertex shaders by themselves, but I don't know about pixel shaders.

I also recently got a computer capable of shaders and have thus been really going at it Grin.
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #8
First off-- don't mix up "shader" and "program". The terminology might be different in DirectX, but in OpenGL they are two different things:

"vertex program" means ARB vertex program; you write the assembly, you are responsible for getting it to work within the hardware limits.
"vertex shader" means something that gets compiled via GLSL. It may or may not be representable in ARB vertex program.

That said, yes you should be able to use a fragment program with no vertex program bound, and vice versa. A vertex program is really just manipulating data in the transform stage-- you could do everything there yourself in immediate mode if you wanted to (not true for the fragment stage.) Fragment programs have their variables passed in either in program parameters or leftover texture coordinates, both of which you can do directly without a vertex program.
Quote this message in a reply
Puzzler183
Unregistered
 
Post: #9
Erm OK.... Like I said, I only work with high level stuff where you can use stuff interchangably: vertex shader/program and fragment/pixel shader/program.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #10
OK, I'm back. I've spent a couple nights reading and re-reading the shader chapters in the OpenGL superbible. I think, conceptually, I've got a handle on it.

Now, that said, I've pulled out the shadow casting code from my game and made a simple glut test app, which I'm running through profiler to get a handle on WTF is happening. What I've verified is that with the shader running ( instead of NV_DEPTH_CLAMP ) I *am* getting values written into the depth buffer ( if I turn on depth writes ). So I *know* the shader is running & outputting to result.depth.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #11
You know, I think I might have figured it out. Stencil shadows work by judging wether a fragment passes or fails a depth test -- perhaps, since I'm not outputting result.color, the fragment program is *doing* nothing...
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #12
Woohoo! I was right!

Now, I've got to find a machine with an ATI card to test this on...
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #13
Well, I was too quick. It draws shadows now, but it doesn't actually solve the problem that GL_NV_DEPTH_CLAMP does. Dammit!

Can anybody suggest an alternative?
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #14
Turns out to be an odd situation. It *sort of* works on ATI hardware -- and not at all on NVIDIA. It works well enough that if I disabled shadows for some objects in the scenegraph I could get away with partial functionality.

Would anybody be willing to try out a demo app and tell me how its working for your hardware?

http://home.earthlink.net/~zakariya/files/Shadows.zip

It's an XCode 2.0 project, so you'll need tiger to build it. If it acts funny, or you get "weirdness", post a screenshot. The depth clamping doesn't quite work on ATI, of course.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #15
Never mind, people. I'm an idiot -- I forgot to semicolon-terminate the color write line in the shader.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  passing values from vertex to fragment shader Sumaleth 6 10,830 Feb 18, 2011 01:54 AM
Last Post: Holmes
  GLSL fragment program limits on GMA950 (MacBook) memon 12 8,747 Oct 26, 2007 05:18 PM
Last Post: arekkusu
  CG Fragment Shaders hangt5 10 6,426 Oct 17, 2005 12:21 PM
Last Post: NicholasFrancis
  A question on fragment shaders, and PBuffers TomorrowPlusX 3 3,256 Jul 5, 2005 04:58 AM
Last Post: TomorrowPlusX
  Multiplying a fragment against values in the alpha channel TomorrowPlusX 2 3,547 May 19, 2005 04:27 AM
Last Post: TomorrowPlusX