Applying fog... retroactively

Sage
Posts: 1,199
Joined: 2004.10
Post: #1
I've discovered in my game, having implemented reasonably robust shadows, a side-effect of doing shadows the easy way ( drawing a transparent quad over the viewport against the stencil buffer ) -- shadows in the distance are NOT affected by fog calculations.

In other words, objects in the distance which are somewhat obscured in fog, still cast fully dark shadows since the transparent quad I draw over the scene isn't affected by the fog equations, since it's drawn in front of everything. As you can imagine this is a little funny looking.

What I'd like is to be able to have the shadows in the distance look fogged out just like the geometry in the distance.

So, what I'm thinking is, since fog is applied by the gl pipeline on a per fragment basis, using the depth buffer to calculate how much fog should be applied, shouldn't it be possible to have the quad be fogged based on the values *already* in the depth buffer instead of the depth values of the quad's fragments?

Is this something which will have to be implemented as a fragment program? Or is there some glEnable call which will result in the same thing?

Also, I'm aware that *correct* stencil shadowing would involve drawing the scene twice -- once with only ambient light and a second time against the stencil buffer. But the geometry throughput for taking this approach would be prohibitive -- so I'm not considering it Wink
Quote this message in a reply
Member
Posts: 79
Joined: 2002.08
Post: #2
Yeah, I came across this issue too. If you come up with a working solution I'd like to hear it. Reading back data from a buffer is probably just as slow as drawing the scene twice.

As far as I know the only real way is to render the scene twice when you need fog. Fog in general has turned out to be a pain in many ways but not using fog is not a solution either.

KenD

CodeBlender Software - http://www.codeblender.com
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #3
Pardon my ignorance of fragment programming, but...

What if I had a fragment program which, for every shadow-quad pixel which passes the stencil test ( e.g., for each shadow fragment which makes it to the color buffer ) multiplies that fragment's alpha against (1 - the value in the depth buffer ) for that position? It seems to me that this would fade out the shadow to transparent as it approaches the far-plane, which would have a *similar* effect to fogging the shadow.

Do fragment programs have direct access to the depth buffer? I would assume they do, since I've seen some hairy stuff which modulates the values in the depth buffer based on bump maps.

Does this sound reasonable?
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #4
With a little googling I've discovered the read-only GLSL 2.0 variable gl_FBDepth which holds the z-value of the depth buffer for the fragment being processed. With some caveats as to performance...

Now, that said, is this available in fragment programs today, on OS X? We don't have GL 2.0, nor do we have GLSL Sad
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #5
What if you rendered each shadow separately, clearing the stencil buffer between them, and when you drew the quad over the screen to darken the shadowed area, you drew it at a depth equal to the distance from the viewer to the shadow's center? I don't know how well that would do performance-wise, but it might be worth a shot.

- Alex Diener
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #6
arbfp1.0 can't read from the framebuffer, only textures. You could copy the depth component of the framebuffer to a depth texture and access that. If you're very tricky, you might be able to map a pbuffer to the framebuffer, and treat the pbuffer as a texture in your render context. But I'm not sure if that will work...
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #7
Looks like I'm SOL. I've never worked with pbuffers, and given that, I can't imagine I'd pull off something "tricky". Oh well.

When... or if... GLSL is supported on OS X I'll revisit this.
Quote this message in a reply
Member
Posts: 47
Joined: 2004.07
Post: #8
Depending on how heavy your use of fragment programming/multipass is when you render the scene, you might be able to stuff depth into the alpha of the screen when you render the main geometry. Then you can blend with GL_ONE_MINUS_DST_ALPHA, GL_DST_ALPHA to get the correct stuff.

I have played a lot around with this in order to do some nice pseudo-3D-volume fog, and if you can live with the alpha requirement, it's very simple to actually get running

Nicholas Francis
http://www.otee.dk
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #9
Please -- tell me how to do this! I'd really appreciate it, as I've put a lot of work into this game so far and it's bothering me to have the shadows be so ugly.

Would this be a fragment shader run for everything drawn in the scene? And, if I'm packing depth into alpha, how do I make certain my ( Cocoa ) opengl context ignores the alpha when actually displaying on screen?
Quote this message in a reply
Member
Posts: 47
Joined: 2004.07
Post: #10
It can be done in different ways - of the top of my head:
* If you have a spare texture unit, you can bind a fog alpha texture (1D gradient) and use TexGen to have it fade in alpha.
* If you're using vertex programs & combiners (ARB), you can calculate fog alpha per vertex and just get the primary alpha in the last combiner
* As a separate pass: Just disable color writes and use one of the above methods. If you use alpha blending (textured), you need the fragment alpha. Then you have to run an extra pass.

How to actually do it depends on what else you do. Quite often you can squeeze it in somewhere. How is your rendering set up?

Nicholas Francis
http://www.otee.dk
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #11
I'll freely confess to being a noob with regards to the OpenGL pipeline, so I can't really say I understand fully what you're describing here.

Regarding your question about how my rendering is set up -- do you mean how is my context configured? Do you mean what's the sequential process by which my rendering is done?

Here's my main display meshanism:

Code:
void LegionGame::display( float deltaT )
{    
    //lock mutex so physics doesn't fight
    _locker.lock();
    
    GameController::display( deltaT );

    switch ( _gameState )
    {
        case MainMenu:
        case Loading:
        {        
            glClearColor (0,0,0,1);
            glClearStencil( 0 );
            glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT );

            _hud->display( deltaT );
            break;
        }                
        
        case Playing:
        case Paused:
        {
            static float timeCounter = 0.0f;
            timeCounter += deltaT;

            if ( timeCounter >= 4.0 )
            {
                if ( _fpsWidget && _fpsWidget->isVisible() )
                {
                    static char msg[255];
                    sprintf( msg, "FPS: %.2f SPS: %.2f", currentFPS(), currentSPS() );
                    _fpsWidget->setString( msg );
                }
                
                timeCounter = 0;
            }

            _camera->set();
            
            //clears color and stencil buffers and sets up lighting
            Visualization *graphics = Visualization::instance();
            graphics->begin();

            //first pass, draws terrain, static geometry, etc
            _stage->display( deltaT );
            _player->display( deltaT );

            //draws shadow quad, etc
            graphics->end();

            //draws effects requiring transparency, such as particle systems
            _stage->displaySecondPass( deltaT );
            _player->displaySecondPass( deltaT );

            //draws game hud overlay            
            _hud->display( deltaT );

            break;
        }        
    }
    
    /*
        As of right now, I can't see menu actions being used while *playing*
        the game. But everywhere else makes sense.
    */
    if ( _gameState != Playing )
    {
        processMenuActions();    
    }

    _locker.unlock();
}

Drawing only the stage and player -- no shadows or second passes -- is very fast. I get a solid 60fps. I have room for trickery there -- I could probably even pull off drawing the stage and player twice to get proper shadowing.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #12
So, I had an idea of sorts.

What if I rendered the scene with all texturing and lighting turned off -- e.g., all white, over white -- into a texture and somehow ( fragment program? ) swizzled the fragment's depth into the fragment's alpha. E.g., alpha = 1 - depth.

I'd render it into a texture ( I guess a pbuffer? I don't know anything about pbuffers except that they seem to be used a lot for rendering into textures ) and then, when I draw the shadow quad, I'd bind this texture to the quad so the quad would be automatically blended proportional to distance.

Since the alpha component could be low-precision, I could make the texture fairly low-resolution, say 1/2 or 1/4 the screen resolution -- this would save a bit in fillrate.

Does this sound like a reasonable approach?

If so, could anybody give me some pointers regarding pbuffers ( if they're applicable? )
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #13
Apple has samples showing pbuffer usage under AGL if you want to learn about them.

But NicholasFrancis's suggestions are very good. You'll have to explain in more detail your rendering method for stage/player to see what's applicable. For example, if you currently draw shadow polys as a second pass on top of terrain geometry, and those polys are currently only affecting the stencil buffer, then you could change that to also render alpha values via a 1D fog texture as he suggested. You'd want to set up texgen to pick the fog index based on Z, scaling your fog near/far planes to [0, 1] tex coords. Set up the color mask to only write alpha values, so the end result of the pass is identical to what you have now (RGB+stencil) except you have added an alpha mask in the framebuffer's alpha channel with depth values for all shadow polys. Then in your single shadow quad pass, you can modulate by destination alpha to fade out.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #14
arekkusu Wrote:But NicholasFrancis's suggestions are very good. You'll have to explain in more detail your rendering method for stage/player to see what's applicable. For example, if you currently draw shadow polys as a second pass on top of terrain geometry, and those polys are currently only affecting the stencil buffer,

Yup -- this is how it works right now.

arekkusu Wrote:then you could change that to also render alpha values via a 1D fog texture as he suggested.

Does this mean I make my own 1D texture and generate a ramping value for the alpha component, from say 0 to 255? Since we're locking out the color buffer ( except for alpha ) does it matter what's in the R/G/B parts of the texture?

arekkusu Wrote:You'd want to set up texgen to pick the fog index based on Z, scaling your fog near/far planes to [0, 1] tex coords. Set up the color mask to only write alpha values, so the end result of the pass is identical to what you have now (RGB+stencil) except you have added an alpha mask in the framebuffer's alpha channel with depth values for all shadow polys. Then in your single shadow quad pass, you can modulate by destination alpha to fade out.

So, what am I binding the texgenned 1d fog coord to? Am I binding it to the shadow volume extrusions? That makes sense.

Also, when I actually draw the shadow quad: to modulate by destination alpha, is it just
Code:
glBlendFunc( GL_SRC_ALPHA, GL_DST_ALPHA )

Will this multiply against the alpha in the framebuffer?
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #15
TomorrowPlusX Wrote:Does this mean I make my own 1D texture and generate a ramping value for the alpha component, from say 0 to 255?
Yep. 256 byte texture. You aren't limited to a linear ramp, of course. You can emulate quadratic or exponential fog falloff in the texture.

Quote:Since we're locking out the color buffer ( except for alpha ) does it matter what's in the R/G/B parts of the texture?
You can actually make an ALPHA texture. You don't have to use RGBA.

Quote:So, what am I binding the texgenned 1d fog coord to? Am I binding it to the shadow volume extrusions? That makes sense.
Yes, turn on texturing while drawing the shadow polys. You'll have to fiddle with the texgen R plane to get it to produce the right coords.

Quote:glBlendFunc( GL_SRC_ALPHA, GL_DST_ALPHA )
Hmm, I think you'll probably want (GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA). Don't forget that you must request an alpha channel when building your pixel format.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  This is cool - applying it to OpenGL Jones 12 4,219 Jun 23, 2006 09:46 AM
Last Post: Jones