Stencil buffers in an FBO... trouble

Sage
Posts: 1,199
Joined: 2004.10
Post: #1
Subject says it all -- when I enable stencil buffers in my FBO ( since I do need the stencil buffer ) -- I get an unspecified error which you'll see below.

First, here's the code to generate the stencil render buffer:

Code:
    if( flags & StencilBuffer )
    {
        glGenRenderbuffersEXT( 1, &_stencilTextureID );
        glBindRenderbufferEXT( GL_RENDERBUFFER_EXT, _stencilTextureID );
        glRenderbufferStorageEXT( GL_RENDERBUFFER_EXT, GL_STENCIL_INDEX8_EXT, _width, _height );
        glFramebufferRenderbufferEXT( GL_FRAMEBUFFER_EXT, GL_STENCIL_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, _stencilTextureID );

        // Dies here...
        checkFramebufferStatus();
    }

It fails in the call to checkFramebufferStatus, which I adapted from the specification text file on SGI's site:

Code:
void checkFramebufferStatus(void)                      
{
    using namespace PANSICore;
    
    GLenum status;                                          
    status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);
    
    switch(status)
    {                                        
        case GL_FRAMEBUFFER_COMPLETE_EXT:                      
            break;        
                                                  
        case GL_FRAMEBUFFER_UNSUPPORTED_EXT:                  
            Logger::log( LogEntry::Critical, "checkFramebufferStatus",
                         "GL_FRAMEBUFFER_UNSUPPORTED_EXT; try different formats" );
            break;  
                                                        
        default:    
            // this is the one that gets triggered!                                          
            Logger::log( LogEntry::Critical, "checkFramebufferStatus",
                         "General FBO error; will fail on all hardware" );
    }
}

So, my question is, am I picking an incorrect stencil format? Or are stencils in FBO not supported yet?

Thanks,
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #2
Not supported yet.

The hardware likes to intermingle 8 bit stencil with 24 bit depth, which makes things complicated for FBO, where you can change/remove/texture from/read the attachments arbitrarily.

EXT_packed_depth_stencil solves this by defining a new packed format, but it isn't available on OS X yet.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #3
Looks like I'm in a bit of a pickle, then! I had plans to do, among other things, a sort of fullscreen underwater distortion effect ( like in Quake ) by rendering the whole scene to texture and then rendering the texture overlayed on a grid and perturbing the texture coordinates. Worked great in my demo, which doesn't require stencils.

But, unfortunately, my game engine *does* require stencil for shadow casting.

Oh well. Thanks for the quick reply, though!

( thinking about it, I could probably just use glTexSubImage to read in the backbuffer for this, but still, a bit of a PITA; FBO is so clean. )
Quote this message in a reply
DoG
Moderator
Posts: 869
Joined: 2003.01
Post: #4
Cant you render the game into a different buffer with stencil, then just get the color texture from it for rendering the disturbed grid?
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #5
?

I mean, if FBO won't do stencil, the only "buffer" I've got is the normal gl context, right? Or... wait... do PBuffers support stencils?

EDIT: Looks like PBuffers may, very well, support stencil. I'll have to look deeper, but since a PBuffer is a full bore gl context, it seems likely to be A-OK.
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #6
If you need stencil you will have to use any of the other RTT methods (an offscreen NSWindow is my method of choice.) But yes, you have to pay the cost of maintaining a second (third, fourth...) GL context.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #7
I was out of town for the weekend, but this morning I began a new implementation where the rendering is initially captured by a pbuffer, and subsequent effect passes are performed by FBOs.

Really this isn't a big deal, but I *do* want to be able to apply fullscreen effects, like the warbly water and greyscale.

I'll let people know if I can make it work, maybe posting a simple project to demonstrate. This is turning out to be harder than I initially expected Wink
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #8
So, long story short, my powerbook was out of commission for a couple weeks, thanks, first, to a blown logic board, and thanks, second, to the fact that when the board blew it nuked my HD. Two trips for repairs later ( new mobo, new HD ), I'm back in action ( and thanking heaven that I bought an external HD which I mirror my drive to every night ).

Anyway, things still aren't happy in pbuffer/fbo land. I've modified my effect stack system to capture the all rendering first into a pbuffer ( with depth and stencil ), and then subsequent passes are rendered via FBOs with only a color buffer. This works great in my GLUT demo -- which you can download here ( source, but PPC only thanks to some libs ): http://zakariya.net/shamyl/etc/RenderTexture.zip

But when I drop the classes into my game, it fails miserably, saying that the pbuffer's receiving an "invalid share context". Looking into the docs, I see that that means ( maybe? ) that the pixel formats are incompatible.

So, first, here's my ( windowed ) pixel format for the actual opengl view:

Code:
NSOpenGLPixelFormatAttribute WindowedAttributes[] =
    {
        NSOpenGLPFADoubleBuffer,
        NSOpenGLPFAAccelerated,
        NSOpenGLPFADepthSize, 32,
        NSOpenGLPFAStencilSize, 8,
        NSOpenGLPFAPixelBuffer,
        NSOpenGLPFASingleRenderer,
        NSOpenGLPFAScreenMask, (NSOpenGLPixelFormatAttribute) CGDisplayIDToOpenGLDisplayMask(kCGDirectMainDisplay),
        NSOpenGLPFANoRecovery,
        0
    };

Second, here's the pixel format ( plus other configuration ) I'm using in my pbuffer:

Code:
    CGLContextObj sharedContext = CGLGetCurrentContext();
    
    const bool fullscreenContext = Application::instance()->isFullscreen();
    const int colorDepth = 8;
    const int bitsPerPixel = ( colorDepth * 4);
    const bool hasZBuffer = (flags & DepthBuffer );
    const bool hasStencilBuffer = (flags & StencilBuffer );

    int i = 0;
    CGLPixelFormatAttribute pixelFormatAttributes[32];

    pixelFormatAttributes[i++] = kCGLPFAAccelerated;
    pixelFormatAttributes[i++] = kCGLPFASingleRenderer;
    pixelFormatAttributes[i++] = fullscreenContext ? kCGLPFAFullScreen : kCGLPFAWindow;
    pixelFormatAttributes[i++] = kCGLPFAPBuffer;

    pixelFormatAttributes[i++] = kCGLPFAColorSize;
    pixelFormatAttributes[i++] = (CGLPixelFormatAttribute)(bitsPerPixel);

    if ( hasZBuffer )
    {
        pixelFormatAttributes[i++] = kCGLPFADepthSize;
        pixelFormatAttributes[i++] = (CGLPixelFormatAttribute)(32);
    }

    if ( hasStencilBuffer )
    {
        pixelFormatAttributes[i++] = kCGLPFAStencilSize;
        pixelFormatAttributes[i++] = (CGLPixelFormatAttribute)(8);
    }

    pixelFormatAttributes[i++] = (CGLPixelFormatAttribute) 0;

    long numPixelFormats = 0;
    CGLPixelFormatObj pixelFormat = NULL;
    CGLError error = CGLChoosePixelFormat( pixelFormatAttributes, &pixelFormat, &numPixelFormats );
    checkCGLError(error);

    error = CGLCreateContext( pixelFormat, sharedContext, &_pbufferContext);
    checkCGLError( error );

    CGLDestroyPixelFormat( pixelFormat );

    error = CGLCreatePBuffer( _width, _height,
                              GL_TEXTURE_RECTANGLE_EXT,
                              GL_RGBA,
                              0,
                              &_pbuffer);

    checkCGLError( error );

Now, I know I'm using CGL for the pbuffer and NSGL ( for lack of a better name ) for the actual window, but NSGL ought to be just translated to CGL, right? For what it's worth, I gave a stab at using the Cocoa pbuffer implementation, but instead of failing and dumping "invalid share context" it just quietly failed, without error. Here's the code for that:

Code:
    const bool hasZBuffer = (flags & DepthBuffer );
    const bool hasStencilBuffer = (flags & StencilBuffer );

    NSOpenGLPixelFormatAttribute attrs[] =
    {
        NSOpenGLPFAAccelerated,
        NSOpenGLPFADepthSize, (NSOpenGLPixelFormatAttribute) (hasZBuffer ? 32 : 0),
        NSOpenGLPFAStencilSize, (NSOpenGLPixelFormatAttribute) (hasStencilBuffer ? 8 : 0),
        NSOpenGLPFAPixelBuffer,
        NSOpenGLPFASingleRenderer,
        NSOpenGLPFANoRecovery,
        (NSOpenGLPixelFormatAttribute) 0 //terminator
    };
    
    NSOpenGLPixelFormat* format = [[[NSOpenGLPixelFormat alloc] initWithAttributes: attrs] autorelease];

    _data->pbufferContext = [[NSOpenGLContext alloc] initWithFormat: format
                                                       shareContext: [NSOpenGLContext currentContext]];

    _data->pbuffer = [[NSOpenGLPixelBuffer alloc] initWithTextureTarget: GL_TEXTURE_RECTANGLE_EXT
                                                  textureInternalFormat: GL_RGBA
                                                  textureMaxMipMapLevel: 0
                                                             pixelsWide: _width
                                                             pixelsHigh: _height ];

Can anybody help? Is there some magic flag in the pixel format which I'm missing?

Secondly, and I think this may just be the issue -- I share the context between my fullscreen and windowed displays. Is there a limit to how many times a context may be shared? Basically, my *first* context -- the windowed one -- is passed on to the fullscreen context for sharing. Does this mean it can't be passed to the pbuffer?
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #9
You can share as many times as you like.

For pixel formats to be compatible, they must be identical except for the Window/FullScreen/PBuffer attribute, and the buffer sizes (though I'd be careful to keep the color buffers the same depth on all).
Quote this message in a reply
kberg
Unregistered
 
Post: #10
I've got interchangeable FBO and PBuffers working pretty well with AGL, which just wraps CGL contexts, all shared between each other.. So short answer is yup, you should be able to get it all to work.

I'm using identical pixel formats for windows / screen / pbuffers with the exception of the pbuffer and fullscreen attributes as OSC mentioned.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #11
Good to know it's valid -- in principle -- so my next question is whether anybody sees anything wrong with the pixel formats I'm using, above.

Thanks,
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #12
Well, I gave it the good old fashioned "try turning things off until it works, then turn them back on until it fails" school of debugging and found that it's the "NSOpenGLPFANoRecovery" flag that was preventing it from working.

So, that said, it no longer *noisily* fails, but now silently doesn't work. I'm going to have to beg and plead and in general humiliate myself before the godlike and unforgiving alter of OpenGL Profiler to see if I'm actually capturing the rendering but failing to display it, or if the rendering's just going into the bit bucket.

This is killing me! It works great in my GLUT app...
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #13
Well, I prostrated myself before OpenGL Profiler to see what's going on. First I tried to determine if I actually *can* view the generated texture from a PBuffer ( using my GLUT testbed as a control) , and it seems that I can.

So, I tried in my game, and I get a nice 800x600 blank texture. So, it looks like it's silently failing, e.g., no gl errors, no complaints, and no rendering.

Why would a PBuffer silently *not* capture a rendering? I have to assume something's not mixing right with regards to pixel formats.

Any ideas? Please!
Quote this message in a reply
kberg
Unregistered
 
Post: #14
You are explicitly setting your bound FBO to 0 before doing anything pbuffer related?
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #15
Yeah, it fails even when I don't use fbo at all.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  SLOW performance drawing multiple polygons per frame with stencil buffer clam61 7 3,526 Apr 27, 2013 11:53 AM
Last Post: clam61
  Masking without a stencil buffer? Bachus 12 12,923 Sep 2, 2010 02:42 PM
Last Post: Skorche
  Making 2D Stencil Shadows Soft metacollin 16 14,498 Jul 22, 2009 01:59 PM
Last Post: NelsonMandella
  Frame buffers, vertex programs and more. 8600m GT m3the01 18 7,759 Nov 18, 2007 07:34 PM
Last Post: m3the01
  Stencil shadows meant to look like this? ia3n_g 2 3,601 Nov 23, 2006 06:57 PM
Last Post: akb825