OpenGL weirdness

DoG
Moderator
Posts: 869
Joined: 2003.01
Post: #1
Ok, so for some reason, I can't take OpenGL screenshots in windowed mode properly, also I get refresh artifacts when dragging other windows over mine.

Here's a bad picture:
http://dog4.dyndns.org:8080/wwwalt/planet3.png

The thing is only half drawn. This is AGL, double buffered context. Not only is half the mesh missing, but also the interface elements drawn on top.

Next is what it should look like (its a bit warped due to fullscreen being 16/10, but otherwise as should be):
http://dog4.dyndns.org:8080/wwwalt/planetlit2.png

Same AGL, shared context, just fullscreen instead of windowed, perfect screenshot.

And another good one:
http://dog4.dyndns.org:8080/wwwalt/planet2.png

Has anobody found a workaround or possible solution to this?

Code for the whole shebang is available from site in my sig, if anybody wants to take a closer look, I'd be thankful for any pointers.
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #2
More of a workaround than a solution, but you could always implement your own screenshot function using glReadPixels and QuickTime or NSImage or something to save to a PNG or TIFF.
Quote this message in a reply
Oldtimer
Posts: 832
Joined: 2002.09
Post: #3
I can just confirm that I have the same problem with an NSGL context. Most notably, when I drag windows over my GL context... It's great when you take a screenshot, and suddenly one of the characters onscreen vanishes. Very odd.

Since I can screenshot partial frames (not top-to-bottom partial, but rendering order partial) it seems as if the context is somehow not double-buffered at all, but I can't see how that would happen, knowing the Window Server...
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #4
Are you guys talking about taking a screenshot from code inside your program's code, or with the screenshot utility? I use the screenshot functionality built into my game library, and I've never seen that happen.

I suppose it's possible that using the screenshot utility is forcing a premature flush or something.

Writing a screenshot function with libPNG is pretty easy. Mine is only about 40 lines long. I could post code if you are interested.

Scott Lembcke - Howling Moon Software
Author of Chipmunk Physics - A fast and simple rigid body physics library in C.
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #5
IIRC, the NSOpenGLContext isn't double buffered by default, so it will often have only a partial flush. I've found that the probability of getting a useable screenshot is proportional to the framerate.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #6
I seem to recall a thread about this on the mac-opengl mailing list in the long-distant past... but I agree, the problem seems to be worse under recent 10.4s than it ever has been before...
Quote this message in a reply
DoG
Moderator
Posts: 869
Joined: 2003.01
Post: #7
All I could find was this thread:
http://lists.apple.com/archives/mac-open...00046.html

Not very helpful, doesnt really explain the issue, though. This issue is just bloody ********. I'll try experimenting with single/triple buffering when i get the chance, maybe it does something...
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #8
Henry's message refers to the original thread, which I dug up... not that it's substantially more helpful: http://lists.apple.com/archives/mac-open...00102.html
Quote this message in a reply
Member
Posts: 87
Joined: 2006.08
Post: #9
Those messages explain the issue entirely adequately. The window server reads your back buffer becuase a windowed view doesn't have a front buffer (even in 'double buffered mode'). If each GL surface had a dedicated front buffer, it would consume much more VRAM, which is already a scarce resource. It was apparently decided that removing these temporary glitches wasn't worth the signifcant cost.

In practice, your best bet for avoiding this kind of thing is to write extremly efficient GL code. More specifically, you want to cram as much of your scene into a single command buffer as possible. Do this by avoiding having to upload redundant state change commands, or even more imporantly, upload vertex data via VBOs. Begin/End and Non-VBO vertex arrays often upload that data via the command buffer, consuming quite a bit of space. Once the command buffer is full, an implicit flush is done to acquire more space. When that flush occurs (implicit or not), the window server gets its chance to grab the contents of your backbuffer.
Quote this message in a reply
DoG
Moderator
Posts: 869
Joined: 2003.01
Post: #10
I was under the impression that when I create a double-buffered windowed OpenGL context, i have a two framebuffers offscreen, of which the front buffer is then composited to the screen.

Even if the front buffer is the screen, I don't bloody see why the window server can't just wait for the swap buffers call to update the screen. I'd rather have the screen show nothing for a flicker than a half-rendered scene.

As for the suggestion to write efficient GL code, a good idea, but it is not always possible to have something run at over 60fps constantly only using VBOs.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  Alpha blending weirdness on GMA 950 akb825 15 8,676 Sep 1, 2009 06:22 PM
Last Post: arekkusu
  Weirdness with glClipPlane TomorrowPlusX 12 6,792 Jan 8, 2008 10:36 AM
Last Post: TomorrowPlusX