2D Rendering with Alpha

Member
Posts: 22
Joined: 2005.02
Post: #1
Hi,

just made my first OpenGL 2D steps. Everything works: Showing textures with and without masks. As soon as it comes to alpha masked textures I run into a strange problem. The alpha values don't do exactly what I expected.
The following example may show it:

Picture:
http://www.mcsebi.com/images/test/picture.jpg

8-bit alpha mask:
http://www.mcsebi.com/images/test/mask.jpg

Rendered onto black screen with 3 colored rectangles.
http://www.mcsebi.com/images/test/result.jpg

??? What's that?

The alpha mask seems to do anything but the result is not what I expected. Any ideas what can be wrong here? Masks filled with black only seem to work ok.

Sebastian
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #2
What does your call to glTexImage2D or gluBuild2DMipmaps look like?

- Alex Diener
Quote this message in a reply
Member
Posts: 320
Joined: 2003.06
Post: #3
I had similar problems when creating pngs without premultiplying the alpha. Perhaps you need to multiply the RGB values of the source image with the alpha?

Chopper, iSight Screensavers, DuckDuckDuck: http://majicjungle.com
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #4
Are you sure you're sending the right parameters to glTexImage2D()? I'd double check that.
Quote this message in a reply
Moderator
Posts: 771
Joined: 2003.04
Post: #5
Are you sure your alpha mask is 8-bit? If so, are you loading it as 8 bit?
Quote this message in a reply
Member
Posts: 22
Joined: 2005.02
Post: #6
I'll check that. Thanks for your input. It seems I don't load the alpha channel correctly. I will let you know here as soon as I have found it.
Quote this message in a reply
Member
Posts: 22
Joined: 2005.02
Post: #7
Ok, the problem is not inside my OpenGL code but inside my image loading code. If I load the alpha channel into an eight bit buffer I get this behavior, if I load it into a 32bit buffer everything looks ok. It seems the QuicktimeGraphics importer does some dithering when drawing the imported picture into my 8bit offscreen buffer. Is there any way to prevent this?
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #8
Couldn't you just use a PNG with an alpha channel, and load the whole thing at once? Splitting it up into two sepatate images seems like more headache than it's worth.

- Alex Diener
Quote this message in a reply
Member
Posts: 22
Joined: 2005.02
Post: #9
Don't know if this is possible. If I load a file with the Quicktime graphics importer and draw it into a GWorld, how do I tell the GWorld it has an additional alpha channel? Or don't I have to draw it into a GWorld and send it directly to OpenGL? If so, how do you do it?
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #10
QTNewGWorldFromPtr(&gw, k32ARGBPixelFormat, &rect, NULL, NULL, 0, pix, 4 * rect.right);

Notice the ARGB format?
Quote this message in a reply
Member
Posts: 22
Joined: 2005.02
Post: #11
Oh thanks, didn't know that function.

There has another question come up which may expose me as an OPENGL newbie Wink:

I called

glViewport(0, 0, window_width, window_height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, window_width, window_height, 0, -1.0, 1.0);

to make my GL context behave like a 2D world. Everything looks ok, but every quad with negative coordinates will now be clipped away, which make objects disappear too soon when moving out of the window left or up. What did I do wrong?

-Sebastian
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #12
Start a new thread for this problem. And specify how you are drawing quads-- raster operations are clipped differently than polygon operations.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  png without alpha fine, png with alpha, nothing dave05 6 6,728 Jun 11, 2005 10:31 AM
Last Post: dave05