GL, 16bit, 32bit,

Member
Posts: 110
Joined: 2002.04
Post: #1
1. How do I set the GL context to be 16 bit?

2. How do I set the GL context to be 32?

3. How do I set textures to be 16 bit or 32bit?


Ok does the context buffer simplt match the current screen depth?

- Mac Lead ZeniMax Online Studios
- Owner Plaid World Studios
- Resume: http://www.chrisdillman.com
Quote this message in a reply
Mars_999
Unregistered
 
Post: #2
AFAIK, however you setup your RC to OpenGL sets up your bit depth. I use SDL so I tell SDL to run in 32bit mode and OpenGL is good to go. Not sure what API your using? Cocoa or Carbon? If they are either of those two I can't help you. =)
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #3
When you request a pixel format, you can pass an option for color depth, choosing either 16 or 32-bit. I'm pretty sure you'll match the screen resolution anyway, though, regardless of what you say...

You can upload textures in 16-bit using GL_RGBA_1_5_5_5_REV_EXT as your format. I don't know if this is useful or not, though, since it seems likely that GL will downsample your textures to 16-bit for you if you've got a 16-bit context.
Quote this message in a reply
Member
Posts: 110
Joined: 2002.04
Post: #4
Quote:Originally posted by Mars_999
AFAIK, however you setup your RC to OpenGL sets up your bit depth. I use SDL so I tell SDL to run in 32bit mode and OpenGL is good to go. Not sure what API your using? Cocoa or Carbon? If they are either of those two I can't help you. =)

Im using AGL....
This is the Apple GL extension lib.
See the AGL PDF with the apple GL SDK.

I think this would be used only with carbon since
cocoa has its own classes for setting up GL.

- Mac Lead ZeniMax Online Studios
- Owner Plaid World Studios
- Resume: http://www.chrisdillman.com
Quote this message in a reply
Member
Posts: 110
Joined: 2002.04
Post: #5
Quote:Originally posted by OneSadCookie
When you request a pixel format, you can pass an option for color depth, choosing either 16 or 32-bit. I'm pretty sure you'll match the screen resolution anyway, though, regardless of what you say...

You can upload textures in 16-bit using GL_RGBA_1_5_5_5_REV_EXT as your format. I don't know if this is useful or not, though, since it seems likely that GL will downsample your textures to 16-bit for you if you've got a 16-bit context.

Pixel format....
I tired this short of thing....

// OpenGL compliant HWA renderer
GLint attrib[] = { AGL_RGBA, AGL_ACCELERATED, AGL_DOUBLEBUFFER, AGL_DEPTH_SIZE, 16, AGL_PIXEL_SIZE, 16, AGL_NONE };

Is this what you mean...

I can't tell if GL is giving 16 bit or not.

Maybe I can ask it what it gave me?

- Mac Lead ZeniMax Online Studios
- Owner Plaid World Studios
- Resume: http://www.chrisdillman.com
Quote this message in a reply
Member
Posts: 145
Joined: 2002.06
Post: #6
Quote:Originally posted by ChrisD
I can't tell if GL is giving 16 bit or not.

Maybe I can ask it what it gave me?

AFAIK, OpenGL gives you whatever the current screen depth is. If you want a 16bpp environment, you have to switch the screen to 16bpp. I believe Apple's SetupGL sample code does this.

"He who breaks a thing to find out what it is, has left the path of wisdom."
- Gandalf the Gray-Hat

Bring Alistair Cooke's America to DVD!
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #7
You can indeed query the pixel format for its exact spec. Just look in agl.h, all that stuff's there and obvious.

I think maybe you want COLOR_SIZE rather than PIXEL_SIZE? Not quite sure...
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  Binding 16Bit textures? ChrisD 15 5,923 May 1, 2003 07:58 PM
Last Post: Mars_999