Desaturating an OpenGL world

Member
Posts: 351
Joined: 2002.04
Post: #1
The artist I'm working on my next game with has just come up with a really interesting visual effect and I'm just wondering how it might be achieved in OpenGL.

How do you think I would go about taking all the colour out of an OpenGL scene, i.e. turning it black and white (grayscale)?

Any thoughts would be appreciated.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #2
Render it to a pbuffer, then render the pbuffer to a window with a simple shader program or color matrix.

Requires at least geforce 3 or radeon 9000-class hardware, and will be easiest on geforce fx / radeon 9500-class hardware. The color matrix will work (slowly) in software everywhere, probably, but it might not be any more efficient than doing it yourself.
Quote this message in a reply
Sage
Posts: 1,403
Joined: 2005.07
Post: #3
If you cant do it any other way than this,
Instead of calling glColor3f or 4f, call a custom method
which works somthing like this
Code:
void custom_glColor3f(float r, float g, float b) {
ifdef COLOR
glColor3f(r, g, b);
#else
s = desaturate(r, g, b) // |r^2+g^2+b^2| ?
glColor3f(s, s, s);
#endif
}
and do the same for all your materials and textures.

Sir, e^iπ + 1 = 0, hence God exists; reply!
Quote this message in a reply
Member
Posts: 715
Joined: 2003.04
Post: #4
Use grayscale or destatured images for your textures. You can run a photoshop batch to handle that. If you are using materials only, your materials can just use a very desatured color scheme.

Shouldn't be too hard, and surely shouldn't need a special graphics card.
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #5
It depends on if you want just a part of the game grayscale, or the entire game grayscale. If it's just certain situations, you can create a pixel shader that takes the color sum after the texture operations and re-assigns the color. Of course, this would require a GeForce 3 (or 5200, I'm not sure about fragment shaders) or ATI 9550 or higher. If all your textures are grayscale and mapped on colored polygons, you could get away with a vertex shader and it would only require a GeForce 3 or Radeon 8500/9000 or higher. Or you could create separate textures etc. and have no requirement. It uses a lot more memory, though.
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #6
I gave a stab at the pbuffer + fragment program approach a month or so ago. I can provide code if you're interested, but I think there's something wrong with my implementation because the performance was *awful*. It also may have to do with my having an FX 5200 Go, which is the worst graphics card, ever. Note, however, that a *proper* grayscale transform is not as simple as averaging the color components.

Here's the shader I wrote:

Code:
!!ARBfp1.0

# grayscale.fp
#
# convert RGB to grayscale

ATTRIB iPrC = fragment.color.primary; # input primary color
OUTPUT oPrC = result.color;           # output color

TEMP fragColor;
TEX fragColor, fragment.texcoord[0], texture[0], RECT;

# do proper grayscale conversion
TEMP fragColorGrey;
DP3 fragColorGrey, fragColor, {0.3, 0.59, 0.11};

MOV oPrC.rgb, fragColorGrey;
MOV oPrC.a, iPrC.a;

END

It works, but on my machine ( 12" PB ) it's pretty slow. Perhaps somebody here can tell me why? My test app runs at full speed using the pbuffer for display, so long as no shader is bound. But when a shader is bound I go from ~100 fps to about 10...

Why this is the case is beyond me. Either something's wrong with my fragment program, or I'm binding the gl fragment functions incorrectly, or something to do with GLUT, or it's just that my 5200 is crapulent.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #7
I don't see what it could be other than the 5200, though I've done this on the Radeon 9000 without noticing anything like that much of a slowdown...
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #8
If you only need to do it once for a static image, it might just be easiest to use glCopyTexImage2D to copy the framebuffer into a luminance texture. That will do all the heavy lifting to convert to B/W for you in one line of code.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #9
but it won't do the conversion properly, IIRC Wink
Quote this message in a reply
Member
Posts: 351
Joined: 2002.04
Post: #10
Thanks for all the suggestions. I'd prefer to do it without a shader program if possible because my card isn't really new and the idea is to support my set up as the lower-end system requirements.

I might give Skorche's approach a try, just to see how it looks. It's just a visual effect and so doesn't need to be really clean and accurate.

TomorrowPlusX - do you have an executable I could grab just to see if this method even works on my card?

BTW - just changing my textures to grayscale in Photoshop wouldn't be any good because the idea is to change from grayscale to colour in game.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #11
what is your setup, that might help us suggest something that'll work for you.
Quote this message in a reply
Member
Posts: 351
Joined: 2002.04
Post: #12
Yeah sorry about being so vague. I'll post my specs when I get home tonight.

I know I know, I must be a pretty hopeless developer when I don't even know what graphics card I have without looking it up!
Quote this message in a reply
Sage
Posts: 1,199
Joined: 2004.10
Post: #13
I'll post my demo this afternoon. It's pretty hacky, but it demonstrates PBuffers and simple shader usage. I'm a shader newbie, so I'm just hacking around...

But I assure you, somethings rotten in my app. I've played GooBall on my powerbook ( with the 5200 ) and the performance at high graphics quality was excellent.
Quote this message in a reply
Member
Posts: 269
Joined: 2005.04
Post: #14
There's nothing in that shader that's particularly slow itself. What *is* slow is the 5200, which has retched fragment program performance. You want to operate on as few fragments as possible. One way of doing this is to draw normally, then copy the finished image to a half-sized or quarter-sized texture and running the fragment program on that. Then drawing the texture back to the screen. Obviously that reduces the quality of the resulting image, but that may be acceptable depending on the effect.
Quote this message in a reply
Sage
Posts: 1,403
Joined: 2005.07
Post: #15
What about changing the pallete of the screen?
I remember old games like aperion did that when you got mushrooms, can you still do that?

Sir, e^iπ + 1 = 0, hence God exists; reply!
Quote this message in a reply
Post Reply