3D Light/Lens Flares on iPhone

Moderator
Posts: 3,572
Joined: 2003.06
Post: #1
I have a bunch of light flares in my scene. Right now I'm using some cheesy faking to determine if they're occluded. Basically all I do is just rotate lights around the objects (think: alien flying saucers) and fade the lights out as they go behind the object by simply measuring how far away from the z of the object they are. This does not take into account the actual geometry of the object, but it looks surprisingly convincing in the current situation and it is pretty darn quick. So the lights are what I call "self-occluded", that is, only the object they're attached to affects whether or not they're visible. Unfortunately if I were to have some other object flying over, then the light flares wouldn't be occluded, which obviously would not look right. This isn't currently a problem because I don't have anything flying between them and the camera, but I'm thinking about adding an element to the scene which might change that. I suppose I could add more cheesy occlusion by testing against the other object's collision primitive, but it sure would be nice to come up with a better (more accurate) generalized solution.

Even if we could read the depth buffer in ES (which is not allowed in reality), a whole bunch (maybe 30 or more?) of glReadPixels would surely be dog slow.

So what I'm wondering is: Has anyone else here worked out how to figure out occlusion for the purpose of light/lens flares on iPhone?

I guess one idea would be to render the scene in white with black fog into a separate FBO, to generate my own depth map; but even if I were only to render the occluders and light points into that, which could theoretically be relatively quick, surely getting that data back or access to it would be slow. I'm already drawing a whole bunch in my scene, so this seems like it might be too expensive to even bother trying. Could I be wrong about that? Should I make the effort to give it a try? Anybody got a performance tip on that before I attempt it?

I'd also have to implement gluProject to make that work. I haven't looked at the source for it yet. I assume it's pretty speedy?

Any other speedy ideas?

I suspect I may just have to settle for whatever cheesy faked occlusion I can come up with, but I figured I'd ask around here first before I give up. Wink
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #2
On the desktop, you'd use occlusion queries ( http://www.opengl.org/registry/specs/ARB..._query.txt ). It looks like they aren't supported on the iPhone though.
Quote this message in a reply
Moderator
Posts: 3,572
Joined: 2003.06
Post: #3
Yeah, rats, that would have been perfect...
Quote this message in a reply
Moderator
Posts: 3,572
Joined: 2003.06
Post: #4
... oh yeah, that's right. I nixed using it on the desktop too because the GMA950 doesn't support it either. If I dropped support for the 950, it'd probably be just for that Rasp

How fast is the occlusion query compared to doing a gluproject/readPixels? Could one realistically make many occlusion queries per frame without much trouble? That is assuming they're all batched at one point during the frame (the end), since I would assume that since it is a round-trip operation to the hardware we're still probably talking about an inevitable pipeline stall.
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #5
UT2k4 seemed like it only did the occlusion checks every few frames. Lights would fade in and out as they were occluded and sometimes it was a decent fraction of a second before they started fading. Depending on the screen size, it might be faster to read the entire screen back instead of doing multiple reads.

I've never looked into doing reads on OpenGL ES/iphone. Is it just that you can't read back the depth, or that you can't do reads at all? If you can do reads, then maybe you can just use a color key and check if the color gets written to the framebuffer?

Scott Lembcke - Howling Moon Software
Author of Chipmunk Physics - A fast and simple rigid body physics library in C.
Quote this message in a reply
Moderator
Posts: 3,572
Joined: 2003.06
Post: #6
It's because ES specifically does not allow a read of the depth buffer. Apparently this is because of the way the tile-based renderer works. You can read color though, but I've heard it's terribly slow so I haven't bothered with that approach yet, just to see for myself.

Yes, I was thinking about doing a color key approach too. The black fog over white objects would do the same though and I'd get the advantage of not having to change color.

I don't know how UT2k4 does it, but I remember it looked great. Yeah, it was a delay action. They apparently detected the occlusion of one pixel (the center of the light) and if it was occluded it looked like they set a little fade out timer on the flare, and vice-versa when it became visible again. I was fascinated with the light flares in the demo and studied them for a couple hours Rasp

I was thinking about the size of the screen thing too, wondering if it'd be possible to do a low-rez render of the screen into an FBO for faster read-back. The occlusion test wouldn't have to be absolutely perfect.
Quote this message in a reply
Moderator
Posts: 3,572
Joined: 2003.06
Post: #7
Thinking some more... A color key rendering of the entire scene could be used for other things as well, like hit testing. If you were just doing hit testing for a finger on an iPhone screen, a low-rez rendering would be fine for that, but doing hit testing for like a first person shooter-style shot would need higher accuracy.

Still, for light occlusion, I'm guessing I might be able to roll with something as small as like 25% of the screen, so 80 x 120. If that were RGB, and my math isn't whacky, that should only be 9600 pixels and 112 KB. Then I'd do a timer thing for the flares, like I think UT2k4 was doing. I'd pretty much have to, as I'm assuming the aliasing of that low-rez rendered FBO might make for some rather blinky flares if they were skirting along a jagged, aliased edge of an occluding object.

I might give this a try soon. Hopefully somebody knows better and can tell me not to bother wasting time on it Rasp
Quote this message in a reply
Moderator
Posts: 522
Joined: 2002.04
Post: #8
In my games I raycast against the physics geometry towards the camera to test for occlusion. It's fast for me.
Quote this message in a reply
Moderator
Posts: 3,572
Joined: 2003.06
Post: #9
aarku Wrote:In my games I raycast against the physics geometry towards the camera to test for occlusion. It's fast for me.

Aha! I've been waiting for someone to pop in here with a raycasting suggestion. RaspRaspRasp

So, I don't know how to raycast at all. Any tips you could give me on directions to take to pick up the concept for this would be very much appreciated. Smile

BTW, I haven't had time to test my cheesy "depth render" idea on iPhone. Still using smoke and mirrors for the time being, but I still plan on testing the idea when I get a chance...
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #10
aarku Wrote:In my games I raycast against the physics geometry towards the camera to test for occlusion. It's fast for me.

Yes, but I'm guessing that you use Unity to do it for you and it is thus "free" for all intents and purposes. Wink

Efficiently implementing raycasting to be fast otherwise is not exactly going to be simple.

Scott Lembcke - Howling Moon Software
Author of Chipmunk Physics - A fast and simple rigid body physics library in C.
Quote this message in a reply
Post Reply