Water Reflection

Marjock
Unregistered
 
Post: #1
This seems like something that would have come up before, but the two pages of search results did not cover what I wanted to know.

So, I am attempting to create a basic water reflection, such as might be found on the surface of a pond. I have successfully set up a PBuffer, and can render my scene to it, and then create a texture from this, which can be applied to on-screen meshes.

The problem comes in with Texture Coordinates. If you apply the reflection texture the way you would any other texture, then OpenGL distorts it, so as to create perspective. However, water reflections do not get distorted with perspective, in this way.

So what I have tried to devise is an equation will will calculate new Texture Coordinates based on the angle I am from the water. The idea being that I would calculate the texture coordinates in such a way that it would counter-act OpenGL's perspective distortion.

In short, I've failed. And so I thought I'd come and make a post, and see if anybody is able to explain the kind of techniques that are generally employed in situations such as this.

I'll post my latest code below, it's probably almost impossible to follow, so feel free to question bits of it.
However, at the same time, I wouldn't be surprised if it proves easier to scrap this code and try something new, so feel free to suggest new methods, not just ways of fixing up my current one. :-)

Code:
-(void)calculateWaterTexCoords
{
    //The number of faces for a model is equal to the number of *floats* for vertex data or tex coord data or normal data,
    //and so the numFaces variable can be uses as it is below.
    
    float * tempTexCoordArray = (float*)malloc(sizeof(float)*[waterModel numFaces]);
    unsigned int i;
    float cameraAngle;
    float distance = 0.0f;
    float furtherestAway = 0.0f;
    int index = 0;
    myVector directionVector;
    myVector vertexVector;
    
    //Set up a vector for the direction in which the camera is pointing. (xPosition is the camera's xPosition)
    directionVector.x = -[self xPosition];
    directionVector.y = 0.0f;
    directionVector.z = -[self zPosition];

    [vectorOutlet normalise:&directionVector];

    
    //Find which vertex of the water model is furtherest away (in the direction that we're looking) from the camera
    
    for(i=0;i<[waterModel numFaces];i+=3)
    {
        vertexVector.x = [waterModel vertexArray][i];
        vertexVector.y = [waterModel vertexArray][i+1];
        vertexVector.z = 0.0f;
        [vectorOutlet normalise:&vertexVector];
        
        distance = [vectorOutlet dotProduct:directionVector
                                       with:vertexVector];
        
        if(distance > furtherestAway)
        {
            furtherestAway = distance;
            //This index now points to the Vertex, the Normal and the Tex Coord (Depending on which array it is used in) of the furtherest away vertex.
            index = i;
        }
    }
    
    //Get the angle of the camera from the water surface.
    cameraAngle = atan([self yPosition]/sqrt([self xPosition]*[self xPosition] + [self zPosition]*[self zPosition]));
    
    cameraAngle = (cameraAngle/GL_PI)*180.0f;
    
    //Go through every texture coordinate and apply a transformation to it, so that it counteracts OpenGL's perspective distortion
    for(i=0;i<[waterModel numFaces];i+=3)
    {
        if(cameraAngle==0.0f)
            cameraAngle = 1.0f;
        
        // i will point to the x float, and i+1 will point to the y float.
        
        
        //The general idea here is the following. When viewed from straight-on (ie. 90 degrees from the surface), a reflection might cover one quarter of the water surface.
        //When we are viewing the water from 45 degrees, we see half as many pixels on-screen. However, we want to see the same number of Texture Pixels (OpenGL tries to half it, along with everything else)
        //However, we want the furtherest away portion of the water to have the same texture coordinates (because that is where the reflection originates from.
        tempTexCoordArray[i] = ([waterModel texCoordArray][i]/((90.0f/cameraAngle)*directionVector.x)) + ([waterModel texCoordArray][index] - ([waterModel texCoordArray][index]/((90.0f/cameraAngle)*directionVector.x)));
        tempTexCoordArray[i+1] = ([waterModel texCoordArray][i+1]/((90.0f/cameraAngle)*directionVector.z)) + ([waterModel texCoordArray][index+1] - ([waterModel texCoordArray][index+1]/((90.0f/cameraAngle)*directionVector.z)));
    }
    
    //Set the altereted tex coords to be the ones which are used to draw the water.
    [waterModel setTexCoordArray:tempTexCoordArray];
    
    

}

Thanks in advance,
Mark Thomson.
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #2
This is how I did the textures for my water stuff: I drew everything to a rectangular texture the same size as the viewport, and set it up so it wouldn't need any distortions or flipping or anything if just drawn from the camera's point of view. (aka: if I'm drawing the reflection, everything would already be upside-down) Then, once you have the texture set up, you simply have the texture coordinates be the projected texture position. (at least the x y portion) You have 2 options for this: you can either have a vertex shader set up the texture coordinates as the final projected vertex position, or you can set it up to dynamically change through software, by projecting the vertices to set up new texture coordinates each time the camera changes.
Quote this message in a reply
Marjock
Unregistered
 
Post: #3
Hmm, I don't quite understand what you mean by the "projected texture position", or the projected vertex position.

-Mark
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #4
For projected texture position, I mistyped and meant projected vertex position. What I mean by the projected vertex position is the vertex's position multiplied by the modelview and projection matrices. Essentially, that projects the vertex to the screen's coordinates, so the x and y coordinates are the same as the x and y coordinates for the screen and the z is the depth value. Essentially, all that work is already done in the texture (except the z value is lost) assuming it's a rectangular texture, and you're simply projecting the vertices to your water to then grab the respective portion from the texture.

Edit: actually, to clarify, after you multiply by the matrices, the x and y coordinates will range from -1 to 1. In order to get them useable, you'll need to add 1 and divide by 2 to each both the x and y coordinates, and then multiply by the width and height of the viewport respectively. You can also use gluProject to make your life easier if you're doing it in software rather than in the shader. Also, if you were to do more advanced effects using pixel shaders, you can merely use the fragment's position.
Quote this message in a reply
Marjock
Unregistered
 
Post: #5
Well, first off, my texture is a square, but it has the same edges as it would if it were a rectangle, so I should just be able to resize things accordingly.

I think using gluProject looks like my best bet. If I do that, what steps does it eliminate? Seems like it would multiply by my viewport for me, but not do the whole (+1)/2 step.
When using it, I seem to get heaps and heaps of tiny tiny versions of my texture over the surface of the water (So tiny they look almost like noise until you zoom in close, and can then make out vague maybe-shapes).

Thanks for your help.

-Mark
Quote this message in a reply
Sage
Posts: 1,403
Joined: 2005.07
Post: #6
can you just render the water as a stencil and then draw a fullscreen quad with the pbuffer texture on it?

Sir, e^iπ + 1 = 0, hence God exists; reply!
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #7
Marjock Wrote:Well, first off, my texture is a square, but it has the same edges as it would if it were a rectangle, so I should just be able to resize things accordingly.

I think using gluProject looks like my best bet. If I do that, what steps does it eliminate? Seems like it would multiply by my viewport for me, but not do the whole (+1)/2 step.
When using it, I seem to get heaps and heaps of tiny tiny versions of my texture over the surface of the water (So tiny they look almost like noise until you zoom in close, and can then make out vague maybe-shapes).

Thanks for your help.

-Mark
Yes, it already does the job of multiplying by the viewport. in that case. When you're rendering your texture, if you use a rectangular texture (with GL_TEXTURE_RECTANGLE_ARB instead of GL_TEXTURE_2D), you can keep the viewport the exact same size when you render the reflection. Unknown's suggestion works, too, but it won't work if you wanted to add distortions or other effects to the rendered water by distorting the texture coordinates, adding other textures on top, etc.
Quote this message in a reply
Sage
Posts: 1,403
Joined: 2005.07
Post: #8
if you were going for wavey rippling water you would need to render 6 sides of a cube for an environment map, or somthing similar.

Sir, e^iπ + 1 = 0, hence God exists; reply!
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #9
No, you wouldn't. The simple reflection will do, assuming you aren't planning on doing monstrous near-vertical waves complete with reflection. On a per-vertex scale, you can't do full rippling water, but you could do a larger wave pattern for choppy water.
Quote this message in a reply
Marjock
Unregistered
 
Post: #10
Hmm, simply changing GL_TEXTURE_2D to GL_TEXTURE_RECTANGLE_ARB didn't seem to work, are there further complications I'm unaware of?

Also, after projecting the vertices of my water, I get values ranging between -2000 and 9,000. Which seems just a little bit wrong. :-p

Perhaps I am misusing gluProject?

Code:
glGetDoublev(GL_MODELVIEW_MATRIX, modelMatrix);
        glGetDoublev(GL_PROJECTION_MATRIX, projMatrix);
        glGetIntegerv(GL_VIEWPORT, viewport);
        
        if(gluProject((GLdouble)[waterModel vertexArray][i],
                       (GLdouble)[waterModel vertexArray][i+1],
                       (GLdouble)[waterModel vertexArray][i+2],
                       modelMatrix,
                       projMatrix,
                       viewport,
                       &screenX,
                       &screenY,
                       &screenZ))
        {
            NSLog(@"%f, %f, %f", screenX, screenY, screenZ);
            [waterModel texCoordArray][i] = screenX;
            [waterModel texCoordArray][i+1] = screenY;
            [waterModel texCoordArray][i+2] = screenZ;
        }
        else
        {
            NSLog(@"Error at gluProject for index: %d",i);
        }

Thanks again,
Mark
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #11
You appear to be using gluProject correctly. However, the texture coordinates should be in pairs, not in triplets. (so all you need is the x and y coordinates) It's possible that you're assuming that it's in 3s, but it's only looking in 2s, so the texture coordinates would be messed up.
Quote this message in a reply
Marjock
Unregistered
 
Post: #12
Well, I have to pass and receive three values from gluProject, and then I was glDrawElements to draw my sprites, so it pulls the information straight from my texCoord Array (Whereby the Z component is 0.0).

Perhaps I'm misunderstanding what you mean, but I don't think there's any way to use gluProject with only X and Y coordinates.

-Mark
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #13
You do use gluProject with all 3 coordinates. However, you only need to put the resulting x and y coordinates in the texture array. The resulting z coordinate is useless, since the texture is 2D, and in fact is detrimental since it only expects 2 texture coordinates for each vertex, not 3.
Quote this message in a reply
Marjock
Unregistered
 
Post: #14
Changing the code so that the third texCoord value is set to 0.0f doesn't fix the problem, sadly. :-/

I can't understand why the X and Y coordinates would have such bizzare values after being projectd (between -2,000 and 9,000, as I said.)

-Mark
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #15
No, you don't set it to 0. You don't set it at all. As I said (twice, now), it is only looking for 2 values packed together, so when you provide 3, it screws up and doesn't process the values correctly. Instead, the array should have the number of vertices*2 for the number of elements, and you should only set i and i + 1 each time.

It's fine that you get values of -2000 to 9000: it just means that you have portions that are off screen. (the values that are on screen would be in the range of 0 - width for the x coordinate, and 0 - height for the y coordinate)
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  Rendering a reflection about an arbitrary plane TomorrowPlusX 3 4,534 Jan 14, 2008 07:51 AM
Last Post: TomorrowPlusX
  Water gfx in a terrain engine Mars_999 3 3,072 Sep 4, 2003 04:32 PM
Last Post: David
  Good-looking Water? Namoreh 7 3,929 Mar 21, 2003 01:40 PM
Last Post: Namoreh