problem with IMG_Load

Member
Posts: 185
Joined: 2005.02
Post: #1
Ok, I was trying to play around with textures in a simple program but IMG_Load() is being a pain in the *$$. here is the function I am using to load images and create textures (taken from the SDL documentation wiki and modified slightly)
Code:
int loadTexture (void)
{
    SDL_Surface *surface = NULL;
    int mode;
    surface = IMG_Load("logo.bmp");
    
    if (!surface)
    {
        return 0;
    }
    if (surface->format->BytesPerPixel == 3)
    { // RGB 24bit
        mode = GL_RGB;
    }
    else if (surface->format->BytesPerPixel == 4)
    { // RGBA 32bit
        mode = GL_RGBA;
    }
    else
    {
        printf("%d\n", surface->format->BytesPerPixel);
        SDL_FreeSurface(surface);
        return 1;
    }
    
    glGenTextures(1, &textureid);
    
    glBindTexture(GL_TEXTURE_2D, textureid);
    glTexImage2D(GL_TEXTURE_2D, 0, mode, surface->w, surface->h, 0, mode, GL_UNSIGNED_BYTE, surface->pixels);
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

    SDL_FreeSurface(surface);

    return 2;
}

I then have main() print the return value. My output is always a white square and 2 ones in the run log window. This indicates to me that IMG_load successfully loaded the image file but believes that the bytes per pixel for the image is 1.
What the he77 is going on?
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #2
That's entirely possible, what kind of images are you loading?

I still use SDL_image to load my images due to the simplicity. Despite the fact that that code is on the SDL wiki, it's not very safe. (as you've found) It only works when the image gets loaded in a ready to use RGB or RGBA format. It doesn't even guarantee that it isn't BGRA or ARGB etc.

You should do something like this:
Code:
    SDL_PixelFormat RGBAFormat;
    RGBAFormat.palette = 0; RGBAFormat.colorkey = 0; RGBAFormat.alpha = 0;
    RGBAFormat.BitsPerPixel = 32; RGBAFormat.BytesPerPixel = 4;
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
    RGBAFormat.Rmask = 0xFF000000; RGBAFormat.Rshift =  0; RGBAFormat.Rloss = 0;
    RGBAFormat.Gmask = 0x00FF0000; RGBAFormat.Gshift =  8; RGBAFormat.Gloss = 0;
    RGBAFormat.Bmask = 0x0000FF00; RGBAFormat.Bshift = 16; RGBAFormat.Bloss = 0;
    RGBAFormat.Amask = 0x000000FF; RGBAFormat.Ashift = 24; RGBAFormat.Aloss = 0;
#else
    RGBAFormat.Rmask = 0x000000FF; RGBAFormat.Rshift = 24; RGBAFormat.Rloss = 0;
    RGBAFormat.Gmask = 0x0000FF00; RGBAFormat.Gshift = 16; RGBAFormat.Gloss = 0;
    RGBAFormat.Bmask = 0x00FF0000; RGBAFormat.Bshift =  8; RGBAFormat.Bloss = 0;
    RGBAFormat.Amask = 0xFF000000; RGBAFormat.Ashift =  0; RGBAFormat.Aloss = 0;
#endif

    SDL_Surface *orig = IMG_Load(filename);
    if(orig == NULL){
        printf("SDL_image error: \"%s\" couldn't be loaded\n", filename);
        //handle the error somehow
    }

    SDL_Surface *conv = SDL_ConvertSurface(orig, &RGBAFormat, SDL_SWSURFACE);
    SDL_FreeSurface(orig);

Defining an RGB format would be a (mostly) trivial copy paste operation.

conv now holds an SDL_Surface that's in a OpenGL happy RGBA format.

The problem is that SDL_image doesn't allow you to pick the destination format, so it picks one close to the original image. If that's not what you wanted, you have to convert it afterward. Even for large images, this seems to be pretty fast, so I wouldn't worry too much about a speed hit.

Scott Lembcke - Howling Moon Software
Author of Chipmunk Physics - A fast and simple rigid body physics library in C.
Quote this message in a reply
Member
Posts: 185
Joined: 2005.02
Post: #3
Ok Thanks. That fixed that problem and (as always) a new problem has been exposed. Now the program crashes. The debugger says that glGenTextures is to blame. relevant code:
Code:
int textureid[1];
...
glGenTextures(1, textureid);
What's up with that?
Quote this message in a reply
Member
Posts: 320
Joined: 2003.06
Post: #4
You are probably calling glGenTextures before you have created a valid opengl context.

Chopper, iSight Screensavers, DuckDuckDuck: http://majicjungle.com
Quote this message in a reply
Member
Posts: 185
Joined: 2005.02
Post: #5
reubert Wrote:You are probably calling glGenTextures before you have created a valid opengl context.
yep I realized that just before I read your reply. Smile
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #6
ferum Wrote:
Code:
int textureid[1];
...
glGenTextures(1, textureid);
What's up with that?

Not your problem, but the compiler should be screaming about that code -- you can't pass an int* as a GLint* on Mac OS X because a GLint is actually a long...

Code:
GLint textureid[1];
glGenTextures(1, textureid);
Quote this message in a reply
Member
Posts: 185
Joined: 2005.02
Post: #7
grr... I'm still getting a white square.
Could someone point out what I'm missing? This is really frustrating.
Relavent code:
Code:
GLint textureid;

int loadTexture (char * filename)
{
    SDL_PixelFormat RGBAFormat;
    RGBAFormat.palette = 0; RGBAFormat.colorkey = 0; RGBAFormat.alpha = 0;
    RGBAFormat.BitsPerPixel = 32; RGBAFormat.BytesPerPixel = 4;

#if SDL_BYTEORDER == SDL_BIG_ENDIAN
    RGBAFormat.Rmask = 0xFF000000; RGBAFormat.Rshift =  0; RGBAFormat.Rloss = 0;
    RGBAFormat.Gmask = 0x00FF0000; RGBAFormat.Gshift =  8; RGBAFormat.Gloss = 0;
    RGBAFormat.Bmask = 0x0000FF00; RGBAFormat.Bshift = 16; RGBAFormat.Bloss = 0;
    RGBAFormat.Amask = 0x000000FF; RGBAFormat.Ashift = 24; RGBAFormat.Aloss = 0;
#else
    RGBAFormat.Rmask = 0x000000FF; RGBAFormat.Rshift = 24; RGBAFormat.Rloss = 0;
    RGBAFormat.Gmask = 0x0000FF00; RGBAFormat.Gshift = 16; RGBAFormat.Gloss = 0;
    RGBAFormat.Bmask = 0x00FF0000; RGBAFormat.Bshift =  8; RGBAFormat.Bloss = 0;
    RGBAFormat.Amask = 0xFF000000; RGBAFormat.Ashift =  0; RGBAFormat.Aloss = 0;
#endif
    
    SDL_Surface * orig = NULL;
    orig = IMG_Load(filename);
    
    if(orig == NULL){
        printf("SDL_image error: \"%s\" couldn't be loaded\n", filename);
        return 0;
    }
    
    SDL_Surface * conv = SDL_ConvertSurface(orig, &RGBAFormat, SDL_SWSURFACE);
    SDL_FreeSurface(orig);
    
    glGenTextures(1, &textureid);
    
    printf("%d\n", textureid);
    
    glBindTexture(GL_TEXTURE_2D, textureid);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, conv->w, conv->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, conv->pixels);
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
    glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
    
    SDL_FreeSurface(conv);

    return 2;
}
    
void display(void)
{
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glLoadIdentity();
    
    glBindTexture(GL_TEXTURE_2D, textureid);

    glBegin(GL_QUADS);

        // bottom left
        glTexCoord2f(0, 0);
        glVertex2f(0, 0);

        // bottom right
        glTexCoord2f(1, 0);
        glVertex2f(100, 0);

        // top right
        glTexCoord2f(1, 1);
        glVertex2f(100, 100);

        // top left
        glTexCoord2f(0, 1);
        glVertex2f(0, 100);

    glEnd();    
        
    SDL_GL_SwapBuffers();
    return;
}
And yes, I have enabled GL_TEXTURE_2D (in my initialize openGL function that I didn't include above.
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #8
Wow, that took me way to long to figure out.

Hint: Try loading an image with non-white corners...

Edit: Hmmm. I was kind of assuming (somewhat stupidly) that you were using the default coordinates as well. Otherwise it works just fine for me.

Scott Lembcke - Howling Moon Software
Author of Chipmunk Physics - A fast and simple rigid body physics library in C.
Quote this message in a reply
Moderator
Posts: 691
Joined: 2002.04
Post: #9
Try passing 'conv' to SDL_LockSurface() before you pass it to glTexImage2D(); if you do not, you are not guaranteed that the 'pixels' pointer is valid...

Mark Bishop
Quote this message in a reply
Member
Posts: 185
Joined: 2005.02
Post: #10
Ok I've gotten it to work now. The program just didn't like the image file I was trying to use. I have some questions though.
1) Why are my textures being displayed with the y-axis inverted (upside down)?
2) What version of OpenGL does an PPC iBook G4 running Tiger have? Is there some utility I can run to check? The reason I ask is that my program is rejecting non-power of 2 textures, which it shouldn't if my computer has OpenGL version 2.0 or 2.1.

EDIT: Ok, by using the OpenGL Driver Monitor I think I have determined that I have openGL version 2.0 but my computer/graphics card doesn't support non-power of 2 textures. Grr.
Quote this message in a reply
Member
Posts: 87
Joined: 2006.08
Post: #11
Check again, no iBook supports OpenGL 2.0 today.
Quote this message in a reply
Member
Posts: 72
Joined: 2006.10
Post: #12
Quote:1) Why are my textures being displayed with the y-axis inverted (upside down)

Because opengl coordinates are always referenced from the bottom-left. The first byte of your image, since it's a bmp, represents the top-left pixel, hence the vertical inversion.

It may sound silly and annoying, but the ARB cares more about mathematical coherence than convenience to programmers, and for that, I tip my hat to them. (not sarcasticaly, I honestly think that they are on the right track here)
Quote this message in a reply
Sage
Posts: 1,403
Joined: 2005.07
Post: #13
no actually this is wrong, the reason your texture is upside down is because your texture coordinates are upside down.

Sir, e^iπ + 1 = 0, hence God exists; reply!
Quote this message in a reply
Member
Posts: 72
Joined: 2006.10
Post: #14
Well yeah, that's an obvious conclusion, but I was pointing out the reason most people who start texturing always end up with inverted y coordinates when they thought that their mapping was ok.
Quote this message in a reply
Sage
Posts: 1,403
Joined: 2005.07
Post: #15
opengl coordinates are referenced from the bottom left if you setup your projection and modelview matrices that way.

Sir, e^iπ + 1 = 0, hence God exists; reply!
Quote this message in a reply
Post Reply