glBindTexture with ALPHA sux on NVIDIA Cards !?

Post: #1
dear all !

i have a rather simple algorithm that renders an object to a texture. now i want to use several of these textures to fake some kind of "motionblur".
i do the following:

glClearColor(1.0f, 1.0f, 1.0f, 0.f); //ALPHA 0 !!
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, g_Viewport, g_Viewport, 0);

later on i render this texture via:

glColor4f(1.f,1.f,1.f,alpha); //alphafade
glBindTexture(GL_TEXTURE_2D, myTextureGLINT);

so far so good - works totally nice on ATI cards - but not at all on NVIDIA (latest driver - no question) - any ideas on this ???
do i eventually have to use some "strange" NV_ extension to make this work ??
please help !!!

thnx & many greetings

PS: i am using SDL - eventually my setup sux somehow ??
PPS: i create the texture in mem with mainly:
glTexImage2D(GL_TEXTURE_2D, 0, 4, 512, 512, 0, type, GL_UNSIGNED_INT, pToMyTexture);
Quote this message in a reply
Posts: 5,143
Joined: 2002.04
Post: #2
did you remember to request destination alpha in your opengl setup code? enable blending? choose an appropriate blendfunc?
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  General Questions About Old Or Limited Graphics Cards Oddity007 5 6,420 Mar 3, 2009 01:39 AM
Last Post: arekkusu
  glBindTexture(GL_TEXTURE_RECTANGLE_ARB, ...) throws GL errors ravuya 1 7,442 Nov 2, 2006 12:06 AM
Last Post: ravuya
  Pbuffer problems on Intel NVIDIA GeForce 7300 GT NYGhost 5 6,718 Oct 26, 2006 09:39 AM
Last Post: NYGhost
  nvidia cg habahaba 5 5,595 Aug 2, 2006 01:36 PM
Last Post: OneSadCookie
  Heads up: glDrawbuffer(GL_FRONT) issue w/OS X 10.4.3 and NVidia cards zKing 4 5,570 Jan 11, 2006 01:00 PM
Last Post: arekkusu