glBindTexture with ALPHA sux on NVIDIA Cards !?

hgore69
Unregistered
 
Post: #1
dear all !

i have a rather simple algorithm that renders an object to a texture. now i want to use several of these textures to fake some kind of "motionblur".
i do the following:

glClearColor(1.0f, 1.0f, 1.0f, 0.f); //ALPHA 0 !!
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
pMyObject->Render();
glBindTexture(GL_TEXTURE_2D,myTextureGLINT);
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, g_Viewport, g_Viewport, 0);

later on i render this texture via:

glColor4f(1.f,1.f,1.f,alpha); //alphafade
glBindTexture(GL_TEXTURE_2D, myTextureGLINT);

so far so good - works totally nice on ATI cards - but not at all on NVIDIA (latest driver - no question) - any ideas on this ???
do i eventually have to use some "strange" NV_ extension to make this work ??
please help !!!

thnx & many greetings
h.

PS: i am using SDL - eventually my setup sux somehow ??
PPS: i create the texture in mem with mainly:
glTexImage2D(GL_TEXTURE_2D, 0, 4, 512, 512, 0, type, GL_UNSIGNED_INT, pToMyTexture);
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #2
did you remember to request destination alpha in your opengl setup code? enable blending? choose an appropriate blendfunc?
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  General Questions About Old Or Limited Graphics Cards Oddity007 5 3,914 Mar 3, 2009 01:39 AM
Last Post: arekkusu
  glBindTexture(GL_TEXTURE_RECTANGLE_ARB, ...) throws GL errors ravuya 1 5,811 Nov 2, 2006 12:06 AM
Last Post: ravuya
  Pbuffer problems on Intel NVIDIA GeForce 7300 GT NYGhost 5 4,215 Oct 26, 2006 09:39 AM
Last Post: NYGhost
  nvidia cg habahaba 5 3,610 Aug 2, 2006 01:36 PM
Last Post: OneSadCookie
  Heads up: glDrawbuffer(GL_FRONT) issue w/OS X 10.4.3 and NVidia cards zKing 4 3,603 Jan 11, 2006 01:00 PM
Last Post: arekkusu