glDeleteTextures()?

Mars_999
Unregistered
 
Post: #1
If you don't allocate memory dynamically with new or malloc do you still use glDeleteTextures()? I am not using new for texture variables, but global or local variables. Thanks
Quote this message in a reply
Member
Posts: 177
Joined: 2002.08
Post: #2
glDeleteTextures has nothing to do with new or malloc, it only affects GL's texture object storage.
Quote this message in a reply
Mars_999
Unregistered
 
Post: #3
Quote:Originally posted by Mark Levin
glDeleteTextures has nothing to do with new or malloc, it only affects GL's texture object storage.


I know you use delete on new memory but was asking do you still call glDeleteTextures() on all texture objects no matter what the situation is. Dynamic or non-Dynamic allocated memory do still use glDeleteTextures()? Thanks
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #4
glDeleteTextures has nothing to do with new or malloc, it only affects GL's texture object storage.
Quote this message in a reply
Feanor
Unregistered
 
Post: #5
Does this help?
Quote:C SPECIFICATION
void glDeleteTextures( GLsizei n,
const GLuint *textures )

PARAMETERS
n Specifies the number of textures to be deleted.

textures Specifies an array of textures to be deleted.

DESCRIPTION
glDeleteTextures deletes n textures named by the elements
of the array textures. After a texture is deleted, it has
no contents or dimensionality, and its name is free for
reuse (for example by glGenTextures). If a texture that
is currently bound is deleted, the binding reverts to 0
(the default texture).

Distinguish between a texture and some image file you loaded and have made into a texture. Textures are a logical entity, not the same as some image file you may or may not be using as a texture. A texture gets created, meaning you then have a means to refer to it, and it gets bound, meaning there is an area of memory associated with it, which can be read or written to. You do not allocate them, you ask OpenGL to generate them. What happens to memory when you generate and delete textures, you do not have to worry about in detail.

The answer to your second question is technically "Yes" -- insofar as the two subjects are unrelated. Mark and Keith are probably concerned that you have got some erroneous idea in your head, and they do not want to mislead you.

Whether an image source for a texture is created dynamically or statically has no impact on your usage of glDeleteTextures(). Likewise, using glDeleteTextures() will have no impact on the image *source* for your textures. I hope Mark, Keith or somebody else will confirm or correct me.

I think maybe the question should have been, "What does glDeleteTextures() do?", hence my quoting the man page.
Quote this message in a reply
Mars_999
Unregistered
 
Post: #6
Argh I UNDERSTAND THE WHOLE new & malloc() process!!!!!!!!!!!!! I know glDeleteTextures() isn't going to delete or free() memory I allocated with new or malloc(). What I was asking is glDeleteTextures() is it ok to use it with memory that was allocated with new or malloc()? I know that it works with a variable like this
Code:
unsigned int textures[10] = {0};
glDeleteTextures(10, textures);

but was asking if this is ok also?

Code:
unsigned int *textures = new unsigned int[10];
glDeleteTextures(10, textures);

so glDeleteTextures() clears out all the textures in memory and in theory I can reload new images lets say for a new level into that same memory address? use textures[] again instead of making a new memory location?
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #7
Yes of course that's fine. How could glDeleteTextures know the difference?
Quote this message in a reply
Hog
Member
Posts: 151
Joined: 2002.09
Post: #8
you should certainly stick to GLuint[10] instead of unsigned int[10],
GLuint seems to be defined as
typedef unsigned long GLuint; in gl.h (at least in Version 1.2.1)
for the compiler unsigned int* and unsigned long* are two different things.
Quote this message in a reply
Mars_999
Unregistered
 
Post: #9
Quote:Originally posted by c_dev
you should certainly stick to GLuint[10] instead of unsigned int[10],
GLuint seems to be defined as
typedef unsigned long GLuint; in gl.h (at least in Version 1.2.1)
for the compiler unsigned int* and unsigned long* are two different things.


I thought I looked through OpenGL and seen it was a unsigned int? Maybe that was on a older version of OpenGL? I will change the types. Thanks
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #10
The point is that you don't know what type GLuint actually is. It happens to be a typedef of unsigned long on Mac OS X. It could equally well be a typedef of unsigned int, or unsigned long long. You should always use GLuint since you can't be sure.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  anybody get glDeleteTextures working in X.2.4? kelvin 1 2,593 Feb 26, 2003 07:30 PM
Last Post: kberg