Glsl

Member
Posts: 312
Joined: 2006.10
Post: #1
Hi, I know OpenGL 2.0 added official support for GLSL, and that most function names just dropped the ARB suffix. My question is that, is there any difference between using the ARB extension for GLSL, or using the OpenGL 2.0 syntax?tDoes using the ARB extension allow for support on older machines?

Also, any learning material people know of for GLSL would be appreciated if posted Smile

Thanks
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #2
Keep using the ARB suffix versions: they still work on newer machines, and older machines that have OpenGL < 2.0 but support GLSL through extensions will be able to use them.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #3
Yes, there are differences -- enough that you should make sure only to call the 2.0 functions if 2.0 is supported, and only to call the ARB functions if the extensions are supported, not to treat them as equivalent.

There are no feature-level differences which make one more powerful than the other, however the ARB extensions may be supported on hardware which cannot otherwise support OpenGL 2.0 (GeForce FXs and GMA 950s fit this category, and Radeon 9500-X1900 should, but don't on Mac OS X, but that's a different rant).

The orange book may be worthwhile having. OpenGL.org has a GLSL quick reference PDF which is essential to have on-hand. There are plenty of tutorials for GLSL and for Cg on the web. The two languages are similar enough that there shouldn't be any major problems porting between them.
Quote this message in a reply
Member
Posts: 312
Joined: 2006.10
Post: #4
Thanks. So, I guess it would be as simple as checking for OpenGL 2.0 or check for not OpenGL 2.0 but with GL_ARB_shader_objects and GL_ARB_shading_language_100, then have the appropriate calls.

As for Cg compared to GLSL, do you (or anyone else) have a preference (all biased views are welcomed Wink)?
Quote this message in a reply
Sage
Posts: 1,232
Joined: 2002.10
Post: #5
For the extensions, you actually need to check for
ARB_shader_objects (required)
ARB_shading_language_100 (required)
ARB_vertex_shader (required if you want to use vertex shaders)
ARB_fragment_shader (required if you want to use fragment shaders)

I only mention this because it is possible to use vertex shaders without fragment shaders, and vice versa. ARB_vertex_shader is exported on all renderers in Mac OS X (it is emulated in software on older hardware) while ARB_fragment_shader is only exported on renderers that export ARB_fragment_program.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #6
You should use GLSL rather than Cg. Cg is very NVidia-centric, and may not give you access to the full power of recent hardware.
Quote this message in a reply
Member
Posts: 312
Joined: 2006.10
Post: #7
Thanks for the info guys Smile
Quote this message in a reply
Member
Posts: 312
Joined: 2006.10
Post: #8
Using GLSLEditorSampler, it seems all vertex shaderd are being done in software. Is that typical for an Intel GMA 950?
Quote this message in a reply
Moderator
Posts: 522
Joined: 2002.04
Post: #9
Yep. GMA 950s only do fragment programs in hardware. Vertex programs are done by the CPU. (As is Transform&Lighting... which is why you get vertex count bound on these stupid things so much)

-Jon
Quote this message in a reply
Member
Posts: 81
Joined: 2007.05
Post: #10
I made the cg mistake.

It boiled down to this: on the mac, with cg, your going to be limited to the arbvp1 and arbfp1 profiles. The old assembly language stuff. The lowest standard. Fixed loops etc. Yep, use the glstate variables but don't use the glstate variables in other profiles. Ugg, now your passing state through the cg api. OK, you run on an nvidia card and you can have a better profile.

But the rub is if you run on an ati card your *always* going to have arbfp1 and arbvp1 profiles. Not like ati wants to help. Thus, you will not enjoy the better hardware features on the ati card.

Yet, if you go the glsl route then you enjoy a better deal on both cards.

There is another cg profile called glsl. It converts the cg script to glsl but if you look at the results its greek and type casts are all over the place.

At the end of the day, one concludes that glsl is the "standard".

cg on a gma system. Now is nvidia really going to debug that ball of yarn or is intel going to care. I see different venders point fingers at each other ... he he .. Wacko
Quote this message in a reply
Moderator
Posts: 522
Joined: 2002.04
Post: #11
macnib Wrote:cg on a gma system. Now is nvidia really going to debug that ball of yarn or is intel going to care. I see different venders point fingers at each other ... he he .. Wacko
Cg working on a GMA system are unrelated. As long as ARB fp and vp work on the GMA 950 (or software whatever) it'll work.

Personally I don't care about high end GPU features that you can't get at with ARB fp because the market to ship a game using the high end features is too specialized for games I want to make.

Here's a rebuttle about Cg/GLSL on the Unity forum:
http://forum.unity3d.com/viewtopic.php?t=5595

Cg still seems like the best choice on Mac OS X.

-Jon
Quote this message in a reply
Member
Posts: 312
Joined: 2006.10
Post: #12
Now I'm tied between the two. I'm mostly concerned with support. Which runs best on the most cards (Mac and Windows)?
Quote this message in a reply
Moderator
Posts: 1,140
Joined: 2005.07
Post: #13
Cg was mainly filling the role of GLSL before GLSL was introduced. As such, GLSL will grow while Cg probably won't.
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #14
If you believe the Unity folks' arguments, then by all means try Cg. Just don't say you weren't warned Wink

Personally, I'd be using GLSL, but then I'm not shy of turning off shaders on PowerPC Macs on 10.4 if that's what's required.
Quote this message in a reply
Member
Posts: 312
Joined: 2006.10
Post: #15
OSC, you're probably right. It only makes sense to use GLSL. I all ready know the basics of the language, I've got a nice wrapper around the OGL syntax for using GLSL shaders. I think I'll stick with GLSL, then try to port any of my shaders to ShaderLab/Cg if I want to use them in Unity.

Thanks for all your guys input!
Quote this message in a reply
Post Reply