(unsigned) char, bitsets when using - + / * - Printable Version
+- iDevGames Forums (http://www.idevgames.com/forums)
+-- Forum: Development Zone (/forum-3.html)
+--- Forum: Programming Languages & Scripting (/forum-8.html)
+--- Thread: (unsigned) char, bitsets when using - + / * (/thread-3169.html)
(unsigned) char, bitsets when using - + / * - wyrmmage - Jul 19, 2007 08:09 PM
after reading an old book about C, I decided to do some optimizing to my graphics engine. Beforehand, I had not known that char could contain an int value, and I had never heard of bitsets. For many of the variables that I was using, I only needed a small range of numbers...some only needed a possible combination of three values, so I used a bitset 2, and some needed a slightly larger, but not huge, so I used char. After googling around for awhile, I found this: http://www.eventhelix.com/RealtimeMantra/Basics/OptimizingCAndCPPCode.htm article, which basically says that the C/C++ (I'm compiling in C++) compiler automatically recasts char variables to a signed int when doing mathematical operations like addition, subtraction, etc. and therefore, if speed instead of memory is your main concern, you should stick with int instead of chars. Is this true? I'm not outputting the value as a char, only as an int, so should I go ahead and use ints instead of chars? Does the same apply to bitsets? If I use an unsigned char, and then perform math with it, will it convert it into a signed char, or will it keep the unsigned status? What if I use an unsigned int instead of an unsigned char?
(unsigned) char, bitsets when using - + / * - unknown - Jul 19, 2007 09:01 PM
Yes use ints, also profile your code then optimize the bits which are slow.
(unsigned) char, bitsets when using - + / * - wyrmmage - Jul 19, 2007 09:27 PM
alright; thanks for the quick reply I know about general optimizing rules, but I figure the code requires less work overall if I do things the faster way the first time XD
(unsigned) char, bitsets when using - + / * - akb825 - Jul 19, 2007 10:01 PM
On modern computers, everything is done in 32 bits (or 64 bits) regardless of if it's 32 bits or less. This means that if you have a char, it loads the 8 bits, then stores it in a 32 bit register. When you do any operations on it, it's using all 32 bits. When it gets time to store that value back into memory, it simply throws away all but 8 bits of the value. In the end, it does the same amount of work as an int. The only difference is it uses less memory, but due to alignment constraints (such as ints must be aligned on 4 byte boundaries), it may not matter anyway. In the end, it's not worth it. If anything, it may even make things worse.
(unsigned) char, bitsets when using - + / * - OneSadCookie - Jul 19, 2007 10:53 PM
Always profile before optimizing. Very often, horrifically inefficient pieces of code (like bits written in Ruby) have absolutely zero effect on the performance. There's no point in "trying to do things the fast way first", that's just a guarantee of wasting your time. Do things the easy way first, then if (and only if) performance is a problem, profile, then optimize the bits that are actually causing you problems.
(unsigned) char, bitsets when using - + / * - wyrmmage - Jul 21, 2007 02:18 PM
thanks for all of the help...just one last question then
If you have code like:
what happens? Does the unsigned int get converted into a signed int for the operation?
(unsigned) char, bitsets when using - + / * - OneSadCookie - Jul 21, 2007 03:41 PM
C99 Spec Wrote:18.104.22.168 Signedand unsigned integers
(whatever that really means, the answer "in practice" is that you get 1. I think how it gets there is that -1 gets converted to unsigned, unsigned addition gets done (which overflows), and the overflow is discarded)
(unsigned) char, bitsets when using - + / * - wyrmmage - Jul 21, 2007 04:01 PM
very interesting stuff
Well, that answers all of my questions... thanks for all the help guys