byteswapping using SDL?

Mars_999
Unregistered
 
Post: #1
I have looked at the byte swapping header in SDL and am now confused? Ok on a Mac int is 4bytes right? so thats 32bits. Now a float is 4bytes also right? If its 8 thats 64bits right? SDL has 16,32,64 byteswapping macros. So it doesn't matter if I send a 4byte int or float to the function right? Also was wondering if anyone here is using SDL's byteswapping functions? I see that you need to use some SDL defined file format in stead of your standard C FILE struct? Thanks
Quote this message in a reply
Member
Posts: 116
Joined: 2002.04
Post: #2
Yes, a 4-byte swapping macro should handle both a float and an integer properly.

Of course you don't have to use SDL for this. You can just use the macros defined in Endian.h.

Wade
Quote this message in a reply
Mars_999
Unregistered
 
Post: #3
Thanks for the reply. When I do a search on for endian.h I come up with a few different files in the frameworks directory? Which one should I be including and from what I can't seem to figure out which function I should be using for all the different types? Also when should I be using the function call? right after the file input and before you use the variable? Thanks
Quote this message in a reply
Member
Posts: 116
Joined: 2002.04
Post: #4
On OS X, you should do:
#include <CarbonCore/Endian.h>

On OS 9, you can just do #include <Endian.h>


As for using it, from the instructions in the header:

Code:
/*
    This file provides Endian Flipping routines for dealing with converting data
    between Big-Endian and Little-Endian machines.  These routines are useful
    when writing code to compile for both Big and Little Endian machines and  
    which must handle other endian number formats, such as reading or writing
    to a file or network packet.
    
    These routines are named as follows:
    
        Endian<U><W>_<S>to<D>

    where
        <U> is whether the integer is signed ('S') or unsigned ('U')
        <W> is integer bit width: 16, 32, or 64
        <S> is the source endian format: 'B' for big, 'L' for little, or 'N' for native
        <D> is the destination endian format: 'B' for big, 'L' for little, or 'N' for native
    
    For example, to convert a Big Endian 32-bit unsigned integer to the current native format use:
        
        long i = EndianU32_BtoN(data);

So, say you wanted to convert a Little-Endian integer to the Mac Big-Endian format. You actually have a choice - you could use:

EndianS32_LtoB(value) //swap a signed-32 bit from little to big

or

EndianS32_LtoN(value) //swap a signed 32-bit from little to native

You would think the LtoN would be a better choice, since if you moved your code to another platform, you would think it would work. However, the file really doesn't implement macros for other systems, so you'd have to do some work to get it to work anyhow.

In short, I think you can use either macro and it won't make any difference.

Wade
Quote this message in a reply
Mars_999
Unregistered
 
Post: #5
Hey thanks wadesworld, but I have tried including your header and I get the error no file or directory? I have looked through my frameworks dir and see no carboncore framework? Is this something I have to download seperate from PB?
Quote this message in a reply
Feanor
Unregistered
 
Post: #6
Erm, if someone is using SDL, is that not usually because they want to write cross-platform code? And wouldn't using Carbon invalidate that compatibility?
Quote this message in a reply
Mars_999
Unregistered
 
Post: #7
Quote:Originally posted by Feanor
Erm, if someone is using SDL, is that not usually because they want to write cross-platform code? And wouldn't using Carbon invalidate that compatibility?

Not really. SDL has a nice set of libraries all in one package. Music, sound, input, OpenGL support, ect... So why not use it? Beats having to look around for various libararies. But yeah its biggest advantage is cross platform. Which is why I should have coded my game I did when I was on the PC in SDL and not Win32API!!!!! Argh, I am going to go now and kick my own ass for being so stupid!!!
Quote this message in a reply
Post Reply