Drawing RGB pixel arrays - Printable Version
+- iDevGames Forums (http://www.idevgames.com/forums)
+-- Forum: Development Zone (/forum-3.html)
+--- Forum: Graphics & Audio Programming (/forum-9.html)
+--- Thread: Drawing RGB pixel arrays (/thread-3120.html)
Drawing RGB pixel arrays - andy82 - Aug 6, 2007 07:37 AM
I'm new to programming on the Macintosh. I'm trying to port a game to Mac OS X. The game is written in plain C and already compiles on a variety of different platforms. Because of the OS-abstracted nature of the game, I do not need to use many Carbon API functions. In fact, the only drawing function I need would be a function that allows me to draw RGB pixels to a window.
My game stores all graphics data in simple RGB arrays with 32-bits per pixel. The alpha channel is unused, so each pixel is of the format 0x00RRGGBB. Now all I would need to get the graphics going under Mac OS X would be a function to draw these pixels to my window.
I read a little bit through the docs on ADC and found the function CGBitmapContextCreate(), which allows me to create a bitmap from an RGB array. However, as far as I can see, I cannot draw this bitmap directly to a window. Instead, in order to draw the bitmap, I will first have to convert it into an image using CGBitmapContextCreateImage(). When I finally have the image, I could jam it into the window using CGContextDrawImage().
And here is what worries me: The documentation of CGBitmapContextCreateImage() says that the pixel data will be copied if the original pixel data changes. But I don't want that because the pixel data in the RGB array changes with every frame of course. So what should I do? The only thing that comes to my mind is:
1) Attach all RGB arrays to bitmap contexts using CGBitmapContextCreate().
2) Draw next frame into RGB array attached to a bitmap context
3) Convert this bitmap context into an image
4) Draw the image to the window
5) Dispose image
6) Loop, i.e. jump back to 2)
This would be a solution but I don't know how expensive CGBitmapContextCreateImage() is. In the solution presented above I would be calling CG...CreateImage() for every frame that is rendered which is probably a huge overhead. But I do not see an other way to do it because it appears that I cannot modify the pixel data of an image directly. Thus, I seem to be forced to use both, a bitmap context and an image.
Ok, I hope I've made my case clear and I would like to hear some thoughts on this from some experienced Mac OS X programmers.
Drawing RGB pixel arrays - ThemsAllTook - Aug 6, 2007 08:10 AM
Since you're worried about performance, perhaps OpenGL is a more ideal route than CoreGraphics. After initially creating textures for each RGB array, you could call glTexSubImage2D to update them every frame. Depending on how many things you're drawing and how often you're updating them, though, the CoreGraphics approach may well be fast enough - no real way to know other than to try it and see.
Drawing RGB pixel arrays - AnotherJake - Aug 6, 2007 11:28 AM
This topic comes up a lot from people porting games to the Mac or learning from older books. I have thought many times about coding a simple framework to address it more easily, but time seems short these days. Maybe one of these afternoons I can whip something up for it. Until then, to the best of my knowledge, here's the deal:
Drawing/blitting directly to video memory or a window just doesn't really work on modern OS X. Even if you do get stuff to work the old fashioned way, that path is littered with trouble at every corner. As ThemsAllTook already said, OpenGL is the best way to do it (at least that we know of around here). That does not mean using very much of OpenGL or doing 3D graphics or anything like that, but rather using it as your rasterizer. Just like ThemsAllTook suggested, in OpenGL you would make a quad the size of the window or screen and a texture to use as your pixel buffer and simply update that texture every frame. You're essentially tricking OpenGL into being your high-performance blitter. The bonus you get by doing this is automatic DMA acceleration when updating the texture (er, pixel buffer, frame buffer, whatever). It seems like an akward solution, but it works pretty good from what I've seen myself and heard from others.
 And oh yeah, don't let yourself get suckered into thinking glDrawPixels would be better either, coz it's slooow. Texture updating is fast.