iPhone native graphics format

Posts: 2
Joined: 2009.02
Post: #1
My app has just a few too many moving, animating elements, so I am having trouble getting the graphics to draw at the right speed. My initial implementation used bitmaps, then putting those bitmaps into a CALayer to have it displayed. It ran very slow (30 sec a tick).

The next implementation was to do everything with CALayers. I had about 300 CALayers on the screen, and things went quite a bit faster (3 sec a tick), but obviously still too slow.

The third implementation was to do a hybrid approach, where a lot of the static image (backgrounds, walls, other unchanging features) were drawn to a bitmap, but the animatable pieces use CALayers. This got us to about .5 sec a tick, which is fairly close to where we need to be. Looking at the instrumentation, I find that the phone is redrawing almost the whole screen whenever one CALayer changes, even though that CALayer is in one little corner of the screen.

So now I'm back to looking at compositing everything myself in data buffers, then converting to bitmaps to draw on the screen. Looking at various message boards, it sounds like the underlying hardware either uses 32-bit graphics in a couple of formats, or 16-bit graphics in BGRx5551 format.

I'm interested in using the 16-bit format, and I really would like to use the native format of the phone, so that the phone doesn't have to convert the image every time the screen changes, which would be a big waste of effort. However, it is very difficult finding out how to get my data in that native format.

I'm storing the image into an unsigned short array, with bits of the data arranged as follows:

Bit 15 Bit 0
32768 1
Blue Green Red

(I hope the picture makes sense. Each short is 16-bits long. What I call 'bit 0' has binary weight 1, and is the unused bit. Bits 1 through 5, with binary weights 2 through 32, represent the Red intensity, with bit 5 being the more important bit. Bits 6 through 10 are Green, and bits 11 through 15 are Blue).

My understanding is that this is the BGRx-5551 format. After I compute this image, I store it into a bitmap context like this:

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef bitmap = CGBitmapContextCreate((void *) buffer, 320, 480, 5, 640, colorSpace,
     kCGBitmapByteOrder16Little | kCGImageAlphaNoneSkipFirst);
Later I want to display this, so I make a CGImageRef and a CALayer, like this:

CGImageRef image = CGBitmapContextCreateImage(bitmap);
CALayer layer = [CALayer layer];
[layer retain];
//... set the layer's bounds, origin, etc.
layer.contents = image;
[... addSublayer:layer];
When I want to change the image, to avoid the Copy On Write, I do this:

layer.contents = nil;
//... change some data in my buffers
image = GBitmapContextCreateImage(bitmap);
layer.contents = image;

The colors are messed up on the display. After poking around with the data, I find that the underlying representation is what I would call xRGB1555:

Bit 15 Bit 0
32768 1
Red Green Blue

It's fine to use this representation, if this is the underlying model used by the hardware. I'm trying to avoid having the iPhone convert the image from the format I'm using to the hardware model that it wants. So I guess there are a couple of questions:

1) What is the correct bit order I should be using?

2) How do I tell the system I've built the image in that bit order?

3) How do I tell the color space system not to 'fix' the colors if they happen to be outside the gamut?

Everything else in my app is done, I just need to get this graphics fixed. Help is very much appreciated.
Quote this message in a reply
Posts: 3,591
Joined: 2003.06
Post: #2
Sorry I don't have time to parse carefully through your entire post, but is there a particular reason why you can't use OpenGL instead of CALayers? You'd likely see a huge performance improvement.
Quote this message in a reply
Posts: 9
Joined: 2011.06
Post: #3

Your question about the 16 bit format is a complex one, the best approach would be look at some working code that already implements the approach you are interested in. Have a look at the source code for my graphics library for iOS named AVAnimator.


It seems like graphics performance is critical to your application, so I would suggest that you consider using this library as a back end and then prepare and render images and write them to the optimized .mvid format on disk. When the render stage is completed, you can then blit the composed frames to the screen one at a time and performance will be very good. One newer devices it is possible to get 45 to 60 FPS with a full screen blit using this approach.

You could also just take a look at the source code and then develop your own code to do the same thing, but I think you will find just using the existing library and optimized ARM asm blit code is a big win.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  Problem with the graphics color on a videogame for iphone femmedragon 6 9,373 Dec 16, 2012 07:37 PM
Last Post: SethWillits
  sdl with a native ui on the iphone is easier than you may think michelleC 0 4,094 Mar 17, 2010 12:05 PM
Last Post: michelleC
  Confusion about native pixel format tigakub 3 8,182 May 14, 2002 02:45 PM
Last Post: OneSadCookie