Thoughts on supporting @2x resolution ..

Moderator
Posts: 3,577
Joined: 2003.06
Post: #31
I cannot remember exactly why I did it that way. You're right, certainly in that chunk of code there is absolutely no reason not to combine them.

I seem to recall that I was debugging touch coordinates, which do not automatically transform, so I was being explicit about it while debugging and probably forgot to combine the redundant code afterwards.
Quote this message in a reply
Member
Posts: 129
Joined: 2009.03
Post: #32
Just looking in the iPhoneAppProgramming guide:

Quote:An important factor when determining whether to support high-resolution content is performance. The quadrupling of pixels that occurs when you change the scale factor of your layer from 1.0 to 2.0 puts additional pressure on the fragment processor. If your application performs many per-fragment calculations, the increase in pixels may reduce your application’s frame rate.

So assuming that 4x the number of source pixels, per destination pixel, is going to have an impact on performance; using a combination of @2x and @1x textures could be the way to go, for sprite-based games at least. Would be interesting to know exactly how much impact it has; may need to do some tests at some point.

I'm a bit stuck (again), on the business of loading in PNGs from disk, uncompressing them, and copying the data to OpenGL texture buffers.

I'm using CGImageCreateWithPNGDataProvider to load in my image, and CGContextDrawImage to extract all the bits in to a buffer.

After that, I optionally process the buffer (down to ARGB4444 etc).

Next, I use glTexImage2D to get the buffer in to OpenGL.

At what stage do I set the scale of the texture? Surely, if it's just a buffer (raw data), and that gets copied in to OpenGL; I need to set the scale of the OpenGL texture? Is that correct?
Quote this message in a reply
Apprentice
Posts: 14
Joined: 2010.01
Post: #33
(Nov 19, 2010 03:13 AM)Jamie W Wrote:  At what stage do I set the scale of the texture? Surely, if it's just a buffer (raw data), and that gets copied in to OpenGL; I need to set the scale of the OpenGL texture? Is that correct?
If you're using 0.0-1.0 ranged floats for texture coordinates, then assuming everything is in the same relative location on the 2x image as the 1x image you shouldn't need to change anything. ie. 0.25f of 1024 pixels is the same relative point of 2048 pixels.

I'm not sure what happens if you use integers for texture coordinates. I assume you would need to add a uniform to the shader; a uniform that holds the width/height of the texture, so the coordinates can be converted to 0.0-1.0 floats in the shader. If you're using ES 1.1 then I even more don't know. Smile

Hoping I understood your question.

Rowan.
Quote this message in a reply
Member
Posts: 129
Joined: 2009.03
Post: #34
I think you understood fine Rowan, and yeah, makes perfect sense what you're saying. I have an integer width and height; and I reference texture coords by integers, which I later scale/convert to floats. That's what counts; the float position on the texture.

*phew*

Makes sense now, thanks. Smile
Quote this message in a reply
Sage
Posts: 1,482
Joined: 2002.09
Post: #35
@Jamie W
I think you misunderstood Apple's note. The number of source pixels don't matter nearly as much as the number of destination pixels. The GPU has to render each pixel of each triangle separately. So by having 4x as many destination pixels, it has to do 4x the amount of work.

I did read somewhere that the GPU in the 4G and iPad have 3x the fillrate of the 3GS, so you don't have to be too worried.

Scott Lembcke - Howling Moon Software
Author of Chipmunk Physics - A fast and simple rigid body physics library in C.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  Thoughts on this article? Elphaba 1 2,421 Jun 11, 2009 02:04 PM
Last Post: Bachus