Texture in OpenGL ES 2 looks pixelated

Member
Posts: 25
Joined: 2011.08
Post: #1
I'm trying to translate a view based app to OpenGL, and my images look pixelated:

Screenshot

I suspect it is because I need to add retina support or something, I tried adding [self setContentScaleFactor:2] in my GLView, but it doesn't work, it just makes the screen purple.
Also I set the min and mag filter to linear.

I'm sure it's just something stupid that I'm missing... The texture's dimensions are powers of two...

Can someone help? Thanks!!
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #2
(Aug 26, 2011 06:01 AM)vunterslaush Wrote:  I tried adding [self setContentScaleFactor:2] in my GLView, but it doesn't work, it just makes the screen purple.

Strange, that's pretty much what I do and it works... Maybe you're setting it in the wrong place (if there is a wrong place)? I do it first thing in -initWithFrame:. Here's my code in case it's any help: http://sacredsoftware.net/svn/misc/StemL...EAGLView.m
Quote this message in a reply
Member
Posts: 25
Joined: 2011.08
Post: #3
Something is weird: My mainScreen bounds are 320 X 460. Is that supposed to be like that? It runs now, but only on the top left quarter of the screen.
Changing the content scale factor is the only thing I'm supposed to do?

BTW, when I'm running the simulator, when my device is the normal iPhone it shows a picture of an iPhone4, so I thought that it would simulate retina support, but it doesn't, and when I choose iPhone (retina) it show's an iPad, but it actually simulating a retina iPhone :| not a big deal but I'm wondering if it's like that for everyone or is it just me.

And back on topic, I'm talking about an actual device (iPhone 4)...
Quote this message in a reply
Member
Posts: 25
Joined: 2011.08
Post: #4
Ok fixed it. I changed glViewPort and I multiplied by 2 the width and height.
Does that make any sense or my programming is way off?
(BTW, still doesn't look as good as UIImageView)...
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #5
(Aug 26, 2011 09:29 AM)vunterslaush Wrote:  Ok fixed it. I changed glViewPort and I multiplied by 2 the width and height.
Does that make any sense or my programming is way off?

Multiplying by 2 will be correct for the specific case you're writing for at the moment, but isn't particularly futureproof, and would mean you'd have to use different code depending on whether you're running on a retina display or not. That's fine if you're only writing for iPhone 4, but if you want your code to run on a iPad or an older iPhone (or a newer one with potentially a different resolution), you may want a more general approach. I do it like this:

Code:
    GLint backingWidth, backingHeight;
    
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);
    glViewport(0, 0, backingWidth, backingHeight);

(Aug 26, 2011 09:29 AM)vunterslaush Wrote:  (BTW, still doesn't look as good as UIImageView)...

Another comparison screenshot, maybe? The UIImageView in your previous one looks like simple linear magnification to me, so I'd expect GL to do just as good a job of it...

(Aug 26, 2011 09:10 AM)vunterslaush Wrote:  BTW, when I'm running the simulator, when my device is the normal iPhone it shows a picture of an iPhone4, so I thought that it would simulate retina support, but it doesn't, and when I choose iPhone (retina) it show's an iPad, but it actually simulating a retina iPhone :| not a big deal but I'm wondering if it's like that for everyone or is it just me.

Yeah, the simulator's window doesn't look like an iPhone in retina mode for some reason. It should be the correct size for a retina display, though, whereas the iPad simulator window is somewhat smaller with a wider aspect ratio.
Quote this message in a reply
Member
Posts: 25
Joined: 2011.08
Post: #6
(Aug 26, 2011 10:32 AM)ThemsAllTook Wrote:  Multiplying by 2 will be correct for the specific case you're writing for at the moment, but isn't particularly futureproof, and would mean you'd have to use different code depending on whether you're running on a retina display or not. That's fine if you're only writing for iPhone 4, but if you want your code to run on a iPad or an older iPhone (or a newer one with potentially a different resolution), you may want a more general approach. I do it like this: ...
Yeah I looked at your code, thank you very much! Actually I multiplied it by [UIScreen mainScreen].scale, or should I use what you said?
I guess what you did is more compatible because my solution keeps the same width/height ratio... Well I don't plan on developing on the iPad because I don't have one, but I'll keep that in mind. Smile

(Aug 26, 2011 10:32 AM)ThemsAllTook Wrote:  Another comparison screenshot, maybe? The UIImageView in your previous one looks like simple linear magnification to me, so I'd expect GL to do just as good a job of it...
Of course:
Screenshot
BTW, after second thoughts, I enlarged a bit the GL texture so it will be more similar to the image view version, and it does look pretty identical. except for that black outline sorta thing. How can I fix that?


BTW, ThemsAllTook, thanks a lot for everything!!! You are helping me big time, I owe you! Smile
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #7
(Aug 26, 2011 10:59 AM)vunterslaush Wrote:  Screenshot
BTW, after second thoughts, I enlarged a bit the GL texture so it will be more similar to the image view version, and it does look pretty identical. except for that black outline sorta thing. How can I fix that?

Aha, you've hit another classic problem. Short version: Your texture appears to be using premultiplied alpha; use glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) and the black halo should disappear.

Longer version: A bitmap image with premultiplied alpha has its R, G, and B components scaled by the A component, and has to be composited differently in order to be drawn without semi-transparent portions looking darker than they should. For example, 50% transparent red would be 0x7F00007F premultiplied, rather than 0xFF00007F unpremultiplied. This has some advantages, though; when pixels are interpolated due to image scaling, transparent color values mingle with opaque ones in a way that can cause black or white halos (depending on a few things) around edges in a nonpremultiplied image. Premultiplied images (when composited correctly) categorically avoid this problem, because the math works out correctly when transparent pixels blend with opaque ones.

There's more information about this scattered around the forum. Here's another thread about it: http://www.idevgames.com/forums/thread-966.html

As for why your pixels are premultiplied, I presume you're loading the texture with UIImage or some other Apple API? Apple's image loading functions will typically premultiply your data for you, though in some cases you can disable it. If you want to load the image without premultiplying, you can (maybe? haven't done it in a while) change the options you use to load the image, or use libpng directly.

(Aug 26, 2011 10:59 AM)vunterslaush Wrote:  BTW, ThemsAllTook, thanks a lot for everything!!! You are helping me big time, I owe you! Smile

It's my pleasure! I always enjoy helping someone who puts the proper effort into understanding things and formulating their questions as well as you have.
Quote this message in a reply
Member
Posts: 25
Joined: 2011.08
Post: #8
(Aug 26, 2011 11:51 AM)ThemsAllTook Wrote:  Aha, you've hit another classic problem. Short version: Your texture appears to be using premultiplied alpha; use glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) and the black halo should disappear.

Thanks! That solved it. Smile

(Aug 26, 2011 11:51 AM)ThemsAllTook Wrote:  As for why your pixels are premultiplied, I presume you're loading the texture with UIImage or some other Apple API? Apple's image loading functions will typically premultiply your data for you, though in some cases you can disable it. If you want to load the image without premultiplying, you can (maybe? haven't done it in a while) change the options you use to load the image, or use libpng directly.

Yeah, I'm using UIImage CGBitmapContext. (Everything is copied and pasted from here)
Should I bother with it? Why do I care if my images are premultiplied if they are shown correctly? What are the benefits?

One last question which is probably more related to my previous thread-
If I manually sort my objects/sprites by depth, and then call glDrawWhatever in that order, can I ditch depth test and the depth buffer?
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #9
(Aug 26, 2011 12:15 PM)vunterslaush Wrote:  Should I bother with it? Why do I care if my images are premultiplied if they are shown correctly? What are the benefits?

In the vast majority of cases, it doesn't matter; for loading images you want to display with OpenGL, premultiplied alpha is almost always what you want. The only situations I can think of where you'd really need nonpremultiplied alpha are if you're encoding data into a bitmap format for something other than display (say, a terrain map of some sort) and need all four channels to do so, or if you're editing the image and you need to preserve colors that might be destroyed by 0 alpha. Premultiplication is technically a lossy operation, so it's good to be aware of what it is and does, but for simply displaying the image, the lossiness isn't relevant.

(Aug 26, 2011 12:15 PM)vunterslaush Wrote:  One last question which is probably more related to my previous thread-
If I manually sort my objects/sprites by depth, and then call glDrawWhatever in that order, can I ditch depth test and the depth buffer?

Yes indeed! Without the depth buffer/test, GL will simply draw newer stuff on top.
Quote this message in a reply
Member
Posts: 25
Joined: 2011.08
Post: #10
New questions:
I've started a new project from the OpenGL template from Apple,
and it automatically supports both retina and non-retina displays?
And another question- I edited the shaders to support texture, and I'm playing around with it to see that everything is ok, and of course, that's not the case: I have two quads, same texture (a grayscale shaded ball), one is setting all the color vertices to green, and the other ball is red, but all the alpha values of the corner vertices are 0, so I expected the red ball to not show at all, however, that's what I get:
Screenshot

I'm using glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) and glEnable(GL_BLEND), these are my shaders:

Vertex shader:
Code:
attribute vec4 position;
attribute vec4 color;

varying vec4 colorVarying;
attribute vec2 texCoordIn;
varying vec2 texCoordOut;


void main()
{
    gl_Position = position;
    gl_Position.x *= 0.33/0.5; //This fix the x/y ratio, it looked scaled because I'm using landscape, it's temporary until I'll think of a better solution
    texCoordOut = texCoordIn;
    colorVarying = color;
}

Fragment shader:
Code:
varying highp vec4 colorVarying;
varying highp vec2 texCoordOut;

uniform sampler2D texture;

void main()
{
    gl_FragColor = colorVarying * texture2D(texture, texCoordOut);
}

I thought that was what's relevant, but if more info is needed I will add it.

Thanks!
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #11
ONE, ONE_MINUS_SRC_ALPHA is a premultiplied blend, but your vertex colors are not premultiplied.

Code:
colorVarying = vec4(color.rgb * color.a, color.a);
Quote this message in a reply
Member
Posts: 25
Joined: 2011.08
Post: #12
(Aug 30, 2011 09:22 AM)OneSadCookie Wrote:  ONE, ONE_MINUS_SRC_ALPHA is a premultiplied blend, but your vertex colors are not premultiplied.

Code:
colorVarying = vec4(color.rgb * color.a, color.a);

I added that at the vertex shader and it worked! Thanks!
Just so that I'll understand- Is that ok? When I'm loading the textures I'm premultiplying the alpha, so in order to make everything work that line that you wrote is the standard thing to do? Or I'm doing something wrong and I should change it?
Quote this message in a reply
Luminary
Posts: 5,143
Joined: 2002.04
Post: #13
Actually submitting premultiplied vertex colors (and thereby saving a couple of instructions in the vertex shader) would be better... but it's not terrible.

Either way, you need to decide on premultiplied or not, and do everything in that mode.
Quote this message in a reply
Member
Posts: 25
Joined: 2011.08
Post: #14
(Aug 30, 2011 09:46 AM)OneSadCookie Wrote:  Actually submitting premultiplied vertex colors (and thereby saving a couple of instructions in the vertex shader) would be better... but it's not terrible.

Either way, you need to decide on premultiplied or not, and do everything in that mode.

How do I submit premultiplied vertex colors? I thought I'm premultiplying my textures when I load them...

Here's my texture loading code:
Code:
- (GLuint)setupTexture:(NSString *)fileName
{
    CGImageRef spriteImage = [UIImage imageNamed:fileName].CGImage;
    if (!spriteImage) {
        NSLog(@"Failed to load image %@", fileName);
        exit(1);
    }
    size_t width = CGImageGetWidth(spriteImage);
    size_t height = CGImageGetHeight(spriteImage);
    uint wpot = 1;
    uint hpot = 1;
    while(wpot<width) wpot*=2;
    while(hpot<height)hpot*=2;
    GLubyte * spriteData = (GLubyte *) calloc(wpot*hpot*4, sizeof(GLubyte));
    CGContextRef spriteContext = CGBitmapContextCreate(spriteData, wpot, hpot, 8, wpot*4, CGImageGetColorSpace(spriteImage), kCGImageAlphaPremultipliedLast);
    CGContextDrawImage(spriteContext, CGRectMake(1, 1, wpot-1, hpot-1), spriteImage);
    CGContextRelease(spriteContext);
    GLuint texName;
    glGenTextures(1, &texName);
    glBindTexture(GL_TEXTURE_2D, texName);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, wpot, hpot, 0, GL_RGBA, GL_UNSIGNED_BYTE, spriteData);
    free(spriteData);        
    return texName;
}
Quote this message in a reply
Moderator
Posts: 1,560
Joined: 2003.10
Post: #15
(Aug 30, 2011 11:25 AM)vunterslaush Wrote:  How do I submit premultiplied vertex colors?

I think he means you should do glColor4f(r*a, g*a, b*a, a) instead of glColor4f(r, g, b, a) while you're using GL_ONE, GL_ONE_MINUS_SRC_ALPHA.
Quote this message in a reply
Post Reply 

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  OpenGL ES Texture Masking airfire 6 14,407 Mar 17, 2014 07:07 PM
Last Post: baioses
  OpenGL ES Texture Compression ajrs84 9 3,767 May 7, 2013 03:36 PM
Last Post: ajrs84
  OpenGL ES Texture Masking dalasjoe sin 0 3,783 Apr 13, 2012 12:17 AM
Last Post: dalasjoe sin
  Lighting and changing texture colors in OpenGL agreendev 2 7,536 Aug 13, 2010 03:47 PM
Last Post: agreendev
  OpenGL ES Texture Problems jhbau1000 1 5,189 Jul 12, 2010 05:57 AM
Last Post: Kezhaya