How do you specify a texture image with internal format RGBA2?

This is how I would think it should be done. Creating a red texture:

        int width = 64;
        int height = 64;
        int size = width * height;
        byte red = (byte) 0b11_00_00_11; // 2-bit per channel
        ByteBuffer buffer = MemoryUtil.memAlloc(size);
        for (int p = 0; p < size; p++) buffer.put(red);
        texture = glGenTextures();
        glBindTexture(GL_TEXTURE_2D, texture);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA2, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer.flip());

and in the fragment shader:

// The RGBA format and Type: UNSIGNED_BYTE should normalize the color values [0,1]
out_color = texture(texture,uv); 

But the output is clearly wrong. Showing different colors in the window for every program execution.

I provide 1 byte per. pixel. is this wrong? I thought that was the point.

>Solution :

I provide 1 byte per. pixel. is this wrong?

Yes. The type UNSIGNED_BYTE means 1 byte per color channel (4 bytes). You can use GL_UNSIGNED_SHORT_4_4_4_4 to reduce the size of the source image to 16 bits per pixel, however, there is nothing like GL_UNSIGNED_BYTE_2_2_2_2. This means that you cannot read a 2-2-2-2 image directly into a texture with the internal format RGBA2.

Leave a Reply