show 8bit images on 16bit screen? - Programmers Heaven

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Welcome to the new platform of Programmer's Heaven! We apologize for the inconvenience caused, if you visited us from a broken link of the previous version. The main reason to move to a new platform is to provide more effective and collaborative experience to you all. Please feel free to experience the new platform and use its exciting features. Contact us for any issue that you need to get clarified. We are more than happy to help you.

show 8bit images on 16bit screen?

PAGPAG Posts: 168Member
Help me, I need to know how to display 8bit images on a 16bit screen mode...


  • MutilateMutilate Posts: 22Member
    : Help me, I need to know how to display 8bit images on a 16bit screen mode...

    I try to help you from what I know, but I can't give you the solutions.
    The passage from 8bit to 16 bit maybe a problem 'cose they completly different meaning:
    16 bit mean that you have an 16bit integer per pixel where the first 5 more sigificative bit are red's one (0-31), the 6 next one are green's one (0-63) and the last 5 one's are blue's one.
    Bit/Color: F/R,E/R,D/R,C/R,B/R,A/G,9/G,8/G,7/G,6/G,5/G,4/B,3/B,2/B,1/B,0/B or RRRRRGGGGGGBBBBB

    But the 8 Bit mean that you have an char (0-255) per pixel, that's an index for a palette(Color0, Color1, ecc.)
    The palette has variable settings and dimension, it usually use 12bit (4bit for Red, 4 bit for blue and 4 for green), but the colors that are in a palette are usually variable so that you can with 256 colors better aproximate the original image. All this information should be in the image's header or it may use the windows's one.

    What u have to do is to understend where is and what dimension have the image's palette, and the convert the palette format into 16 bit format:
    Case 12 bit:

Sign In or Register to comment.