Hi there - quick problem that is doing my nut in.
I've loaded up a tga file into an array of short ints. No problem there. TGAs apparently store their 16 bit pixels in least significant byte followed by most significant byte so I run through the array switching them around. The next bit, which seems to be cocking up, is moving the bits around into a psp texture which is then applied using the code from Blit in samples.
I've expanded the code to make it clearer
value = bitmap[loop];
blue_mask = (value & 0x001F) << 10;
red_mask = (value & 0x7C00) >> 10;
green_mask = (value & 0x03E0);
result = red_mask | blue_mask | green_mask;
I'm not bothered by the transparency bit at this point.
I believe tga files are ARRRRRGGGGGBBBBB (after the bytes have been switched) and psp format - ABBBBBGGGGGRRRRR, although there is some confusion on this point as the sony defines use 5551
so.... why do shades of blue and red work perfectly with the above code, but anything with green in screws up? Not much, but clearly something is amiss. Any further info would be appreciated - ie do some bits of psp code use 5551 and other things use 1555? I am storing the short pixels in an array, copying them to the 512 by 512 texture Blit uses, and letting its code blit them to the back buffer, presumably on a poly.
Cheers,
Robin