Flickering / missing single-pixel lines/dots?

Discuss the development of new homebrew software, tools and libraries.

Moderators: cheriff, TyRaNiD

Post Reply
skeezixcodejedi
Posts: 29
Joined: Tue Aug 30, 2005 10:37 am
Contact:

Flickering / missing single-pixel lines/dots?

Post by skeezixcodejedi »

The emu is running along well so I expect my blitting code is working as expected (and its essentially derived from the blit sample in the SDK.) For a couple of days I've been trying to bring over some UI code I've been using in a couple other platforms, but it always renders as a total mess.. the code is known good, and when reduced to the very basics, still futzes up.

Ex:

Draw a filled rectangle most of the screen size; works every time.

Draw a 1 pixel tall line across the screen, just inside the box, and just above the box. The one in the box and the one out of the box will both be missing some parts.. but different parts. Moving the line up or down the screen makes it miss different parts... though things drawn near the bottom right seem to miss more than things in the top left. Its not a simple unsigned char when should be unsigned int issue.. this code works fine on a half dozen platforms, and the kicker.. works fine for thick lines on the PSP more or less.

ie: Thickening the lines up to 10 pixels and they show up pretty well, though still a little distortion near the bottom right. Reduce to 1 or 3 pixels, and piles of parts start missing. Needless to say, rendering fonts and such is a total mess, with most of the pixels just not showing.

I'm relatively new to open-gl like systems, so I'm betting its something like the GPU deciding to average out the texture and dropping the pixels.. but it'd make more sense to me if it dropped out the whole line.. and not just a few bits of it.

Ex:

I'll draw a line and it'll show up like this...

--------------------------- ------------------- ----- ---- - - -

Whats very strange is I can redraw it over and over, and the missing parts might change here or there, every half second or so.. though in general it'll stay constant.

ie:

while ( running ) {
render-line-and-swap-buffers
}

For debugging I've tossedo ut the bresenham line draw and reduced it to a rediculously stupid line draw.. to no avail:

void render_line_v ( display16_t *disp, UInt16 c,
UInt32 x1, UInt32 y1, UInt32 h )
{
UInt32 x = disp -> pitch;
x *= ( (UInt32) y1 );
x += ( (UInt32) x1 );
//UInt16 *cursor = disp -> vram + ( disp -> pitch * y1 ) + x1;
UInt16 *cursor = disp -> vram + x;
while ( h ) {
*cursor = c;
cursor += disp -> pitch;
h--;
}
return;
}

I've got others, but they _all_ have the same effect and the line draw code is known good.

The 'blit' function I'm testing with is here (I've got numerous variatoins, with pre-set vertice lists and such, but since this is damaging my mind I'm keeping this one since its dumb and near to the blit sample.):

void cj_psp_2d_blit ( unsigned short int *pixels ) {
unsigned int j;

// begin logging GPU commands to our command buffer
sceGuStart ( GU_DIRECT, gpu_list );

// setup the source buffer as a 512x512 texture, but only copy 480x272
sceGuTexMode ( GU_PSM_4444, 0, 0, 0 );
sceGuTexImage ( 0, 512, 512, 512, pixels );
sceGuTexFunc ( GU_TFX_REPLACE, GU_TCC_RGB );
sceGuTexFilter ( GU_NEAREST, GU_NEAREST );
sceGuTexScale ( 1.0f/512.0f, 1.0f/512.0f ); // scale UVs to 0..1
sceGuTexOffset ( 0.0f, 0.0f );
sceGuAmbientColor ( 0xffffffff );

// do a striped blit (takes the page-cache into account)
unsigned char stripe = 0;
for ( j = 0; j < 480; j = j+SLICE_SIZE ) {
vertex_t *vertices = g_vertices + ( 2 * stripe );
//vertex_t *vertices = (vertex_t*)sceGuGetMemory(2 * sizeof(vertex_t));

vertices[0].u = j; vertices[0].v = 0;
vertices[0].color = 0;
vertices[0].x = j; vertices[0].y = 0; vertices[0].z = 0;
vertices[1].u = j+SLICE_SIZE; vertices[1].v = 272;
vertices[1].color = 0;
vertices[1].x = j+SLICE_SIZE; vertices[1].y = 272; vertices[1].z = 0;

sceGuDrawArray ( GU_SPRITES,
GU_TEXTURE_16BIT | GU_COLOR_4444 | GU_VERTEX_16BIT |
GU_TRANSFORM_2D, 2, 0, vertices );

stripe++;
} // for

// wrap up the rendering pipe
sceGuFinish();
sceGuSync(0,0);

// sceDisplayWaitVblankStart();
sceGuSwapBuffers();

return;
}

Am I doing anything obviously wrong in the blit routine, to cause the GPU to drop out single pixels or thin lines or the like?

jeff
--
Have you played Atari today?
ector
Posts: 195
Joined: Thu May 12, 2005 10:22 pm

Post by ector »

Yup, you don't flush your vertex data out of the cpu cache, so the gfx chip can't see all of it in memory properly.
http://www.dtek.chalmers.se/~tronic/PSPTexTool.zip Free texture converter for PSP with source. More to come.
holger
Posts: 204
Joined: Thu Aug 18, 2005 10:57 am

Post by holger »

If you are writing to VRAM you either need to use uncached adresses or flush the CPU DCache before calling GE commands again, otherwise it's not guaranteed that all writes are arrived in VRAM when you swap buffers.
skeezixcodejedi
Posts: 29
Joined: Tue Aug 30, 2005 10:37 am
Contact:

Post by skeezixcodejedi »

Oh baby.. some success! (after a couple of days this is a big thing :)

Where is the proper place to do this? I had dropped a couple of calls in there before but it didn't help.. so the trick is to drop the,m right after each call to the GuDrawArray()?

Which one.. the cache or cache-and-invalidate one?

Thanks guys!

jeff
--
Have you played Atari today?
holger
Posts: 204
Joined: Thu Aug 18, 2005 10:57 am

Post by holger »

You need to synchronize accesses done by the CPU and the GE. When you started the GE you have to wait for completion (and until it flushed all pending VRAM accesses) before acessing the VRAM from the CPU. When you rendered using CPU memory accesses, you need to flush the caches before you restart the GE.
skeezixcodejedi
Posts: 29
Joined: Tue Aug 30, 2005 10:37 am
Contact:

Post by skeezixcodejedi »

Makes perfect sense in retrospect :) Thanks for the tips my friends,

jeff
--
Have you played Atari today?
Post Reply