OSSC optimized modes for 240p game ports?

NewHome Forums OSSC, OSSC Pro and DExx-vd isl OSSC – Discussion and support OSSC optimized modes for 240p game ports?

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
  • #27425

    I’ve tried using my OSSC with various 240p ports such as the Mega Man X Collection for PS2 and Wii’s Virtual Console and it generally works but if I try and use the optimized sampling modes, I can’t ever seem to dial in a proper sampling phase. Whereas for original hardware, I’ve gotten it to work. Does anyone have experience with this kind of thing?


    I don’t know about PS2, but I was testing this on Wii a couple years ago. There 240p always outputs at standard 263p, 59.82 Hz. This means that optimal sampling would be 429 (half of standard 858 for 480i/480p). So the console is prescaling the games to this output.
    Therefor optimal timing for 256px width games is out of the question, while you might still get decent results with 320px wide content (their native 426-427 sample timing is close enough to 429 that pixel interpolation will be minimal).


    I recently got Megaman X Collection for the GameCube, and I tried to play it on the Wii. I have a special adapter that outputs HDMI from the Wii, but it didn’t work once the games are launched (240p). I used the component cable (original) to feed the signal through the OSSC, but it didn’t look that great.

    I tried the 240p test suite for the Wii, and 429 is indeed the proper sampling for it. I had to adjust it a bit to get signal (429.15). The game, however, did not look right with that sampling.

    I then proceeded to run the game on Dolphin to see if I could get the actual resolution that way. With the software renderer I get screenshots with a resolution of 256×240 (1:1). With the OpenGL renderer I get screenshots of 584×480 (the “pixels” are no longer square and they are bigger than an actual pixel on the image). It seems safe to conclude that the actual output is indeed 256×240. OSSC does show 263p.

    I use that output (256×240) for the SNES, and it looks great. I tried a sampling of 342 as used in the optimal settings for the GameCube, but it didn’t make much of a difference compared to the standard 341. Megaman X does have a small checkerboard pattern (just two pixels high) for the sky in the very first level. I used that pattern to find the right sampling, and I got 345. A sampling phase between 90 and 180 looks about the same, but I just can’t get the perfect image. There’s still some bleeding (if that’s the right word). I can make it worse, but no better (with sampling phase or LPF filter settings).

    This is a screenshot from Dolphin:

    This is a picture of the game running on my TV through OSSC with 345 sampling:

    This is a closeup where the bleeding can be seen:

    Do you have any suggestion on how I can further improve the image quality?


    This is a limitation of the GameCube and Wii as they subsample the color to 4:2:2 internally and then filter it for output, so there will always be some color bleeding. They might also be scaling up the picture from 256 to 640 pixels, rather than 512 pixels, causing additional bleeding to occur in bands across the picture even when the sample rate in the OSSC matches up with the game’s native resolution.


    That’s very unfortunate, but it does explain a lot. I’ll have to take a more careful look at the 240p test suite again. I don’t remember noticing any color bleeding there, but I wasn’t looking for it either.

    It does seem to be related to 4:2:2 in this case instead of scaling since the checkerboard pattern in the sky looks fine all the way after adjusting the sampling. Now I know where to look further. Thanks.

Viewing 5 posts - 1 through 5 (of 5 total)
  • You must be logged in to reply to this topic.