Reply To: Getting CRT-like brightness from original Xbox to LCD monitor

NewHome Forums OSSC, OSSC Pro and DExx-vd isl OSSC – Discussion and support Getting CRT-like brightness from original Xbox to LCD monitor Reply To: Getting CRT-like brightness from original Xbox to LCD monitor

#53267
TrantaLocked
Participant

    I will be adding some key points about the various issues I’ve had and my current solutions to them:

    -Displays can handle the 720×480 Xbox signal in wildly different ways, meaning that the optimal sampling settings will differ. For my LG C1, I change h. and v. active for the most effective way of stretching the image, while for my old Samsung 2494 monitor I primarily changed sample rate. But it’s typical to incorporate all of the settings depending on what game I’m calibrating for.

    -Modern 16:9 TVs usually block the side edges of the picture even if you set aspect ratio to native/original. PC monitors don’t have this issue. So on my LG C1, even If change sampling settings to calibrate for the full 1.363 frame, the sides are cut off by forced 4:3 borders, even in original aspect ratio mode! It’s annoying, but that’s how it is on both the LG C1 and my older Sharp LC-39 1080p TV. I CAN see the full width on my PC monitors. I’m not referring to the 1.5 aspect ratio of un-processed 720×480 signal, because it’s right for a TV to squeeze the signal width at least in 4:3 mode, but it’s not correct for it to force 4:3 borders especially if it has an original aspect ratio mode, where I want to see that small extra width for games that use the full signal width.

    -The visible frame aspect ratio differs depending on the game, as most tend to use less than the entire 720×480 signal, however, any calibration – once done correctly – will have the correct pixel aspect ratio for almost all games and scenarios. Timings are not necessarily compatible between different consoles, and there are cases in which games that don’t officially support 480p and are ran as such with GSM on a modded PS2 will have varying behavior per scene. In Ace Combat Zero: The Belkan War for example, one single sampling profile calibrated for in-engine gameplay results in a squeezed pixel aspect ratio for some of the mission briefing scenes. As for exact ratios, Burnout 3 for Xbox’s full visible frame should be 1.33, while Ninja Gaiden’s, which uses the entire signal width and height, is 1.363. A good rule if you just want to use one profile for the OSSC is to calibrate according to Burnout 3 or a similar 1.33 game that has a border around the visible picture by filling the visible frame to a 4:3 space, which you may need to either physically measure or use the boundaries of your TV’s 4:3 masking pillars if it uses them. You can also use a game with perfect circles for measuring, and I suggest Halo 2 with its radar, crosshair and BR scope. After which, you will see other full-signal games with a slight overscan and in the correct pixel aspect ratio like CRTs would show, mimicking the original experience. I tend to keep one profile for full-height games and one for reduced-height games. There are some games where the UI element’s width is adjusted in-between 4:3 and 16:9 to accommodate both modes, so as a sanity check it’s good to reference perfect circles and squares that are actually rendered in-game in case the UI happens to be unreliable.

    -Your display’s real aspect ratio is not necessarily a perfect 1.77 and this can vary by a surprising margin between displays. My LG C1 measures at exactly 1.77, but my other displays measure as high as 1.79! And the difference is indeed noticeable in side by side comparisons. As a result, your sample rate and/or active window settings – painstakingly tuned for a perfect pixel aspect ratio on one display – will not necessarily translate correctly to a different display!

    -I still strongly suspect there is something wrong with the way some of the Xbox Tony Hawk games brightness levels were mastered, or at least are output (**see my update on this below**). I’m waiting on a PS2 to inspect THPS4 and THUG to see the differences between the versions, but otherwise my hypothesis is still that Xbox games (at least over component, maybe not composite) seem to want more brightness than standard for other SDR content to look “correct.” It’s like they expected people with Xboxes and component to have fancy bright screens, so they output them in “HDR-lite” for that use case. You usually want at least 150-200 nits, if not more depending on the game, with G/Y gain at 39 and pre-adc gain at 8 in the OSSC settings. For Burnout 3 on an LG C1, the sweet spot in a dark room is 65 to 75 pixel brightness, with 85 contrast and 50 brightness. But I would say the absolute minimum is 60 before games start looking unplayably dark.
    **Update**: I saw that the PS2 version of THPS4 is brighter than the Xbox version, however this is comparing the PS2 480i signal to the Xbox 480p signal. I did not see the same white clipping as I saw in the video I linked earlier, but either way, the PS2 version still appears brighter than the Xbox version. It makes sense that it felt weird for my to play the Xbox version as I had grown up with the PS2 version.

    -If a game looks too yellow you can try the color temperature setting on your display or the B/Pb offset in the OSSC settings. B/Pb gain is not optimal because it’s moreso to fix problems with the console output as it extends gain for both blue and yellow due to the way YCbCr works. The color temperature method works the best with my LG C1 and Warm 30 tends to look quite good. This isn’t usually going to be a problem but some games are mastered to compensate for a slightly cool display.

    The OSSC is still an amazing device and one of my favorites in gaming hardware. I decided to not install the Xbox HDMI mod because I don’t want to lose all of the picture control I have with the OSSC (for which I bought a set of Xbox monster cables!).