Getting CRT-like brightness from original Xbox to LCD monitor

NewHome Forums OSSC, OSSC Pro and DExx-vd isl OSSC – Discussion and support Getting CRT-like brightness from original Xbox to LCD monitor

Tagged: , , , , , , ,

Viewing 13 posts - 1 through 13 (of 13 total)
  • Author
    Posts
  • #46120
    TrantaLocked
    Participant

      Solution to the problem detailed below in post #47210.
      I made a summation of my findings in the next post #53267.

      TL;DR for the brightness problem is that first of all yes, THPS4 is actually slightly dimmer on the Xbox 480p version compared to the PS2 480i version with both systems over component to the OSSC. However, Xbox games in general tend to benefit from a higher peak nit level. Whereas you’d usually be able to get by just fine on 100 nits for most SDR content, some Xbox games actually look optimal in the 200-250 nit range, with the average game looking optimal between 150 and 200 nits. There are signs that Xbox games were mastered (or at least output when in 480p mode) for newer “high end” CRTs popular in the early to mid 2000s that came out of the box with over saturated colors, a cooler white point, and the ability to reach ~300 nits peak brightness, all at a time when pumping these settings was the new hip thing. So as weird as it sounds, I think many Xbox developers and possibly Microsoft itself thought Xbox owners were the type most likely to get in on this trend, and so they compensated for it.
      ———————————–

      Xbox games just don’t look as bright as I remember them, but when I turn up various settings it clips out the highest whites. I’ve messed with pre-ADC gain, then regular gain and offset. Maybe there’s a combination of these settings that will get what I want? Maybe it is correct for some of the whites to be clipped if I’m going for a CRT look? Or is there some way to change gamma without clipping on the OSSC?

      It could also be that Xbox games are designed to be a bit darker and I’m basing my idea of some games on the PS2 versions. Like THPS4 for example which I played only on PS2 as a kid, it looks kind of dark from the Xbox through the OSSC at stock settings, but if the brightest white I can find say in a cloud for example indeed hits the full 255 on my monitor, then that theoretically means I should be seeing the full range correctly. I don’t know. Am I missing something?

      #46149
      TrantaLocked
      Participant

        I was looking at a video comparison of the Xbox vs PS2 vs Gamecube vs PS1 versions of THPS4 and found a perfect spot in the video at 9:38 that showed a cloud in the background in San Francisco. This cloud, as well as some clouds in Alcatraz, have spots that appear to be intended to render at max brightness, which makes it easy to calibrate. In the video you can clearly see a difference in the cloud brightness and structure between Xbox and PS2, showing what appears to be the PS2 and Gamecube versions clipping the upper end of the range, hiding some of the cloud details that are seen better on the Xbox version.

        I need to set the OSSC to a pre-ADC gain of 9 or 10 to match what I see in the video for the Xbox version. But for the PS2 version, I need to set pre-ADC gain to at least 11. Unless the game was captured non-ideally, this shows that #1 it’s possible a pre-ADC gain of 8 may be too low depending on your setup, and that to match what appears to be a slightly clipped PS2 version you need to set it even higher than 10. The Xbox version also appears darker than the PS2 version in other videos I’ve seen so this isn’t a fluke when it comes to the PS2 version being brighter

        Is this total proof the PS2 version is clipped and thus brighter? Almost, but it needs substantiation. If someone could post a picture of that cloud running off their PS2 on a CRT that would awesome to see for sure what PS2 users see.

        A pre-ADC value of 11+ is not necessary for other games. Ninja Gaiden for example looks more natural at 8 or 9 pre-ADC gain as the game is fine-tuned for the Xbox. I think 8 is too low with 9 maybe a tad too high, but I’d still choose 9 over 8 if I’m not touching the fine values. A fine G/Y gain value of 40 with pre-ADC at 8 should be a bit more precise.

        #46156
        BuckoA51
        Keymaster

          An awful lot would depend on how the user had his setup configured in the video too. I think it’s fair to say that game developers don’t pay enough attention to calibration and clipping points when developing software. If you’re ramping up ADC gain like that don’t forget to check blacks too, you might need to come to more of a compromise value. If only 240p test suite existed for Xbox!

          #46170
          TrantaLocked
          Participant

            Despite the possibility of the video capture being bad, I still believe THP4 on Xbox just looks wrong at default OSSC brightness settings (at least for my setup which involves a modded X360 component cable to the OSSC and then HDMI->DVI to my monitor). Most other games look correct at stock settings, but on an instinctual level THPS4 looks dark and dull. This isn’t a problem with the OSSC, rather with a design choice by Neversoft.

            I did check black levels and they match up correctly. I think gain doesn’t actually shift the lower most black value, so true blacks are the same at any gain setting. The lighter blacks and greys benefited from the gain increase in THPS4.

            #46177
            TrantaLocked
            Participant

              I don’t know whether the filtering over composite or a brighter display of a CRT would require a different setting to match that look on an LCD, but to maintain the “correct” color scale without clipping, stock settings are pretty much it, other than gain possibly benefiting from a tiny increase. I wish I had a PS2 and an old TV to test this though. I don’t think I’ve even looked at a CRT TV in years.

              #46245
              BuckoA51
              Keymaster

                If you can play DVDs on your Xbox or PS2 and you can find it I recommend the digital video essentials disc as a baseline for your calibration – https://www.amazon.com/Digital-Video-Essentials-Optimize-Entertainment/dp/B00005PJ70

                it’s not as good as actually getting the professional equipment in, but for videogames I’m not sure that’s practical anyway as I don’t think developers took games as seriously as studios take movies when considering such things.

                I find CRTs have a certain glow to them that’s hard to match on fixed resolution displays, but most CRTs now are darker than they were in their heyday.

                #46545
                TrantaLocked
                Participant

                  If you can play DVDs on your Xbox or PS2 and you can find it I recommend the digital video essentials disc as a baseline for your calibration – https://www.amazon.com/Digital-Video-Essentials-Optimize-Entertainment/dp/B00005PJ70

                  it’s not as good as actually getting the professional equipment in, but for videogames I’m not sure that’s practical anyway as I don’t think developers took games as seriously as studios take movies when considering such things.

                  I find CRTs have a certain glow to them that’s hard to match on fixed resolution displays, but most CRTs now are darker than they were in their heyday.

                  Edit 4-28-2021
                  After looking at videos comparing cheap component cables to the official Microsoft HD AV pack it looks reasonably possible that the green/yellow tint I’m seeing is due to the cable itself, which is a modded Xbox 360 component cable I got on ebay. The amount of tint I experience looks about the same as what I see in videos on YouTube. I wanted to get an XOSVP and Chimeric HDMI adapter but with both out of stock I need to be on the waitlist. I also would have settled for the SCART YPbPr cable from retrogamingcables but that is also out of stock. In the mean time I bought a set of used Monster component cables. Unmodified, direct connection, built like a tank with some of the best shielding on the market. Unless used monster cables go bad for whatever reason which I highly doubt, this should give me the absolute best picture possible over component. If that doesn’t get rid of the yellow tint I’m seeing, then I will for sure move to replacing the CPU caps. I am also hoping for a brightness increase with the monsters, but I am more set on that being a gamma curve shape issue than signal gain loss.

                  Edit 4-21-2021
                  I checked into gamma again and found out the typical CRT gamma is usually around 2.5, but the CRT gamma curve is also not “perfect” like it is for modern displays, making CRT-experience recreation harder. Due to the shape of a typical CRT’s gamma curve, a lot of the upper values will be brighter than they would be for a modern 2.5 curve, which may lend to finding the right middle ground gamma value if a custom gamma curve can’t be used. The gamma settings on my Samsung 2494 of mode1/mode2/mode3 correlate to roughly 2.0/2.1/2.2 at a high viewing angle and 2.1/2.2/2.3 at a low viewing angle (Test on a similar monitor). Choosing mode 3 for the highest possible gamma would be a reasonable starting point instead of the mode 2 I was previously using, but honestly both modes look decent in their own respects. I joined this with offsets of 128/128/(130 or 131 for blue), 26/36/26, and pre-ADC gain of between 8 and 12 depending on the game or scene. Colors look great. I still would like a custom gamma curve either at the OSSC side or in the Xbox dashboard.

                  Edit 4-15-2021
                  I think was looking in the wrong place with color gain values. I think all you need in some games is blue offset to around 131 and some extra Y and/or pre-ADC gain as usual. I would probably benefit more if the component chroma Pr and Pb gain settings did not scale outwards but rather upwards with a static baseline and I am hoping the firmware devs add that form of gain scaling which Y gain already seems to have. I have the new settings later in bold.

                  So I did do some calibration with that Digital Video Essentials DVD, which comes with a physical color filter for color level balance. The DVD helped me realize my monitor’s peak white gets slightly higher at a contrast ratio of 78 while I usually have it set at 75. I’m not sure which one is correct though because it’s quite a small increase and there is no increase at 76 nor 77, and level usually increases every other step while this jump skip two steps. I’m pretty sure 75 is correct because it preserves dark levels better.

                  I did notice a bit of color dullness in Burnout 3 compared to videos I saw on YouTube, so it’s possible my new OSSC settings below made with the DVD are a correct calibration for my specific setup but I still need to test to verify. At the least, things look better but something might be off still.

                  If anyone is using my monitor Samsung SyncMaster 2494SW or a similar model, here are my monitor settings:
                  Brightness: 100
                  Contrast: 75 (or 78) (78 appears to be max white level but 75 is already quite close to this level and feels like the “correct” max setting especially because it preserves dark detail better according to this test)
                  Sharpness: 52
                  MagicBright: Custom
                  MagicColor: Off
                  Red/Green/Blue: 50/50/50
                  Color Tone: Custom
                  Color Effect: Off
                  Gamma: Mode 2 or Mode 3 (2.1/2.2 or 2.2/2.3 gamma depending on vertical viewing angle in my case) depending on what you like. To get closer to a CRT (which has a 2.5 gamma curve), pick the gamma mode that is darkest on your display, but do note that due to the difference between gamma curve shapes of a CRT and modern display, you might not want to actually pick a 2.5 gamma value even if you have access to it. I think a gamma between 2.2 and 2.4 will probably be your best starting point in order to meet a middle ground with the brighter upper values of a CRT’s gamma curve which you can’t really replicate without a custom gamma curve at a point in your chain. I think game visibility can also be quite enjoyable at a standard monitor gamma, so it’s really up to you. I do think if you’re going to use the higher (darker) gamma setting, it is even more necessary to turn up pre-ADC gain to help some of those upper values look as bright as they would on a CRT. This gain increase of course will clip some of the white end, but that’s a sacrifice I’m willing to make – at least until there is a way to use custom curve gamma profiles within the Xbox softmod dashboard or on the OSSC. That in addition to more chroma gain scaling options in the OSSC should make for all the customization one needs to fix the picture coming from their 6th gen consoles.
                  Image Size: Auto for 480p, Wide for anamorphic widescreen
                  PC/AV Mode: PC

                  And my OSSC settings that work pretty much for all games (some games will look better with a higher pre-ADC like THPS4):

                  480p->line2x->upsample2x->DTV mode with custom timings (below)->DVI output

                  Ok so bear with me on these color settings because I’m still trying to optimize things to look both correct and “good,” at least to my eyes on my monitor. I’m aware that messing with the offsets may not be a great idea and that my pre-ADC gain is pretty high but again, the picture just looks good at these settings. I just have a gut feeling things needed to be brighter but maybe it’s something else like modern game’s color and lighting design that make me feel this way.

                  It also might depend on the game being played, as I earlier showed that the PS2 and Gamecube versions of THPS4 are brighter and thus clipped compared to the Xbox version. So it could be possible that stock settings are good for one game but not another, and at this point I would re-iterate my idea of using your gut as a factor to help determine if things look right. If you change a setting and it feels more real or transparent then maybe lessen the significance of another idea you believed was more “correct.”

                  I sort of realized that the color gain settings weren’t helping much because they were adding unwanted green due to the way the OSSC scales component chroma values (increases the absolute value on the -128 to +128 scale which means both ends scale outwards, negatively and positively). So instead I found that just a simple tick to Pb offset 130 or 131 and some extra overall gain is all I need to get the picture looking great. I think it’s possible that if the option were there that positive-only scaling blue chroma gain is more correct, but the offset solution is the best I have to getting there. Sometimes I need no extra offset so I judge based on each game. This is what I usually use if I need more:

                  Video LPF: Off
                  Color space: Auto
                  R/Pr offset: 128
                  G/Y offset: 128
                  B/Pb offset: 130 or 131 (usually will increase to between 129 and 132 depending on the game.)
                  R/Pr gain: 26
                  G/Y gain: 36
                  B/Pb gain: 26
                  Pre-ADC gain: between 8 and 11 depending on how you’re feeling. If you want “perfect” gain without white clipping, set pre-ADC gain to 8 and G/Y gain to 39. This should look good for many games, but you may, like me, want more overall brightness and pre-ADC gain is a quick and effective way of doing this.

                  Old settings:
                  Video LPF: Auto
                  Color space: Rec. 601
                  R/Pr offset: 130
                  G/Y offset: 128
                  B/Pb offset: 130
                  R/Pr gain: 35
                  G/Y gain: 26
                  B/Pb gain: 47

                  Pre-ADC gain: 9/10/11 depending on the game or how I’m feeling. The RGB balance changed slightly when when changing pre-ADC gain, at least when I was testing with the test DVD, so the other offset and gain values would not remain perfect for all pre-ADC gain levels. I made the above settings based on a pre-ADC gain of 11 but things still look about the same at lower gain levels.

                  I had a run of burnout 3 with the above settings and immediately felt the color balance was more transparent and realistic compared to the other profiles I had been working on. It was hard getting everything exactly right with the color filter card that came with the Digital Video Essentials DVD (though the filter did help a lot for getting blue and red gain correct) so I kind of started eyeballing it in order to get white balance correct. That required me to touch offsets, but if there’s a better way to achieve the right colors I’m open to suggestions. I’ve seen reports that color from component is not quite as good as with RGsB but I still don’t have enough experience with different output modes and hardware to know. I want to get a SCART cable that might help me enable this in forced progressive mode but I’m afraid I’d just be wasting money when I already have a decent setup with my modded X360 component cable and OSSC. But the thing I’ve learned to accept is that I just enjoy gaming from the Xbox more when I turn the gain up a couple ticks higher than it should be, and I think that looks better than turning up gamma on my monitor.

                  H. sample rate: 780

                  This works for the SyncMaster 2494SW but your display may stretch the standard 858 timing to the correct dimension. You want to make sure games using an active picture width of 704 pixels (Halo 2, THPS4) fill a 1.333 space and games like Ninja Gaiden that use the full 720 pixels fill a 1.363 space. Some games like Burnout 3 put a border around the entire 1.333 active image in anticipation for overscan, and this will appear zoomed out on screen. You can solve this by lowering V. active to around 449 or 450 and your monitor should automatically adjust the picture to 1.333. If it doesn’t, then you may need to touch other settings like H. active or even use a secondary scaler like a DSC 301 HD if all else fails. You can use perfectly circular objects like the radar HUD in Halo 2 to verify dimensions of your settings, which once calibrated should then work for every other Xbox game.

                  The following settings should be adjusted for picture centeredness and, if you desire, to ensure the entire active picture of games using the full 720×480 space (Ninja Gaiden) can be seen. The sync length and back porch settings may be slightly different for your setup. There are also other ways of changing picture position in the OSSC under “sync opt.” in case these fail to work properly, but sometimes when you change a setting like back porch it won’t register on screen until you change h. sample rate back and forth.

                  H. sync length: 48
                  H. back porch: 41
                  H. active: 720
                  V. sync length: 6
                  V. back porch: 30 (Back porch and sync length settings may need to change slightly depending on your V. active.)
                  V. active: 480 (or 449 if filling the display area for bordered games like Burnout 3)
                  Sampling phase: 180 (this makes no difference in my case but it may depending on your setup)

                  #47210
                  TrantaLocked
                  Participant

                    I have a pretty conclusive update on the whole topic and I will try to add more detail after I investigate further. But the gist is that the brightness problem is actually due to how the Xbox was designed to output in anticipation of a bright display, whether in how games were mastered on disc or the GPU settings. Xbox games are intended to be viewed on bright displays of at least 250 lux (cd/m^2). I verified all factors including OSSC settings, full/limited range, testing display brightness with a brightness app, comparing the same game on an Xbox 360, etc.

                    My recently acquired Sony Trinitron 1080i can run up to a max of about 350 lux while my monitor, to my ignorance, lost some of its factory brightness and now shines at a maximum of 180 lux. This brightness level is nearly unplayable for some Xbox games like Tony Hawk’s Underground without increasing the OSSC gain, which then can slightly screw up the picture, especially if your monitor can’t be set to 2.5 gamma as the OSSC has no gamma setting. I believe the brightness and black level output between 480i and 480p from the Xbox may change due to the expected screen brightness of classic vs progressive CRTs. I probably never noticed the issue for PC games because monitors are usually ran at a lower brightness than TVs, which the developer can account for in mastering. If you check the typical 1080p monitor the max brightness is usually 250 lux, so for Xbox games to look at all correct you will need to run at max brightness, including running the same games on the Xbox 360 in backwards compatibility mode as I don’t believe Microsoft corrects for this.

                    I’m also quite sure the yellow/green tint is due to a difference in the white temperature of different displays. Somehow the warmest setting on my Trinitron is still cooler than the neutral setting on my monitor, and my other laptop display is also a bit warmer than the Trinitron. I’ve also read that many CRTs were set neutral/cool by default for marketing reasons. However, the neutral on my Trinitron doesn’t look too blue and it actually looks perfect white to me and far better for Xbox games than my monitor’s slightly red-white that looks wrong if I try to set it cooler in any way. The better way to compensate is with the blue offset in the OSSC, but some games need this less than others.

                    I also want to say that when I was comparing the OSSC vs straight component to my Trinitron, the picture quality were both equally great. Main problem for me is a lack of a gamma setting on the OSSC. My Trinitron does have a good range of gamma settings, but my monitor is unable to go higher than 2.2. I did notice that the Trinitron changes some gamma and black level settings, which can be accessed in the service menu, depending on if it’s 480p over component vs DVI, so when these are accounted for I didn’t notice any color or brightness problems with the OSSC. I am also planning on using the MakeMHz XboxHD+ with its native 1080i upscale capability for lagless gameplay on the Trinitron 1080i (HDPT set to 0).

                    #53267
                    TrantaLocked
                    Participant

                      I will be adding some key points about the various issues I’ve had and my current solutions to them:

                      -Displays can handle the 720×480 Xbox signal in wildly different ways, meaning that the optimal sampling settings will differ. For my LG C1, I change h. and v. active for the most effective way of stretching the image, while for my old Samsung 2494 monitor I primarily changed sample rate. But it’s typical to incorporate all of the settings depending on what game I’m calibrating for.

                      -Modern 16:9 TVs usually block the side edges of the picture even if you set aspect ratio to native/original. PC monitors don’t have this issue. So on my LG C1, even If change sampling settings to calibrate for the full 1.363 frame, the sides are cut off by forced 4:3 borders, even in original aspect ratio mode! It’s annoying, but that’s how it is on both the LG C1 and my older Sharp LC-39 1080p TV. I CAN see the full width on my PC monitors. I’m not referring to the 1.5 aspect ratio of un-processed 720×480 signal, because it’s right for a TV to squeeze the signal width at least in 4:3 mode, but it’s not correct for it to force 4:3 borders especially if it has an original aspect ratio mode, where I want to see that small extra width for games that use the full signal width.

                      -The visible frame aspect ratio differs depending on the game, as most tend to use less than the entire 720×480 signal, however, any calibration – once done correctly – will have the correct pixel aspect ratio for almost all games and scenarios. Timings are not necessarily compatible between different consoles, and there are cases in which games that don’t officially support 480p and are ran as such with GSM on a modded PS2 will have varying behavior per scene. In Ace Combat Zero: The Belkan War for example, one single sampling profile calibrated for in-engine gameplay results in a squeezed pixel aspect ratio for some of the mission briefing scenes. As for exact ratios, Burnout 3 for Xbox’s full visible frame should be 1.33, while Ninja Gaiden’s, which uses the entire signal width and height, is 1.363. A good rule if you just want to use one profile for the OSSC is to calibrate according to Burnout 3 or a similar 1.33 game that has a border around the visible picture by filling the visible frame to a 4:3 space, which you may need to either physically measure or use the boundaries of your TV’s 4:3 masking pillars if it uses them. You can also use a game with perfect circles for measuring, and I suggest Halo 2 with its radar, crosshair and BR scope. After which, you will see other full-signal games with a slight overscan and in the correct pixel aspect ratio like CRTs would show, mimicking the original experience. I tend to keep one profile for full-height games and one for reduced-height games. There are some games where the UI element’s width is adjusted in-between 4:3 and 16:9 to accommodate both modes, so as a sanity check it’s good to reference perfect circles and squares that are actually rendered in-game in case the UI happens to be unreliable.

                      -Your display’s real aspect ratio is not necessarily a perfect 1.77 and this can vary by a surprising margin between displays. My LG C1 measures at exactly 1.77, but my other displays measure as high as 1.79! And the difference is indeed noticeable in side by side comparisons. As a result, your sample rate and/or active window settings – painstakingly tuned for a perfect pixel aspect ratio on one display – will not necessarily translate correctly to a different display!

                      -I still strongly suspect there is something wrong with the way some of the Xbox Tony Hawk games brightness levels were mastered, or at least are output (**see my update on this below**). I’m waiting on a PS2 to inspect THPS4 and THUG to see the differences between the versions, but otherwise my hypothesis is still that Xbox games (at least over component, maybe not composite) seem to want more brightness than standard for other SDR content to look “correct.” It’s like they expected people with Xboxes and component to have fancy bright screens, so they output them in “HDR-lite” for that use case. You usually want at least 150-200 nits, if not more depending on the game, with G/Y gain at 39 and pre-adc gain at 8 in the OSSC settings. For Burnout 3 on an LG C1, the sweet spot in a dark room is 65 to 75 pixel brightness, with 85 contrast and 50 brightness. But I would say the absolute minimum is 60 before games start looking unplayably dark.
                      **Update**: I saw that the PS2 version of THPS4 is brighter than the Xbox version, however this is comparing the PS2 480i signal to the Xbox 480p signal. I did not see the same white clipping as I saw in the video I linked earlier, but either way, the PS2 version still appears brighter than the Xbox version. It makes sense that it felt weird for my to play the Xbox version as I had grown up with the PS2 version.

                      -If a game looks too yellow you can try the color temperature setting on your display or the B/Pb offset in the OSSC settings. B/Pb gain is not optimal because it’s moreso to fix problems with the console output as it extends gain for both blue and yellow due to the way YCbCr works. The color temperature method works the best with my LG C1 and Warm 30 tends to look quite good. This isn’t usually going to be a problem but some games are mastered to compensate for a slightly cool display.

                      The OSSC is still an amazing device and one of my favorites in gaming hardware. I decided to not install the Xbox HDMI mod because I don’t want to lose all of the picture control I have with the OSSC (for which I bought a set of Xbox monster cables!).

                      #53285
                      BuckoA51
                      Keymaster

                        Thank you for your feedback and research on this. Make sure your TV is in just scan mode (overscan off) to combat it clipping any of the edges of the image but I would think you already did that.

                        I still use my OSSC with the Xbox HDMI mod (at least for now) via a HDMI to VGA transcoder, just makes the cables less bulky!

                        #53303
                        TrantaLocked
                        Participant

                          Just scan mode is on. The LG C1 either stretches the image to 16:9 or forces 4:3 masking pillars and nothing in between, even in original aspect ratio mode. You will of course still see the pixel aspect ratio you want, you just won’t see the full width of the 1.363 picture unless you want to run it in 1.33 which results in a squeezed pixel aspect ratio.

                          Does the MakeMHz HDMI mod give you the correct pixel aspect ratio out of the box or at least let you tinker to achieve it? I feel most setups out there, whether through an OSSC or other scaler or converter, are inadvertently running the full signal in 1.33 rather than 1.363, and the visual difference is noticeable. I believe the mod’s settings app has an overscan feature which is very useful but not enough if it’s just outputting 720×480 and letting the display fit it in a perfect 4:3 space.

                          #64082
                          RandomBolt
                          Participant

                            No game will use “full width” 720×480, that is not supported. The 480p output is EDTV 480p, where the 720 has margins (2 by 8px) which leaves 704px to be squished by 10:11 into 4:3, 640×480. This is according to spec for digital signals encoded over analog, like Component of TV VGA ports (with a TV RGB signal).

                            Any game that rendered geometry into square pixels instead of 10:11 will never display correctly in any EDTV compatible TV – CRT or LCD.

                            480p EDTV doesn’t change colorimetry, so if newer CRTs were more saturated, that should be a CRT problem.

                            Xbox was not (and cannot output) correct monitor VGA signals with Conexant chips. The filtering circuit on the motherboard would also need to be bypassed and further modified before being correct at a PC monitor.
                            Supposedly, it can output sRGB/rec 709, for 720p/1080i HDTV with HDTV gamma curves. I would not think this would be what is out of the encoder chip at 480p.

                            Gamma should be the same between 480i or 480p (not sure). It is possible there is some mismatch of IRE black levels at the output, so maybe try NTSC vs NTSC-J as US has 7.5 while J and PAL have 0 for black. This is a common problem when converting analog video to digital form, where for HDTV they were all set to 16-235 regardless of region.

                            #64093
                            Zacabeb
                            Participant

                              On the Xbox, both 720×480 and square pixel 640×480 are in fact supported rendering resolutions, but how they’re handled for output differ between encoders as mentioned in another thread. 🙂

                              Square pixel 480p (780 samples per line) of course differs from VGA (800 samples per line,) making them incompatible for proper aspect ratio and centering. But there’s nothing stopping the use of any pixel clock in analog video so long as the signal is within bandwidth, permitting NTSC square pixel and BT.601 to coexist.

                              There is a dilemma when it comes to ITU-R BT.601 and derived 480p and 576p standards. The pixel aspect ratios aren’t always adhered to in accordance with legacy. When converting HDTV to SDTV or EDTV, scaling down to cover the full 720 pixel raster including margins instead of the correct 711, 704, or 702 pixels was not only technically easier, but helpful as it extended picture into the blanking area for masking to suppress ringing from the leading edge of active video. The vertical scaling was not modified to keep the proper pixel aspect ratio, leading to new pixel aspect ratios. The same can happen in upscaling to HDTV. The assumption was that the discrepancy would be small enough go unnoticed by most viewers.

                              The standards have actually become quite a mess. 😄

                            Viewing 13 posts - 1 through 13 (of 13 total)
                            • You must be logged in to reply to this topic.