TrantaLocked
Forum Replies Created
-
AuthorPosts
-
The thing about the EVGA XR1 Lite is that I usually need to first switch back and forth between OSSC profiles for it to register a picture in OBS. It works for most settings I throw at it, though I can sometimes get distortion with certain sample or active window settings on the OSSC side. So yes 480i passthrough works, but you may need to switch between profiles to get the video feed to show in OBS. The XR1 Lite does not work with my USB 2.0 extension cable. For the longest time I thought USB 3.0 was just a bandwidth difference, but it uses four more pins than 2.0!
August 15, 2022 at 11:22 AM in reply to: Is outputting to a frame/proper de-interlacing possible on the 1.6 hardware? #54451The OSSC was made at a time when this kind of open source, community-driven converter/scaler project was new. I assume it needed to be cheap and simple to get traction, which it has. People had been using commercial scalers for the longest time and finally with the OSSC we had an open source scaler that did really well for the price, and now that the entire idea of open source scalers has been normalized we’re seeing tons of products coming out like native HDMI mods and now the OSSC Pro which will have the features you desire, hopefully with minimal lag.
I use my OSSC in passthrough mode for 480i so I don’t get jitter, but it does add some lag and minor movement distortion at the TV side which has to do the deinterlacing itself. It’s still the best picture I have seen in a scaler, though I haven’t seen the RT5x. If you want a Framemeister alternative, look into the Extron DSC 301 HD and get a VGA to RCA adapter for the component in. Extron’s cable adapts to BNC which requires RCA adapter caps, but it works great even with the OSSC if you want multiple consoles without needing a switch, and the OSSC can be configured for independent first-check AV inputs per profile. Schmups forum has a thread and users who have the Extron software to configure and there are EDID editors out there if you find that useful, as I needed it to set a 1440x960p output EDID.
LG OLED55C1PUB
-480i PS2 Lx3 for does NOT work, but passthrough, Lx2 and Lx4 work. For 480i, passthrough is preferred as the C1 does the best job of cleanly deinterlacing compared to the other modes.
-480p Xbox works for passthrough and Lx2. At first 480p passthrough wasn’t working but somehow I did something to make it work, just not sure how or why. For 480p, Lx2 + upsample2x is preferred.
-I did not test other consoles. Maybe someone can suggest a way to get Lx3 to work with the PS2?
-Firmware: .86aLike most TVs, my C1 55 OLED forces 4:3 pillar masks for most signals it deems as or close to “4:3”, even if the TV is set to original aspect ratio. Other signals are automatically stretched to the full 16:9 width and this cannot be altered on the set with any zoom or aspect ratio mode. As a result, the only ways to resize the image are to use h. active and v. active, h. sample rate, with sync length and back porch values for positioning. The end result of the pillar masks is you cannot see the full width of a 720×480 signal either in 1.363 (which is the correct aspect ratio for the full Xbox 720×480 signal, ex. Ninja Gaiden), or 1.5 (for which is not ideal but useful for testing purposes). You can see the full width in 1.33, but then the image will be slightly squeezed. In the case of 1.363, a few lines on both sides will be cut off by the pillar masks, however games that use a smaller area of the signal will still be fully visible (ex. Burnout 3: Takedown).
May 12, 2022 at 3:17 AM in reply to: Getting CRT-like brightness from original Xbox to LCD monitor #53303Just scan mode is on. The LG C1 either stretches the image to 16:9 or forces 4:3 masking pillars and nothing in between, even in original aspect ratio mode. You will of course still see the pixel aspect ratio you want, you just won’t see the full width of the 1.363 picture unless you want to run it in 1.33 which results in a squeezed pixel aspect ratio.
Does the MakeMHz HDMI mod give you the correct pixel aspect ratio out of the box or at least let you tinker to achieve it? I feel most setups out there, whether through an OSSC or other scaler or converter, are inadvertently running the full signal in 1.33 rather than 1.363, and the visual difference is noticeable. I believe the mod’s settings app has an overscan feature which is very useful but not enough if it’s just outputting 720×480 and letting the display fit it in a perfect 4:3 space.
May 10, 2022 at 10:33 AM in reply to: Getting CRT-like brightness from original Xbox to LCD monitor #53267I will be adding some key points about the various issues I’ve had and my current solutions to them:
-Displays can handle the 720×480 Xbox signal in wildly different ways, meaning that the optimal sampling settings will differ. For my LG C1, I change h. and v. active for the most effective way of stretching the image, while for my old Samsung 2494 monitor I primarily changed sample rate. But it’s typical to incorporate all of the settings depending on what game I’m calibrating for.
-Modern 16:9 TVs usually block the side edges of the picture even if you set aspect ratio to native/original. PC monitors don’t have this issue. So on my LG C1, even If change sampling settings to calibrate for the full 1.363 frame, the sides are cut off by forced 4:3 borders, even in original aspect ratio mode! It’s annoying, but that’s how it is on both the LG C1 and my older Sharp LC-39 1080p TV. I CAN see the full width on my PC monitors. I’m not referring to the 1.5 aspect ratio of un-processed 720×480 signal, because it’s right for a TV to squeeze the signal width at least in 4:3 mode, but it’s not correct for it to force 4:3 borders especially if it has an original aspect ratio mode, where I want to see that small extra width for games that use the full signal width.
-The visible frame aspect ratio differs depending on the game, as most tend to use less than the entire 720×480 signal, however, any calibration – once done correctly – will have the correct pixel aspect ratio for almost all games and scenarios. Timings are not necessarily compatible between different consoles, and there are cases in which games that don’t officially support 480p and are ran as such with GSM on a modded PS2 will have varying behavior per scene. In Ace Combat Zero: The Belkan War for example, one single sampling profile calibrated for in-engine gameplay results in a squeezed pixel aspect ratio for some of the mission briefing scenes. As for exact ratios, Burnout 3 for Xbox’s full visible frame should be 1.33, while Ninja Gaiden’s, which uses the entire signal width and height, is 1.363. A good rule if you just want to use one profile for the OSSC is to calibrate according to Burnout 3 or a similar 1.33 game that has a border around the visible picture by filling the visible frame to a 4:3 space, which you may need to either physically measure or use the boundaries of your TV’s 4:3 masking pillars if it uses them. You can also use a game with perfect circles for measuring, and I suggest Halo 2 with its radar, crosshair and BR scope. After which, you will see other full-signal games with a slight overscan and in the correct pixel aspect ratio like CRTs would show, mimicking the original experience. I tend to keep one profile for full-height games and one for reduced-height games. There are some games where the UI element’s width is adjusted in-between 4:3 and 16:9 to accommodate both modes, so as a sanity check it’s good to reference perfect circles and squares that are actually rendered in-game in case the UI happens to be unreliable.
-Your display’s real aspect ratio is not necessarily a perfect 1.77 and this can vary by a surprising margin between displays. My LG C1 measures at exactly 1.77, but my other displays measure as high as 1.79! And the difference is indeed noticeable in side by side comparisons. As a result, your sample rate and/or active window settings – painstakingly tuned for a perfect pixel aspect ratio on one display – will not necessarily translate correctly to a different display!
-I still strongly suspect there is something wrong with the way some of the Xbox Tony Hawk games brightness levels were mastered, or at least are output (**see my update on this below**). I’m waiting on a PS2 to inspect THPS4 and THUG to see the differences between the versions, but otherwise my hypothesis is still that Xbox games (at least over component, maybe not composite) seem to want more brightness than standard for other SDR content to look “correct.” It’s like they expected people with Xboxes and component to have fancy bright screens, so they output them in “HDR-lite” for that use case. You usually want at least 150-200 nits, if not more depending on the game, with G/Y gain at 39 and pre-adc gain at 8 in the OSSC settings. For Burnout 3 on an LG C1, the sweet spot in a dark room is 65 to 75 pixel brightness, with 85 contrast and 50 brightness. But I would say the absolute minimum is 60 before games start looking unplayably dark.
**Update**: I saw that the PS2 version of THPS4 is brighter than the Xbox version, however this is comparing the PS2 480i signal to the Xbox 480p signal. I did not see the same white clipping as I saw in the video I linked earlier, but either way, the PS2 version still appears brighter than the Xbox version. It makes sense that it felt weird for my to play the Xbox version as I had grown up with the PS2 version.-If a game looks too yellow you can try the color temperature setting on your display or the B/Pb offset in the OSSC settings. B/Pb gain is not optimal because it’s moreso to fix problems with the console output as it extends gain for both blue and yellow due to the way YCbCr works. The color temperature method works the best with my LG C1 and Warm 30 tends to look quite good. This isn’t usually going to be a problem but some games are mastered to compensate for a slightly cool display.
The OSSC is still an amazing device and one of my favorites in gaming hardware. I decided to not install the Xbox HDMI mod because I don’t want to lose all of the picture control I have with the OSSC (for which I bought a set of Xbox monster cables!).
I’ve never had a problem with HDMI sources with audio not showing up on my DVI displays. My PS4 can be set to audio over HDMI and I still get a picture. I also have the OSSC with the non-audio firmware.
May 17, 2021 at 7:23 AM in reply to: HDMI switch routes power from PS4 to OSSC causing blinking backlight #47396Ok so I tested two different HDMI switches, both of which do not cause the OSSC backlight to blink. NEWCARE HDMI Switch 3 in 1 Out from amazon and RadioShack 1500467 from ebay are both working fine.
My advice is to avoid the Techole HDMI switch and any switch that uses a similar design that could use the same generic PCB. Pictures of the Techhole PCB: https://imgur.com/a/CHSrIxM
May 7, 2021 at 9:32 AM in reply to: Getting CRT-like brightness from original Xbox to LCD monitor #47210I have a pretty conclusive update on the whole topic and I will try to add more detail after I investigate further. But the gist is that the brightness problem is actually due to how the Xbox was designed to output in anticipation of a bright display, whether in how games were mastered on disc or the GPU settings. Xbox games are intended to be viewed on bright displays of at least 250 lux (cd/m^2). I verified all factors including OSSC settings, full/limited range, testing display brightness with a brightness app, comparing the same game on an Xbox 360, etc.
My recently acquired Sony Trinitron 1080i can run up to a max of about 350 lux while my monitor, to my ignorance, lost some of its factory brightness and now shines at a maximum of 180 lux. This brightness level is nearly unplayable for some Xbox games like Tony Hawk’s Underground without increasing the OSSC gain, which then can slightly screw up the picture, especially if your monitor can’t be set to 2.5 gamma as the OSSC has no gamma setting. I believe the brightness and black level output between 480i and 480p from the Xbox may change due to the expected screen brightness of classic vs progressive CRTs. I probably never noticed the issue for PC games because monitors are usually ran at a lower brightness than TVs, which the developer can account for in mastering. If you check the typical 1080p monitor the max brightness is usually 250 lux, so for Xbox games to look at all correct you will need to run at max brightness, including running the same games on the Xbox 360 in backwards compatibility mode as I don’t believe Microsoft corrects for this.
I’m also quite sure the yellow/green tint is due to a difference in the white temperature of different displays. Somehow the warmest setting on my Trinitron is still cooler than the neutral setting on my monitor, and my other laptop display is also a bit warmer than the Trinitron. I’ve also read that many CRTs were set neutral/cool by default for marketing reasons. However, the neutral on my Trinitron doesn’t look too blue and it actually looks perfect white to me and far better for Xbox games than my monitor’s slightly red-white that looks wrong if I try to set it cooler in any way. The better way to compensate is with the blue offset in the OSSC, but some games need this less than others.
I also want to say that when I was comparing the OSSC vs straight component to my Trinitron, the picture quality were both equally great. Main problem for me is a lack of a gamma setting on the OSSC. My Trinitron does have a good range of gamma settings, but my monitor is unable to go higher than 2.2. I did notice that the Trinitron changes some gamma and black level settings, which can be accessed in the service menu, depending on if it’s 480p over component vs DVI, so when these are accounted for I didn’t notice any color or brightness problems with the OSSC. I am also planning on using the MakeMHz XboxHD+ with its native 1080i upscale capability for lagless gameplay on the Trinitron 1080i (HDPT set to 0).
I’ve certainly not had any brightness issues using the official HDTV pack with my Xbox. I doubt the monster cable could have gone off though.. though you could consider the MakeMHZ HDMI mod instead now.
I just got a Sony Trinitron 1080i with DVI input and am waiting for the MakeMHZ HD+ mod shipment. That could be a combination I put many hours on. The Sony CRTs have a ton of options including gamma out the wazoo so I am sure I will get an awesome picture for the Xbox.
I’m still leaning on the brightness issue just having to do with gamma curve shape and not level, which isn’t really rectified with a standard gamma correction. Or it’s just my bias, whatever causes it. But that is why I bought the Sony CRT, because it “should” look correct especially direct over component from the Xbox. That’s why I hope MakeMHZ and citrus3000 are considering implementing custom gamma curves or at the least a toggle between “modern standard” and “CRT,” because again modern 2.5 isn’t the same as CRT 2.5. In my opinion that is the only true way to get it right, and I just talked to N64 freak where mentioned how hard it was to get the levels right for different screens. If you implement a toggle as a user option that accurately compensates depending on the type of display, as in more accurate than just changing gamma level, it will make for a better product that everyone can rely on. Especially considering these mods will semi-permanently change your video output.
I have an update on this here. Also, there is no difference between the modded Xbox 360 cable and the Monster. The color issue I had was due to display white temperature differences. With bad component cables you will see a more distinct pure green tint rather than a yellow one like I was seeing.
I tested the Xbox modded Xbox 360 component cable I got on ebay and nothing was shorted or connecting to anything it shouldn’t have. I still want to try a SCART cable with YPbPr connections to see if that looks or behaves differently.
April 8, 2021 at 3:12 AM in reply to: Getting CRT-like brightness from original Xbox to LCD monitor #46545If you can play DVDs on your Xbox or PS2 and you can find it I recommend the digital video essentials disc as a baseline for your calibration – https://www.amazon.com/Digital-Video-Essentials-Optimize-Entertainment/dp/B00005PJ70
it’s not as good as actually getting the professional equipment in, but for videogames I’m not sure that’s practical anyway as I don’t think developers took games as seriously as studios take movies when considering such things.
I find CRTs have a certain glow to them that’s hard to match on fixed resolution displays, but most CRTs now are darker than they were in their heyday.
Edit 4-28-2021
After looking at videos comparing cheap component cables to the official Microsoft HD AV pack it looks reasonably possible that the green/yellow tint I’m seeing is due to the cable itself, which is a modded Xbox 360 component cable I got on ebay. The amount of tint I experience looks about the same as what I see in videos on YouTube. I wanted to get an XOSVP and Chimeric HDMI adapter but with both out of stock I need to be on the waitlist. I also would have settled for the SCART YPbPr cable from retrogamingcables but that is also out of stock. In the mean time I bought a set of used Monster component cables. Unmodified, direct connection, built like a tank with some of the best shielding on the market. Unless used monster cables go bad for whatever reason which I highly doubt, this should give me the absolute best picture possible over component. If that doesn’t get rid of the yellow tint I’m seeing, then I will for sure move to replacing the CPU caps. I am also hoping for a brightness increase with the monsters, but I am more set on that being a gamma curve shape issue than signal gain loss.Edit 4-21-2021
I checked into gamma again and found out the typical CRT gamma is usually around 2.5, but the CRT gamma curve is also not “perfect” like it is for modern displays, making CRT-experience recreation harder. Due to the shape of a typical CRT’s gamma curve, a lot of the upper values will be brighter than they would be for a modern 2.5 curve, which may lend to finding the right middle ground gamma value if a custom gamma curve can’t be used. The gamma settings on my Samsung 2494 of mode1/mode2/mode3 correlate to roughly 2.0/2.1/2.2 at a high viewing angle and 2.1/2.2/2.3 at a low viewing angle (Test on a similar monitor). Choosing mode 3 for the highest possible gamma would be a reasonable starting point instead of the mode 2 I was previously using, but honestly both modes look decent in their own respects. I joined this with offsets of 128/128/(130 or 131 for blue), 26/36/26, and pre-ADC gain of between 8 and 12 depending on the game or scene. Colors look great. I still would like a custom gamma curve either at the OSSC side or in the Xbox dashboard.Edit 4-15-2021
I think was looking in the wrong place with color gain values. I think all you need in some games is blue offset to around 131 and some extra Y and/or pre-ADC gain as usual. I would probably benefit more if the component chroma Pr and Pb gain settings did not scale outwards but rather upwards with a static baseline and I am hoping the firmware devs add that form of gain scaling which Y gain already seems to have. I have the new settings later in bold.So I did do some calibration with that Digital Video Essentials DVD, which comes with a physical color filter for color level balance. The DVD helped me realize my monitor’s peak white gets slightly higher at a contrast ratio of 78 while I usually have it set at 75. I’m not sure which one is correct though because it’s quite a small increase and there is no increase at 76 nor 77, and level usually increases every other step while this jump skip two steps. I’m pretty sure 75 is correct because it preserves dark levels better.
I did notice a bit of color dullness in Burnout 3 compared to videos I saw on YouTube, so it’s possible my new OSSC settings below made with the DVD are a correct calibration for my specific setup but I still need to test to verify. At the least, things look better but something might be off still.
If anyone is using my monitor Samsung SyncMaster 2494SW or a similar model, here are my monitor settings:
Brightness: 100
Contrast: 75 (or 78) (78 appears to be max white level but 75 is already quite close to this level and feels like the “correct” max setting especially because it preserves dark detail better according to this test)
Sharpness: 52
MagicBright: Custom
MagicColor: Off
Red/Green/Blue: 50/50/50
Color Tone: Custom
Color Effect: Off
Gamma: Mode 2 or Mode 3 (2.1/2.2 or 2.2/2.3 gamma depending on vertical viewing angle in my case) depending on what you like. To get closer to a CRT (which has a 2.5 gamma curve), pick the gamma mode that is darkest on your display, but do note that due to the difference between gamma curve shapes of a CRT and modern display, you might not want to actually pick a 2.5 gamma value even if you have access to it. I think a gamma between 2.2 and 2.4 will probably be your best starting point in order to meet a middle ground with the brighter upper values of a CRT’s gamma curve which you can’t really replicate without a custom gamma curve at a point in your chain. I think game visibility can also be quite enjoyable at a standard monitor gamma, so it’s really up to you. I do think if you’re going to use the higher (darker) gamma setting, it is even more necessary to turn up pre-ADC gain to help some of those upper values look as bright as they would on a CRT. This gain increase of course will clip some of the white end, but that’s a sacrifice I’m willing to make – at least until there is a way to use custom curve gamma profiles within the Xbox softmod dashboard or on the OSSC. That in addition to more chroma gain scaling options in the OSSC should make for all the customization one needs to fix the picture coming from their 6th gen consoles.
Image Size: Auto for 480p, Wide for anamorphic widescreen
PC/AV Mode: PCAnd my OSSC settings that work pretty much for all games (some games will look better with a higher pre-ADC like THPS4):
480p->line2x->upsample2x->DTV mode with custom timings (below)->DVI output
Ok so bear with me on these color settings because I’m still trying to optimize things to look both correct and “good,” at least to my eyes on my monitor. I’m aware that messing with the offsets may not be a great idea and that my pre-ADC gain is pretty high but again, the picture just looks good at these settings. I just have a gut feeling things needed to be brighter but maybe it’s something else like modern game’s color and lighting design that make me feel this way.
It also might depend on the game being played, as I earlier showed that the PS2 and Gamecube versions of THPS4 are brighter and thus clipped compared to the Xbox version. So it could be possible that stock settings are good for one game but not another, and at this point I would re-iterate my idea of using your gut as a factor to help determine if things look right. If you change a setting and it feels more real or transparent then maybe lessen the significance of another idea you believed was more “correct.”
I sort of realized that the color gain settings weren’t helping much because they were adding unwanted green due to the way the OSSC scales component chroma values (increases the absolute value on the -128 to +128 scale which means both ends scale outwards, negatively and positively). So instead I found that just a simple tick to Pb offset 130 or 131 and some extra overall gain is all I need to get the picture looking great. I think it’s possible that if the option were there that positive-only scaling blue chroma gain is more correct, but the offset solution is the best I have to getting there. Sometimes I need no extra offset so I judge based on each game. This is what I usually use if I need more:
Video LPF: Off
Color space: Auto
R/Pr offset: 128
G/Y offset: 128
B/Pb offset: 130 or 131 (usually will increase to between 129 and 132 depending on the game.)
R/Pr gain: 26
G/Y gain: 36
B/Pb gain: 26
Pre-ADC gain: between 8 and 11 depending on how you’re feeling. If you want “perfect” gain without white clipping, set pre-ADC gain to 8 and G/Y gain to 39. This should look good for many games, but you may, like me, want more overall brightness and pre-ADC gain is a quick and effective way of doing this.Old settings:
Video LPF: Auto
Color space: Rec. 601
R/Pr offset: 130
G/Y offset: 128
B/Pb offset: 130
R/Pr gain: 35
G/Y gain: 26
B/Pb gain: 47
Pre-ADC gain: 9/10/11 depending on the game or how I’m feeling. The RGB balance changed slightly when when changing pre-ADC gain, at least when I was testing with the test DVD, so the other offset and gain values would not remain perfect for all pre-ADC gain levels. I made the above settings based on a pre-ADC gain of 11 but things still look about the same at lower gain levels.I had a run of burnout 3 with the above settings and immediately felt the color balance was more transparent and realistic compared to the other profiles I had been working on. It was hard getting everything exactly right with the color filter card that came with the Digital Video Essentials DVD (though the filter did help a lot for getting blue and red gain correct) so I kind of started eyeballing it in order to get white balance correct. That required me to touch offsets, but if there’s a better way to achieve the right colors I’m open to suggestions. I’ve seen reports that color from component is not quite as good as with RGsB but I still don’t have enough experience with different output modes and hardware to know. I want to get a SCART cable that might help me enable this in forced progressive mode but I’m afraid I’d just be wasting money when I already have a decent setup with my modded X360 component cable and OSSC. But the thing I’ve learned to accept is that I just enjoy gaming from the Xbox more when I turn the gain up a couple ticks higher than it should be, and I think that looks better than turning up gamma on my monitor.
H. sample rate: 780
This works for the SyncMaster 2494SW but your display may stretch the standard 858 timing to the correct dimension. You want to make sure games using an active picture width of 704 pixels (Halo 2, THPS4) fill a 1.333 space and games like Ninja Gaiden that use the full 720 pixels fill a 1.363 space. Some games like Burnout 3 put a border around the entire 1.333 active image in anticipation for overscan, and this will appear zoomed out on screen. You can solve this by lowering V. active to around 449 or 450 and your monitor should automatically adjust the picture to 1.333. If it doesn’t, then you may need to touch other settings like H. active or even use a secondary scaler like a DSC 301 HD if all else fails. You can use perfectly circular objects like the radar HUD in Halo 2 to verify dimensions of your settings, which once calibrated should then work for every other Xbox game.
The following settings should be adjusted for picture centeredness and, if you desire, to ensure the entire active picture of games using the full 720×480 space (Ninja Gaiden) can be seen. The sync length and back porch settings may be slightly different for your setup. There are also other ways of changing picture position in the OSSC under “sync opt.” in case these fail to work properly, but sometimes when you change a setting like back porch it won’t register on screen until you change h. sample rate back and forth.
H. sync length: 48
H. back porch: 41
H. active: 720
V. sync length: 6
V. back porch: 30 (Back porch and sync length settings may need to change slightly depending on your V. active.)
V. active: 480 (or 449 if filling the display area for bordered games like Burnout 3)
Sampling phase: 180 (this makes no difference in my case but it may depending on your setup)So the 85% number then is in reference to the full raster? I wasn’t sure if overscan percentages were meant to be a % of the full raster or a % of the active picture area. I don’t know why the wikipedia article doesn’t make this clearer about what exactly is being overscanned.
I’m still confused about the Xbox output resolution and overscan. So the full NTSC raster is 858 x 525, and visible picture is 710.85 x 486. The Xbox output resolution is 720 x 480. Does this mean that if you had a CRT that showed the full raster without any overscan, there would be a thicker border than normal around the Xbox image right? If so, why does Microsoft recommend an 85% overscan when the image is already smaller than the full raster? Or is that just reference to 85% overscan of the full raster, not the 720 x 480 Xbox image? Or is 720 x 480 image blanked automatically so it fills about the same space as the 710 x 486 NTSC visible area, or does the Xbox output image have this blanking information in it?
Edit: So the impression I’m getting is that there is a minimum blanking interval forced by the CRT, but the source signal can add additional blanking.
In analog television systems the vertical blanking interval can be used for datacasting (to carry digital data), since nothing sent during the VBI is displayed on the screen; various test signals…
This suggests that any kind of data sent outside the regular data interval of 710.85 samples, even if it’s formatted as video information, will not be projected by the CRT during the blanking period. However,
On many consoles there is an extended blanking period, as the console opts to paint graphics on fewer lines than the television would natively allow, permitting its output to be surrounded by a border. On some very early machines such as the Atari 2600, the programmer is in full control of video output and therefore may select their own blanking period, allowing arbitrarily few painted lines.
This then suggests that there is the ability to add extra blanking to the signal at any point. So in a game like Pong for instance, you could have any area that is known to never have active pixels be defined as a blank rather than as actual data, which I assume would lower the data processing need for the GPU.
https://en.wikipedia.org/wiki/Vertical_blanking_intervalSo to answer my own question, the Xbox kind of does and doesn’t supply the blanking interval, and the 85% overscan is in reference to the 720 x 480 picture, though this was more of a safe number for worst case scenario as I believe CRTs usually overscanned no more than 5%. The Xbox sends data in a way that conforms to the minimum blanking interval set by the CRT, meaning there are valleys of non-data between each scanline (I think…). If the Xbox tried to supply data for a longer period on each scanline, it would simply be ignored by the CRT. The interesting thing is that this actually does happen for 720×480 signals from the Xbox; the extra 9 samples outside of the 710.85 (711) active picture area couldn’t naturally be seen on a CRT, even without overscan, unless you were to calibrate the CRT’s blanking interval to show this data. And some Xbox games actually do output video data over these 9 extra pixels, like Ninja Gaiden, while most games don’t. But the impression I’m getting is that most Xbox games that were authored “normally” are only using 704 x 480 pixels for active information since this matches the aspect ratio of the 711 x 486 matrix (“1.46”, which when converted to real space with square pixels is 4:3 = 1.33), just to insure that if overscan is disabled that the picture will still look perfectly 4:3 on the screen. Also fun fact, the top and bottom 3 lines would be blank when playing Xbox on a CRT with overscan disabled due to the actual area being 486 lines tall while the Xbox output signal is 480 lines tall.
Info on format conversion: http://xpt.sourceforge.net/techdocs/media/video/dvd/dvd04-DVDAuthoringSpecwise/ar01s02.html
March 28, 2021 at 9:58 AM in reply to: Gamecube 480p 2x mode aspect ratio not 4:3. Any ideas? #46279I wonder if there is a tag on the output EDID that the TV is latching onto. I mean if you set active area to something random like 1500×300 and h sample rate to something very low like 300, it’s still just resizing the image to the same aspect ratio as it would otherwise? If that’s happening the TV is definitely latching onto an identifier on the output like it thinks it knows better. It’s like if a TV saw “DTV” in the EDID and then proceeded to ignore literally everything else about the incoming data. Maybe if there’s a way to change a part of the output EDID on the OSSC to fix this, but that’s a firmware feature request.
I went back and saw you mention it was sizing correctly over passthrough but not line2x. Do those settings also do nothing but move the picture for both as well?
March 28, 2021 at 6:12 AM in reply to: Gamecube 480p 2x mode aspect ratio not 4:3. Any ideas? #46276Harrumph’s thread said that some games like Luigi’s mansion were shipped with the wrong pixel aspect ratio and would even look wrong direct to a crt, but to counteract this either h active and or h sample rate should help resize the image. What exactly happens in screen when you change those settings?
-
AuthorPosts