x2 mode works but there is the flickering image because of the deinterlacing. It’s specially annoying when reading texts… I didn’t noticed any better image, so why should I x2 ? I’ve read that some people uses x2…what’s the benefit ? Would the quality be noticeableo only on larger TVs ?
I think a lot of people end up using line2x on 480i/576i because it’s the one mode their TV will accept. Support for 480p is pretty much universal, but there are lots of displays and devices that don’t accept 480i/576i over HDMI, even though it should be perfectly fine; plus, there’s an issue with some metadata or flags when using 480i/576i that’s fixed with firmware 0.84 (which I don’t think is out yet), which can cause incompatibility with passthrough (Happened to me with my Onkyo AVR; happened to several mroe with Denon AVRs).
(x3, x4 and x5 don’t work)
Or does it depend on the TV’s model ?
If you’re still talking about 480i/576i, there’s no line5x available for those modes; and a lot of TVs (especially 1080p and smaller-resolution models) don’t like the resultant 1440i/1728i or 960p/1152p modes, because they’re not HDTV modes (and they probably have funky refresh rates). Sometimes you can get a video processor, like the Extron DSC 301 HD, which can handle them, but TVs usually won’t like them.
If, instead, you’re talking about 240p/288p, the problems are similar; you probably have a picky display. (Also, the manual says that line3x/4x/5x don’t generate standard modes that may not be accepted by consumer TVs.)