Currently the OSSC’s main status screen reports the following three values:
– lines/frame
– lines/second
– frames/second
Clearly if you know any two of those you can calculate the third one.
Which of these does the OSSC measure “most accurately”? I’m guessing lines/frame since it’s the smallest integer?
But between lines/second and frames/second, which does it measure more accurately (in terms of either decimal places, percentage uncertainty, units of Hz, or whatever)? I ask because it would be nice to (have the option to) display only one of those two frequency values with more than just four digits (two decimal places), though that functionality would only make sense if the additional digits were accurate.