Currently the OSSC’s main status screen reports the following three values:
Clearly if you know any two of those you can calculate the third one.
Which of these does the OSSC measure “most accurately”? I’m guessing lines/frame since it’s the smallest integer?
But between lines/second and frames/second, which does it measure more accurately (in terms of either decimal places, percentage uncertainty, units of Hz, or whatever)? I ask because it would be nice to (have the option to) display only one of those two frequency values with more than just four digits (two decimal places), though that functionality would only make sense if the additional digits were accurate.
The parameters are primarily calculated from digitizer chip registers, which tell the number of 27MHz clocks per line (longest one in case they are uneven length for some reason) and lines per frame. However, they are not always the most accurate as former has some variation and latter may be off by one in some rare cases. A more accurate measurements could be done by FPGA itself (although it uses the same oscillator as fixed external clock) by counting clocks per frame which would really be required if additional digits were shown on the displayed numbers.