“Then one can calculate the needed H.Samplerate (given the fixed linecount and H/V. refresh coming from OSSC). However, as this approach is dependent on scanline output and V.Refresh, it could be different from console to console.”
Is there a possibility to dial in H.Samplerate based on the information given on the Display of OSSC without try and error?
Whenever a source does not provide a checkerboard-pattern, nor pausing the moving images, fine-tuning seems to be nearly impossible to me.
That post you quote is in relation to adjusting samplerate to get a non-compatible screen to display the OSSC image (i.e. achieve a more compatible output pixel clock rate). Not to achieve optimized sampling for a specific console.
To answer you question, no, there is no way to determine the optimal samplerate for a console based only on the information extracted by the OSSC.
You need to know the pixel clock of the source before it can be divided by the line rate to figure out the optimal number of samples per line. Of course, a very precise calculation of the line frequency would remove much of the guesswork so long as the pixel clock is known.