I think this is something that could be improved in future OSSC revisions.
Of course, the SCART spec and the OSSC documentation is clear on what type of sync signal is expected. And that’s all good and fair – if you i.e. fry your device with a 9V DC center-positive labelled power jack by hooking up a 12V DC supply, your fault.
But I’d argue that the sync level issue is more insidious than that. Some cables have the resistor in the SCART head, some in the DIN plug. Some consoles output video level sync and don’t need a resistor. Some RGB mods and RGB output devices have a jumper to switch between the two levels. Sync strippers can also change the situation. Many makers of cables, RGB mods & superguns have apparently gotten this wrong over the years. A 10$ radio shack meter can’t tell you the amplitude of a sync pulse. If you get it wrong, at first nothing might happen. Picture looks fine, but a year later some switch, TV or scaler is damaged.
My point is that making sure your sync signal has the correct magnitude is not easy for the average retro gamer. You either need complete knowledge and understanding of the video signal chain in your console, RGB mod & cable or you need to own & be able to operate a scope to be sure you’re not outputting the wrong levels.
I think adding tolerance, or even better reporting, for TTL-level sync signals would be valuable.