On Tue, Dec 8, 2020 at 01:16 AM, Dave Peterson wrote:
I've considered responding re. calibration for a while but decided to wait what others would say - call me lazy.
So far, I haven't seen mentioned what I wanted to say:
In a strict sense, "calibration" means verifying against a standard, nothing more. For electronic instruments, the standard would be the specification of the instrument to be calibrated (DUT), *not* the specification of the instrument one is using to verify the DUT with.
A DUT is considered "in spec" or "calibrated" if it performs within its published specifications (all of them). IOW, if the published spec for vertical sensitivity of a 'scope is +/- 2% and it performs within +/- 1.5%, it is *within spec* in that respect (or parameter, or feature, whatever you like to call it). Since there are only two possibilities: *within spec* or *not* within spec, if it performs within say +/- 0.01%, it's not "more within spec" and in a sense, not even "better calibrated". It's just more precise.
While (or after, whatever you call) restoring, many hobbyists tend to start adjusting things like the low-voltage power supplies, because they have a pretty good DVM - but nothing else.
It's perfectly possible that the instrument was within spec ("calibrated") before that but after adjusting, no longer is, and they have no way to check or adjust because they'd need different instruments.
Many calibration labs distinguish between "calibrating" and "adjusting", with very different pricing. For many people (and in normal speak) calibrating means the same as adjusting. Strictly speaking this isn't so, as I explained above. You need to be aware of that.
It would seem to make sense to try and adjust close to perfect but that's not a requirement to be "in calibration".
Naturally, adjusting as much as possible precisely in the middle (aiming at +/- 0%) seems to make sense to allow as much drift as possible until you lose calibration status but even that's *not* true in all cases. It may depend on temperature behavior or aging for instance. If e.g. a particular model quartz oscillator is known to age toward becoming slower, it may make sense to adjust it a bit higher than exactly +/- 0%.
As another example, it may make very little sense to adjust within say 0.1% where the spec says +/- 2%, because the parameter could drift outside of that within minutes to say .3% (because of temperature variations or so) but stay within calibration (= spec) for a year because the drift would only be say +/- 1% over a specified calibration interval.
BTW, I don't think it makes sense to calibrate or even adjust a healthy 465/475 twice a year, as was suggested, unless it lives in an unhealthy or very unstable environment, because it's not expected to drift out of spec within such a relatively short period. Checking (calibrating!) a restored instruments initially after a few months may make sense.
Note: The drift that is seen with many old components, like carbon composite resistors, far exceeds normal drift that the original calibration interval is based on.