toggle quoted messageShow quoted text
But wouldn't you have the "standard" problem in both cases? (calibrator vs Signal generator)
On July 14, 2020 10:35:04 AM "Dave Daniel" <kc0wjn@...> wrote:
DescriptionDescriptionQuis custodiet ipsos custodes?
On what basis is a “well calibrated” instrument calibrated? How accurate and with what precision was it calibrated? Against what standards?
The problem is that the calibration instruments, or the instruments used to calibrate the instruments, etc., etc., need to be pretty much dead-on wrt to the actual values being measured. Metrology labs and other people who offer calibration services calibrate their calibration instruments such that they are traceable back to a root source, such as NIST.
I try and keep one indepentably calibrated ‘scope (my 2465B) around for measurement comparison. But then there is my 8566B, my 3456A, and my 8660D which have are uncalibrated.
On Jul 14, 2020, at 10:13, David Berlind <david@...> wrote:
Question for the hive:
I have a pile of scopes here that I'll be looking to calibrate at some point as a part of their resurrection. I see that the primary function of some Tek calibration fixtures is to generate square and sine waves. How is this significantly different from what a well-calibrated signal/function generator can do?