Re: 2465A channel 2 problems


Another question or two for those in the know: At the beginning of the Cal 02 procedure, it states to connect a 0.5 V standard-amplitude signal to CH 1 and to use CH 2 Position to vertically position the "trace" to within 1 division of the center graticule line, and then to use the CH 1 Position and Volts/DIV VAR controls to obtain a 10-division horizontal signal. I don't get a "trace," only 2 dots. I can position the dots with the controls to put one dot on the first graticule line and the other on the 10th.
Question 1: "Is this "normal," or am I doing something wrong?" I've read and re-read the procedures and can't find any references to this.

Also, at the beginning of the Cal 01 procedure, it states that, "Upon entering Cal 01, the Input Coupling is automatically set to 50 ohm DC ....." This doesn't appear to be true for the other Cal procedures (2,3,4,etc), as I had to install a terminator on my generator to get the actual levels called out in the test procedures (monitored on another scope).
Question 2: "Is this correct, or am I missing something here, as well?

Thanks in advance for any input!


------ Original Message -----
From: "machineguy59 via Groups.Io" <>
To: <>
Sent: Wednesday, October 16, 2019 6:48 PM
Subject: Re: [TekScopes] 2465A channel 2 problems

How does Channel 2 work when using the scope in the "normal" mode? If it seems about right (just not precisely calibrated) when measuring known voltages then the attenuator is switching.
On Wednesday, October 16, 2019, 03:31:47 PM CDT, GerryR <totalautomation1@...> wrote:

A little history first: Acquired the 'scope recently and everything was working fine. Old battery dated 07-87. Decided to change battery and took all known precautions, but somehow screwed up and requires re-cal. I have attempted re-cal, and went from a "Test 04 Failed 04" to a "Test 04 failed 02" which is a limit failure. Channel 2 "calibrates" up to the 10V input (cal 02), and I get a Limit 130 error, no matter what I try, so I assume that is my problem. In the non-calibrate run mode, the channel 2 input only works up to the 500 mV range (Hi-Impedance input with co-ax) and with the 10X probe, up to 5V. The display range readout works, but in the 1V, 2V and 5V ranges, it appears to stay in the 100 mV, 200 mV and 500 mV ranges, so it appears the attenuator isn't switching properly.

Is there a way to tell whether it is the attenuator, or if the proper outputs are not getting to the drivers for the attenuator relays, or is this still a calibration problem? I have no way to monitor a "word" from the processor at this point. I did try monitoring channel 1 at the drivers and comparing them to channel 2, but found it difficult to get any meaningful data. Any help will be appreciated.

Join to automatically receive all group messages.