AM700 FFT LF residual noise Problems


tss_steve_990
 

Fellow Tektronix lovers:

 

As I indicated in my previous post, I am trying to learn how to run the AM700 that I recently repaired with a massive replacement of leaking electrolytics and accompanying trace repair and exhaustive cleaning.

 

I was running through the performance verification of the FFT given in the service manual and I noticed much higher Low Frequency residual noise in channel A than in Channel B.  At frequencies above about 200 Hz the two channels were essentially identical.

 

I changed the x axis on the FFT analyzer to log scale instead of linear so I could see the Low frequency residual more clearly and the LF discrepancy became even more apparent when the Low frequencies weren’t all smashed in the linear scale.

With the bin width at 46.87Hz/Bin, the window set to Kaiser-Bessel, there is a very large discrepancy between the A channel and the B channel.

The A channel reads about -64dBu from DC up to about 200 Hz and then heads down to -140 dBu or so.

The B channel reads about -110dBu from DC up to about 200 Hz and then heads down to -140 dBu or so.

The B channel moves a lot more, even with Averaging set to 32.

 

Using the big knob to expand the measurements, the display began to look much less crazy, and the two channels began to take on much more similar characteristics:

At 0.4687 Hz/Bin, 32 Averages, Ch 1:  The baseline residual is -157dBu; 60 Hz = -126dBu; 120 Hz = -141dBu; 180 Hz = -130dBu.

At 0.4687 Hz/Bin, 32 Averages, Ch 2:  the baseline residual is -160 dBu; 60 Hz = -142dBu; 120 Hz = -143 dBu; 180 Hz = -128dBu.

 

With the exception of the 60 Hz spike, which is 16dB higher in channel A than channel B, the rest of the residual is about the same.

 

That doesn’t explain why at 46.87Hz/Bin channel A reads 46 dB higher than channel B.

 

At small bin widths, the -160dBu 10 Hz level seems to make sense, but as the bin size increases, so does the erroneous (??) display/calculation (?).

 

On channel A, the 10 Hz level increases as follows:

2.344 Hz/Bin = -125 dBu

4.687 Hz.Bin = -78 dBu

9.375 Hz/Bin = -68 dBu

46.87 Hz/Bin = -64 dBu

 

On channel B 46.87 Hz/Bin = -105 to -120 dBu (moves up and down a lot even with averaging) – but it is WAY lower than channel A.

 

My first impression was that there was an IC in the channel A measurement path that had excessive 1/f noise, but maybe this is a math (programming) problem unrelated to hardware.

 

Any light that anyone can shed on this discrepancy between the two channels would be greatly appreciated.

 

 

Steve Hogan

 

(714) 871-6636

 

 

 

Join TekScopes@groups.io to automatically receive all group messages.