Re: VSWR tester


Dave Brown <tractorb@...>
 

Denis-

What you have is basically a return loss bridge with a built in reference
arm termination of 50 ohms plus an rf detector on the output port. So you
get a dc signal out of the thing that is proportional to the mismatch (wrpt
the reference termination) seen at the test port. Unfortunately, the
detector has a conversion characteristic that needs to be accounted for, as
they are definitely non-linear other than maybe over a narrow level range.
For a given (and constant) drive level at the swept rf source port, you
will get two extremes of DC output as the load on the test port is varied,
zero with a good 50 ohm load, and maximum DC with a complete mismatch. ie
open or short cct. These extremes correspond to very high return loss (the
well-matched condition) and zero return loss (very bad mismatch).

The easiest way to caibrate the thing for all the ' in-between' values of
return loss (or VSWR-it's an easy conversion, right?) is to establish
reference calibrations on the scope screen with 'known return loss'
terminations.

This may sound difficult but all you need is an adjustable attenuator (must
be good to excellent over the frequency range of interest) plus an open or a
short cct termination. Connect the measuring port on the autotester to the
input of the attenuator, ideally directly, and the open or short to the
attenuator output. Then just set the attenuator for the required return
loss.

But bear in mind that the attenuator introduces TWICE the indicated value of
return loss. With say 6 dB in the attenuator, you have 12 dB return loss,
etc .Why TWICE-- well think of it as 6dB on the way out and another 6 on the
way back.
So you can set up whatever you want for maximum deflection desired on
screen to correspond to worst case expected return loss and (usually) a few
other in-between levels as well.

And if you really want to use just a fixed frequency source and a milliovolt
meter, rather than a scope and sweeper, the same principle applies for
calibration at whatever spot frequencies you care to use.

If you tend to have particular worst case VSWR values that you want on
screen (or on meter) just make up known return loss or VSWR terminations to
calibrate maximum deflection against. eg use a 100 ohm load (or at low freqs
a pair of 50 ohm loads and a coax tee-to make 25 ohms- same VSWR) to cal the
screen for 2:1 VSWR max., (or approx 10dB return loss) etc. Lower (better
return loss) values can be filled in with use of the attenuator, as above,
to make a screen overlay in the first instance.

The directivity spec is another matter- just consider it to be a limit to
the upper return loss that you can measure. Return loss readings over say,
30 dB, will not be as accurate as lower values.

Regards,
Dave Brown
Christchurch, NZ

----- Original Message -----
From: "Denis Cobley" <denis.cobley@newteksupport.com>
To: <TekScopes@yahoogroups.com>
Sent: Friday, July 11, 2003 12:38 PM
Subject: [TekScopes] VSWR tester


Hi all
Slightly off subject but there seems to be a good knowledge base on the
group.
We have a Wiltron 97A50 VSWR Autotester to measure VSWR.
Only problem is we have no idea how to interpret the detected output back
to VSWR.
Anyone have a manual or can give me info on how to convert the millivolt
output to VSWR
At 100Mhz into an open circuit we get -22mVDC out, with a 50 ohm term we
get 0VDC.
We need to know what this means in terms of VSWR (1.3:1 , 2:1 ??)
The cal report that we have with it gives us readings in Directivity >36dB
but this is meaningless in terms of VSWR to us.

Regards
Denis Cobley








Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/


Join TekScopes@groups.io to automatically receive all group messages.