Topics

Newbie material on HF+(more)

Pete Smith
 

OK, changing the FFToffset to -93 brought the amplitude of my XG-3 signal source to about the correct value, but the problem now is that I cannot see the noise floor on 20M and above  because the GUI's calibration only goes to -110.  Is there another config setting that I can modify to make the scale go to ~ -140 dBFS, even if that means giving away 0-40 at the top end?

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/5/2018 10:17 PM, jdow wrote:

You don't - from the GUI. Full scale means the magnitude of the sample coming out of the USB data stream for a sample is 1.00000.

If you are adventuresome you can edit the SDRSharp.exe.config file to change this line:
<add key="fftOffset" value="-40.0" />

{^_^}

On 20180405 18:15, Pete Smith wrote:
But how do you "set" full scale? Clearly right now it is not correct. I'm looking for a measurement scale for my noise level and signal strengths that is comparable with other SDRs and hardware radios.

73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/5/2018 7:10 PM, David Eckhardt wrote:
If you have an S-9 signal which is -73 dBm and you have set dBFS to 0 dBm, the dBFS should read -73 dBm.  If you have set full scale to -20 dBm, the dBFS should read -53 dBFS.  Likewise, if you set the dBFS to -73 dBFS, the dBFS should read 0 dBFS.

Dave - WØLEV

<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=icon>     Virus-free. www.avast.com <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail&utm_term=link>


On Thu, Apr 5, 2018 at 8:23 PM, Pete Smith <n4zr@... <mailto:n4zr@...>> wrote:

    Please pardon the extreme newbie questions - I bought an Airspy HF+
    recently, intending to use it on the IF of my K3 for spectrum display.  Of
    course, the K3 is off at Watsonville, so I've been experimenting,
    comparing the Airspy with a Red Pitaya and a QS1R.

    From what I've seen I like SDR# a lot.  Is there anywhere an SDR#manual
    specifically for the HF+?  I keep running into things in the Quick Start
    that are not germane to my HF+, running the latest firmware.

    Also, can anyone point me toward an explanation of the dBFS scale used for
    signal strength with the Airspy HF+? I've established that my Elecraft
    XG-3's S-9 signal equates to 20 dBFS on SDR#, and its -33 dBM (equivalent
    to 40 over 9) more than "pins the meter" on SDR#.  Is this normal?

    --
    73, Pete N4ZR
    Check out the Reverse Beacon Network
    at<http://reversebeacon.net> <http://reversebeacon.net>, now
    spotting RTTY activity worldwide.
    For spots, please use your favorite
    "retail" DX cluster.




-- 
*Dave - WØLEV
*
*/Just Let Darwin Work/*







prog
 

On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:
changing the FFToffset to -93 brought the amplitude of my XG-3 signal source to about the correct value
For those who are late to the party: Don't touch the configuration file until you fully understand what you are doing. What you may think is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute dBm's. This is mostly related to the nature of modern radios with complex gain distribution, ADCs, DSP, and AGC loops. These are already hard to explain to experienced RF engineers, so I can't blame the end users.
Long story short: Revert these changes and learn how to use dBFS and SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides to reduce the gain to increase the SNR?"

Pete Smith
 

This is all very well, and I am sure that it is technically correct, but what I am primarily interested in at this moment is measuring my background noise level and tracking what will hopefully be improvements as the power company does its thing. So yes, it's all about SNR, but the question is relating what I see now to what I saw last month or last year using a good hardware receiver.

I am also somewhat averse to being told "you don't know enough", when the documentation so lags the hardware and software development.  I don't have an engineering degree, which makes product-specific documentation all the more important.

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/6/2018 9:15 AM, prog wrote:

On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:
changing the FFToffset to -93 brought the amplitude of my XG-3 signal source to about the correct value
For those who are late to the party: Don't touch the configuration file until you fully understand what you are doing. What you may think is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute dBm's. This is mostly related to the nature of modern radios with complex gain distribution, ADCs, DSP, and AGC loops. These are already hard to explain to experienced RF engineers, so I can't blame the end users.
Long story short: Revert these changes and learn how to use dBFS and SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides to reduce the gain to increase the SNR?"

Dana Myers
 

On 4/6/2018 7:00 AM, Pete Smith wrote:

This is all very well, and I am sure that it is technically correct, but what I am primarily interested in at this moment is measuring my background noise level and tracking what will hopefully be improvements as the power company does its thing. So yes, it's all about SNR, but the question is relating what I see now to what I saw last month or last year using a good hardware receiver.


If you're interested in measuring your absolute background noise level,
I'd think you're going to need something to calibrate your receiver - regardless
of which receiver you're using. I mean, it sounds like what you really want
is an instrument more than a receiver, right?

73,
Dana  K6JQ

Alan G4ZFQ
 

what I am primarily interested in at this moment is measuring my background noise level
Pete,

If you looked back at the development notes by ""prog" you would see that the first principle was to make the AGC do all the work, with S/N important, true measurements unsupported. Since release more improvements have made this even better for most casual users.

But you may now disable the AGC, set the gain as you wish, and use something like HDSDR to calibrate levels.

73 Alan G4ZFQ

jdow
 

What might constitute a "good" hardware receiver. One might expect for the expense that an ICOM IC_756proII might qualify. I can state without reservations that it does NOT qualify as giving a good accurate signal strength reading. It does not even compensate for its own settings for preamp gain and input attenuation. Readings above it's notion of S9 are "roughly (!)" 10dB/10dB. Below S9 the steps are closer to 3dB than to 6dB, which makes it all horridly inaccurate, especially when you switch in the attenuation that is appropriate at 75 meters, for example. This is one of my pet peeve ongoing wars with the ham community. Calibration and ham radio seldom go together. (Those for whom it does tend to read QEX or the former CQ magazine's "Communications Quarterly" or else they have college degrees in the field.) I do note that using manual settings will allow you to give startlingly accurate difference readings between two signal levels even if the absolute values are not identical. This is nice for A/B testing with antennas, for example. A tool that is limited is not a useless tool if you adequately understand the limitations.

And I suppose I just told you that you are mal-educated about things RF. That is curable and reflects not at all on your intelligence or knowledge in other fields. I would not expect a psychologist to understand how radios work at anything near the level I knew even as a teenager when I got into this hobby in the very late 50s. I knew a "shrink" in those days who was a ham and treated it as "plug it in and go." Anyway, learning in RF is a rich enough subject as you get your teeth into it the learning becomes a reward in itself. It is also something you can put down satisfied when you know as much as you feel is worth the effort. From time to time I become overly prolix here and try to generate understandable documents digging deeper into the field. Sometimes I succeed. Nearly 60 years as both a ham and a graduate level electronics engineer has generated a rich enough field of knowledge that I figure it's worth trying to pass along some of my insights and experience.

{^_^} Joanne

On 20180406 07:00, Pete Smith wrote:
This is all very well, and I am sure that it is technically correct, but what I am primarily interested in at this moment is measuring my background noise level and tracking what will hopefully be improvements as the power company does its thing. So yes, it's all about SNR, but the question is relating what I see now to what I saw last month or last year using a good hardware receiver.
I am also somewhat averse to being told "you don't know enough", when the documentation so lags the hardware and software development.  I don't have an engineering degree, which makes product-specific documentation all the more important.
73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.
On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

changing the FFToffset to -93 brought the amplitude of my XG-3 signal
source to about the correct value

For those who are late to the party: Don't touch the configuration file until you fully understand what you are doing. What you may think is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute dBm's. This is mostly related to the nature of modern radios with complex gain distribution, ADCs, DSP, and AGC loops. These are already hard to explain to experienced RF engineers, so I can't blame the end users.
Long story short: Revert these changes and learn how to use dBFS and SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides to reduce the gain to increase the SNR?"

Pete Smith
 

Alan, I've clearly missed a whole block of info - where are these development notes? I suspect my confusion lies in part in both the Airspy HF+ and SDR# sharing a developer, or so I've been told.  See, I really *am* a newbie.

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/6/2018 1:49 PM, Alan G4ZFQ wrote:


what I am primarily interested in at this moment is measuring my background noise level

Pete,

If you looked back at the development notes by ""prog" you would see that the first principle was to make the AGC do all the work, with S/N important, true measurements unsupported. Since release more improvements have made this even better for most casual users.

But you may now disable the AGC, set the gain as you wish, and use something like HDSDR to calibrate levels.

73 Alan G4ZFQ






Pete Smith
 

That's where the XG-3 comes in -- it has output levels of -107, -73, -33 and 0 dBm, which are adequately stable and accurate for my purposes.  That said, I'm less interested in the absolute level than being able to repeat the measurement conditions from one moment in time to another.

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/6/2018 1:43 PM, Dana Myers wrote:


If you're interested in measuring your absolute background noise level,
I'd think you're going to need something to calibrate your receiver - regardless
of which receiver you're using. I mean, it sounds like what you really want
is an instrument more than a receiver, right?

73,
Dana  K6JQ


jdow
 

For that the manual settings would work well. And all I can suggest is experiment. I've not played with them, myself.

{^_^}

On 20180406 14:52, Pete Smith wrote:
That's where the XG-3 comes in -- it has output levels of -107, -73, -33 and 0 dBm, which are adequately stable and accurate for my purposes.  That said, I'm less interested in the absolute level than being able to repeat the measurement conditions from one moment in time to another.
73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.
On 4/6/2018 1:43 PM, Dana Myers wrote:

If you're interested in measuring your absolute background noise level,
I'd think you're going to need something to calibrate your receiver - regardless
of which receiver you're using. I mean, it sounds like what you really want
is an instrument more than a receiver, right?

73,
Dana  K6JQ

Leif Asbrink
 

Hello Pete,

This is all very well, and I am sure that it
is technically correct, but what I am primarily
interested in at this moment is measuring my
background noise level and tracking what will
hopefully be improvements as the power company
does its thing. So yes, it's all about SNR, but
the question is relating what I see now to what
I saw last month or last year using a good hardware
receiver.
I am afraid you are not attacking this problem
properly. What you really want to measure is the
system noise temperature. Ideally it should be the
sky noise temperature at the frequency you look at.
In a non-rural area it is expected to be 10 to 20
dB higher - and your power company might help to
bring it down. Also ferrites on your neighbours
equipment might help.

To measure the noise floor temperature the easy way
is to assume that your HF+ has NF=5dB. Set manual
gain, Preamp on and zero attenuation. Connect a
dummy load to your HF+ and evaluate the power (use
signal diagnostics.) Then connect your antenna
and again use signal diagnostics. You might find
30 dB more noise which would mean that your noise
is 35 dB above room temperature.

Have a look here:
https://www.itu.int/dms_pubrec/itu-r/rec/p/R-REC-P.372-8-200304-S!!PDF-E.pdf
Figure 2. You can see that 35 dB is what you can expect
at 7 MHz, 30 dB at 14 MHz.

Be aware that disabling the AGC could bring the front
end of your HF+ into saturation with hoplessly incorrect
results as a consequence.

A better strategy would be to connect your antenna
and to also inject a weak signal. Look for S/N of the
weak signal and insert attenuation until it degrades.
Then evaluate the NF of your system by use of signal
diagnostics. Use the gain setting you found to be required
to avoid saturation from all the strong out-of-band signals.
Find the noise level and then the level of a known signal.
That would give (S+N)/N in a certain bandwidth. From that
you can compute S/N in 1 Hz bandwidth. Assuming S/N is much
larger than one you get the noise floor in dBm/Hz directly
from the signal power and S/N in 1 Hz bandwidth.

The HF+ gives you dBFS, you could use that, but only it
you make sure there is no out-of-band signal strong enough
to turn down the gain of the HF+ if you are in AGC mode
or to saturate the front end if you are in manual mode.

I am also somewhat averse to being told "you don't
know enough", when the documentation so lags the
hardware and software development.  I don't have an
engineering degree, which makes product-specific
documentation all the more important.
Hmmm, I have tried to clarify the problem. It is not
really something you should expect from the HF+
documentation, it is not product specific.

Surely, if you insert a filter that ensures that no
out-of-band signal is affecting the gain of your HF+
(or saturating it) and then calibrate the dB scale to
fit your generators, everything would be fine for you,
but you would have to make sure that the same FFT size
is used always. There could also be other limitations.

Better properly measure your antenna noise temperature:-)

73

Leif



73, Pete N4ZR
Check out the Reverse Beacon Network
at <http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

changing the FFToffset to -93 brought the amplitude of my XG-3
signal source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may think
is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute dBm's.
This is mostly related to the nature of modern radios with complex
gain distribution, ADCs, DSP, and AGC loops. These are already hard to
explain to experienced RF engineers, so I can't blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"

jdow
 

For what he seems to indicate he wants to do, measure a change in power line radiated noise, he can work with manual settings. He needs to avoid too much gain. The symptom is sudden high level noise spurious all over the spectrum with a small change in gain. Note the settings so he can return to them in the future. (And it may be a poor idea to update the HF+ driver between measurements. Although he could step back for making the measurement.) For a good means of taking the measurement from time to time is to copy the sdrsharp folder to a "SDRSharp Noise Measurement" folder. Run the copy of sdrsharp within that folder. Setup and take the measurement. Exit. Then never use that folder again except to start the program, take a measurement without touching settings, and exit. For normal operation use the normal sdrsharp folder.

A second method that he can use is to observe the noise patterns. Take a picture of what the spectrum display looks like with only medium level signals present. Then in the future under about the same conditions he can compare what the spectrum and waterfall look like. Changes often can stand out pretty dramatically.

{^_^}

On 20180406 18:09, Leif Asbrink wrote:
Hello Pete,

This is all very well, and I am sure that it
is technically correct, but what I am primarily
interested in at this moment is measuring my
background noise level and tracking what will
hopefully be improvements as the power company
does its thing. So yes, it's all about SNR, but
the question is relating what I see now to what
I saw last month or last year using a good hardware
receiver.
I am afraid you are not attacking this problem
properly. What you really want to measure is the
system noise temperature. Ideally it should be the
sky noise temperature at the frequency you look at.
In a non-rural area it is expected to be 10 to 20
dB higher - and your power company might help to
bring it down. Also ferrites on your neighbours
equipment might help.
To measure the noise floor temperature the easy way
is to assume that your HF+ has NF=5dB. Set manual
gain, Preamp on and zero attenuation. Connect a
dummy load to your HF+ and evaluate the power (use
signal diagnostics.) Then connect your antenna
and again use signal diagnostics. You might find
30 dB more noise which would mean that your noise
is 35 dB above room temperature.
Have a look here:
https://www.itu.int/dms_pubrec/itu-r/rec/p/R-REC-P.372-8-200304-S!!PDF-E.pdf
Figure 2. You can see that 35 dB is what you can expect
at 7 MHz, 30 dB at 14 MHz.
Be aware that disabling the AGC could bring the front
end of your HF+ into saturation with hoplessly incorrect
results as a consequence.
A better strategy would be to connect your antenna
and to also inject a weak signal. Look for S/N of the
weak signal and insert attenuation until it degrades.
Then evaluate the NF of your system by use of signal
diagnostics. Use the gain setting you found to be required
to avoid saturation from all the strong out-of-band signals.
Find the noise level and then the level of a known signal.
That would give (S+N)/N in a certain bandwidth. From that
you can compute S/N in 1 Hz bandwidth. Assuming S/N is much
larger than one you get the noise floor in dBm/Hz directly
from the signal power and S/N in 1 Hz bandwidth.
The HF+ gives you dBFS, you could use that, but only it
you make sure there is no out-of-band signal strong enough
to turn down the gain of the HF+ if you are in AGC mode
or to saturate the front end if you are in manual mode.

I am also somewhat averse to being told "you don't
know enough", when the documentation so lags the
hardware and software development.  I don't have an
engineering degree, which makes product-specific
documentation all the more important.
Hmmm, I have tried to clarify the problem. It is not
really something you should expect from the HF+
documentation, it is not product specific.
Surely, if you insert a filter that ensures that no
out-of-band signal is affecting the gain of your HF+
(or saturating it) and then calibrate the dB scale to
fit your generators, everything would be fine for you,
but you would have to make sure that the same FFT size
is used always. There could also be other limitations.
Better properly measure your antenna noise temperature:-)
73
Leif

73, Pete N4ZR
Check out the Reverse Beacon Network
at <http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

changing the FFToffset to -93 brought the amplitude of my XG-3
signal source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may think
is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute dBm's.
This is mostly related to the nature of modern radios with complex
gain distribution, ADCs, DSP, and AGC loops. These are already hard to
explain to experienced RF engineers, so I can't blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"

doug
 

On 04/06/2018 04:12 PM, jdow wrote:
What might constitute a "good" hardware receiver. One might expect for
the expense that an ICOM IC_756proII might qualify. I can state
without reservations that it does NOT qualify as giving a good
accurate signal strength reading. It does not even compensate for its
own settings for preamp gain and input attenuation. Readings above
it's notion of S9 are "roughly (!)" 10dB/10dB. Below S9 the steps are
closer to 3dB than to 6dB, which makes it all horridly inaccurate,
especially when you switch in the attenuation that is appropriate at
75 meters, for example. This is one of my pet peeve ongoing wars with
the ham community. Calibration and ham radio seldom go together.
(Those for whom it does tend to read QEX or the former CQ magazine's
"Communications Quarterly" or else they have college degrees in the
field.) I do note that using manual settings will allow you to give
startlingly accurate difference readings between two signal levels
even if the absolute values are not identical. This is nice for A/B
testing with antennas, for example. A tool that is limited is not a
useless tool if you adequately understand the limitations.

And I suppose I just told you that you are mal-educated about things
RF. That is curable and reflects not at all on your intelligence or
knowledge in other fields. I would not expect a psychologist to
understand how radios work at anything near the level I knew even as a
teenager when I got into this hobby in the very late 50s. I knew a
"shrink" in those days who was a ham and treated it as "plug it in and
go." Anyway, learning in RF is a rich enough subject as you get your
teeth into it the learning becomes a reward in itself. It is also
something you can put down satisfied when you know as much as you feel
is worth the effort. From time to time I become overly prolix here and
try to generate understandable documents digging deeper into the
field. Sometimes I succeed. Nearly 60 years as both a ham and a
graduate level electronics engineer has generated a rich enough field
of knowledge that I figure it's worth trying to pass along some of my
insights and experience.

{^_^} Joanne


I don't think that any receiver since the Collins R388 had an
accurate S-Meter! And most radios I have seen do not give 6-dB per
S-Unit results. (I had a Drake 2B receiver years ago that had
real 5dB calibrations on it's meter. A much more useful
calibration!) At any rate, what is the use of dB/µv when you are
working in a 50 Ohm system? Nobody cares about microvolts at an
antenna jack in this modern age--not if they have any sense. RF into
a known impedance--50 Ohms--is the standard nowadays. And voltage into
an impedance equals power: decibels relative to
power--in milliwatts. 0dBm. Thirty dB below 1 Watt in a 50 Ohm
system. Your receiver may or may not be capable of that, but this is the
standard of the industry, and if you can calibrate it to
that reference, 0dBm, then everyone should understand it. Now a s/n
at the receiving end might be a useful number IF THE RECEIVER IS
BASICALLY QUIET. But the basic story has been
around for at least a century: if you can't hear 'em, you can't work 'em!

Another thread on the forum points out that you can put a dummy load (50
Ohm) termination) on your receiver to establish your receiver noise
floor. Anything over that is external noise.

My personal experience with a power company is that if you can find the
close source of interference, they will fix it. Ham magazines have
devoted many pages over many years to showing you
how to find the source of interference. And you can't expect overnite
service, but I had mine fixed in two weeks. (It was a 6-meter noise.)

--doug, WA2SAY

On 20180406 07:00, Pete Smith wrote:
This is all very well, and I am sure that it is technically correct,
but what I am primarily interested in at this moment is measuring my
background noise level and tracking what will hopefully be
improvements as the power company does its thing. So yes, it's all
about SNR, but the question is relating what I see now to what I saw
last month or last year using a good hardware receiver.

I am also somewhat averse to being told "you don't know enough", when
the documentation so lags the hardware and software development. I
don't have an engineering degree, which makes product-specific
documentation all the more important.

73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

changing the FFToffset to -93 brought the amplitude of my XG-3
signal
source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may
think is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute
dBm's. This is mostly related to the nature of modern radios with
complex gain distribution, ADCs, DSP, and AGC loops. These are
already hard to explain to experienced RF engineers, so I can't
blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"

jdow
 

R-390A was pretty good. But finding one with the original meter is "difficult".

Do you have any idea how difficult it is to generate a calibrated S-Meter and keep it calibrated? AGC loop gain control is not log-linear. It's not X volts per dB with X a nice constant. That means a special meter calibration is needed. Addressing the Drake line tubes age, fairly rapidly. So stage gain vs AGC voltage will differ from nominal. This will upset the calibration. Basic tube gains vary considerably. That means you have to setup the individual stages fairly accurately. So if Drake out of the factory is calibrated, I'm not sure any Drake you could buy today would be particularly well calibrated. But it would be better than the average rice box.

There is nothing magical about 50 ohms. Although if a box is calibrated with an antenna presenting a 50 ohm impedance then it will have a constant bias if you put a 200 ohm source on it. That's sort of a nevermind. It can be calibrated out. A calibration error that has varying true dB vs indicated dB error is something you can't do anything about with simple adjustments. I handled that situation with my ProII by generating a software table within my remote control tool. That solved a LOT of problems, especially when I added in compensation for input attenuation or gain. 100 uV at 50 ohms was always S9 (Specifically Collins Radio S9 as a matter of fact). That annoyed HELL out of some of the creatures on 75 at night. {^_-} It was accurate enough that path loss calculations produced reasonable comparisons with S-Meter readings. (And it uncovered more than one legal limit++ operators. No, I didn't turn them in. At the times I saw them doing it the band was about as crowded as the local streets between 0300 and 0400 local time. So no real harm was done.)

I note that the S meter is something people relate to. Telling them that their signal is -30 dBm distresses them. It SOUNDS tiny to them despite that being a VERY strong signal. Some people just NEED to be tweaked that way, though. They earn it the hard way, good old obnoxious behavior.

It would be easier to find the sources IF they didn't object quite so strongly to hitting the poles with a really heavy hammer to see what that does to the noise. {^_-} It's an art to find the noise. Sonic detectors may be among the best tools. (I totally gave up on HF when I lived in Hermosa Beach. The ocean salt air does a number on insulators within weeks. 70 miles away and inland it's not so bad. But our UPSs radiate like banshees, too. Ah well.) And do NOT get a Plasma TV. They radiate horridly, I am told. LCD and OLED would be better. (Sadly, both age fairly ungracefully fast compared to what "should be".)

{^_^} Joanne - yes, I am opinionated. I plead too much injuneering eddicashun.

On 20180406 22:06, doug wrote:
On 04/06/2018 04:12 PM, jdow wrote:
What might constitute a "good" hardware receiver. One might expect for
the expense that an ICOM IC_756proII might qualify. I can state
without reservations that it does NOT qualify as giving a good
accurate signal strength reading. It does not even compensate for its
own settings for preamp gain and input attenuation. Readings above
it's notion of S9 are "roughly (!)" 10dB/10dB. Below S9 the steps are
closer to 3dB than to 6dB, which makes it all horridly inaccurate,
especially when you switch in the attenuation that is appropriate at
75 meters, for example. This is one of my pet peeve ongoing wars with
the ham community. Calibration and ham radio seldom go together.
(Those for whom it does tend to read QEX or the former CQ magazine's
"Communications Quarterly" or else they have college degrees in the
field.) I do note that using manual settings will allow you to give
startlingly accurate difference readings between two signal levels
even if the absolute values are not identical. This is nice for A/B
testing with antennas, for example. A tool that is limited is not a
useless tool if you adequately understand the limitations.

And I suppose I just told you that you are mal-educated about things
RF. That is curable and reflects not at all on your intelligence or
knowledge in other fields. I would not expect a psychologist to
understand how radios work at anything near the level I knew even as a
teenager when I got into this hobby in the very late 50s. I knew a
"shrink" in those days who was a ham and treated it as "plug it in and
go." Anyway, learning in RF is a rich enough subject as you get your
teeth into it the learning becomes a reward in itself. It is also
something you can put down satisfied when you know as much as you feel
is worth the effort. From time to time I become overly prolix here and
try to generate understandable documents digging deeper into the
field. Sometimes I succeed. Nearly 60 years as both a ham and a
graduate level electronics engineer has generated a rich enough field
of knowledge that I figure it's worth trying to pass along some of my
insights and experience.

{^_^}   Joanne
   I don't think that any receiver since the Collins R388 had an
accurate S-Meter! And most radios I have seen do not give 6-dB per
S-Unit results. (I had a Drake 2B receiver years ago that had
   real 5dB calibrations on it's meter. A much more useful
calibration!)  At any rate, what is the use of dB/µv when you are
working in a 50 Ohm system? Nobody cares about microvolts at an
  antenna jack in this modern age--not if they have any sense. RF into
a known impedance--50 Ohms--is the standard nowadays. And voltage into
an impedance equals power: decibels relative to
  power--in milliwatts. 0dBm. Thirty dB below 1 Watt in a 50 Ohm
system. Your receiver may or may not be capable of that, but this is the
standard of the industry, and if you can calibrate it to
  that reference, 0dBm, then everyone should understand it.  Now a s/n
at the receiving end might be a useful number IF THE RECEIVER IS
BASICALLY QUIET.  But  the basic story has been
 around for at least a century: if you can't hear 'em, you can't work 'em!
Another thread on the forum points out that you can put a dummy load (50
Ohm) termination) on your receiver to establish your receiver noise
floor. Anything over that is external noise.
My personal experience with a power company is that if you can find the
close source of interference, they will fix it. Ham magazines have
devoted many pages over many years to showing you
how to find the source of interference.  And you can't expect overnite
service, but I had mine fixed in two weeks. (It was a 6-meter noise.)
--doug, WA2SAY

On 20180406 07:00, Pete Smith wrote:
This is all very well, and I am sure that it is technically correct,
but what I am primarily interested in at this moment is measuring my
background noise level and tracking what will hopefully be
improvements as the power company does its thing. So yes, it's all
about SNR, but the question is relating what I see now to what I saw
last month or last year using a good hardware receiver.

I am also somewhat averse to being told "you don't know enough", when
the documentation so lags the hardware and software development.  I
don't have an engineering degree, which makes product-specific
documentation all the more important.

73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

    changing the FFToffset to -93 brought the amplitude of my XG-3
signal
    source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may
think is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute
dBm's. This is mostly related to the nature of modern radios with
complex gain distribution, ADCs, DSP, and AGC loops. These are
already hard to explain to experienced RF engineers, so I can't
blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"


Alan G4ZFQ
 

Alan, I've clearly missed a whole block of info - where are these development notes?
Pete,

Sorry, I do not think there are any official instructions and I do not know of any unofficial.
The "notes" I refer to are just posts in this group from prog.
As I've been reading a long time I've just picked up the basics.

The changelog on the Airspy HF+ page gives an idea of the development but reading back posts is really the only way so far.
Used normally the HF+ is just another SDR, no special instructions. If you choose to use manual adjustments then what you do is your decision, the developer will say his AGC does it all.
Manual adjustment is shown in the software, SDR#, or if you want to calibrate with your XG1 use HDSDR with the DLL.
Lief's way of assuming a constant noise figure for the HF+ under fixed gain conditions is another way of measurement.
Note that the HF+ is a SDR and this is being emphasised here. Us old radio users have to get used to the fact that while the basic principles are the same 0dBFS, onset of digital overload, is more important than indicating signal strengths.

73 Alan G4ZFQ



what I am primarily interested in at this moment is measuring my background noise level

David J Taylor
 

I don't think that any receiver since the Collins R388 had an accurate S-Meter!
[]

--doug, WA2SAY
======================================

I'm hoping that my IC-R8600 has reasonable calibration. It offers both dBuV in EMF and PD measures, and dBm (which I prefer). As expected, the reading on white noise varies with the bandwidth (mode) selected.

For digital dBFS offers the best way to see how near to overload you are.

73,
David GM8ARV
--
SatSignal Software - Quality software written to your requirements
Web: http://www.satsignal.eu
Email: david-taylor@...
Twitter: @gm8arv

jdow
 

Test it with a good signal generator. It's a rice box. But it might be a VERY good rice box. Its design has that potential. It would be nice to know how accurate it appears to be. When measuring weak signals make sure there is at least one signal that is roughly 30 dB to 40 dB below full scale. That allows good potential for averaging (decimation - sort of) to fill in a few more bits of precision at low levels. (Full scale would be where the spectrum suddenly grows just a whole lot of spurious signals. The signal could also be set to provide just about any level near the middle of its spectrum display give or take about 30 dB.

{^_^}

On 20180407 00:14, David J Taylor via Groups.Io wrote:
I don't think that any receiver since the Collins R388 had an accurate S-Meter!
[]
--doug, WA2SAY
======================================
I'm hoping that my IC-R8600 has reasonable calibration.  It offers both dBuV in EMF and PD measures, and dBm (which I prefer).  As expected, the reading on white noise varies with the bandwidth (mode) selected.
For digital dBFS offers the best way to see how near to overload you are.
73,
David GM8ARV

jdow
 

Hm, I find this "stuff" on the ICOM page for this receiver:

===8<---
RSSI (Received Single Strength Indicator)

The IC-R8600 shows S-meter, dBµ¬, dBµ¬ (emf) and dBm meter types in the RSSI. The dBµ¬, dBµ¬ (emf) and dBm meter has a high ±3 dBµ accuracy (between 0.5–1100 MHz) that can be used for measuring signal strength level.
===8<---
Um, er, ah, (shuffle feet), stare at that cross-eyed, no matter what I do it's "odd". Their marketdroid screwed up. They are claiming +/- 3 dBu calibration. That means their levels are all accurate to within a roughly 1 uV range depending on how picky you are over the numbers. That means it's insanely accurate at full scale and "reasonable non-instrument" accurate at 1 uV and trash below 1 uV. I think they meant +/- 3 dB of reading which makes just a WHOLE LOT more sense. And would make it a quite reasonable piece of gear for those not needing precision instrumentation.

{^_^}

On 20180407 00:14, David J Taylor via Groups.Io wrote:
I don't think that any receiver since the Collins R388 had an accurate S-Meter!
[]
--doug, WA2SAY
======================================
I'm hoping that my IC-R8600 has reasonable calibration.  It offers both dBuV in EMF and PD measures, and dBm (which I prefer).  As expected, the reading on white noise varies with the bandwidth (mode) selected.
For digital dBFS offers the best way to see how near to overload you are.
73,
David GM8ARV

David J Taylor
 

From: jdow

Hm, I find this "stuff" on the ICOM page for this receiver:

===8<---
RSSI (Received Single Strength Indicator)

The IC-R8600 shows S-meter, dBµ¬, dBµ¬ (emf) and dBm meter types in the RSSI.
The dBµ¬, dBµ¬ (emf) and dBm meter has a high ±3 dBµ accuracy (between 0.5–1100
MHz) that can be used for measuring signal strength level.
===8<---
Um, er, ah, (shuffle feet), stare at that cross-eyed, no matter what I do it's
"odd". Their marketdroid screwed up. They are claiming +/- 3 dBu calibration.
That means their levels are all accurate to within a roughly 1 uV range
depending on how picky you are over the numbers. That means it's insanely
accurate at full scale and "reasonable non-instrument" accurate at 1 uV and
trash below 1 uV. I think they meant +/- 3 dB of reading which makes just a
WHOLE LOT more sense. And would make it a quite reasonable piece of gear for
those not needing precision instrumentation.

{^_^}
===================================

You can't expect perfection for such a low price! HI! It's just a whole lot better than S-points. Someday I might write a program to monitor and record signal strengths of selected stations (VHF/UHF) both to see the effects of propagation and to check my own cables and antenna's deterioration.

73,
David GM8ARV
--
SatSignal Software - Quality software written to your requirements
Web: http://www.satsignal.eu
Email: david-taylor@...
Twitter: @gm8arv

Martin Smith
 

On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

XG-3 signal source to about the correct value
FYI: The Elecraft XG3 RF Signal Source, on paper is calibrated from 1.5 MHz to 200 MHz, but that plastic case means that under many physical configurations you will get higher than expected signal levels ( ref: https://youtu.be/0GiFWEEFAJc?t=209 ).
Once you know that there are problems you easily can work around them.

Pete Smith
 

Hmmm...somehow I assumed that plastic case had a metallic layer sprayed on the inside.  Oh well, for my purposes, with a receiver that has a noise floor of -140 dBm or so and a metallic case, I get S-meter readings that agree with the stated levels of the XG-3, and the XG-3 signal (even the highest one) is not discernible when disconnected from the receiver.

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/7/2018 2:21 PM, Martin Smith via Groups.Io wrote:

On Fri, Apr  6, 2018 at 05:57 am, Pete Smith wrote:
XG-3 signal source to about the correct value

FYI: The Elecraft XG3 RF Signal Source, on paper is calibrated from 1.5 MHz to 200 MHz, but that plastic case means that under many physical configurations you will get higher than expected signal levels ( ref: https://youtu.be/0GiFWEEFAJc?t=209 ). 
Once you know that there are problems you easily can work around them.