Date   
Re: Newbie material on HF+(more)

Pete Smith <n4zr@...>
 

Hmmm...somehow I assumed that plastic case had a metallic layer sprayed on the inside.  Oh well, for my purposes, with a receiver that has a noise floor of -140 dBm or so and a metallic case, I get S-meter readings that agree with the stated levels of the XG-3, and the XG-3 signal (even the highest one) is not discernible when disconnected from the receiver.

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/7/2018 2:21 PM, Martin Smith via Groups.Io wrote:

On Fri, Apr  6, 2018 at 05:57 am, Pete Smith wrote:
XG-3 signal source to about the correct value

FYI: The Elecraft XG3 RF Signal Source, on paper is calibrated from 1.5 MHz to 200 MHz, but that plastic case means that under many physical configurations you will get higher than expected signal levels ( ref: https://youtu.be/0GiFWEEFAJc?t=209 ). 
Once you know that there are problems you easily can work around them.





Re: Newbie material on HF+(more)

Martin Smith
 

On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

XG-3 signal source to about the correct value
FYI: The Elecraft XG3 RF Signal Source, on paper is calibrated from 1.5 MHz to 200 MHz, but that plastic case means that under many physical configurations you will get higher than expected signal levels ( ref: https://youtu.be/0GiFWEEFAJc?t=209 ).
Once you know that there are problems you easily can work around them.

Re: ADSBSpy Output format

Support@...
 

No - You still don't get it. The timestamps output by airspy_adsb *ARE* cycle counts. In AVR mode the timestamp is just the lower 32 bits of the cycle count. It is totally independent and unrelated to PC time. There is no need for airspy_adsb to call any windows time API's. 

RE: Anyway, what comes to mind is that there may be several copies of the message sent one after the other.

Yes, but it takes either 64uS or 120uS for a message to be transmitted, therefore it's impossible for different messages to be separated by less than 1280 'ticks' of a 20MSPS sampling clock. 

RE: It grabs a buffer full, sets the timer at the start of a decode, finishes that decode and has enough material queued up over 16ms to decode again setting the timestamp when it starts the second decode

No - this is where your thinking is wrong. The 'time' at the start of the buffer is simply the number of samples received/processed since the program started running. And the timestamp of any messages decoded from within the buffer is the 'time 'at the start of the buffer plus the number of samples into the buffer that the message starts. If each buffer contained (say) 20000 samples, then a buffer would contain 1mS of data, so every new buffer would represent 1ms of elapsed time. Providing you never miss a buffer, you can keep counting the number of buffers you've received and you then know a timestamp in 1mS steps, but with 50nS +/- a few ppm accuracy.

RE: because reprocessing the same buffers several times and getting different results is rather scary. That shouldn't happen.

Yes - that's what I'm 'complaining' about. I understand why it can happen in a low SNR signal, but I can't do anything about it because airspy_adsb source isn't public. 

Re: Newbie material on HF+(more)

David J Taylor
 

From: jdow

Hm, I find this "stuff" on the ICOM page for this receiver:

===8<---
RSSI (Received Single Strength Indicator)

The IC-R8600 shows S-meter, dBµ¬, dBµ¬ (emf) and dBm meter types in the RSSI.
The dBµ¬, dBµ¬ (emf) and dBm meter has a high ±3 dBµ accuracy (between 0.5–1100
MHz) that can be used for measuring signal strength level.
===8<---
Um, er, ah, (shuffle feet), stare at that cross-eyed, no matter what I do it's
"odd". Their marketdroid screwed up. They are claiming +/- 3 dBu calibration.
That means their levels are all accurate to within a roughly 1 uV range
depending on how picky you are over the numbers. That means it's insanely
accurate at full scale and "reasonable non-instrument" accurate at 1 uV and
trash below 1 uV. I think they meant +/- 3 dB of reading which makes just a
WHOLE LOT more sense. And would make it a quite reasonable piece of gear for
those not needing precision instrumentation.

{^_^}
===================================

You can't expect perfection for such a low price! HI! It's just a whole lot better than S-points. Someday I might write a program to monitor and record signal strengths of selected stations (VHF/UHF) both to see the effects of propagation and to check my own cables and antenna's deterioration.

73,
David GM8ARV
--
SatSignal Software - Quality software written to your requirements
Web: http://www.satsignal.eu
Email: david-taylor@...
Twitter: @gm8arv

Re: Newbie material on HF+(more)

jdow
 

Hm, I find this "stuff" on the ICOM page for this receiver:

===8<---
RSSI (Received Single Strength Indicator)

The IC-R8600 shows S-meter, dBµ¬, dBµ¬ (emf) and dBm meter types in the RSSI. The dBµ¬, dBµ¬ (emf) and dBm meter has a high ±3 dBµ accuracy (between 0.5–1100 MHz) that can be used for measuring signal strength level.
===8<---
Um, er, ah, (shuffle feet), stare at that cross-eyed, no matter what I do it's "odd". Their marketdroid screwed up. They are claiming +/- 3 dBu calibration. That means their levels are all accurate to within a roughly 1 uV range depending on how picky you are over the numbers. That means it's insanely accurate at full scale and "reasonable non-instrument" accurate at 1 uV and trash below 1 uV. I think they meant +/- 3 dB of reading which makes just a WHOLE LOT more sense. And would make it a quite reasonable piece of gear for those not needing precision instrumentation.

{^_^}

On 20180407 00:14, David J Taylor via Groups.Io wrote:
I don't think that any receiver since the Collins R388 had an accurate S-Meter!
[]
--doug, WA2SAY
======================================
I'm hoping that my IC-R8600 has reasonable calibration.  It offers both dBuV in EMF and PD measures, and dBm (which I prefer).  As expected, the reading on white noise varies with the bandwidth (mode) selected.
For digital dBFS offers the best way to see how near to overload you are.
73,
David GM8ARV

Re: Newbie material on HF+(more)

jdow
 

Test it with a good signal generator. It's a rice box. But it might be a VERY good rice box. Its design has that potential. It would be nice to know how accurate it appears to be. When measuring weak signals make sure there is at least one signal that is roughly 30 dB to 40 dB below full scale. That allows good potential for averaging (decimation - sort of) to fill in a few more bits of precision at low levels. (Full scale would be where the spectrum suddenly grows just a whole lot of spurious signals. The signal could also be set to provide just about any level near the middle of its spectrum display give or take about 30 dB.

{^_^}

On 20180407 00:14, David J Taylor via Groups.Io wrote:
I don't think that any receiver since the Collins R388 had an accurate S-Meter!
[]
--doug, WA2SAY
======================================
I'm hoping that my IC-R8600 has reasonable calibration.  It offers both dBuV in EMF and PD measures, and dBm (which I prefer).  As expected, the reading on white noise varies with the bandwidth (mode) selected.
For digital dBFS offers the best way to see how near to overload you are.
73,
David GM8ARV

Re: Newbie material on HF+(more)

David J Taylor
 

I don't think that any receiver since the Collins R388 had an accurate S-Meter!
[]

--doug, WA2SAY
======================================

I'm hoping that my IC-R8600 has reasonable calibration. It offers both dBuV in EMF and PD measures, and dBm (which I prefer). As expected, the reading on white noise varies with the bandwidth (mode) selected.

For digital dBFS offers the best way to see how near to overload you are.

73,
David GM8ARV
--
SatSignal Software - Quality software written to your requirements
Web: http://www.satsignal.eu
Email: david-taylor@...
Twitter: @gm8arv

Re: Newbie material on HF+(more)

Alan G4ZFQ
 

Alan, I've clearly missed a whole block of info - where are these development notes?
Pete,

Sorry, I do not think there are any official instructions and I do not know of any unofficial.
The "notes" I refer to are just posts in this group from prog.
As I've been reading a long time I've just picked up the basics.

The changelog on the Airspy HF+ page gives an idea of the development but reading back posts is really the only way so far.
Used normally the HF+ is just another SDR, no special instructions. If you choose to use manual adjustments then what you do is your decision, the developer will say his AGC does it all.
Manual adjustment is shown in the software, SDR#, or if you want to calibrate with your XG1 use HDSDR with the DLL.
Lief's way of assuming a constant noise figure for the HF+ under fixed gain conditions is another way of measurement.
Note that the HF+ is a SDR and this is being emphasised here. Us old radio users have to get used to the fact that while the basic principles are the same 0dBFS, onset of digital overload, is more important than indicating signal strengths.

73 Alan G4ZFQ



what I am primarily interested in at this moment is measuring my background noise level

Re: Newbie material on HF+(more)

jdow
 

R-390A was pretty good. But finding one with the original meter is "difficult".

Do you have any idea how difficult it is to generate a calibrated S-Meter and keep it calibrated? AGC loop gain control is not log-linear. It's not X volts per dB with X a nice constant. That means a special meter calibration is needed. Addressing the Drake line tubes age, fairly rapidly. So stage gain vs AGC voltage will differ from nominal. This will upset the calibration. Basic tube gains vary considerably. That means you have to setup the individual stages fairly accurately. So if Drake out of the factory is calibrated, I'm not sure any Drake you could buy today would be particularly well calibrated. But it would be better than the average rice box.

There is nothing magical about 50 ohms. Although if a box is calibrated with an antenna presenting a 50 ohm impedance then it will have a constant bias if you put a 200 ohm source on it. That's sort of a nevermind. It can be calibrated out. A calibration error that has varying true dB vs indicated dB error is something you can't do anything about with simple adjustments. I handled that situation with my ProII by generating a software table within my remote control tool. That solved a LOT of problems, especially when I added in compensation for input attenuation or gain. 100 uV at 50 ohms was always S9 (Specifically Collins Radio S9 as a matter of fact). That annoyed HELL out of some of the creatures on 75 at night. {^_-} It was accurate enough that path loss calculations produced reasonable comparisons with S-Meter readings. (And it uncovered more than one legal limit++ operators. No, I didn't turn them in. At the times I saw them doing it the band was about as crowded as the local streets between 0300 and 0400 local time. So no real harm was done.)

I note that the S meter is something people relate to. Telling them that their signal is -30 dBm distresses them. It SOUNDS tiny to them despite that being a VERY strong signal. Some people just NEED to be tweaked that way, though. They earn it the hard way, good old obnoxious behavior.

It would be easier to find the sources IF they didn't object quite so strongly to hitting the poles with a really heavy hammer to see what that does to the noise. {^_-} It's an art to find the noise. Sonic detectors may be among the best tools. (I totally gave up on HF when I lived in Hermosa Beach. The ocean salt air does a number on insulators within weeks. 70 miles away and inland it's not so bad. But our UPSs radiate like banshees, too. Ah well.) And do NOT get a Plasma TV. They radiate horridly, I am told. LCD and OLED would be better. (Sadly, both age fairly ungracefully fast compared to what "should be".)

{^_^} Joanne - yes, I am opinionated. I plead too much injuneering eddicashun.

On 20180406 22:06, doug wrote:
On 04/06/2018 04:12 PM, jdow wrote:
What might constitute a "good" hardware receiver. One might expect for
the expense that an ICOM IC_756proII might qualify. I can state
without reservations that it does NOT qualify as giving a good
accurate signal strength reading. It does not even compensate for its
own settings for preamp gain and input attenuation. Readings above
it's notion of S9 are "roughly (!)" 10dB/10dB. Below S9 the steps are
closer to 3dB than to 6dB, which makes it all horridly inaccurate,
especially when you switch in the attenuation that is appropriate at
75 meters, for example. This is one of my pet peeve ongoing wars with
the ham community. Calibration and ham radio seldom go together.
(Those for whom it does tend to read QEX or the former CQ magazine's
"Communications Quarterly" or else they have college degrees in the
field.) I do note that using manual settings will allow you to give
startlingly accurate difference readings between two signal levels
even if the absolute values are not identical. This is nice for A/B
testing with antennas, for example. A tool that is limited is not a
useless tool if you adequately understand the limitations.

And I suppose I just told you that you are mal-educated about things
RF. That is curable and reflects not at all on your intelligence or
knowledge in other fields. I would not expect a psychologist to
understand how radios work at anything near the level I knew even as a
teenager when I got into this hobby in the very late 50s. I knew a
"shrink" in those days who was a ham and treated it as "plug it in and
go." Anyway, learning in RF is a rich enough subject as you get your
teeth into it the learning becomes a reward in itself. It is also
something you can put down satisfied when you know as much as you feel
is worth the effort. From time to time I become overly prolix here and
try to generate understandable documents digging deeper into the
field. Sometimes I succeed. Nearly 60 years as both a ham and a
graduate level electronics engineer has generated a rich enough field
of knowledge that I figure it's worth trying to pass along some of my
insights and experience.

{^_^}   Joanne
   I don't think that any receiver since the Collins R388 had an
accurate S-Meter! And most radios I have seen do not give 6-dB per
S-Unit results. (I had a Drake 2B receiver years ago that had
   real 5dB calibrations on it's meter. A much more useful
calibration!)  At any rate, what is the use of dB/µv when you are
working in a 50 Ohm system? Nobody cares about microvolts at an
  antenna jack in this modern age--not if they have any sense. RF into
a known impedance--50 Ohms--is the standard nowadays. And voltage into
an impedance equals power: decibels relative to
  power--in milliwatts. 0dBm. Thirty dB below 1 Watt in a 50 Ohm
system. Your receiver may or may not be capable of that, but this is the
standard of the industry, and if you can calibrate it to
  that reference, 0dBm, then everyone should understand it.  Now a s/n
at the receiving end might be a useful number IF THE RECEIVER IS
BASICALLY QUIET.  But  the basic story has been
 around for at least a century: if you can't hear 'em, you can't work 'em!
Another thread on the forum points out that you can put a dummy load (50
Ohm) termination) on your receiver to establish your receiver noise
floor. Anything over that is external noise.
My personal experience with a power company is that if you can find the
close source of interference, they will fix it. Ham magazines have
devoted many pages over many years to showing you
how to find the source of interference.  And you can't expect overnite
service, but I had mine fixed in two weeks. (It was a 6-meter noise.)
--doug, WA2SAY

On 20180406 07:00, Pete Smith wrote:
This is all very well, and I am sure that it is technically correct,
but what I am primarily interested in at this moment is measuring my
background noise level and tracking what will hopefully be
improvements as the power company does its thing. So yes, it's all
about SNR, but the question is relating what I see now to what I saw
last month or last year using a good hardware receiver.

I am also somewhat averse to being told "you don't know enough", when
the documentation so lags the hardware and software development.  I
don't have an engineering degree, which makes product-specific
documentation all the more important.

73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

    changing the FFToffset to -93 brought the amplitude of my XG-3
signal
    source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may
think is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute
dBm's. This is mostly related to the nature of modern radios with
complex gain distribution, ADCs, DSP, and AGC loops. These are
already hard to explain to experienced RF engineers, so I can't
blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"


Re: Newbie material on HF+(more)

doug
 

On 04/06/2018 04:12 PM, jdow wrote:
What might constitute a "good" hardware receiver. One might expect for
the expense that an ICOM IC_756proII might qualify. I can state
without reservations that it does NOT qualify as giving a good
accurate signal strength reading. It does not even compensate for its
own settings for preamp gain and input attenuation. Readings above
it's notion of S9 are "roughly (!)" 10dB/10dB. Below S9 the steps are
closer to 3dB than to 6dB, which makes it all horridly inaccurate,
especially when you switch in the attenuation that is appropriate at
75 meters, for example. This is one of my pet peeve ongoing wars with
the ham community. Calibration and ham radio seldom go together.
(Those for whom it does tend to read QEX or the former CQ magazine's
"Communications Quarterly" or else they have college degrees in the
field.) I do note that using manual settings will allow you to give
startlingly accurate difference readings between two signal levels
even if the absolute values are not identical. This is nice for A/B
testing with antennas, for example. A tool that is limited is not a
useless tool if you adequately understand the limitations.

And I suppose I just told you that you are mal-educated about things
RF. That is curable and reflects not at all on your intelligence or
knowledge in other fields. I would not expect a psychologist to
understand how radios work at anything near the level I knew even as a
teenager when I got into this hobby in the very late 50s. I knew a
"shrink" in those days who was a ham and treated it as "plug it in and
go." Anyway, learning in RF is a rich enough subject as you get your
teeth into it the learning becomes a reward in itself. It is also
something you can put down satisfied when you know as much as you feel
is worth the effort. From time to time I become overly prolix here and
try to generate understandable documents digging deeper into the
field. Sometimes I succeed. Nearly 60 years as both a ham and a
graduate level electronics engineer has generated a rich enough field
of knowledge that I figure it's worth trying to pass along some of my
insights and experience.

{^_^} Joanne


I don't think that any receiver since the Collins R388 had an
accurate S-Meter! And most radios I have seen do not give 6-dB per
S-Unit results. (I had a Drake 2B receiver years ago that had
real 5dB calibrations on it's meter. A much more useful
calibration!) At any rate, what is the use of dB/µv when you are
working in a 50 Ohm system? Nobody cares about microvolts at an
antenna jack in this modern age--not if they have any sense. RF into
a known impedance--50 Ohms--is the standard nowadays. And voltage into
an impedance equals power: decibels relative to
power--in milliwatts. 0dBm. Thirty dB below 1 Watt in a 50 Ohm
system. Your receiver may or may not be capable of that, but this is the
standard of the industry, and if you can calibrate it to
that reference, 0dBm, then everyone should understand it. Now a s/n
at the receiving end might be a useful number IF THE RECEIVER IS
BASICALLY QUIET. But the basic story has been
around for at least a century: if you can't hear 'em, you can't work 'em!

Another thread on the forum points out that you can put a dummy load (50
Ohm) termination) on your receiver to establish your receiver noise
floor. Anything over that is external noise.

My personal experience with a power company is that if you can find the
close source of interference, they will fix it. Ham magazines have
devoted many pages over many years to showing you
how to find the source of interference. And you can't expect overnite
service, but I had mine fixed in two weeks. (It was a 6-meter noise.)

--doug, WA2SAY

On 20180406 07:00, Pete Smith wrote:
This is all very well, and I am sure that it is technically correct,
but what I am primarily interested in at this moment is measuring my
background noise level and tracking what will hopefully be
improvements as the power company does its thing. So yes, it's all
about SNR, but the question is relating what I see now to what I saw
last month or last year using a good hardware receiver.

I am also somewhat averse to being told "you don't know enough", when
the documentation so lags the hardware and software development. I
don't have an engineering degree, which makes product-specific
documentation all the more important.

73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

changing the FFToffset to -93 brought the amplitude of my XG-3
signal
source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may
think is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute
dBm's. This is mostly related to the nature of modern radios with
complex gain distribution, ADCs, DSP, and AGC loops. These are
already hard to explain to experienced RF engineers, so I can't
blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"

Re: ADSBSpy Output format

jdow
 

You might explore larger FFTs if your machine has the horsepower. Note that if the decode process takes very little time the stamps you see may be an artifact of the operating system time granularity. Lacking source for ADSBSpy there's not much you can do there unless Youssef configures the tool to use the multimedia timer features. Then instead of the OS ticking at a slow rate (16 ms I believe) it can be made to tick at a 1ms.

But that might not even be a small enough tick to catch separate 50 ns decodes.

Anyway, what comes to mind is that there may be several copies of the message sent one after the other. But, ADSBSpy does not process them all at once. It grabs a buffer full, sets the timer at the start of a decode, finishes that decode and has enough material queued up over 16ms to decode again setting the timestamp when it starts the second decode after a really short period of time. This repeats perhaps for three transmissions or more of the same message. I suspect this is what is happening because reprocessing the same buffers several times and getting different results is rather scary. That shouldn't happen.

Given your intended use I'm not sure what the timestamp is good for as it's potentially less accurate than your cycle counting. But to do the cycle counting you have to get inside ADSBSpy so you can count buffers and points in the buffer that correspond to signal reception. I suppose ADSBSpy could produce relative 64 bit timestamps corresponding to a relative sample number counting from up from 0 for an AirSpy or AirSpy R2. That would likely give the kind of timer you seem to want. It's certainly better than what will come from the OS at the time the buffer is read as reading can be seriously deferred by other running processes and ADSBSpy may have to wait a whole tick to get to a chance to run.

{^_^}

On 20180406 18:13, Support@... wrote:
RE: OK, I see what you are doing. It still looks like it requires some reasonable accuracy, hundreds of ms level, for the time tags. That is certainly a bit better than what you get with GPS techniques (transit time). That "feels like" a mile/km sort of accuracy with several stations reporting. Since the bearing from the polling station is calculated that's all within a machine absolute time is not needed. You primarily need stable time.
Yes stable time. And since the ADSB frames are decoded from a stable 20MSPS/50nS stream of samples, the granularity of the airspy_adsb timestamp can be (and is) 50nS +/- a few parts per million. This is important for MLAT, but MLAT does require multiple receiver sites. I'm not trying to implement MLAT - I'm after beam-finding.
If a typical maximum radar rotation rate is 13 RPM (4.6 sec/rev), and we want one degree resolution, then each FFT Bin needs to be (4.6sec / 360degrees) 12.8mS. So realistically millisecond resolution timestamps are plenty good enough for beam finding. My 16384 long FFT array can therefore hold samples for around 163 seconds, which is 35 revs of a 13 RPM radar head. However, I wasn't 'complaining' about the resolution of the timestamps - I was just pointing out that it's impossible for 3 or 4 frames to be decoded within a few ticks of a 50nS resolution timer - they must be the same frame decoded several times but with differing results. The issue is that it happens often enough for the spurious DF11 decodes to contaminate my FFT array.
RE: I bet there's still a considerable amount of jitter in the readings.
Yes - the interrogating beam isn't particularly narrow, and worse there can be side-lobes which elicit DF-11 replies from aircraft when the head isn't pointing directly at the plane.  However, providing the radar is illuminating sufficient aircraft to populate the FFT array, the FFT conversion can and does produce a very accurate average.

Re: Newbie material on HF+(more)

jdow
 

For what he seems to indicate he wants to do, measure a change in power line radiated noise, he can work with manual settings. He needs to avoid too much gain. The symptom is sudden high level noise spurious all over the spectrum with a small change in gain. Note the settings so he can return to them in the future. (And it may be a poor idea to update the HF+ driver between measurements. Although he could step back for making the measurement.) For a good means of taking the measurement from time to time is to copy the sdrsharp folder to a "SDRSharp Noise Measurement" folder. Run the copy of sdrsharp within that folder. Setup and take the measurement. Exit. Then never use that folder again except to start the program, take a measurement without touching settings, and exit. For normal operation use the normal sdrsharp folder.

A second method that he can use is to observe the noise patterns. Take a picture of what the spectrum display looks like with only medium level signals present. Then in the future under about the same conditions he can compare what the spectrum and waterfall look like. Changes often can stand out pretty dramatically.

{^_^}

On 20180406 18:09, Leif Asbrink wrote:
Hello Pete,

This is all very well, and I am sure that it
is technically correct, but what I am primarily
interested in at this moment is measuring my
background noise level and tracking what will
hopefully be improvements as the power company
does its thing. So yes, it's all about SNR, but
the question is relating what I see now to what
I saw last month or last year using a good hardware
receiver.
I am afraid you are not attacking this problem
properly. What you really want to measure is the
system noise temperature. Ideally it should be the
sky noise temperature at the frequency you look at.
In a non-rural area it is expected to be 10 to 20
dB higher - and your power company might help to
bring it down. Also ferrites on your neighbours
equipment might help.
To measure the noise floor temperature the easy way
is to assume that your HF+ has NF=5dB. Set manual
gain, Preamp on and zero attenuation. Connect a
dummy load to your HF+ and evaluate the power (use
signal diagnostics.) Then connect your antenna
and again use signal diagnostics. You might find
30 dB more noise which would mean that your noise
is 35 dB above room temperature.
Have a look here:
https://www.itu.int/dms_pubrec/itu-r/rec/p/R-REC-P.372-8-200304-S!!PDF-E.pdf
Figure 2. You can see that 35 dB is what you can expect
at 7 MHz, 30 dB at 14 MHz.
Be aware that disabling the AGC could bring the front
end of your HF+ into saturation with hoplessly incorrect
results as a consequence.
A better strategy would be to connect your antenna
and to also inject a weak signal. Look for S/N of the
weak signal and insert attenuation until it degrades.
Then evaluate the NF of your system by use of signal
diagnostics. Use the gain setting you found to be required
to avoid saturation from all the strong out-of-band signals.
Find the noise level and then the level of a known signal.
That would give (S+N)/N in a certain bandwidth. From that
you can compute S/N in 1 Hz bandwidth. Assuming S/N is much
larger than one you get the noise floor in dBm/Hz directly
from the signal power and S/N in 1 Hz bandwidth.
The HF+ gives you dBFS, you could use that, but only it
you make sure there is no out-of-band signal strong enough
to turn down the gain of the HF+ if you are in AGC mode
or to saturate the front end if you are in manual mode.

I am also somewhat averse to being told "you don't
know enough", when the documentation so lags the
hardware and software development.  I don't have an
engineering degree, which makes product-specific
documentation all the more important.
Hmmm, I have tried to clarify the problem. It is not
really something you should expect from the HF+
documentation, it is not product specific.
Surely, if you insert a filter that ensures that no
out-of-band signal is affecting the gain of your HF+
(or saturating it) and then calibrate the dB scale to
fit your generators, everything would be fine for you,
but you would have to make sure that the same FFT size
is used always. There could also be other limitations.
Better properly measure your antenna noise temperature:-)
73
Leif

73, Pete N4ZR
Check out the Reverse Beacon Network
at <http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

changing the FFToffset to -93 brought the amplitude of my XG-3
signal source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may think
is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute dBm's.
This is mostly related to the nature of modern radios with complex
gain distribution, ADCs, DSP, and AGC loops. These are already hard to
explain to experienced RF engineers, so I can't blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"

Re: ADSBSpy Output format

Support@...
 

RE: OK, I see what you are doing. It still looks like it requires some reasonable accuracy, hundreds of ms level, for the time tags. That is certainly a bit better than what you get with GPS techniques (transit time). That "feels like" a mile/km sort of accuracy with several stations reporting. Since the bearing from the polling station is calculated that's all within a machine absolute time is not needed. You primarily need stable time.

Yes stable time. And since the ADSB frames are decoded from a stable 20MSPS/50nS stream of samples, the granularity of the airspy_adsb timestamp can be (and is) 50nS +/- a few parts per million. This is important for MLAT, but MLAT does require multiple receiver sites. I'm not trying to implement MLAT - I'm after beam-finding.

If a typical maximum radar rotation rate is 13 RPM (4.6 sec/rev), and we want one degree resolution, then each FFT Bin needs to be (4.6sec / 360degrees) 12.8mS. So realistically millisecond resolution timestamps are plenty good enough for beam finding. My 16384 long FFT array can therefore hold samples for around 163 seconds, which is 35 revs of a 13 RPM radar head. However, I wasn't 'complaining' about the resolution of the timestamps - I was just pointing out that it's impossible for 3 or 4 frames to be decoded within a few ticks of a 50nS resolution timer - they must be the same frame decoded several times but with differing results. The issue is that it happens often enough for the spurious DF11 decodes to contaminate my FFT array.

RE: I bet there's still a considerable amount of jitter in the readings.

Yes - the interrogating beam isn't particularly narrow, and worse there can be side-lobes which elicit DF-11 replies from aircraft when the head isn't pointing directly at the plane.  However, providing the radar is illuminating sufficient aircraft to populate the FFT array, the FFT conversion can and does produce a very accurate average.

Re: Newbie material on HF+(more)

Leif Asbrink
 

Hello Pete,

This is all very well, and I am sure that it
is technically correct, but what I am primarily
interested in at this moment is measuring my
background noise level and tracking what will
hopefully be improvements as the power company
does its thing. So yes, it's all about SNR, but
the question is relating what I see now to what
I saw last month or last year using a good hardware
receiver.
I am afraid you are not attacking this problem
properly. What you really want to measure is the
system noise temperature. Ideally it should be the
sky noise temperature at the frequency you look at.
In a non-rural area it is expected to be 10 to 20
dB higher - and your power company might help to
bring it down. Also ferrites on your neighbours
equipment might help.

To measure the noise floor temperature the easy way
is to assume that your HF+ has NF=5dB. Set manual
gain, Preamp on and zero attenuation. Connect a
dummy load to your HF+ and evaluate the power (use
signal diagnostics.) Then connect your antenna
and again use signal diagnostics. You might find
30 dB more noise which would mean that your noise
is 35 dB above room temperature.

Have a look here:
https://www.itu.int/dms_pubrec/itu-r/rec/p/R-REC-P.372-8-200304-S!!PDF-E.pdf
Figure 2. You can see that 35 dB is what you can expect
at 7 MHz, 30 dB at 14 MHz.

Be aware that disabling the AGC could bring the front
end of your HF+ into saturation with hoplessly incorrect
results as a consequence.

A better strategy would be to connect your antenna
and to also inject a weak signal. Look for S/N of the
weak signal and insert attenuation until it degrades.
Then evaluate the NF of your system by use of signal
diagnostics. Use the gain setting you found to be required
to avoid saturation from all the strong out-of-band signals.
Find the noise level and then the level of a known signal.
That would give (S+N)/N in a certain bandwidth. From that
you can compute S/N in 1 Hz bandwidth. Assuming S/N is much
larger than one you get the noise floor in dBm/Hz directly
from the signal power and S/N in 1 Hz bandwidth.

The HF+ gives you dBFS, you could use that, but only it
you make sure there is no out-of-band signal strong enough
to turn down the gain of the HF+ if you are in AGC mode
or to saturate the front end if you are in manual mode.

I am also somewhat averse to being told "you don't
know enough", when the documentation so lags the
hardware and software development.  I don't have an
engineering degree, which makes product-specific
documentation all the more important.
Hmmm, I have tried to clarify the problem. It is not
really something you should expect from the HF+
documentation, it is not product specific.

Surely, if you insert a filter that ensures that no
out-of-band signal is affecting the gain of your HF+
(or saturating it) and then calibrate the dB scale to
fit your generators, everything would be fine for you,
but you would have to make sure that the same FFT size
is used always. There could also be other limitations.

Better properly measure your antenna noise temperature:-)

73

Leif



73, Pete N4ZR
Check out the Reverse Beacon Network
at <http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.

On 4/6/2018 9:15 AM, prog wrote:
On Fri, Apr 6, 2018 at 05:57 am, Pete Smith wrote:

changing the FFToffset to -93 brought the amplitude of my XG-3
signal source to about the correct value

For those who are late to the party: Don't touch the configuration
file until you fully understand what you are doing. What you may think
is proper calibration is not proper at all.
There's a very good reason why SDR# uses dBFS and not absolute dBm's.
This is mostly related to the nature of modern radios with complex
gain distribution, ADCs, DSP, and AGC loops. These are already hard to
explain to experienced RF engineers, so I can't blame the end users.
Long story short: Revert these changes and learn how to use dBFS and
SNR measurements. SNR is the only thing that should matter to you.

PS: "Q: But X software has dBm. A: What happens when the DSP decides
to reduce the gain to increase the SNR?"

Re: ADSBSpy Output format

jdow
 

OK, I see what you are doing. It still looks like it requires some reasonable accuracy, hundreds of ms level, for the time tags. That is certainly a bit better than what you get with GPS techniques (transit time). That "feels like" a mile/km sort of accuracy with several stations reporting. Since the bearing from the polling station is calculated that's all within a machine absolute time is not needed. You primarily need stable time. I bet there's still a considerable amount of jitter in the readings.

Thanks for the clue. Learning is good.
{^_^} Joanne

On 20180406 15:56, Support@... wrote:
RE: Getting the rotation time for the radar head is the first step. Here in the US, normal rotations are supposed to be either 5 or 12 rpm. As an experiment, I threw together a program to look for consecutive interrogations from the same radar and plane at those speeds (with a plus or minus factor, I had a known radar spinning just over 13 rpm). My goal at the time was to identify various radars and approximate the rotation speeds; comparing my results with those from PlanePlotter gave reasonably close results.
Yep - but you're not really that interested in the rotation frequency - you want the radar head 'phase' and there is a better way of getting the rotation 'phase'. Assume you're in space directly overhead the radar site. As the radar spins it sends out ADSB interrogations. Some of these are DF17's and you can receive these and decode the plane's latitude and longitude. However, you can't be sure which radar site sent out the DF17 request, so you just keep a list of planes and there locations.
Then suppose you receive a DF-11 with an SI/II that corresponds to the radar site you're directly overhead looking down on. If you know the location of that plane (from DF-17's), and you know the location of the radar head (Google earth), then you can work out the angle (aka phase) of the plane from the radar head. From this you can calculate the real (East/West) component (Cos(theta)) and imaginary (North/South) component (Sin(theta)) and feed the results into the first (time) BIN of an FFT array.  You shift the array along at regular (time) intervals - if you want one degree accuracy and the radar head spins at 13RPM then you need to shift by one BIN every 10mS or so. Every time you get another DF-11 from a known position aircraft you shift the array along by the correct 'time' and populate the first BIN with the new I/Q for the angle of dangle to the plane. Once there are more than a few dozen planes in the time array you can do an FFT to translate your time array into a frequency/phase array. The peak in the output will correspond to the rotational frequency of the radar head.
The longer the FFT array (I'm using 16384 entries), and the more DF-11 I/Q samples you can put into the array, then the more accurate the FFT will be.
Then suppose an SI/II arrives from a plane whose position you don't know. You shift the time array along to the correct time BIN. Do a forward FFT to convert the time array to a frequency array, filter the frequency array to extract just the area around the rotational frequency peak, and then do an inverse FFT from the frequency back to the time array. The first BIN in the new time array will contain the I/Q of the unknown plane, and from that you can arctan  to get the phase (or bearing) of the plane from the radar head. If the same plane emits SI/II's from multiple radar sites, then bobs your auntie - you can see where radar beams intersect and the plane must be somewhere close to there.
There are lots of iffs and buts, but it does work. You just need a reliable stream of II/SI DF-11's with not too many spurious entries and the FFT effectively averages all the samples to give you a high reliability/probability result.

Re: ADSBSpy Output format

Support@...
 

RE: Post four? All I see is the one post, this one

I access the forum via a web interface, not via EMAIL. The webpage for this tread is here : https://airspy.groups.io/g/main/topic/14826964

RE: That is to say most of the techniques I can think of to solve this timing issue don't apply ....etc.

The absolute time rarely matters, and as you say unless you have either GPS or rubidium clocks getting within 1mS accuracy is challenging. Luckily what you actually need for MLAT is accurate delta time - the time between messages received at the same location. The airspy_adsb timestamps are implicit in the sampling. If you are sampling at 20MSPS then you will get 20million samples per second. So if you know how many samples there were between two successive DF frames, then you also know how much time has elapsed between them. Delays in Airspy processing, USB transport, PC operating systems, cables etc don't matter - provided no samples are ever lost meaning your PC can keep up with a 40Meg/second USB transfer rate. The accuracy of the delta timestamps are determined solely by the accuracy of the sampling clock rate, which for Airspy is very good - a few ppm.

Re: Newbie material on HF+(more)

jdow
 

For that the manual settings would work well. And all I can suggest is experiment. I've not played with them, myself.

{^_^}

On 20180406 14:52, Pete Smith wrote:
That's where the XG-3 comes in -- it has output levels of -107, -73, -33 and 0 dBm, which are adequately stable and accurate for my purposes.  That said, I'm less interested in the absolute level than being able to repeat the measurement conditions from one moment in time to another.
73, Pete N4ZR
Check out the Reverse Beacon Network
at<http://reversebeacon.net>, now
spotting RTTY activity worldwide.
For spots, please use your favorite
"retail" DX cluster.
On 4/6/2018 1:43 PM, Dana Myers wrote:

If you're interested in measuring your absolute background noise level,
I'd think you're going to need something to calibrate your receiver - regardless
of which receiver you're using. I mean, it sounds like what you really want
is an instrument more than a receiver, right?

73,
Dana  K6JQ

Re: ADSBSpy Output format

Support@...
 

RE: Getting the rotation time for the radar head is the first step. Here in the US, normal rotations are supposed to be either 5 or 12 rpm. As an experiment, I threw together a program to look for consecutive interrogations from the same radar and plane at those speeds (with a plus or minus factor, I had a known radar spinning just over 13 rpm). My goal at the time was to identify various radars and approximate the rotation speeds; comparing my results with those from PlanePlotter gave reasonably close results.

Yep - but you're not really that interested in the rotation frequency - you want the radar head 'phase' and there is a better way of getting the rotation 'phase'. Assume you're in space directly overhead the radar site. As the radar spins it sends out ADSB interrogations. Some of these are DF17's and you can receive these and decode the plane's latitude and longitude. However, you can't be sure which radar site sent out the DF17 request, so you just keep a list of planes and there locations.  

Then suppose you receive a DF-11 with an SI/II that corresponds to the radar site you're directly overhead looking down on. If you know the location of that plane (from DF-17's), and you know the location of the radar head (Google earth), then you can work out the angle (aka phase) of the plane from the radar head. From this you can calculate the real (East/West) component (Cos(theta)) and imaginary (North/South) component (Sin(theta)) and feed the results into the first (time) BIN of an FFT array.  You shift the array along at regular (time) intervals - if you want one degree accuracy and the radar head spins at 13RPM then you need to shift by one BIN every 10mS or so. Every time you get another DF-11 from a known position aircraft you shift the array along by the correct 'time' and populate the first BIN with the new I/Q for the angle of dangle to the plane. Once there are more than a few dozen planes in the time array you can do an FFT to translate your time array into a frequency/phase array. The peak in the output will correspond to the rotational frequency of the radar head.

The longer the FFT array (I'm using 16384 entries), and the more DF-11 I/Q samples you can put into the array, then the more accurate the FFT will be.

Then suppose an SI/II arrives from a plane whose position you don't know. You shift the time array along to the correct time BIN. Do a forward FFT to convert the time array to a frequency array, filter the frequency array to extract just the area around the rotational frequency peak, and then do an inverse FFT from the frequency back to the time array. The first BIN in the new time array will contain the I/Q of the unknown plane, and from that you can arctan  to get the phase (or bearing) of the plane from the radar head. If the same plane emits SI/II's from multiple radar sites, then bobs your auntie - you can see where radar beams intersect and the plane must be somewhere close to there.

There are lots of iffs and buts, but it does work. You just need a reliable stream of II/SI DF-11's with not too many spurious entries and the FFT effectively averages all the samples to give you a high reliability/probability result.  

Re: Newbie material on HF+(more)

Pete Smith <n4zr@...>
 

That's where the XG-3 comes in -- it has output levels of -107, -73, -33 and 0 dBm, which are adequately stable and accurate for my purposes.  That said, I'm less interested in the absolute level than being able to repeat the measurement conditions from one moment in time to another.

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/6/2018 1:43 PM, Dana Myers wrote:


If you're interested in measuring your absolute background noise level,
I'd think you're going to need something to calibrate your receiver - regardless
of which receiver you're using. I mean, it sounds like what you really want
is an instrument more than a receiver, right?

73,
Dana  K6JQ


Re: Newbie material on HF+(more)

Pete Smith <n4zr@...>
 

Alan, I've clearly missed a whole block of info - where are these development notes? I suspect my confusion lies in part in both the Airspy HF+ and SDR# sharing a developer, or so I've been told.  See, I really *am* a newbie.

73, Pete N4ZR
Check out the Reverse Beacon Network 
at <http://reversebeacon.net>, now 
spotting RTTY activity worldwide. 
For spots, please use your favorite 
"retail" DX cluster.
On 4/6/2018 1:49 PM, Alan G4ZFQ wrote:


what I am primarily interested in at this moment is measuring my background noise level

Pete,

If you looked back at the development notes by ""prog" you would see that the first principle was to make the AGC do all the work, with S/N important, true measurements unsupported. Since release more improvements have made this even better for most casual users.

But you may now disable the AGC, set the gain as you wish, and use something like HDSDR to calibrate levels.

73 Alan G4ZFQ