Topics

Understanding the frequency calibration

Patrick
 

Hi !
 
My Airspy HF+ calibration is fine : less than 5 Hz off at 10 MHz after running for more than 1 hour. I use RWM or CHU Time Stations as calibration sources.
 
But it reads about +1 Hz on reliable MW stations (Talk Sports 1089 .001, BBC 5 Live 909.001, etc).
Even stranger : on 198 kHz, BBC also reads +1 Hz.
 
To my knowledge, using a SDR, the difference should be proportional as you move up or down in frequency.
 
How can my my Airspy show +5 Hz at 10 MHz and +1 Hz on 198 kHz ?
Note : I found this when listening to some SDR# recordings. I will make sure it also occurs when listening 'live'.
 
Values given by both SDR# IF Spectrum (very narrow BW for max accuracy) and SDR Console Data File Analyser.
See attachments.
 
Thanks

prog
 

On Tue, Oct 29, 2019 at 11:44 AM, Patrick wrote:
Note : I found this when listening to some SDR# recordings.
The center frequency is saved with a 1 Hz resolution. Coincidence!

Patrick
 

So that's most likely the explanation.
I will have to think about systematically checking a reliable MW frequency every time I take a reading on a recording.

Thanks for the prompt reply.



Le mar. 29 oct. 2019 à 11:46, prog <info@...> a écrit :
On Tue, Oct 29, 2019 at 11:44 AM, Patrick wrote:
Note : I found this when listening to some SDR# recordings.
The center frequency is saved with a 1 Hz resolution. Coincidence!

Patrick
 

On Tue, Oct 29, 2019 at 11:44 AM, Patrick wrote:
Note : I found this when listening to some SDR# recordings.
The center frequency is saved with a 1 Hz resolution. Coincidence!

Hi !

I just ran LIVE listening this time, using again RWM Time Station as calibration source on HF and the BBC on LF (confirmed with TDF 162 kHz Time Station).
The values obtained are as follows

14.996 : +3.4 Hz (usual value)
  9.996 : +2.3 Hz (as expected)
  4.996 : +1.1 Hz (as expected)
  0.198 : +1 Hz    (???)

Same results as previously with the recorded file.
Attached are the screenshots

prog
 

On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:
  0.198 : +1 Hz    (???)
Assuming a sample rate of 768 ksps, your calculation is only valid when center frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation of the offset, because the physical LO is fixed there. For other sample rates, you should use different offsets in the calculation.

Patrick
 

All I actually need is accurate readings on the MW band, so it should be fine.

<< your calculation is only valid when center frequency is >= 300 kHz >>
I set the center frequency to 500 kHz and the  reading remains 198.001
Tried different sample rates, same result. That's probably too complicated for my tiny brain !

Just out if curiosity, how can I simply read the correct frequency, i.e 198.000 or 162.000 for TDF ?

Thanks.


Le mer. 30 oct. 2019 à 16:56, prog <info@...> a écrit :
On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:
  0.198 : +1 Hz    (???)
Assuming a sample rate of 768 ksps, your calculation is only valid when center frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation of the offset, because the physical LO is fixed there. For other sample rates, you should use different offsets in the calculation.

prog
 

On Wed, Oct 30, 2019 at 05:30 PM, Patrick wrote:
Just out if curiosity, how can I simply read the correct frequency, i.e 198.000 or 162.000 for TDF ?
Buy a real spectrum analyzer.

Monroe Pattillo
 

If you’re specifically in the US (not North of the US in Canada) and near (within 1 mile) to the final approach flight path of an ILS supportive airport you could try the 75MHz 400Hz tone Outer Marker (OM) beacon as a short range RF calibration source in that band. 

 

                https://en.wikipedia.org/wiki/Marker_beacon

 

We’ve used it as a calibration source for amateur radio astronomy receivers.  I don’t know if the ILS OM transmitter frequency is atomic clock accurately aligned and maintained.  It should be at least local Rubidium source accurate.  Transmission frequency accuracy is required as the ILS OM is used for aircraft landing alignments.  This should be sufficiently accurate for most SDR frequency calibrations in that frequency range.  It’s certainly a zero cost calibration source and unlike an internal calibration source it does include any preamplifier and the antenna in the reception equipment path.

 

RWM, Beta, and TDF are possible but more challenging to receive in the US and even if received there is sufficient propagation delay given the distance to the transmitters to be subject to inaccuracies due to atmospheric propagation distortion.

 

Monroe

From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of prog
Sent: Wednesday, October 30, 2019 11:56 AM
To: airspy@groups.io
Subject: Re: [airspy] Understanding the frequency calibration

 

On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:

  0.198 : +1 Hz    (???)

Assuming a sample rate of 768 ksps, your calculation is only valid when center frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation of the offset, because the physical LO is fixed there. For other sample rates, you should use different offsets in the calculation.

kb3cs
 

US stations are required to maintain frequency within 20 Hz. typical accuracy is within +/- 6 Hz.

  - 45 (base 17) -


On Tue, Oct 29, 2019 at 04:14 AM, Patrick wrote:
So that's most likely the explanation.
I will have to think about systematically checking a reliable MW frequency every time I take a reading on a recording.
 
Thanks for the prompt reply.
 

Le mar. 29 oct. 2019 à 11:46, prog <info@...> a écrit :
On Tue, Oct 29, 2019 at 11:44 AM, Patrick wrote:
Note : I found this when listening to some SDR# recordings.
The center frequency is saved with a 1 Hz resolution. Coincidence!

 

 

SV1BTL
 

A practical guide:
- Tune to WWW stations (2500, 5000, 10000, and 15000 kHz) e.g. 5 MHz USB
- Then re-tune to 4.999 KHz USB. You can here an 1 KHz acoustic tone
- Compare what you hear with an 1 KHz sine tone (you may find many samples searching the net - https://www.youtube.com/watch?v=PyD9cMarVJk).
- The 2 tones must be the identically the same.
 
You can also use many other stable carriers as 77.5 KHz USB (DCF77 the German longwave time signal). When you will tune 1 KHz down (76.5 KHz USB), you will demodulate 1 KHz acoustic tone.
 
I have used these methods when I've installed the server and I have to calibrate as follows: 
- AirSpy discovery : frequency_correction_ppb = -50
- AirSpy HF+ : frequency_correction_ppb = -200
They are on air 24/7 and there was no need to re-calibrate them until now.

I found that this is a practical method to calibrate an SDRs, if you don't have any well calibrated spectrum analyzer.

jdow
 

Pray tell what makes you think marker beacons are precise frequency references? The US NTIA specifies they be within 50 ppm of the correct frequency. There are far far better sources of frequency accuracy. Police radios are within 5 ppm. Aircraft are within 5 ppm. Portions of the digital TV signal are very precisely known. Google for it.

{^_^}

On 20191030 09:40:09, Monroe Pattillo wrote:
If you’re specifically in the US (not North of the US in Canada) and near (within 1 mile) to the final approach flight path of an ILS supportive airport you could try the 75MHz 400Hz tone Outer Marker (OM) beacon as a short range RF calibration source in that band.
https://en.wikipedia.org/wiki/Marker_beacon
We’ve used it as a calibration source for amateur radio astronomy receivers.  I don’t know if the ILS OM transmitter frequency is atomic clock accurately aligned and maintained.  It should be at least local Rubidium source accurate. Transmission frequency accuracy is required as the ILS OM is used for aircraft landing alignments.  This should be sufficiently accurate for most SDR frequency calibrations in that frequency range.  It’s certainly a zero cost calibration source and unlike an internal calibration source it does include any preamplifier and the antenna in the reception equipment path.
RWM, Beta, and TDF are possible but more challenging to receive in the US and even if received there is sufficient propagation delay given the distance to the transmitters to be subject to inaccuracies due to atmospheric propagation distortion.
Monroe
*From:*airspy@groups.io [mailto:airspy@groups.io] *On Behalf Of *prog
*Sent:* Wednesday, October 30, 2019 11:56 AM
*To:* airspy@groups.io
*Subject:* Re: [airspy] Understanding the frequency calibration
On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:
  0.198 : +1 Hz    (???)
Assuming a sample rate of 768 ksps, your calculation is only valid when center frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation of the offset, because the physical LO is fixed there. For other sample rates, you should use different offsets in the calculation.

Monroe Pattillo
 

Yes, DVB-T channel frequency is required to be very accurate (10-12) and is far more accurate than the US NTIA ILS allocation of 50ppm which dates back to post-WW II, or the current FAA ILS certification required 20ppm, or maybe even FAA airport certification empirical testing results which is likely << 20ppm assuming they use commonly available inexpensive TXCO technology at 1ppm. 

 

I believe DVB-T stations use GPS+PPS+OCXO (while GPS location might drift, the atomic clock putting out the PPS is accurate to about 10ns, and the OCXO provides a short duration lock).  That should be more accurate than our current SDR receivers could fully utilize.  Although the typical AFC lock range for consumer TVs appears to be +/-70kHz which is far sloppier than we would want for a calibration source.  If DVB-T transmission are so accurate why wouldn’t we abandon all other publicly accessible calibration sources and use only the leading edge of a DTV channel?  Maybe someone out there might know the reason and be willing to educate the rest of us.

 

The AirSpy R2 and Discovery spec says 0.5ppm.  I’m not sure what they use, maybe TCXO (10-6).  A contemporary (less expensive than they used to be) Rubidium source is 10-9

 

Contemporary airport ILS transmitters in order to improve their Class (I, II, III)  might use GPS+PPS+local Rubidium source to assure certification compliance at a sustainable improved class, but who knows when the certification data is locked behind an FAA employee login.  If we knew for sure it might make ILS OM more usable than what might appear at first glance from its US NTIA allocation or its post-WW II original specs.  ILS OM is far easier to demodulate and decode than a DVB-T signal, particularly since we disengage that built-in demodulation of our SDRs. 

 

Does anyone out there have certification test equipment that can check the frequency accuracy of the ILS OM of a local US airport?

 

Monroe

-----Original Message-----
From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of jdow
Sent: Wednesday, October 30, 2019 9:23 PM
To: airspy@groups.io
Subject: Re: [airspy] Understanding the frequency calibration

 

Pray tell what makes you think marker beacons are precise frequency references?

The US NTIA specifies they be within 50 ppm of the correct frequency. There are

far far better sources of frequency accuracy. Police radios are within 5 ppm.

Aircraft are within 5 ppm. Portions of the digital TV signal are very precisely

known. Google for it.

 

{^_^}

 

On 20191030 09:40:09, Monroe Pattillo wrote:

> If you’re specifically in the US (not North of the US in Canada) and near

> (within 1 mile) to the final approach flight path of an ILS supportive airport

> you could try the 75MHz 400Hz tone Outer Marker (OM) beacon as a short range RF

> calibration source in that band.

>

> https://en.wikipedia.org/wiki/Marker_beacon

>

> We’ve used it as a calibration source for amateur radio astronomy receivers.  I

> don’t know if the ILS OM transmitter frequency is atomic clock accurately

> aligned and maintained.  It should be at least local Rubidium source accurate. 

> Transmission frequency accuracy is required as the ILS OM is used for aircraft

> landing alignments.  This should be sufficiently accurate for most SDR frequency

> calibrations in that frequency range.  It’s certainly a zero cost calibration

> source and unlike an internal calibration source it does include any

> preamplifier and the antenna in the reception equipment path.

>

> RWM, Beta, and TDF are possible but more challenging to receive in the US and

> even if received there is sufficient propagation delay given the distance to the

> transmitters to be subject to inaccuracies due to atmospheric propagation

> distortion.

>

> Monroe

>

> *From:*airspy@groups.io [mailto:airspy@groups.io] *On Behalf Of *prog

> *Sent:* Wednesday, October 30, 2019 11:56 AM

> *To:* airspy@groups.io

> *Subject:* Re: [airspy] Understanding the frequency calibration

>

> On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:

>

>    0.198 : +1 Hz    (???)

>

> Assuming a sample rate of 768 ksps, your calculation is only valid when center

> frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation

> of the offset, because the physical LO is fixed there. For other sample rates,

> you should use different offsets in the calculation.

>

>

 

jdow
 

DVB-T in the US? (The equivalent in the US is also very precise.)
{^_-}

On 20191031 00:19:37, Monroe Pattillo wrote:
Yes, DVB-T channel frequency is required to be very accurate (10^-12 ) and is far more accurate than the US NTIA ILS allocation of 50ppm which dates back to post-WW II, or the current FAA ILS certification required 20ppm, or maybe even FAA airport certification empirical testing results which is likely << 20ppm assuming they use commonly available inexpensive TXCO technology at 1ppm.
I believe DVB-T stations use GPS+PPS+OCXO (while GPS location might drift, the atomic clock putting out the PPS is accurate to about 10ns, and the OCXO provides a short duration lock).  That should be more accurate than our current SDR receivers could fully utilize.  Although the typical AFC lock range for consumer TVs appears to be +/-70kHz which is far sloppier than we would want for a calibration source.  If DVB-T transmission are so accurate why wouldn’t we abandon all other publicly accessible calibration sources and use only the leading edge of a DTV channel?  Maybe someone out there might know the reason and be willing to educate the rest of us.
The AirSpy R2 and Discovery spec says 0.5ppm.  I’m not sure what they use, maybe TCXO (10^-6 ).  A contemporary (less expensive than they used to be) Rubidium source is 10^-9 .
Contemporary airport ILS transmitters in order to improve their Class (I, II, III)  might use GPS+PPS+local Rubidium source to assure certification compliance at a sustainable improved class, but who knows when the certification data is locked behind an FAA employee login.  If we knew for sure it might make ILS OM more usable than what might appear at first glance from its US NTIA allocation or its post-WW II original specs.  ILS OM is far easier to demodulate and decode than a DVB-T signal, particularly since we disengage that built-in demodulation of our SDRs.
Does anyone out there have certification test equipment that can check the frequency accuracy of the ILS OM of a local US airport?
Monroe
-----Original Message-----
From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of jdow
Sent: Wednesday, October 30, 2019 9:23 PM
To: airspy@groups.io
Subject: Re: [airspy] Understanding the frequency calibration
Pray tell what makes you think marker beacons are precise frequency references?
The US NTIA specifies they be within 50 ppm of the correct frequency. There are
far far better sources of frequency accuracy. Police radios are within 5 ppm.
Aircraft are within 5 ppm. Portions of the digital TV signal are very precisely
known. Google for it.
{^_^}
On 20191030 09:40:09, Monroe Pattillo wrote:

> If you’re specifically in the US (not North of the US in Canada) and near

> (within 1 mile) to the final approach flight path of an ILS supportive airport

> you could try the 75MHz 400Hz tone Outer Marker (OM) beacon as a short range RF

> calibration source in that band.

>

> https://en.wikipedia.org/wiki/Marker_beacon

>

> We’ve used it as a calibration source for amateur radio astronomy receivers.  I

> don’t know if the ILS OM transmitter frequency is atomic clock accurately

> aligned and maintained.  It should be at least local Rubidium source accurate.

> Transmission frequency accuracy is required as the ILS OM is used for aircraft

> landing alignments.  This should be sufficiently accurate for most SDR frequency

> calibrations in that frequency range.  It’s certainly a zero cost calibration

> source and unlike an internal calibration source it does include any

> preamplifier and the antenna in the reception equipment path.

>

> RWM, Beta, and TDF are possible but more challenging to receive in the US and

> even if received there is sufficient propagation delay given the distance to the

> transmitters to be subject to inaccuracies due to atmospheric propagation

> distortion.

>

> Monroe

>

> *From:*airspy@groups.io [mailto:airspy@groups.io] *On Behalf Of *prog

> *Sent:* Wednesday, October 30, 2019 11:56 AM

> *To:* airspy@groups.io

> *Subject:* Re: [airspy] Understanding the frequency calibration

>

> On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:

>

>    0.198 : +1 Hz    (???)

>

> Assuming a sample rate of 768 ksps, your calculation is only valid when center

> frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation

> of the offset, because the physical LO is fixed there. For other sample rates,

> you should use different offsets in the calculation.

>

>

Joe M.
 

That would be ATSC for those who want to Google it.

Joe M.

On 10/31/2019 9:26 AM, jdow wrote:
DVB-T in the US? (The equivalent in the US is also very precise.)
{^_-}

Monroe Pattillo
 

Yes, it's ATSC DTV here which is also very precise in its transmission characteristics. It's like buying GNSS receivers they always have to support the US variant for commercial marketplace success. It's akin to buying a SDR receiver device and it says it's DVB-T chip based, but it's all just DTTV, disable the built-in channelization and demodulation and the result is a software frequency agile receiver with outboard software based channel definition and demodulation, which is put into products and just called a SDR ;>)

Regardless of the DTTV national flavor I guess the question still stands. If the frequency of their transmissions are so accurate why wouldn’t we abandon all other publicly accessible calibration sources and use only the leading edge of a DTTV channel for SDR calibration? Maybe someone out there might know the reason why we don't and be willing to educate the rest of us.

Monroe

-----Original Message-----
From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of jdow
Sent: Thursday, October 31, 2019 9:26 AM
To: airspy@groups.io
Subject: Re: [airspy] Understanding the frequency calibration

DVB-T in the US? (The equivalent in the US is also very precise.)
{^_-}

On 20191031 00:19:37, Monroe Pattillo wrote:
Yes, DVB-T channel frequency is required to be very accurate (10^-12 ) and is
far more accurate than the US NTIA ILS allocation of 50ppm which dates back to
post-WW II, or the current FAA ILS certification required 20ppm, or maybe even
FAA airport certification empirical testing results which is likely << 20ppm
assuming they use commonly available inexpensive TXCO technology at 1ppm.

I believe DVB-T stations use GPS+PPS+OCXO (while GPS location might drift, the
atomic clock putting out the PPS is accurate to about 10ns, and the OCXO
provides a short duration lock). That should be more accurate than our current
SDR receivers could fully utilize. Although the typical AFC lock range for
consumer TVs appears to be +/-70kHz which is far sloppier than we would want for
a calibration source. If DVB-T transmission are so accurate why wouldn’t we
abandon all other publicly accessible calibration sources and use only the
leading edge of a DTV channel? Maybe someone out there might know the reason
and be willing to educate the rest of us.

The AirSpy R2 and Discovery spec says 0.5ppm. I’m not sure what they use, maybe
TCXO (10^-6 ). A contemporary (less expensive than they used to be) Rubidium
source is 10^-9 .

Contemporary airport ILS transmitters in order to improve their Class (I, II,
III) might use GPS+PPS+local Rubidium source to assure certification compliance
at a sustainable improved class, but who knows when the certification data is
locked behind an FAA employee login. If we knew for sure it might make ILS OM
more usable than what might appear at first glance from its US NTIA allocation
or its post-WW II original specs. ILS OM is far easier to demodulate and decode
than a DVB-T signal, particularly since we disengage that built-in demodulation
of our SDRs.

Does anyone out there have certification test equipment that can check the
frequency accuracy of the ILS OM of a local US airport?

Monroe

-----Original Message-----
From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of jdow
Sent: Wednesday, October 30, 2019 9:23 PM
To: airspy@groups.io
Subject: Re: [airspy] Understanding the frequency calibration

Pray tell what makes you think marker beacons are precise frequency references?

The US NTIA specifies they be within 50 ppm of the correct frequency. There are

far far better sources of frequency accuracy. Police radios are within 5 ppm.

Aircraft are within 5 ppm. Portions of the digital TV signal are very precisely

known. Google for it.

{^_^}

On 20191030 09:40:09, Monroe Pattillo wrote:

> If you’re specifically in the US (not North of the US in Canada) and near

> (within 1 mile) to the final approach flight path of an ILS supportive airport

> you could try the 75MHz 400Hz tone Outer Marker (OM) beacon as a short range RF

> calibration source in that band.

>

> https://en.wikipedia.org/wiki/Marker_beacon

>

> We’ve used it as a calibration source for amateur radio astronomy receivers. I

> don’t know if the ILS OM transmitter frequency is atomic clock accurately

> aligned and maintained. It should be at least local Rubidium source accurate.

> Transmission frequency accuracy is required as the ILS OM is used for aircraft

> landing alignments. This should be sufficiently accurate for most SDR frequency

> calibrations in that frequency range. It’s certainly a zero cost calibration

> source and unlike an internal calibration source it does include any

> preamplifier and the antenna in the reception equipment path.

>

> RWM, Beta, and TDF are possible but more challenging to receive in the US and

> even if received there is sufficient propagation delay given the distance to the

> transmitters to be subject to inaccuracies due to atmospheric propagation

> distortion.

>

> Monroe

>

> *From:*airspy@groups.io [mailto:airspy@groups.io] *On Behalf Of *prog

> *Sent:* Wednesday, October 30, 2019 11:56 AM

> *To:* airspy@groups.io

> *Subject:* Re: [airspy] Understanding the frequency calibration

>

> On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:

>

> 0.198 : +1 Hz (???)

>

> Assuming a sample rate of 768 ksps, your calculation is only valid when center

> frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation

> of the offset, because the physical LO is fixed there. For other sample rates,

> you should use different offsets in the calculation.

>

>

Curt Faulk
 

I'm a retired FAA air traffic controller and my son is currently an FAA ILS technician.  I have forwarded this topic to him to see if he can provide anything useful to this discussion, specifically, the frequency accuracy of ILS marker beacons.

Curt Faulk
 

I'm a retired FAA air traffic controller and my son is currently a fully-certified FAA ILS technician.  I've forwarded this topic to him to see if he can provide any useful information to us, specifically regarding frequency accuracy of ILS marker beacons.

If he gives anything useful, I'll post it here,

BryonB
 

I don't think most of our ATSC TV broadcasters in the US are in the frequency range of the HF+/Discovery.

Most are now UHF. There are a few in HI-VHF range, but only in some locations.

On Thu, 31 Oct 2019 7:54 am Monroe Pattillo, <monroe_pattillo@...> wrote:
Yes, it's ATSC DTV here which is also very precise in its transmission characteristics.  It's like buying GNSS receivers they always have to support the US variant for commercial marketplace success.  It's akin to buying a SDR receiver device and it says it's DVB-T chip based, but it's all just DTTV, disable the built-in channelization and demodulation and the result is a software frequency agile receiver with outboard software based channel definition and demodulation, which is put into products and just called a SDR ;>)

Regardless of the DTTV national flavor I guess the question still stands.  If the frequency of their transmissions are so accurate why wouldn’t we abandon all other publicly accessible calibration sources and use only the leading edge of a DTTV channel for SDR calibration?  Maybe someone out there might know the reason why we don't and be willing to educate the rest of us.

Monroe
-----Original Message-----
From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of jdow
Sent: Thursday, October 31, 2019 9:26 AM
To: airspy@groups.io
Subject: Re: [airspy] Understanding the frequency calibration

DVB-T in the US? (The equivalent in the US is also very precise.)
{^_-}

On 20191031 00:19:37, Monroe Pattillo wrote:
> Yes, DVB-T channel frequency is required to be very accurate (10^-12 ) and is
> far more accurate than the US NTIA ILS allocation of 50ppm which dates back to
> post-WW II, or the current FAA ILS certification required 20ppm, or maybe even
> FAA airport certification empirical testing results which is likely << 20ppm
> assuming they use commonly available inexpensive TXCO technology at 1ppm.
>
> I believe DVB-T stations use GPS+PPS+OCXO (while GPS location might drift, the
> atomic clock putting out the PPS is accurate to about 10ns, and the OCXO
> provides a short duration lock).  That should be more accurate than our current
> SDR receivers could fully utilize.  Although the typical AFC lock range for
> consumer TVs appears to be +/-70kHz which is far sloppier than we would want for
> a calibration source.  If DVB-T transmission are so accurate why wouldn’t we
> abandon all other publicly accessible calibration sources and use only the
> leading edge of a DTV channel?  Maybe someone out there might know the reason
> and be willing to educate the rest of us.
>
> The AirSpy R2 and Discovery spec says 0.5ppm.  I’m not sure what they use, maybe
> TCXO (10^-6 ).  A contemporary (less expensive than they used to be) Rubidium
> source is 10^-9 .
>
> Contemporary airport ILS transmitters in order to improve their Class (I, II,
> III)  might use GPS+PPS+local Rubidium source to assure certification compliance
> at a sustainable improved class, but who knows when the certification data is
> locked behind an FAA employee login.  If we knew for sure it might make ILS OM
> more usable than what might appear at first glance from its US NTIA allocation
> or its post-WW II original specs.  ILS OM is far easier to demodulate and decode
> than a DVB-T signal, particularly since we disengage that built-in demodulation
> of our SDRs.
>
> Does anyone out there have certification test equipment that can check the
> frequency accuracy of the ILS OM of a local US airport?
>
> Monroe
>
> -----Original Message-----
> From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of jdow
> Sent: Wednesday, October 30, 2019 9:23 PM
> To: airspy@groups.io
> Subject: Re: [airspy] Understanding the frequency calibration
>
> Pray tell what makes you think marker beacons are precise frequency references?
>
> The US NTIA specifies they be within 50 ppm of the correct frequency. There are
>
> far far better sources of frequency accuracy. Police radios are within 5 ppm.
>
> Aircraft are within 5 ppm. Portions of the digital TV signal are very precisely
>
> known. Google for it.
>
> {^_^}
>
> On 20191030 09:40:09, Monroe Pattillo wrote:
>
>  > If you’re specifically in the US (not North of the US in Canada) and near
>
>  > (within 1 mile) to the final approach flight path of an ILS supportive airport
>
>  > you could try the 75MHz 400Hz tone Outer Marker (OM) beacon as a short range RF
>
>  > calibration source in that band.
>
>  >
>
>  > https://en.wikipedia.org/wiki/Marker_beacon
>
>  >
>
>  > We’ve used it as a calibration source for amateur radio astronomy receivers.  I
>
>  > don’t know if the ILS OM transmitter frequency is atomic clock accurately
>
>  > aligned and maintained.  It should be at least local Rubidium source accurate.
>
>  > Transmission frequency accuracy is required as the ILS OM is used for aircraft
>
>  > landing alignments.  This should be sufficiently accurate for most SDR frequency
>
>  > calibrations in that frequency range.  It’s certainly a zero cost calibration
>
>  > source and unlike an internal calibration source it does include any
>
>  > preamplifier and the antenna in the reception equipment path.
>
>  >
>
>  > RWM, Beta, and TDF are possible but more challenging to receive in the US and
>
>  > even if received there is sufficient propagation delay given the distance to the
>
>  > transmitters to be subject to inaccuracies due to atmospheric propagation
>
>  > distortion.
>
>  >
>
>  > Monroe
>
>  >
>
>  > *From:*airspy@groups.io [mailto:airspy@groups.io] *On Behalf Of *prog
>
>  > *Sent:* Wednesday, October 30, 2019 11:56 AM
>
>  > *To:* airspy@groups.io
>
>  > *Subject:* Re: [airspy] Understanding the frequency calibration
>
>  >
>
>  > On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:
>
>  >
>
>  >    0.198 : +1 Hz    (???)
>
>  >
>
>  > Assuming a sample rate of 768 ksps, your calculation is only valid when center
>
>  > frequency is >= 300 kHz. Below that, you should use 300kHz for the calculation
>
>  > of the offset, because the physical LO is fixed there. For other sample rates,
>
>  > you should use different offsets in the calculation.
>
>  >
>
>  >
>
>






Joe M.
 

Channels 2-13 are. Channels 14-36 are not.

Joe M.

On 10/31/2019 11:15 AM, Bryon NF6M wrote:
I don't think most of our ATSC TV broadcasters in the US are in the
frequency range of the HF+/Discovery.

Most are now UHF. There are a few in HI-VHF range, but only in some
locations.

On Thu, 31 Oct 2019 7:54 am Monroe Pattillo,
<@monroepattillo <mailto:@monroepattillo>>
wrote:

Yes, it's ATSC DTV here which is also very precise in its
transmission characteristics. It's like buying GNSS receivers they
always have to support the US variant for commercial marketplace
success. It's akin to buying a SDR receiver device and it says it's
DVB-T chip based, but it's all just DTTV, disable the built-in
channelization and demodulation and the result is a software
frequency agile receiver with outboard software based channel
definition and demodulation, which is put into products and just
called a SDR ;>)

Regardless of the DTTV national flavor I guess the question still
stands. If the frequency of their transmissions are so accurate why
wouldn’t we abandon all other publicly accessible calibration
sources and use only the leading edge of a DTTV channel for SDR
calibration? Maybe someone out there might know the reason why we
don't and be willing to educate the rest of us.

Monroe
-----Original Message-----
From: airspy@groups.io <mailto:airspy@groups.io>
[mailto:airspy@groups.io <mailto:airspy@groups.io>] On Behalf Of jdow
Sent: Thursday, October 31, 2019 9:26 AM
To: airspy@groups.io <mailto:airspy@groups.io>
Subject: Re: [airspy] Understanding the frequency calibration

DVB-T in the US? (The equivalent in the US is also very precise.)
{^_-}

On 20191031 00:19:37, Monroe Pattillo wrote:
> Yes, DVB-T channel frequency is required to be very accurate
(10^-12 ) and is
> far more accurate than the US NTIA ILS allocation of 50ppm which
dates back to
> post-WW II, or the current FAA ILS certification required 20ppm,
or maybe even
> FAA airport certification empirical testing results which is
likely << 20ppm
> assuming they use commonly available inexpensive TXCO technology
at 1ppm.
>
> I believe DVB-T stations use GPS+PPS+OCXO (while GPS location
might drift, the
> atomic clock putting out the PPS is accurate to about 10ns, and
the OCXO
> provides a short duration lock). That should be more accurate
than our current
> SDR receivers could fully utilize. Although the typical AFC lock
range for
> consumer TVs appears to be +/-70kHz which is far sloppier than we
would want for
> a calibration source. If DVB-T transmission are so accurate why
wouldn’t we
> abandon all other publicly accessible calibration sources and use
only the
> leading edge of a DTV channel? Maybe someone out there might
know the reason
> and be willing to educate the rest of us.
>
> The AirSpy R2 and Discovery spec says 0.5ppm. I’m not sure what
they use, maybe
> TCXO (10^-6 ). A contemporary (less expensive than they used to
be) Rubidium
> source is 10^-9 .
>
> Contemporary airport ILS transmitters in order to improve their
Class (I, II,
> III) might use GPS+PPS+local Rubidium source to assure
certification compliance
> at a sustainable improved class, but who knows when the
certification data is
> locked behind an FAA employee login. If we knew for sure it
might make ILS OM
> more usable than what might appear at first glance from its US
NTIA allocation
> or its post-WW II original specs. ILS OM is far easier to
demodulate and decode
> than a DVB-T signal, particularly since we disengage that
built-in demodulation
> of our SDRs.
>
> Does anyone out there have certification test equipment that can
check the
> frequency accuracy of the ILS OM of a local US airport?
>
> Monroe
>
> -----Original Message-----
> From: airspy@groups.io <mailto:airspy@groups.io>
[mailto:airspy@groups.io <mailto:airspy@groups.io>] On Behalf Of jdow
> Sent: Wednesday, October 30, 2019 9:23 PM
> To: airspy@groups.io <mailto:airspy@groups.io>
> Subject: Re: [airspy] Understanding the frequency calibration
>
> Pray tell what makes you think marker beacons are precise
frequency references?
>
> The US NTIA specifies they be within 50 ppm of the correct
frequency. There are
>
> far far better sources of frequency accuracy. Police radios are
within 5 ppm.
>
> Aircraft are within 5 ppm. Portions of the digital TV signal are
very precisely
>
> known. Google for it.
>
> {^_^}
>
> On 20191030 09:40:09, Monroe Pattillo wrote:
>
> > If you’re specifically in the US (not North of the US in
Canada) and near
>
> > (within 1 mile) to the final approach flight path of an ILS
supportive airport
>
> > you could try the 75MHz 400Hz tone Outer Marker (OM) beacon as
a short range RF
>
> > calibration source in that band.
>
> >
>
> > https://en.wikipedia.org/wiki/Marker_beacon
>
> >
>
> > We’ve used it as a calibration source for amateur radio
astronomy receivers. I
>
> > don’t know if the ILS OM transmitter frequency is atomic clock
accurately
>
> > aligned and maintained. It should be at least local Rubidium
source accurate.
>
> > Transmission frequency accuracy is required as the ILS OM is
used for aircraft
>
> > landing alignments. This should be sufficiently accurate for
most SDR frequency
>
> > calibrations in that frequency range. It’s certainly a zero
cost calibration
>
> > source and unlike an internal calibration source it does
include any
>
> > preamplifier and the antenna in the reception equipment path.
>
> >
>
> > RWM, Beta, and TDF are possible but more challenging to
receive in the US and
>
> > even if received there is sufficient propagation delay given
the distance to the
>
> > transmitters to be subject to inaccuracies due to atmospheric
propagation
>
> > distortion.
>
> >
>
> > Monroe
>
> >
>
> > *From:*airspy@groups.io <mailto:airspy@groups.io>
[mailto:airspy@groups.io <mailto:airspy@groups.io>] *On Behalf Of *prog
>
> > *Sent:* Wednesday, October 30, 2019 11:56 AM
>
> > *To:* airspy@groups.io <mailto:airspy@groups.io>
>
> > *Subject:* Re: [airspy] Understanding the frequency calibration
>
> >
>
> > On Wed, Oct 30, 2019 at 04:49 PM, Patrick wrote:
>
> >
>
> > 0.198 : +1 Hz (???)
>
> >
>
> > Assuming a sample rate of 768 ksps, your calculation is only
valid when center
>
> > frequency is >= 300 kHz. Below that, you should use 300kHz for
the calculation
>
> > of the offset, because the physical LO is fixed there. For
other sample rates,
>
> > you should use different offsets in the calculation.
>
> >
>
> >
>
>







<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
Virus-free. www.avg.com
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>


<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>

Monroe Pattillo
 

Thanks.  That might be interesting information.  In the 1990’s the primary concern for ILS certification accuracy was its navigational antenna placement and pointing accuracy in small fractions of degrees decimal and not so much on attaining high precision accuracy for its transmitter frequency or tone.  Receivers in aircraft have to be tolerant back to the post-WW II specs because there are legacy airports still in operation.  With the advent of higher accuracy GNSS, ILS, while still in use, has dropped a peg.  Improving ILS transmitter accuracy using contemporary technologies would improve the navigational accuracy a little bit.  However, once next gen GNSS satellites and GNSS+IMU receivers are ubiquitous the original ILS inaccuracies will likely be the cause of its fading away.  Canada has long had their ILS upgrade program.

 

                https://www.canso.org/nav-canada-announces-final-stage-country-wide-ils-replacement-program

 

Here’s an ILS conformance analyzer they use where the 75MH Marker Beacon has a frequency tolerance of 0.0004% or 300Hz or 4ppm. 

 

                https://www.indracompany.com/sites/default/files/indra-normarc_7710.pdf

 

That’s far better than the US NTIA channel allocation of 50ppm, and far better than the US FAA certification requirement of 20ppm, but better would be better for our SDR calibration.

 

But still, the question remains, why not use DTTV as a highly accurate SDR calibration source?  It’s local.  It’s free.

 

Monroe

From: airspy@groups.io [mailto:airspy@groups.io] On Behalf Of Curt Faulk
Sent: Thursday, October 31, 2019 10:49 AM
To: airspy@groups.io
Subject: Re: [airspy] Understanding the frequency calibration

 

I'm a retired FAA air traffic controller and my son is currently a fully-certified FAA ILS technician.  I've forwarded this topic to him to see if he can provide any useful information to us, specifically regarding frequency accuracy of ILS marker beacons.

If he gives anything useful, I'll post it here,