Re: WSPR data analysis #wspr #qcx40


Hans Summers
 

Hi Ted, all

I have taken a different approach whenever I needed to analyze a download of a month's worth of WSPRnet data. I used Excel with some VBA code, that opens the WSPRnet downloaded csv file and reads it sequentially one line at a time. Then I implement a filter for whatever rows I want to look at, and any row that matches, I write that out into another csv file. 

I've always done it this way. 

To an extent it depends what software you have available and what you're familiar with. I had Excel available and am familiar with VBA having used it professionally for many years. So that was the solution I chose. 

Ted let me know if you're interested and I can find the VBA code I used and email it to you.

73 Hans G0UPL 

On Thu, Jun 11, 2020, 16:51 Graham, VE3GTC <colonelkrypton@...> wrote:
I do just what Michael has described.

You don't need a computer with Linux however but it is easier to get going.

On a Windows computer you can install many of the common *NIX command line tools available in 



and there are other WIN32 GNU command line tools sources as well.

BUT, these are not for everyone.

an alternative already exists with Windows.

There is a command line tool called FINDSTR.

Open a cmd.exe window and enter:  help findstr and you fill a simple listing of how to use this tool

findstr will search a specified file for a specified string just like using grep in Michaels example.

for example, at the c: prompt:   findstr callsignToSearchFor wsprFile.csv > results.csv

This will search wsprFile.csv for the string CallsignToSearchFor and will put the results in file results.csv

This will reduce that very large .csv file down to something more manageable that Excel can import.

Also, SQLite is a very good alternative to using Microsoft Access if you wished to go that way.

cheers, Graham ve3gtc


On Thu, Jun 11, 2020 at 1:24 PM Michael Babineau <mbabineau.ve3wmb@...> wrote:
Ted :

The simple approach that I have used to deal with these massive WSPR .csv files is to pre-process the file on a Linux machine
to extract the relevant data into a new file, discarding all of the records that I don't care about. Then I transfer that  new file to my
Windows machine and then open it with Excel.

The grep utility on Linux/Unix is line-based so it reads in one line at a time and processes that very efficiently. It can search using any 
regular expression so it is a simple matter to grep the .csv searching for your callsign and then use ">" to redirect the output
to a new text file.   Something like :  grep "VE3WMB"  wspr.csv  > newcsvfile.csv    would search wspr.csv for VE3WMB and output
only those lines. The '>' takes those output lines and redirects them to a new file named newcsvfile.csv . 

You can do this on any Linux/Unix box, even a Raspberry Pi.  If you don't have a Linux machine you could create a USB stick
with a live Linux distribution and boot from that without installing Linux on your computer. Or if this seems to difficult find someone
you know who is a Linux user and get them to pre-process the file for you.  

The advantage of this approach is that you separate the "wheat from the chaff" before you try to do anything with the data.  
Using a database you are still loading in a pile of information that you really just want to discard.

Cheers

Michael VE3WMB 

Join QRPLabs@groups.io to automatically receive all group messages.