RTL-SDRs are great devices. They are known for being small, inexpensive, light in power demand, operable from low frequencies through L band, and workable with numerous signal processing packages. They are NOT known for having precise and stable clock oscillators. Frequency errors of 30 to 50 parts per million are common. For users tuning VHF narrowband voice or digital modes, such an error is enough to cause signals to fall on the edge of the receiver passband. In some cases, signals may fall out of a narrow passband and be completely missed.
One solution is to only use RTL-SDRs sold with upgraded clock oscillators, such as the NESDR SMArt or RTL-SDR.com V3. Another way to put the RTL-SDR on frequency is to check it against a known signal and manually program the offset into applications controlling the dongle. That is often accurate enough, but still a broad approximation. A better solution, also applicable to upgraded devices, is to automate the process of checking against known signals and get a more precise measurement based on the mean value of a series of phase measurements. For that, there is a nifty application named Kalibrate-RTL.
Kalibrate-RTL uses mobile telephone calibration signals, which are referenced to GNSS for precise frequency and time. Also, Kalibrate-RTL eliminates human factors in measuring offsets. In fact, it takes several readings and averages them to work out a mean offset. That is good enough (quite good enough) for most users. For critical applications needing top notch, minimal drift, precision tuning, an SDR better than RTL dongles should be considered...
I have written a Bash script, called sdr-params.sh, which you can use to execute Kalibrate-RTL and save the offset to a file. That saved ppm offset may then be read by other software or written into their configuration files so that you do not need to poke around in the menus to manually program a correction. This is easily done for programs like Gqrx, where the ppm correction is easy to find and edit with a tool like sed.
It is a simple matter to calibrate a dedicated receiver at boot time, as the script can be executed from a system init script. Here is a snippet of code which would run from a launcher in /etc/xdg/autostart/ or the script /etc/init.d/rc.local:
# calibrate the rtl-sdr sh -c "kal.sh gsm850" &
The above example could also be run regularly as a cron job, but be aware that some scripting would be necessary to stop other processes which may be using the device, then start it up again after the calibration finishes. Calibration of SDR devices with very large offsets may fail, as the cell tower signals may fall outside of the receiver passband. In that case, use your favorite SDR application to manually find the strongest GSM signals, then invoke kal manually to tune one of them.
If you ran the utility manually, you would enter something similar to what is shown below. Parameter "f" is for frequency, where a strong GSM signal happens to be, while parameter "g" represents the SDR gain as a percentage of the min-max range.
kal -v -f 872.6 -g 55
What returns, after a brief interval, is a batch of data conveying conveying statistics of the ppm offset of the radio. Use the average, but be aware of the standard deviation. On a good device, the ppm offset will be a single digit number. RTL-SDRs are still usable even if the offset is 30 or 40 ppm.
Using GSM-850 channel 145 (872.6MHz) average [min, max] (range, stddev) - 4.093kHz [-4115, -4084] (31, 5.723481) overruns: 0 not found: 0 average absolute error: 4.812 ppm
To actually let the computer handle the matter of using the offset data, the process should be scripted, writing the average ppm offset of the RTL-SDR to a file or environmental variable. Tools on your system which need the offset can then be configured to access that file or variable.For more scrpts and snippets useful for SDR operating, see the Skywave Linux-v4 Github repository.