Stop wget clobbering local file is server unavailable

Chris asked:

I have a cron job on a server which once a day uses wget to download “Earth Orientation Parameters” and leapsecond data from a NASA server. Specifically:

wget https://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/usno_finals.erp -O .eops
wget https://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/ut1ls.dat -O .ut1ls

This works fine. However it seems when the server is unavailable, wget clobbers my local files (filesize 0). Is there anyway to tell wget to abort if the server not available and leave the local file unaffected. (The files contain predictions of a couple of months, so missing the update for a few days until the server comes back is not a problem).

My answer:


That’s the documented behavior of -O, so you shouldn’t be using it if this is not the behavior you want.

By default wget names files using the same name given by the server (version-dependent), or if none was given, by the basename of the URL. Since you want a different name, you should take advantage of this.

For example, you can download the file and then only copy it over the existing file if the download was successful.

wget https://gemini.gsfc.nasa.gov/500/oper/solve_apriori_files/usno_finals.erp && \
mv usno_finals.erp .eops

Because wget timed out, no usno_finals.erp was created, wget returned an error exit code, and mv was never called.

When someone at Goddard gets their head out of their … whatever … and fixes their server, you’ll be able to see that the file gets created as expected.


View the full question and answer on Server Fault.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.