Several Missing/Corrupted SRTM 01s Data tiles

I have been making topographical maps of the northern area of Korea for a few years, but now this code does not work (I am not sure when it began as I last used this code maybe 8mos ago.)

grdcut @earth_relief_01s -V -R126/128/40.5/42
grdgradient -N1 -A100 -fg
grdimage -JM -CgrayC+h0.5 -P >

grdimage returns the error:

grdimage [ERROR]: Passing zmax <= zmin prevents automatic CPT generation!
grdimage [ERROR]: Failed to read CPT grayC+h0.5.

I suspect this may have to do with missing data in the SRTM files on the remote server as a sample test of the region from N40E124 to N43E131 all returned similar errors. Is this something that can be rectified on the GMT side or should I download the SRTM files directly from USGS?

The files on the server are fine. However, we have noticed that if libcurl fails to do a connection you may end up with corrupted files locally. Have a look at what you have in ~/.gmt/server/earth/earth_relief/earth_relief_01g and see if those tiles have unusually small sizes - if so just delete and redownload via your script.

Thank you for the response, Paul!
I am stumped by this. I deleted all my cached files for the Korean peninsula and ran a couple scripts again. Anything below N40 produced topographic images, anything above did not (grd image again returned the zmax <= zmin error). I did not receive any libcurl errors in the process.

I am guessing your computer prevents the connection. I just ran your example from a weak wifi connection at the apartment I am staying at in Oxford and it happily went to Hawaii to get the required files. So not a problem with GMT.

BTW, you have some stray @ characters in front of files that you are making. These are not remote files.

Thank you again, Paul. I appreciate the time spared to help me out.

I must have a local cache of the earth_relief files for N40 and above still around somewhere. I cannot for the life of me figure out what else would have changed in my local setup that would cause this to fail so specifically.

It also worked fine for me.