Figure.grdimage causes segfault with certain images

I am working with a bunch of images downloaded from elevation.alaska.gov. After downloading them, I use gdal.Warp() to convert them to EPSG:4326 so they can work with grdimage (if I don’t, I get a Your grid y's or latitudes appear to be outside the map region and will be skipped error). I then loop through the files, calling grdimage() on each one, to add them to the figure.

Unfortunately, with some of the files produced this way, calling grdimage causes a segfault with the following error message being printed:

grdimage (gmt_gdalread): Could not reallocate memory [137438953472.00 Gb, 18446744073709451063 items of 8 bytes]

I have made one such image that produces this issue available here: https://univalaska-my.sharepoint.com/:i:/g/personal/ijbrewster_alaska_edu/EUk4DMuOx9JKor3FIAohiLMBfcwPvz3h6WZ2lHGHa-dmXw?e=1UMyyz

and a minimal code example that reproduces the issue:

import pygmt
test_file = '/path/to/23.tiff'
bounds = [
    -183,
    -176.37,
    50.155,
    53.228
]

fig = pygmt.Figure()
fig.basemap(projection="M8i", region=bounds, frame=('WeSn', 'afg'))
fig.grdimage(test_file, dpi = 100, monochrome = True)

I am so far unable to find anything special/different about this file relative to all the ones that work. Perhaps someone here can shed some light on the issue?

I’m also happy to entertain suggestions for process improvements - for example, is there a way I can use the original .tif files in grdimage directly, rather than having to run them through gdal.Warp()? If so, maybe that would solve the issue.

I have tried feeding all the files to gdal.Warp() to produce a single large file, and this does appear to work, however a) it is extremely slow, b) it uses a LOT of memory, and c) it produces a 22GB tiff file from only around 4GB of source files for the bounds listed in the example code. I’m thinking because the individual files actually have very little coverage relative to the target area. I’m also thinking the final file size is responsible for the slowness, as it shuffles 22GB to/from the disk. As such, I would prefer to work with the individual files, if possible.

Running gmt grdinfo 23.tiff gives:

$ gmt grdinfo 23.tiff  
23.tiff: Title: Grid imported via GDAL
23.tiff: Command:
23.tiff: Remark:
23.tiff: Gridline node registration used [Geographic grid]
23.tiff: Grid file format: gd = Import/export through GDAL
23.tiff: x_min: -176.380881942 x_max: -175.870955995 x_inc: 6.57121066617e-05 name: x n_columns: 7761
23.tiff: y_min: 51.6975655499 y_max: 52.0521480774 y_inc: 6.57121066617e-05 name: y n_rows: 5397
23.tiff: v_min: 0 v_max: 808.335205078 name: z
23.tiff: scale_factor: 1 add_offset: 0
+proj=longlat +datum=WGS84 +no_defs

It shows the grid bound is [-176.38, -175.87, 51.69, 52.05]. Perhaps the segfault is because you give the wrong region/bound?

Interesting. Indeed, that appears to be the problem. By clipping the file to the desired region by using the outputBounds option of gdal.Warp, grdimage can now run without error. Odd, but at least it’s functional.

Of course, now that it’s running without error, it reveals an inherent problem with my process: apparently portions of each image are zero (or perhaps NaN?), which is fine in and of itself. However, when layering multiple images using grdimage, those zero/NaN areas can “overlap” non-zero areas from previous images, resulting in an incorrect output image. See this result of adding individual files, vs the result of using Warp to first combine the files into a single file to be added.

Individual:
Output-individual.pdf (77.9 KB)
Combined:
Output-combined.pdf (62.7 KB)

Guess I’m off to investigate that issue now. Might be back with another question soon. Thanks again for the pointer!

Just had to set the transparency option to True in grdimage. Broke the monochrome option I was also using, but I can deal with that. So this is now resolved.

Some of the elevation datasets on that site have very high resolution, so you will get an error if you use a large boundary region. You can’t plot all of Alaska with 1 meter pixels.

Sure, but this error occurs when trying to plot just a single file, over a relatively small region of interest, whenever a portion of the file falls outside said region of interest. It doesn’t appear to have anything to do with the total area being plotted.

Consider that using the same file, with the same region of interest, works properly if I first use gdal.Warp() to trim the file to the desired region of interest.

GMT might be incorrectly estimating the region when you are asking it to reproject the data. At high latitudes, some projections don’t work well. It seems that gdal.Warp is doing a better job.

Does it also happens with plain GMT? I would bet it not.

I’ll take that bet:

israel@tokuuke Downloads % gmt begin ak_test                                         
israel@tokuuke Downloads % gmt basemap -JM8i -R-183/-176.37/50.155/53.228 -BWeSn -Bafg
israel@tokuuke Downloads % gmt grdimage '23.tiff' -E100 -M                            
grdimage (gmt_gdalread): Could not reallocate memory [137438953472.00 Gb, 18446744073709451063 items of 8 bytes]
ERROR: Caught signal number 11 (Segmentation fault: 11) at
0   libgmt.6.2.0.dylib                  0x0000000102da4fb4 gmt_gdalread + 7115
1   ???                                 0x00007ff756d01700 0x0 + 140700290127616
Stack backtrace:
0   libgmt.6.2.0.dylib                  0x0000000102d6522f sig_handler + 555
1   libsystem_platform.dylib            0x00007fff20382d7d _sigtramp + 29
2   ???                                 0x00007fee016196c0 0x0 + 140660202116800
3   libgmt.6.2.0.dylib                  0x0000000102da9dc9 gmt_gdal_read_grd + 682
4   libgmt.6.2.0.dylib                  0x0000000102db9327 gmtlib_read_grd + 178
5   libgmt.6.2.0.dylib                  0x0000000102d92ed7 gmtapi_import_grid + 3991
6   libgmt.6.2.0.dylib                  0x0000000102d8ca3d gmtapi_import_data + 375
7   libgmt.6.2.0.dylib                  0x0000000102d761df GMT_Read_Data + 4431
8   libgmt.6.2.0.dylib                  0x0000000102f6447f GMT_grdimage + 10079
9   libgmt.6.2.0.dylib                  0x0000000102d8366f GMT_Call_Module + 1947
10  gmt                                 0x0000000102d4d5df main + 1454
11  libdyld.dylib                       0x00007fff20358f5d start + 1
israel@tokuuke Downloads % 

Yep, there it is. Same segfault with plain GMT.

Hmmmm…perhaps. Mercator does go to infinite width at high latitudes. However, using a different map projection, such as -JS-180/90/8i doesn’t seem to help things. It appears I get this issue whenever the image I am passing to grdimage has portions that lie outside the region of interest, which is where gdal.Warp() comes into the picture - using that command with the proper arguments will trim the image to the bounds before I pass it to grdimage, thereby preventing the issue.

Ah, old ones do not count :slight_smile:

Is that actually the same thing though? My issue is with grdimage not grdcut - but perhaps grdimage is using grdcut under-the-hood? Still, the crash is in a different place in the code - gmt_gdalread vs gmt_gdal_read_grd (although gmt_gdal_read_grd is referenced…), and that bug report doesn’t mention getting an error about trying to allocate insane amounts of memory. I suppose it could be the same issue though, just presenting somewhat differently… :slight_smile:

Would need to dive in the debugger to confirm but I’m 100 -eps% sure it’s the same thing. The root of the problem is in reading sub-regions of an image.

Gotcha. I’ll go ahead and subscribe to that issue then, as it would be nice if I could ditch the gdal.Warp() step to pre-trim the images. Though I might still need it to convert the projection of the tiffs to lat/long so the images get positioned correctly.

Thanks for the pointer!

???
Mind to elaborate? Projected images are actually easier to deal with since they don’t require any projection.

Sure, although it is arguably a different topic :slight_smile: The GeoTIFFs I’m working with are originally in the Alaska Albers projection. The map I am making is (currently) in a mercator projection, which I defined earlier in a call to basemap() (as shown in the example code, actually). If I try to use one of these GeoTIFFs directly, without first converting it to lat/long (which I have done already on the example TIFF I made available), I get a warning about there being no data within my map area, and indeed nothing draws. By first converting to lat/lon, things work as expected (as long as I avoid the crash, of course! :slight_smile: ). I tried specifying the source projection in the grdimage command, but that didn’t work either (though I don’t recall the exact issue with that one).

Maybe I was supposed to specify the destination projection in the grdimage command as well as the basemap? I can play around with this aspect more if it should be working, as in that case that issue is probably just me doing something wrong.

Not that I’ve run into :slight_smile: I’ll probably be changing the frame parameters at some point anyway, so those specific values are of no consequence to me personally. Just what I found in an example somewhere.

Sorry, forget, this was something that showed up to me while I tried to see if the Julia wrapper could handle this case. Apparently some memory too long recall from a previous error because it vanished when I started afresh.

1 Like

Yep, you cannot go from one projection to another using the GMT projection machinery. Though I think you can it if using the the proj4 syntax. +proj=data_projection+to+proj=destiny_proj (sorry, don’t remember the details)

OK, you won the bet. This is a new story. Your file, though it has a .tif extension is a grid and not an image (the bug I mentioned affects images only).

The problem, however, is that you are asking for a sub-region that in fact overflows grid limits … and shit happens because gmt_gdalread does not catch this case.

23.tiff: x_min: -176.380881942 x_max: -175.870955995 x_inc: 6.57121066617e-05 name: x n_columns: 7761
23.tiff: y_min: 51.6975655499 y_max: 52.0521480774 y_inc: 6.57121066617e-05 name: y n_rows: 5397

requested -R -183/-176.37/50.155/53.228