I have a global, 1arcmin resolution grid (pixel registered). I’m interested in computing the standard deviation within each 15 arcmin sub-grid, to get some sense of the variability of a parameter of interest, within each sub-grid. The parameter of interest could be geoid, topography, temperature, whatever… Each sub-grid is a 15x15 chunk of the original grid, with chunks being adjacent to one another. The original grid is 21600 x 10800. I want to form a 1440 x 720 grid that represents the variability within each sub-grid.

The grdfilter command doesn’t quite seem able to provide this type of quantity. If I’m understanding it correctly, the boxcar option of grdfilter, -Fb15/15 -Dp, provides the mean of the sub-grid, but I don’t believe there is a way to have it dump the variance (or standard deviation).

Perhaps one way to do this (kind of clunky, but could be done), is to write a set of loops that invokes grdcut (for each sub-grid) and then feed each sub-grid into grdinfo (with -L2 -C options) to “manually” build a 15 arcmin data set of parameter standard deviations. Is that the best (or only) way?

Or is there another way to do this, using another obvious GMT command that I’m just not thinking of?

Thanks!