Just another strange (and perhaps rather advanced) question as I push the capabilities of GMT beyond its design!
Say, we have a time series of geographic grids, consisting of x,y,z (lon, lat, z). Equivalently, the data could be contained in a 3D NetCDF cube with time as the third dimension.
I’ve already built a simple set of GMT grdmath commands to compute the linear trends that exist within each cell, but computing it over the whole grid (in an application of Cramer’s rule). And it works great, very fast; even when using time-varying grids that provide cell-by-cell, time-varying, observational weights.
My next wish is to not only estimate a linear trend, but to also include a seasonal cycle, consisting of an amplitude and phase, in addition to the linear trend. Not asking for much, am I?
One approach would be to use grdtrack to sample the grids through time, cell by cell; extracting needed data (time, z); externally fitting a seasonal model on a per cell basis (via Fortran routine); then reassembling grids of amplitude, phase and linear trend (what I would characterize as the slow way).
Instead, I’m trying to conceive of a way that GMT’s grdmath command may be suited to do the same kind of thing, across the whole grid in one shot, in a very fast and efficient way.
Any ideas? An issue I see, is the fact that seasonal cycles are functions of sines and/or cosines, which are non-linear functions. Thus the solution has to be linearized and probably requires iterations. How one would handle iterations (independent per cell) over thousands of cells is rather daunting. As I think more about this, it might not be simply doable.
Paul pointed me to the script in test/gmtmath/lsfit.sh as a starting point. But I was hoping to chomp through a whole bunch of grids, through time, and obtain a whole grid of cell-by-cell estimates of seasonal cycle amplitudes and phases.