Strange problem that likely has an easy solution, but it eludes me.

I need to create a full (n x m) grid based on an (n x 1) column vector. In other words, the column vector is repeated for each m column.

Simple example, given a column vector:

| 1 |

| 2 |

| 3 |,

create a matrix of repeated columns:

| 1 1 1 1 |

| 2 2 2 2 |

| 3 3 3 3 |

I also need to create a full (n x m) grid based on a (1 x m) row vector. In other words, the row is repeated for each n row.

Example, given a row:

| 1 2 3 |,

create a matrix of repeated rows:

| 1 2 3 |

| 1 2 3 |

| 1 2 3 |

| 1 2 3 |

Is it possible to fool grdmath to do either of these in one command, given -R & -I parameters that correspond to the (m x n) grid?

In my case, I’m not working with sequential integers, but with mixed values read-in as a column or as a row.

One possible solution is to read-in the column (or row), then repeat it (n or m number of times, depending on case) into a temporary file (via cat >>), then use xyz2grd with the appropriate -Z specification to form the grid. Maybe that’s the easiest way - since the final grids aren’t that huge.