How exactly should one approach for grid image interpolation in PyGMT?

Suppose I have a 1-degree width grid cells on cartesian region [(0,0}, {0,2), (2,0), (2,2)] (just for simplicity), so that makes it 4 grid cells and I stored its values in text file like below [longitude, latitude, z-value] form

0.5 0.5 2.43
0.5 1.5 8.61
1.5 0.5 -1.52
1.5 1.5 4.79

What’s the correct method which interpolates (lets say bicubic spline) these grid cells. I have previously used grdsample but it only interpolates within the bounds of the text file latitude, longitude clipping the outer porting to be vacant.

You want to extrapolate beyond your data ?

Not extrapolation, what I meant was only a inner subset region of data is getting interpolated.
In above example the appropriate result would be the interpolated image extending from 0 → 2 on both latitude and longitude directions.
But the plot is restricted to smaller extent of 0.5 → 1.5 on both latitude and longitude directions which is not true.

In your example, the data fall in the range 0.5 → 1.5. If you go under 0.5 (0) and above 1.5 (2) then you’re extrapolating, no?
Well, semantic aside, in both cases you make assumptions on how your data behave between to points… so I guess you could generate those extra “outside” points using the same assumptions ?

Now, if you do already have the outside points. You can always do the interpolation on your small subset (0,2) of interest, then add that layer above your broader area.