FFT of time series

Hi there! I am using spectrum1d module for the first time and I cannot retrieve frequency in a right format. I would like to compute FFT of a time series with the absolute time in the first column (60480 entries) evenly spaced every 30 sec as shown below:

2021-08-25T00:00:00 5.53984247442
2021-08-25T00:00:30 5.53670365209
2021-08-25T00:01:00 5.53347448891
2021-08-25T00:01:30 5.53012074013
2021-08-25T00:02:00 5.52680340069
2021-08-25T00:02:30 5.52368781039
2021-08-25T00:03:00 5.52078531109
2021-08-25T00:03:30 5.51790240359
2021-08-25T00:04:00 5.51496586614
2021-08-25T00:04:30 5.51203250058

I have tried to set the first column to abs time by -fi0T, gave 30 sec interval -D30 --TIME_UNIT=s, and probably need to play with -S. Instead of frequency, the output is in fraction of second in abs time format:

gmt spectrum1d data.txt -fi0T -S512 -D30 --TIME_UNIT=s | head
1970-01-01T00:00:00.000000 8740.38596612 804.191910744
1970-01-01T00:00:00.000130 4190.3700378 385.550672532
1970-01-01T00:00:00.000195 9863.25967141 907.506106946
1970-01-01T00:00:00.000260 28392.4204681 2612.35086819
1970-01-01T00:00:00.000325 9863.26059556 907.506191976
1970-01-01T00:00:00.000390 4190.36999882 385.550668945
1970-01-01T00:00:00.000455 8740.38625725 804.19193753
1970-01-01T00:00:00.000520 434.131393957 39.9438830932
1970-01-01T00:00:00.000585 8740.38610155 804.191923205
1970-01-01T00:00:00.000651 4190.37058127 385.550722536

This message might be helpful:

spectrum1d [INFORMATION]: To prevent loss of time-series precision we have changed FORMAT_CLOCK_OUT to hh:mm:ss.xxxxxx

I would appreciate any hint on how to tweak spectrum1d to get the desired result. Thank you in advance.

Looks like a bug. It takes the frequencies and formats them as UNIX time (see the 1970 year). Maybe see if adding -foff changes things.

Thanks a lot, it worked! Now I am wondering if there is a simple FFT function in GMT, not the Welch’s method incorporated in spectrum1d?
Thank you, Anuar.

I think you control that with -S, but if you have no averaging your uncertainty estimates will go big. Perhaps @WHFSmith have some wisdom here?

1 Like