A common way to do that is to use a Savitzky-Golay filter [0], which does a similar thing - it can smooth out data and also provide smooth derivatives of the input data. It looks like this post's technique can also do that, so maybe it'd be useful for my ice strain-rate field project.
[0] - https://en.wikipedia.org/wiki/Savitzky%E2%80%93Golay_filter
The usual implementation is as a convolution filter based on the premise that the underlying data is regularly sampled.
The pain in the arse occassional reality is missing data and|or present but glitched|spiked data .. both of which require a "sensible infill" to continue with a convolution.
This is a nice implementation and a potentially useful bit of kit- the elephant in the room (from my PoV) is "how come the application domain is irregularly sampled data"?
Generally (engineering, geophysics, etc) great lengths are taken to clock data samples like a metronome (in time and|or space (as required most)).
I'm assuming that your gridded GeoTIFF data field is regularly sampled in both the X and Y axis?