←back to thread

146 points hugohadfield | 1 comments | | HN request time: 0.27s | source

This little project came about because I kept running into the same problem: cleanly differentiating sensor data before doing analysis. There are a ton of ways to solve this problem, I've always personally been a fan of using kalman filters for the job as its easy to get the double whammy of resampling/upsampling to a fixed consistent rate and also smoothing/outlier rejection. I wrote a little numpy only bayesian filtering/smoothing library recently (https://github.com/hugohadfield/bayesfilter/) so this felt like a fun and very useful first thing to try it out on! If people find kalmangrad useful I would be more than happy to add a few more features etc. and I would be very grateful if people sent in any bugs they spot.. Thanks!
1. magicalhippo ◴[] No.41869987[source]
Looks really cool.

I stumbled over this[1] page recently, which has a method that's apparently is better than the "traditional" Savitzky-Golay filters.

The idea seems to be to start with the desired frequency response, with lower frequencies close to the ideal differentiator, and higher frequencies tending smoothly to zero. This is then used to derive the filter coefficients through a set of equations.

The author generalizes it to irregularly sampled data near the end, so would be interesting to compare the approaches.

Just thought it'd throw it out there.

[1]: http://www.holoborodko.com/pavel/numerical-methods/numerical...