←back to thread

146 points hugohadfield | 1 comments | | HN request time: 0s | source

This little project came about because I kept running into the same problem: cleanly differentiating sensor data before doing analysis. There are a ton of ways to solve this problem, I've always personally been a fan of using kalman filters for the job as its easy to get the double whammy of resampling/upsampling to a fixed consistent rate and also smoothing/outlier rejection. I wrote a little numpy only bayesian filtering/smoothing library recently (https://github.com/hugohadfield/bayesfilter/) so this felt like a fun and very useful first thing to try it out on! If people find kalmangrad useful I would be more than happy to add a few more features etc. and I would be very grateful if people sent in any bugs they spot.. Thanks!
Show context
seanhunter ◴[] No.41868638[source]
This looks cool. When you say "there are tons of ways to solve this problem", presumably the canonical way is some sort of Fourier Analysis?
replies(1): >>41868846 #
1. hugohadfield ◴[] No.41868846[source]
I guess I'm not totally sure what the canonical way would be, probably convolution with the N'th derivative of a guassian smoothing kernal where the smoothing response is chosen by frequency analysis, or something along those lines. You could also just smooth the signal then differentiate it numerically (probably equivalent but less efficient). I would personally go for this bayesian filtering approach or some kind of local polynomial approximation like splines or the Savitzky-Golay filter people are talking about this comment section because it would probably be easier to deal with missing data etc.