←back to thread

146 points hugohadfield | 1 comments | | HN request time: 0.202s | source

This little project came about because I kept running into the same problem: cleanly differentiating sensor data before doing analysis. There are a ton of ways to solve this problem, I've always personally been a fan of using kalman filters for the job as its easy to get the double whammy of resampling/upsampling to a fixed consistent rate and also smoothing/outlier rejection. I wrote a little numpy only bayesian filtering/smoothing library recently (https://github.com/hugohadfield/bayesfilter/) so this felt like a fun and very useful first thing to try it out on! If people find kalmangrad useful I would be more than happy to add a few more features etc. and I would be very grateful if people sent in any bugs they spot.. Thanks!
Show context
fisian ◴[] No.41866638[source]
Great work!

I would've needed this recently for some data analysis, to estimate the mass of an object based on position measurments. I tried calculating the 2nd derivative with a Savitzky-Golay filter, but still had some problems and ended up using a different approach (also using a Kalman filter, but with a physics-based model of my setup).

My main problem was that I had repeated values in my measurements (sensor had a lower, non-integer divisible sampling rate than the acquisition pipeline). This especially made clear that np.gradient wasn't suitable, because it resulted in erratic switches between zero and the calculated derivative. Applying, np.gradient twice made the data look like random noise.

I will try using this library, when I next get the chance.

replies(2): >>41868323 #>>41868449 #
1. radarsat1 ◴[] No.41868323[source]
Did you try prefiltering to remove the repeated values?