←back to thread

230 points roryclear | 6 comments | | HN request time: 0.001s | source | bottom

This runs YOLOv8 + bytetrack with Tinygrad detections (depending on user config) are saved and can be sent to the companion iOS app along with a notification, all video processing is done locally, all footage is encrypted before leaving your computer, and the sending notifications + videos part is optional. This uses tinygrad, so it runs well on my apple silicon macs and should be able to run on a lot of hardware (or will be able to when I remove other deps).
Show context
snickerdoodle12 ◴[] No.45004203[source]
how does this compare with frigate?
replies(3): >>45004242 #>>45004269 #>>45005215 #
roryclear ◴[] No.45004242[source]
fewer features, easier setup, with more GPUs supported. (I've not used frigate myself though, only watched videos)
replies(2): >>45004287 #>>45009819 #
1. diggan ◴[] No.45004287[source]
Where can I find the list of supported GPUs? Frigate been able to handle everything I've tried so far, all from Nvidia and AMD GPUs to even Intel iGPUs.
replies(2): >>45004334 #>>45004471 #
2. d0ugal ◴[] No.45004334[source]
I have used Frigate for years, I think early on it didn't support all of those GPUs. So it might be that said videos are out of date.
replies(1): >>45004615 #
3. serf ◴[] No.45004471[source]
same here -- it's also among one of the only things to support Coral devices and RPi video cores.

I would imagine any GPGPU compute-capable pre-CUDA thing probably won't cut it.

4. roryclear ◴[] No.45004615[source]
Maybe my view of frigate and tensorflow (assuming frigate still uses it) is outdated then. I’m referring to tinygrad vs tensorflow when I say GPU support, of course google’s tensorflow is best for google’s TPUs. I’ve had better luck using tinygrad on my personal devices, but I am biased as it’s been a while since I’ve used tensorflow
replies(1): >>45007150 #
5. threecheese ◴[] No.45007150{3}[source]
This would be a good point of differentiation to make on your GitHub page or for a technical audience on your website. Frigate is SOTA in many folks minds, and to show that you are using tinygrad over tensorflow may be a good “modern-ness” signal for that audience.

Edit: another solution in this space shows a list of supported ML runtimes, which would be good info for folks wanting to run on specific hardware. https://github.com/boquila/boquilahub

replies(1): >>45008521 #
6. roryclear ◴[] No.45008521{4}[source]
Supported runtimes list would be nice, but I don't have access to much hardware to test on. I aim to remove most dependencies and support anything that can run tinygrad + ffmpeg