←back to thread

684 points prettyblocks | 1 comments | | HN request time: 0.524s | source

I mean anything in the 0.5B-3B range that's available on Ollama (for example). Have you built any cool tooling that uses these models as part of your work flow?
Show context
kaspermarstal ◴[] No.42790190[source]
I built an Excel Add-In that allows my girlfriend to quickly filter 7000 paper titles and abstracts for a review paper that she is writing [1]. It uses Gemma 2 2b which is a wonderful little model that can run on her laptop CPU. It works surprisingly well for this kind of binary classification task.

The nice thing is that she can copy/paste the titles and abstracts in to two columns and write e.g. "=PROMPT(A1:B1, "If the paper studies diabetic neuropathy and stroke, return 'Include', otherwise return 'Exclude'")" and then drag down the formula across 7000 rows to bulk process the data on her own because it's just Excel. There is a gif on the readme on the Github repo that shows it.

[1] https://github.com/getcellm/cellm

replies(12): >>42790265 #>>42790359 #>>42790494 #>>42790901 #>>42791645 #>>42791646 #>>42793924 #>>42795545 #>>42796501 #>>42805657 #>>42812155 #>>42813125 #
1. basmok ◴[] No.42796501[source]
Can someone hack this together as pure matrix multiplication?

Like either as table in the background or as regular script?

On most computers you can't compile or add add-ons without administrative rights and LLM Chat sites are blocked to prevent usage of company data.

It should run on native Excel or GSheets.

I mean, pure without compilation, just like the do the matrix calculations here straight in Excel without admin rights:

Lesson 1: Demystifying how LLMs work, from architecture to Excel

https://youtu.be/FyeN5tXMnJ8

As far as i know in GSheet the scripts also run on the Google Servers and are not limited by the local computer power. So there larger models could be deployed.

Someone can hack this into Excel/GSheet?