←back to thread

Embeddings are underrated (2024)

(technicalwriting.dev)
484 points jxmorris12 | 1 comments | | HN request time: 1.031s | source
Show context
minimaxir ◴[] No.43964828[source]
> I don’t know. After the model has been created (trained), I’m pretty sure that generating embeddings is much less computationally intensive than generating text.

An embedding is generated after a single pass through the model, so functionally it's the equivalent of generating a single token from an text generation model.

replies(2): >>43965007 #>>43969096 #
energy123 ◴[] No.43965007[source]
I might be wrong but aren't embedding models usually bidirectional and not causal, so the attention mechanism itself is more expensive.
replies(2): >>43965051 #>>43965189 #
1. breadislove ◴[] No.43965051[source]
yes exactly