/top/
/new/
/best/
/ask/
/show/
/job/
^
slacker news
login
about
←back to thread
You could have designed state of the art positional encoding
(fleetwood.dev)
213 points
Philpax
| 1 comments |
17 Nov 24 20:31 UTC
|
HN request time: 0.777s
|
source
1.
Scene_Cast2
◴[
18 Nov 24 12:39 UTC
]
No.
42171847
[source]
▶
>>42166948 (OP)
#
If you're interested in positional embeddings for Transformers, check out this repo -
https://github.com/gazelle93/Attention-Various-Positional-En...
- it implements various popular ones.
ID:
GO
↑