←back to thread

171 points pizza | 1 comments | | HN request time: 0s | source
Show context
benob ◴[] No.43602745[source]
A variant I have been thinking of: each parameter matrix (or block) is the sum of a random matrix (generated from a seed) and a low rank matrix (a LoRA). I'd like to experiment training from scratch in that setting.
replies(1): >>43602864 #
1. sadiq ◴[] No.43602864[source]
There's a related write-up here you might find interesting: https://wandb.ai/learning-at-home/LM_OWT/reports/Parameter-s...

It covers some experiments on weight tying, one of which is actually LoRA and random weights.