←back to thread

114 points roboboffin | 6 comments | | HN request time: 0.426s | source | bottom
1. sigmar ◴[] No.42197504[source]
>AlphaQubit, a recurrent-transformer-based neural-network architecture that learns to predict errors in the logical observable based on the syndrome inputs (Methods and Fig. 2a). This network, after two-stage training—pretraining with simulated samples and finetuning with a limited quantity of experimental samples (Fig. 2b)—decodes the Sycamore surface code experiments more accurately than any previous decoder (machine learning or otherwise)

>One error-correction round in the surface code. The X and Z stabilizer information updates the decoder’s internal state, encoded by a vector for each stabilizer. The internal state is then modified by multiple layers of a syndrome transformer neural network containing attention and convolutions.

I can't seem to find a detailed description of the architecture beyond this bit in the paper and the figure it references. Gone are the days when Google handed out ML methodologies like candy... (note: not criticizing them for being protective of their IP, just pointing out how much things have changed since 2017)

replies(1): >>42198250 #
2. jncfhnb ◴[] No.42198250[source]
Eh. It was always sort of muddy. We never actually had an implementation of doc2vec as described in the paper.
replies(2): >>42198387 #>>42199098 #
3. myownpetard ◴[] No.42198387[source]
That's because attention is all we need.
replies(1): >>42199642 #
4. dekhn ◴[] No.42199098[source]
wait. Are you saying you were a paper author who described a method in their paper that wasn't actually implemented? IE, your methods section contained a false description?
replies(1): >>42199345 #
5. jncfhnb ◴[] No.42199345{3}[source]
No I’m saying the original doc2vec paper described an approach which the ML community never seemed to actually implement. There were things that were called doc2vec, but they were not what the paper described. Folks mostly seemed to just notice.
6. griomnib ◴[] No.42199642{3}[source]
…and a green line by the GOOG ticker.