I wonder if you could creatively combine this model with search algorithms to advance the state of the art in computer chess? I wouldn't be surprised to see such a bot pop up on tcec in a couple years.
replies(3):
- The Leela open source community had already used transformer architecture to train Lc0 long before the paper (and published it, too!) and got much better result than new DeepMind massive model
- The top engines with with search (Stockfish NNUE, Lc0) beat DeepMind’s model by margins under normal competition’s conditions
- Speaking about efficiency, Stockfish NNUE can run on a commodity PC with only slightly lower ELO. AlphaZero or DeepMind’s new model can not even run to begin with.