TY - JOUR
T1 - A universal training scheme and the resulting universality for machine learning phases
AU - Tseng, Yuan Heng
AU - Jiang, Fu Jiun
AU - Huang, C. Y.
N1 - Publisher Copyright:
©The Author(s) 2022. Published by Oxford University Press on behalf of the Physical Society of Japan.
PY - 2023/1/1
Y1 - 2023/1/1
N2 - An autoencoder (AE) and a generative adversarial network (GAN) are trained only once on a one-dimensional (1D) lattice of 200 sites. Moreover, the AE contains only one hidden layer consisting of two neurons, and both the generator and the discriminator of the GAN are made up of two neurons as well. The training set employed to train both the considered unsupervised neural networks (NNs) is composed of two artificial configurations. Remarkably, despite their simple architectures, both the built AE and GAN have precisely determined the critical points of several models, including the three-dimensional classical O(3) model, the two-dimensional generalized classical XY model, the two-dimensional two-state Potts model, and the one-dimensional Bose–Hubbard model. In addition, a factor of several thousands in the speed of calculation is gained for the built AE and GAN when they are compared with the conventional unsupervised NN approaches. The results presented here, as well as those shown previously in the literature, suggest that when phase transitions are considered, an elegant universal neural network that is extremely efficient and is applicable to broad physical systems can be constructed with ease. In particular, since an NN trained with two configurations can be applied to many models, it is likely that when machine learning is concerned, the majority of phase transitions belong to a class having two elements, i.e. the Ising class.
AB - An autoencoder (AE) and a generative adversarial network (GAN) are trained only once on a one-dimensional (1D) lattice of 200 sites. Moreover, the AE contains only one hidden layer consisting of two neurons, and both the generator and the discriminator of the GAN are made up of two neurons as well. The training set employed to train both the considered unsupervised neural networks (NNs) is composed of two artificial configurations. Remarkably, despite their simple architectures, both the built AE and GAN have precisely determined the critical points of several models, including the three-dimensional classical O(3) model, the two-dimensional generalized classical XY model, the two-dimensional two-state Potts model, and the one-dimensional Bose–Hubbard model. In addition, a factor of several thousands in the speed of calculation is gained for the built AE and GAN when they are compared with the conventional unsupervised NN approaches. The results presented here, as well as those shown previously in the literature, suggest that when phase transitions are considered, an elegant universal neural network that is extremely efficient and is applicable to broad physical systems can be constructed with ease. In particular, since an NN trained with two configurations can be applied to many models, it is likely that when machine learning is concerned, the majority of phase transitions belong to a class having two elements, i.e. the Ising class.
UR - http://www.scopus.com/inward/record.url?scp=85164161549&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85164161549&partnerID=8YFLogxK
U2 - 10.1093/ptep/ptac173
DO - 10.1093/ptep/ptac173
M3 - Article
AN - SCOPUS:85164161549
SN - 2050-3911
VL - 2023
JO - Progress of Theoretical and Experimental Physics
JF - Progress of Theoretical and Experimental Physics
IS - 1
M1 - 013A03
ER -