TY - JOUR
T1 - Loss functions of generative adversarial networks (gans)
T2 - Opportunities and challenges
AU - Pan, Zhaoqing
AU - Yu, Weijie
AU - Wang, Bosi
AU - Xie, Haoran
AU - Sheng, Victor S.
AU - Lei, Jianjun
AU - Kwong, Sam
N1 - Funding Information:
Manuscript received December 10, 2019; revised April 3, 2020; accepted April 24, 2020. Date of publication May 21, 2020; date of current version July 22, 2020. This work was supported in part by the National Natural Science Foundation of China under Grant 61971232, in part by the Six Talent Peaks Project of Jiangsu Province under Grant XYDXXJS-041, in part by the Natural Science Foundation of Tianjin Under Grant 18ZXZNGX00110 and 18JCJQJC45800. Paper no. TETCI-2019-0268. (Weijie Yu and Bosi Wang contribute equally to this work) (Corresponding author: Jianjun Lei.) Zhaoqing Pan is with the School of Computer and Software, Nanjing University of Information Science and Technology, Nanjing 210044, China, and also with the State Key Laboratory of Integrated Services Networks, Xidian University, Xi’an 710071, China (e-mail: zqpan3-c@my.cityu.edu.hk).
Publisher Copyright:
© 2017 IEEE.
PY - 2020/8
Y1 - 2020/8
N2 - Recently, the Generative Adversarial Networks (GANs) are fast becoming a key promising research direction in computational intelligence. To improve the modeling ability of GANs, loss functions are used to measure the differences between samples generated by the model and real samples, and make the model learn towards the goal. In this paper, we perform a survey for the loss functions used in GANs, and analyze the pros and cons of these loss functions. Firstly, the basic theory of GANs, and its training mechanism are introduced. Then, the loss functions used in GANs are summarized, including not only the objective functions of GANs, but also the application-oriented GANs' loss functions. Thirdly, the experiments and analyses of representative loss functions are discussed. Finally, several suggestions on how to choose appropriate loss functions in a specific task are given.
AB - Recently, the Generative Adversarial Networks (GANs) are fast becoming a key promising research direction in computational intelligence. To improve the modeling ability of GANs, loss functions are used to measure the differences between samples generated by the model and real samples, and make the model learn towards the goal. In this paper, we perform a survey for the loss functions used in GANs, and analyze the pros and cons of these loss functions. Firstly, the basic theory of GANs, and its training mechanism are introduced. Then, the loss functions used in GANs are summarized, including not only the objective functions of GANs, but also the application-oriented GANs' loss functions. Thirdly, the experiments and analyses of representative loss functions are discussed. Finally, several suggestions on how to choose appropriate loss functions in a specific task are given.
KW - Loss functions
KW - computational intelligence
KW - deep learning
KW - generative adversarial networks (GANs)
KW - machine learning
UR - http://www.scopus.com/inward/record.url?scp=85085747332&partnerID=8YFLogxK
U2 - 10.1109/TETCI.2020.2991774
DO - 10.1109/TETCI.2020.2991774
M3 - Article
AN - SCOPUS:85085747332
VL - 4
SP - 500
EP - 522
JO - IEEE Transactions on Emerging Topics in Computational Intelligence
JF - IEEE Transactions on Emerging Topics in Computational Intelligence
SN - 2471-285X
IS - 4
M1 - 9098081
ER -