Loss functions of generative adversarial networks (gans): Opportunities and challenges

Zhaoqing Pan, Weijie Yu, Bosi Wang, Haoran Xie, Victor S. Sheng, Jianjun Lei, Sam Kwong

Research output: Contribution to journalArticlepeer-review

35 Scopus citations

Abstract

Recently, the Generative Adversarial Networks (GANs) are fast becoming a key promising research direction in computational intelligence. To improve the modeling ability of GANs, loss functions are used to measure the differences between samples generated by the model and real samples, and make the model learn towards the goal. In this paper, we perform a survey for the loss functions used in GANs, and analyze the pros and cons of these loss functions. Firstly, the basic theory of GANs, and its training mechanism are introduced. Then, the loss functions used in GANs are summarized, including not only the objective functions of GANs, but also the application-oriented GANs' loss functions. Thirdly, the experiments and analyses of representative loss functions are discussed. Finally, several suggestions on how to choose appropriate loss functions in a specific task are given.

Original languageEnglish
Article number9098081
Pages (from-to)500-522
Number of pages23
JournalIEEE Transactions on Emerging Topics in Computational Intelligence
Volume4
Issue number4
DOIs
StatePublished - Aug 2020

Keywords

  • Loss functions
  • computational intelligence
  • deep learning
  • generative adversarial networks (GANs)
  • machine learning

Fingerprint

Dive into the research topics of 'Loss functions of generative adversarial networks (gans): Opportunities and challenges'. Together they form a unique fingerprint.

Cite this