Extended Kalman filter training of neural networks on a SIMD parallel machine

Shuhui Li, Donald C. Wunsch, Edgar O'Hair, Michael G. Giesselmann

Research output: Contribution to journalArticlepeer-review

17 Scopus citations

Abstract

The extended Kalman filter (EKF) algorithm has been shown to be advantageous for neural network trainings. However, unlike the backpropagation (BP), many matrix operations are needed for the EKF algorithm and therefore greatly increase the computational complexity. This paper presents a method to do the EKF training on a SIMD parallel machine. We use a multistream decoupled extended Kalman filter (DEKF) training algorithm which can provide efficient use of the parallel resource and more improved trained network weights. From the overall design consideration of the DEKF algorithm and the consideration of maximum usage of the parallel resource, the multistream DEKF training is realized on a MasPar SIMD parallel machine. The performance of the parallel DEKF training algorithm is studied. Comparisons are performed to investigate pattern and batch-form trainings for both EKF and BP training algorithms.

Original languageEnglish
Pages (from-to)544-562
Number of pages19
JournalJournal of Parallel and Distributed Computing
Volume62
Issue number4
DOIs
StatePublished - 2002

Keywords

  • Backpropagation
  • Extended Kalman filter
  • Massively parallel processing
  • Neural network
  • Wind power

Fingerprint

Dive into the research topics of 'Extended Kalman filter training of neural networks on a SIMD parallel machine'. Together they form a unique fingerprint.

Cite this