A New Initialization Method for Neural Networks with Weight Sharing

Xiaofeng DING, Hongfei YANG, Hui HU, Raymond H. CHAN, Yaxin PENG, Tieyong ZENG*

*Corresponding author for this work

Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

1 Citation (Scopus)

Abstract

A proper initialization of parameters in a neural network can facilitate its training. The Xavier initialization introduced by Glorot and Bengio which is later generalized to Kaiming initialization by He, Zhang, Ren and Sun are now widely used. However, from experiments we find that networks with heavy weight sharing are difficulty to train even with the Xavier or the Kaiming initialization. We also notice that a certain simple network can be decomposed in two ways, where one is difficult to train while the other is easy to train, when both are properly initialized by the Xavier or the Kaiming initialization. In this paper we will propose a new initialization method which will increase training speed and training stability of neural networks with heavy weight sharing. We will also propose a simple yet efficient method to adjust learning rates layer by layer which is indispensable to our initialization.

Original languageEnglish
Title of host publicationMathematical Methods in Image Processing and Inverse Problems, IPIP 2018
EditorsXue-Cheng TAI, Suhua WEI, Haiguang LIU
PublisherSpringer
Pages165-179
Number of pages15
ISBN (Electronic)9789811627019
ISBN (Print)9789811627002
DOIs
Publication statusPublished - 2021
Externally publishedYes
EventInternational Workshop on Image Processing and Inverse Problems, IPIP 2018 - Beijing, China
Duration: 21 Apr 201824 Apr 2018

Publication series

NameSpringer Proceedings in Mathematics and Statistics
Volume360
ISSN (Print)2194-1009
ISSN (Electronic)2194-1017

Conference

ConferenceInternational Workshop on Image Processing and Inverse Problems, IPIP 2018
Country/TerritoryChina
CityBeijing
Period21/04/1824/04/18

Bibliographical note

Publisher Copyright:
© 2021, Springer Nature Singapore Pte Ltd.

Keywords

  • Learning rate
  • Neural networks
  • Weight sharing
  • Xavier initialization

Fingerprint

Dive into the research topics of 'A New Initialization Method for Neural Networks with Weight Sharing'. Together they form a unique fingerprint.

Cite this