Show Bookstore Categories

Research Vishwa Dec 2022

ByDr. Sachin S. BhosaleDr. (Mrs.) Vijaya S. Bhosale

It is harder to train deeper neural networks. We present a residual learning framework to make iteasier to train networks that are much deeper than those that have been used before. We explicitlyreformulatethelayers as learningresidual functionswithreferencetotheinputsofthelayers,asopposedto learning unreferenced functions. We provide a wealth of empirical evidence to show that theseresidual networks are simpler to optimize and can benefit from significantly increased depth. On theImageNet dataset, we evaluate residual networks with up to 152 layers, 8 layers more than VGGnetworks [40], but still with lower complexity. An ensemble of these residual nets has an error of 3.57percent on the ImageNet test set. This result came in first place on the classification test for the ILSVRCin 2015.In addition, the findings of our investigation into CIFAR-10 with 100 and 1000 layers arepresented

Details

Publication Date
Dec 30, 2022
Language
English
ISBN
9781387371112
Category
Education & Language
Copyright
All Rights Reserved - Standard Copyright License
Contributors
By (author): Dr. Sachin S. Bhosale, By (author): Dr. (Mrs.) Vijaya S. Bhosale

Specifications

Format
EPUB

Ratings & Reviews