Show simple item record

dc.contributor.advisorRobert Freund.en_US
dc.contributor.authorAng, Jun Siong.en_US
dc.contributor.otherMassachusetts Institute of Technology. Engineering and Management Program.en_US
dc.contributor.otherSystem Design and Management Program.en_US
dc.date.accessioned2019-09-17T19:49:40Z
dc.date.available2019-09-17T19:49:40Z
dc.date.copyright2019en_US
dc.date.issued2019en_US
dc.identifier.urihttps://hdl.handle.net/1721.1/122241
dc.descriptionThesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, System Design and Management Program, 2019en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (pages R-1 to R-2).en_US
dc.description.abstractWith vast improvements in computational power, increased accessibility to big data, and rapid innovations in computing algorithms, the use of neural networks for both engineering and business purposes was met with a renewed interest beginning in early 2000s. Amidst substantial development, the Softplus and Rectified Linear Unit (ReLU) activation functions were introduced in 2000 and 2001 respectively, with the latter emerging as the more popular choice of activation function in neural networks. Notably, the ReLU activation function maintains a high degree of gradient propagation while presenting greater model sparsity and computational efficiency over Softplus. As an alternative to the ReLU, a family of a modified Softplus activation function - the "Smoothing" activation function of the form g(z) = [mu] log(1 + e[superscript z/[mu]) has been proposed. Theoretically, the Smoothing activation function will leverage the high degree of gradient propagation and model simplicity characteristic of the ReLU function, while eliminating possible issues associated with the non-differentiability of ReLU about the origin. In this research, the performance of the Smoothing family of activation functions vis-à-vis the ReLU activation function will be examined.en_US
dc.description.statementofresponsibilityby Jun Siong Ang.en_US
dc.format.extent78, A-1 to A-26, B-1 to B-10, R-1 to R-2 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectEngineering and Management Program.en_US
dc.subjectSystem Design and Management Program.en_US
dc.titleEvaluation of the smoothing activation function in neural networks for business applicationsen_US
dc.typeThesisen_US
dc.description.degreeS.M. in Engineering and Managementen_US
dc.contributor.departmentMassachusetts Institute of Technology. Engineering and Management Programen_US
dc.identifier.oclc1119537063en_US
dc.description.collectionS.M.inEngineeringandManagement Massachusetts Institute of Technology, System Design and Management Programen_US
dspace.imported2019-09-17T19:49:38Zen_US
mit.thesis.degreeMasteren_US
mit.thesis.departmentSysDesen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record