Show simple item record

dc.contributor.authorEdelman, Alan
dc.contributor.authorJohnson, Steven G.
dc.date.accessioned2024-07-15T23:59:17Z
dc.date.available2024-07-15T23:59:17Z
dc.date.issued2022
dc.identifier.other18.S096-IAP-JanuaryIAP2022
dc.identifier.other18.S097
dc.identifier.urihttps://hdl.handle.net/1721.1/155680
dc.description.abstractWe all know that calculus courses such as 18.01 Single Variable Calculus and 18.02 Multivariable Calculus cover univariate and vector calculus, respectively. Modern applications such as machine learning require the next big step, matrix calculus. This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), compute derivatives of important matrix factorizations, and really understand forward and reverse modes of differentiation. We will discuss adjoint methods, custom Jacobian matrix vector products, and how modern automatic differentiation is more computer science than mathematics in that it is neither symbolic nor based on finite differences.en_US
dc.language.isoen_USen_US
dc.rightsAttribution-NonCommercial-NoDerivs 3.0 United States*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/us/*
dc.subjectmatrix calculusen_US
dc.subjectmodes of differentiationen_US
dc.subjectapplied mathematicsen_US
dc.subjectcalculusen_US
dc.subjectlinear algebraen_US
dc.subjectadjoint methodsen_US
dc.subjectJacobian matrix vector productsen_US
dc.subjectmodern automatic differentiationen_US
dc.title18.S096 Matrix Calculus for Machine Learning and Beyond, January IAP 2022en_US
dc.typeLearning Objecten_US
dc.typeLearning Object
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mathematics


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record