dc.contributor.author | Edelman, Alan | |
dc.contributor.author | Johnson, Steven G. | |
dc.date.accessioned | 2024-07-15T23:59:17Z | |
dc.date.available | 2024-07-15T23:59:17Z | |
dc.date.issued | 2022 | |
dc.identifier.other | 18.S096-IAP-JanuaryIAP2022 | |
dc.identifier.other | 18.S097 | |
dc.identifier.uri | https://hdl.handle.net/1721.1/155680 | |
dc.description.abstract | We all know that calculus courses such as 18.01 Single Variable Calculus and 18.02 Multivariable Calculus cover univariate and vector calculus, respectively. Modern applications such as machine learning require the next big step, matrix calculus.
This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), compute derivatives of important matrix factorizations, and really understand forward and reverse modes of differentiation. We will discuss adjoint methods, custom Jacobian matrix vector products, and how modern automatic differentiation is more computer science than mathematics in that it is neither symbolic nor based on finite differences. | en_US |
dc.language.iso | en_US | en_US |
dc.rights | Attribution-NonCommercial-NoDerivs 3.0 United States | * |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/3.0/us/ | * |
dc.subject | matrix calculus | en_US |
dc.subject | modes of differentiation | en_US |
dc.subject | applied mathematics | en_US |
dc.subject | calculus | en_US |
dc.subject | linear algebra | en_US |
dc.subject | adjoint methods | en_US |
dc.subject | Jacobian matrix vector products | en_US |
dc.subject | modern automatic differentiation | en_US |
dc.title | 18.S096 Matrix Calculus for Machine Learning and Beyond, January IAP 2022 | en_US |
dc.type | Learning Object | en_US |
dc.type | Learning Object | |
dc.contributor.department | Massachusetts Institute of Technology. Department of Mathematics | |