Complete Linear Algebra: Theory And Implementation In Code
Last updated 9/2022
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 8.51 GB | Duration: 34h 0m
Last updated 9/2022
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 8.51 GB | Duration: 34h 0m
Learn concepts in linear algebra and matrix analysis, and implement them in MATLAB and Python.
What you'll learn
Understand theoretical concepts in linear algebra, including proofs
Implement linear algebra concepts in scientific programming languages (MATLAB, Python)
Apply linear algebra concepts to real datasets
Ace your linear algebra exam!
Apply linear algebra on computers with confidence
Gain additional insights into solving problems in linear algebra, including homeworks and applications
Be confident in learning advanced linear algebra topics
Understand some of the important maths underlying machine learning
The math underlying most of AI (artificial intelligence)
Requirements
Basic understanding of high-school algebra (e.g., solve for x in 2x=5)
Interest in learning about matrices and vectors!
(optional) Computer with MATLAB, Octave, or Python (or Jupyter)
Description
You need to learn linear algebra!Linear algebra is perhaps the most important branch of mathematics for computational sciences, including machine learning, AI, data science, statistics, simulations, computer graphics, multivariate analyses, matrix decompositions, signal processing, and so on.You need to know applied linear algebra, not just abstract linear algebra!The way linear algebra is presented in 30-year-old textbooks is different from how professionals use linear algebra in computers to solve real-world applications in machine learning, data science, statistics, and signal processing. For example, the "determinant" of a matrix is important for linear algebra theory, but should you actually use the determinant in practical applications? The answer may surprise you, and it's in this course!If you are interested in learning the mathematical concepts linear algebra and matrix analysis, but also want to apply those concepts to data analyses on computers (e.g., statistics or signal processing), then this course is for you! You'll see all the maths concepts implemented in MATLAB and in Python.Unique aspects of this courseClear and comprehensible explanations of concepts and theories in linear algebra.Several distinct explanations of the same ideas, which is a proven technique for learning.Visualization using graphs, numbers, and spaces that strengthens the geometric intuition of linear algebra.Implementations in MATLAB and Python. Com'on, in the real world, you never solve math problems by hand! You need to know how to implement math in software!Beginning to intermediate topics, including vectors, matrix multiplications, least-squares projections, eigendecomposition, and singular-value decomposition.Strong focus on modern applications-oriented aspects of linear algebra and matrix analysis.Intuitive visual explanations of diagonalization, eigenvalues and eigenvectors, and singular value decomposition.Improve your coding skills! You do need to have a little bit of coding experience for this course (I do not teach elementary Python or MATLAB), but you will definitely improve your scientific and data analysis programming skills in this course. Everything is explained in MATLAB and in Python (mostly using numpy and matplotlib; also sympy and scipy and some other relevant toolboxes).Benefits of learning linear algebraUnderstand statistics including least-squares, regression, and multivariate analyses.Improve mathematical simulations in engineering, computational biology, finance, and physics.Understand data compression and dimension-reduction (PCA, SVD, eigendecomposition).Understand the math underlying machine learning and linear classification algorithms.Deeper knowledge of signal processing methods, particularly filtering and multivariate subspace methods.Explore the link between linear algebra, matrices, and geometry.Gain more experience implementing math and understanding machine-learning concepts in Python and MATLAB.Linear algebra is a prerequisite of machine learning and artificial intelligence (A.I.).Why I am qualified to teach this course:I have been using linear algebra extensively in my research and teaching (in MATLAB and Python) for many years. I have written several textbooks about data analysis, programming, and statistics, that rely extensively on concepts in linear algebra. So what are you waiting for??Watch the course introductory video and free sample videos to learn more about the contents of this course and about my teaching style. If you are unsure if this course is right for you and want to learn more, feel free to contact with me questions before you sign up.I hope to see you soon in the course!Mike
Overview
Section 1: Introductions
Lecture 1 What is linear algebra?
Lecture 2 Linear algebra applications
Lecture 3 An enticing start to a linear algebra course!
Lecture 4 How best to learn from this course
Lecture 5 Maximizing your Udemy experience
Section 2: Get the course materials
Lecture 6 How to download and use course materials
Section 3: Vectors
Lecture 7 Algebraic and geometric interpretations of vectors
Lecture 8 Vector addition and subtraction
Lecture 9 Vector-scalar multiplication
Lecture 10 Vector-vector multiplication: the dot product
Lecture 11 Dot product properties: associative, distributive, commutative
Lecture 12 Code challenge: dot products with matrix columns
Lecture 13 Code challenge: is the dot product commutative?
Lecture 14 Vector length
Lecture 15 Dot product geometry: sign and orthogonality
Lecture 16 Code challenge: Cauchy-Schwarz inequality
Lecture 17 Code challenge: dot product sign and scalar multiplication
Lecture 18 Vector Hadamard multiplication
Lecture 19 Outer product
Lecture 20 Vector cross product
Lecture 21 Vectors with complex numbers
Lecture 22 Hermitian transpose (a.k.a. conjugate transpose)
Lecture 23 Interpreting and creating unit vectors
Lecture 24 Code challenge: dot products with unit vectors
Lecture 25 Dimensions and fields in linear algebra
Lecture 26 Subspaces
Lecture 27 Subspaces vs. subsets
Lecture 28 Span
Lecture 29 Linear independence
Lecture 30 Basis
Section 4: Introduction to matrices
Lecture 31 Matrix terminology and dimensionality
Lecture 32 A zoo of matrices
Lecture 33 Matrix addition and subtraction
Lecture 34 Matrix-scalar multiplication
Lecture 35 Code challenge: is matrix-scalar multiplication a linear operation?
Lecture 36 Transpose
Lecture 37 Complex matrices
Lecture 38 Diagonal and trace
Lecture 39 Code challenge: linearity of trace
Lecture 40 Broadcasting matrix arithmetic
Section 5: Matrix multiplications
Lecture 41 Introduction to standard matrix multiplication
Lecture 42 Four ways to think about matrix multiplication
Lecture 43 Code challenge: matrix multiplication by layering
Lecture 44 Matrix multiplication with a diagonal matrix
Lecture 45 Order-of-operations on matrices
Lecture 46 Matrix-vector multiplication
Lecture 47 2D transformation matrices
Lecture 48 Code challenge: Pure and impure rotation matrices
Lecture 49 Code challenge: Geometric transformations via matrix multiplications
Lecture 50 Additive and multiplicative matrix identities
Lecture 51 Additive and multiplicative symmetric matrices
Lecture 52 Hadamard (element-wise) multiplication
Lecture 53 Code challenge: symmetry of combined symmetric matrices
Lecture 54 Multiplication of two symmetric matrices
Lecture 55 Code challenge: standard and Hadamard multiplication for diagonal matrices
Lecture 56 Code challenge: Fourier transform via matrix multiplication!
Lecture 57 Frobenius dot product
Lecture 58 Matrix norms
Lecture 59 Code challenge: conditions for self-adjoint
Lecture 60 Code challenge: The matrix asymmetry index
Lecture 61 What about matrix division?
Section 6: Matrix rank
Lecture 62 Rank: concepts, terms, and applications
Lecture 63 Computing rank: theory and practice
Lecture 64 Rank of added and multiplied matrices
Lecture 65 Code challenge: reduced-rank matrix via multiplication
Lecture 66 Code challenge: scalar multiplication and rank
Lecture 67 Rank of A^TA and AA^T
Lecture 68 Code challenge: rank of multiplied and summed matrices
Lecture 69 Making a matrix full-rank by "shifting"
Lecture 70 Code challenge: is this vector in the span of this set?
Lecture 71 Course tangent: self-accountability in online learning
Section 7: Matrix spaces
Lecture 72 Column space of a matrix
Lecture 73 Column space, visualized in code
Lecture 74 Row space of a matrix
Lecture 75 Null space and left null space of a matrix
Lecture 76 Column/left-null and row/null spaces are orthogonal
Lecture 77 Dimensions of column/row/null spaces
Lecture 78 Example of the four subspaces
Lecture 79 More on Ax=b and Ax=0
Section 8: Solving systems of equations
Lecture 80 Systems of equations: algebra and geometry
Lecture 81 Converting systems of equations to matrix equations
Lecture 82 Gaussian elimination
Lecture 83 Echelon form and pivots
Lecture 84 Reduced row echelon form
Lecture 85 Code challenge: RREF of matrices with different sizes and ranks
Lecture 86 Matrix spaces after row reduction
Section 9: Matrix determinant
Lecture 87 Determinant: concept and applications
Lecture 88 Determinant of a 2x2 matrix
Lecture 89 Code challenge: determinant of small and large singular matrices
Lecture 90 Determinant of a 3x3 matrix
Lecture 91 Code challenge: large matrices with row exchanges
Lecture 92 Find matrix values for a given determinant
Lecture 93 Code challenge: determinant of shifted matrices
Lecture 94 Code challenge: determinant of matrix product
Section 10: Matrix inverse
Lecture 95 Matrix inverse: Concept and applications
Lecture 96 Computing the inverse in code
Lecture 97 Inverse of a 2x2 matrix
Lecture 98 The MCA algorithm to compute the inverse
Lecture 99 Code challenge: Implement the MCA algorithm!!
Lecture 100 Computing the inverse via row reduction
Lecture 101 Code challenge: inverse of a diagonal matrix
Lecture 102 Left inverse and right inverse
Lecture 103 One-sided inverses in code
Lecture 104 Proof: the inverse is unique
Lecture 105 Pseudo-inverse, part 1
Lecture 106 Code challenge: pseudoinverse of invertible matrices
Lecture 107 Why should you avoid the inverse?
Section 11: Projections and orthogonalization
Lecture 108 Projections in R^2
Lecture 109 Projections in R^N
Lecture 110 Orthogonal and parallel vector components
Lecture 111 Code challenge: decompose vector to orthogonal components
Lecture 112 Orthogonal matrices
Lecture 113 Gram-Schmidt procedure
Lecture 114 QR decomposition
Lecture 115 Code challenge: Gram-Schmidt algorithm
Lecture 116 Matrix inverse via QR decomposition
Lecture 117 Code challenge: Inverse via QR
Lecture 118 Code challenge: Prove and demonstrate the Sherman-Morrison inverse
Lecture 119 Code challenge: A^TA = R^TR
Section 12: Least-squares for model-fitting in statistics
Lecture 120 Introduction to least-squares
Lecture 121 Least-squares via left inverse
Lecture 122 Least-squares via orthogonal projection
Lecture 123 Least-squares via row-reduction
Lecture 124 Model-predicted values and residuals
Lecture 125 Least-squares application 1
Lecture 126 Least-squares application 2
Lecture 127 Code challenge: Least-squares via QR decomposition
Section 13: Eigendecomposition
Lecture 128 What are eigenvalues and eigenvectors?
Lecture 129 Finding eigenvalues
Lecture 130 Shortcut for eigenvalues of a 2x2 matrix
Lecture 131 Code challenge: eigenvalues of diagonal and triangular matrices
Lecture 132 Code challenge: eigenvalues of random matrices
Lecture 133 Finding eigenvectors
Lecture 134 Eigendecomposition by hand: two examples
Lecture 135 Diagonalization
Lecture 136 Matrix powers via diagonalization
Lecture 137 Code challenge: eigendecomposition of matrix differences
Lecture 138 Eigenvectors of distinct eigenvalues
Lecture 139 Eigenvectors of repeated eigenvalues
Lecture 140 Eigendecomposition of symmetric matrices
Lecture 141 Eigenlayers of a matrix
Lecture 142 Code challenge: reconstruct a matrix from eigenlayers
Lecture 143 Eigendecomposition of singular matrices
Lecture 144 Code challenge: trace and determinant, eigenvalues sum and product
Lecture 145 Generalized eigendecomposition
Lecture 146 Code challenge: GED in small and large matrices
Section 14: Singular value decomposition
Lecture 147 Singular value decomposition (SVD)
Lecture 148 Code challenge: SVD vs. eigendecomposition for square symmetric matrices
Lecture 149 Relation between singular values and eigenvalues
Lecture 150 Code challenge: U from eigendecomposition of A^TA
Lecture 151 Code challenge: A^TA, Av, and singular vectors
Lecture 152 SVD and the four subspaces
Lecture 153 Spectral theory of matrices
Lecture 154 SVD for low-rank approximations
Lecture 155 Convert singular values to percent variance
Lecture 156 Code challenge: When is UV^T valid, what is its norm, and is it orthogonal?
Lecture 157 SVD, matrix inverse, and pseudoinverse
Lecture 158 SVD, (pseudo)inverse, and left-inverse
Lecture 159 Condition number of a matrix
Lecture 160 Code challenge: Create matrix with desired condition number
Lecture 161 Code challenge: Why you avoid the inverse
Section 15: Quadratic form and definiteness
Lecture 162 The quadratic form in algebra
Lecture 163 The quadratic form in geometry
Lecture 164 The normalized quadratic form
Lecture 165 Code challenge: Visualize the normalized quadratic form
Lecture 166 Eigenvectors and the quadratic form surface
Lecture 167 Application of the normalized quadratic form: PCA
Lecture 168 Quadratic form of generalized eigendecomposition
Lecture 169 Matrix definiteness, geometry, and eigenvalues
Lecture 170 Proof: A^TA is always positive (semi)definite
Lecture 171 Proof: Eigenvalues and matrix definiteness
Section 16: Bonus section
Lecture 172 Bonus lecture
Anyone interested in learning about matrices and vectors,Students who want supplemental instruction/practice for a linear algebra course,Engineers who want to refresh their knowledge of matrices and decompositions,Biologists who want to learn more about the math behind computational biology,Data scientists (linear algebra is everywhere in data science!),Statisticians,Someone who wants to know the important math underlying machine learning,Someone who studied theoretical linear algebra and who wants to implement concepts in computers,Computational scientists (statistics, biological, engineering, neuroscience, psychology, physics, etc.),Someone who wants to learn about eigendecomposition, diagonalization, and singular value decomposition!,Artificial intelligence students