Math, math, math, math. Math?

Teaching


Spring 2018 - Math 235.04 Linear Algebra

Announcements

Check here frequently throughout the course for announcements, such as exam locations or assignment extensions.

The final exam for all sections will be held Monday 5/7/18, 10:30AM-12:30PM, in Boyden gym.
The syllabus for the final exam consists of the following sections of the textbook: 4.5, 4.6, 5.1, 5.2, 5.3, 5.5, 6.1, 6.2, and 6.3.

Resources

Main course webpage: where you can find the course syllabus, coursewide policy, and expectations about grading.

Check back in for our sections policy/expectations, links to assignments, and other resources.

Section policy and expectations
Note that I have additional office hours during the add drop period. I can be found in LGRT 1323L according to the following schedule:
Mon 1/22: 1:20-2:20 pm
Wed 1/24: 9:30-10:30 am and 1:20 -3:20 pm
Mon 1/29: 1:20-2:20 pm
Wed 1/21: 9:30-10:30 am and 1:20 -3:20 pm
Thu 2/1: 2:30-3:30 pm

Lectures and course notes

Quiz 1 solutions

Quiz 2 solutions

Quiz 3 Solutions

Quiz 4 Solutions

Quiz 5 Solutions

Quiz 6 Solutions

Challenge Problems - Hard copy must be recieved by Friday, May 4, 5:00 pm (in my mailbox in LGRT).

Additional resources of interest:
Gilbert Strang's Linear algebra lectures through MIT OCW (remark: I'd consider using Strang's textbook were it entirely up to me; these lectures are probably worth a view!)
Linear algebra toolkit
Expository Article about Google's use of Linear algebra in web ranking
Original article about Google's development
A paper by Terrence Parr and jeremy Howard on the Matrix-Calculus behind Deep Learning

Course Overview

On the surface, linear algebra, like calculus, teaches a set of tools to solve a specific class of computational problems. At times, it may seem like much of what you are trying to learn and master in this course is how to quickly perform strings of calculations. But that is surely not the foremost goal of this course. As I hope you'll discover, linear algebra is also a bridge into higher mathematics, and in many ways quite different from a subject whose concern is merely the solution of a certain class of problems.

Indeed, linear algebra gives a unifying framework and an indispensable set of tools for many of the branches of mathematics. Consider calculus, where the notion of linear approximation led to a fruitful understanding of single variable functions, their rates of change, and how to measure accumulated change via the integral. At the heart of single variable calculus is this notion of linearization. There were also some common themes arising whenever new operations were defined and their properties examined. The limit of a sum of functions is a sum of the limits of the separate functions, provided these limits exist, and so it is also with derivatives and integrals of sums. If we scale functions by constants, we can move the constants outside of well defined limit/derivative/integral operations, or pass them back in. These collective properties of limits, derivatives and integrals are given the name linearity. Linearity arises naturally in the study of systems of linear equations, such as those for finding intersections of lines in a plane, or intersections of planes in three dimensional space. Our linear algbebra course begins by studying such systems of equations.

First we investigate systems of linear equations, and understand their solutions geometrically while learning algebraic, algorithmic methods to determine solutions if they exist and discover if they are unique. After this course of investigation we will uncover the general properties that make an object "linear" (it's the linearity discussed above!) The course will become a bit more abstract here, as we will investigate spaces of linear objects, called vector spaces, and we will study maps between them. This formalizes the pattern described in the properties of the operations of calculus; we will come to understand derivatives as linear operators on functions, or equivalently as linear maps on the space of differentiable functions. As further examples of vector spaces, we will discuss spaces of polynomials.

We will explore matrices as a tool to encode linear systems. We will see that spaces of matrices are also vector spaces in a natural way, and that matrices give a simple way to notationally express linear transformations. We will describe determinants of square matrices, giving an algorithm for their general computation, and discuss their geometric meaning.

We will investigate eigenvalues and eigenvectors, which play a role in linear algebra analogous to fixed points. We will discover that the eigenvalues are crucial objects of study (the "bread and butter" of mathematical physics, as they represent energy states in quantum mechanics, and are indispensable in solving ordinary and partial differential equations). We will use eigenvectors and eigenvalues to produce a decomposition involving diagonal matrices when the eigenspaces span the original vector space. We will also discuss inner product structures on vector spaces, of which the dot product is an example. This will lead us to an algorithm for producing orthonormal bases of an inner product space.

Whenever possible, applications will be injected into the discussion so that we may apply our theory in interesting ways. But there are many more topics that belong in a linear algebra course than one can hope to cover in under 2000 minutes of lecture! Thus, I highly encourage you to come to my office hours to discuss additional problems and topics, and to find interesting applications of the subject. If you are a mathematics major I especially encourage you to look into more advanced resources on linear algebra; often it is expected later on in your mathematical career that you have an advanced grasp of linear algebra, beyond the level of this course, and even beyond the level of the 500 level courses offered here. Some extra topics to consider:
Jordan canonical form
LU and QR decomposition
Jordan-Chevalley decomposition (semi-simple/nilpotent decomposition)
Symmetric and Skew symmetric linear operators
Bilinear maps and their relation to quadratic forms
Tensor product of vector spaces and multilinear maps
Exact sequences of vector spaces
Cokernels and Quotient vector spaces
Hermitian matrices
Symplectic linear algebra
...Wikipedia's list of linear algebra topics...

So while calculus is in many ways a box of tools for approaching functions in local and global ways, underneath it all linear algebra provides the scaffolding, turning up in the algorithms to calculate derivatives, in the changes of variables, in the structure of vector arithmetic, and as a unifying structure for understanding operators like limits, derivatives, integrals, and the sets of functions these operators work upon. Linear algebra is the framework necessary to understand many of the methods of approximation used by mathematicians. It underpins much of the blossoming field of machine learning, and is also indispensible for its computational methods which find their ways into many disciplines within the sciences. By the end of the course, hopefully you have not just had plenty of practice solving linear problems, but come away with an understanding and appreciation for the theory of linear systems, vector spaces, and linear maps.

Lectures and course notes

Lecture notes for lecture 1 - Introducing real linear systems of equations, matrices and augmented matrices, row operations, and the problems of existence and uniqueness of solutions to inear systems of equations.

Lecture notes for lecture 2 - The description of the Gauss-Jordan elimination algorithm, a complete solution to the problem of intersection of two lines in R2, and examples and discussion of row reduction, consistency, inconsistency, free variables, existence, and uniqueness.

Notes on vector algebra from 233 - Contains a discussion of vector arithmetic and elementary properties of vectors (which are stated sufficiently generally to work as axioms of a vector space), the dot product, cross product, lines and planes. The notations and approach to a number of topics differ from that of this course (e.g. the use of angle bracket notation instead of writing real vectors as column vectors). The problems are purely optional, and were meant to provide some challenge and practice as extra credit when I taught 233 in fall 2017. Some problems, such as in the section on dot products, will be relevant to discussions of orthogonality and inner products at the end of the course. Also, the vector spaces we refer to as Rn are given their own notation (to distinguish the space of points with n real coordinates from that space with the added structre of being a vector space; this is a distinction that for whatever reason is emhasized in many vector calculus courses, but which we brush aside for linear algebra.)

Lecture notes for lecture 5 - Introducing matrix-vector products and the equation Ax=b. Also contains a discussion of dot products and geometry of planes, plus two challenge problems.

Lecture notes for lecture 6 - Describing general solution sets for Ax=b, using particular solutions and solutions of the homogeneous equation Ax=0. Further discussion of lines and planes, from the perspective of building parameterizations from their equations. Expands upon a challenge problem from the preceding lecture.

Lecture notes for lectures 7 and 8 - Linear dependence and independence, with a challenge problem on the idea that linearly independent sets are minimal generating sets for spans.

Lecture notes for lectures 9 and 10 - Introduction to linear transformations, featuring many 2D and 3D examples, as well as some formalities of definition. Review of terminology for describing functions. Features challenge problems on describing various geometric transformations of the plane and 3-space using matrices.

Lecture notes on Matrix Algebra - Representing a linear map by a matrix, computing matrix sums and products.

Lecture notes on Matrix Inverses - First, there is an introduction to inverses of functions. There is then the definition of inverses for square matrices, discussion of computation, properties, and characterization via the Inverse Matrix Theorem.

Lecture notes on determinants of square matrices - Discussion of determinants in 2 and 3 dimensions first, and relation to areas and volume, cross products and triple products. Then a discussion of multilinear maps, alternating multilinear maps, and the definition of determinants of square matrices in higher dimensions. There is a discussion of methods of computation, such as cofactor expansion and via row reduction. Finally discussion of hypervolumes and a brief mention of Cramer's rule and the adjugate formula for matrix inverses.

Lecture notes Introducing Vector Spaces and Subspaces - Formal definition of a vector space (over an arbitrary field) is given, plus specific examples over the reals, the complex numbers, and the field of 2 elements. Polynomial spaces and function spaces are briefly introduced. Subspaces are definied and a test is given. Linear transformations between vector spaces are defined, and the images and kernels of linear maps are seen to be subspaces of the codomain and domain respectively. For linear maps from Rn to Rm given via matrix multiplication we can associate the column space and the null space of the matrix, which are terms for the associated image and kernel.

Lectures on Basis, Dimension, and coordinates for vector spaces - A recollection of the ideas of linear independence and dependence, rephrased for the general case of vectors in an abstract vector space V, followed by discussion and definition of basis and dimension. Linear maps are studied using bases and the idea of linear extension. Isomorphisms are introduced, as well as coordinates relative to a basis.

Lectures on Row Space, Column Space, Null Space, and The Rank-Nullity Theorem - The relationship between the row space, column space, and null space is explored, culminating in the rank-nullity theorem (also called the rank theorem or the fundamental theorem of linear algebra). Some applications are demonstrated as well, such as Euler's formula for graphs.

Lectures Introducing Eigenvectors and Eigenvalues - Definitions of eigenvalues and eigenvectors, a discussion of the characteristic euqation and multiplicities (both algebraic and geometric), as well as the equivalence relation of similarity. There is a brief introduction to applications at the end, which is probably better understood after studying the lectures on diagonalization - the slides of the final section were covered later in the course, after diagonalization was presented.

Lecture on Diagonalization (hand-written notes) The lecture notes that were given to guest lecturer Angelica Simonetti while I was away at GSTGC 2018 at UIC (Chicago).

Lecture notes on the dot product, orthogonality, and inner product spaces - Summarizes the last week of lectures in this course. Topics include inner products, orthogonal projections onto a subspace, distances, orthonormal bases, and orthogonal transformations of 3-space.