List11.1Presentation Topics
- Graph Theory: Adjacency Matrices (Justin S, Satvir, Min Chan)
In graph theory, a graph is a collection of \(n\) vertices with some edges that may connect two vertices together. (This is not like the graph of a function.) A graph's adjacency matrix is a symmetric matrix that records which vertices are connected to which using \(0\)s and \(1\)s. Explain how the \(k\)th power of an adjacency matrix provides a count of how many ways there are to get from one vertex to another in \(k\) steps. Demonstrate how an eigenvector for the dominant eigenvalue determines which vertices are “the most connected”.- Statistics: Linear Regression (Austin, Nahele, Nick)
In statistics, we might record pairs of variables for many individuals, and we might believe that in general these variables have a linear relationship (in the sense of \(y=mx+b\)) . Explain how we can take a data set and interpret the hunt for \(m\) and \(b\) as a flawed attempt to solve an equation of the form \(A\vec{c}=\vec{b}\) for \(\vec{c}\text{.}\) Then explain how we can find a “least-squares” solution: a vector \(\vec{c}\) such that the distance between \(A\vec{c}\) and \(\vec{b}\) is as small as possible.- Computer Graphics: 3D Moving Objects on a 2D screen (Justin L, Adam)
In simple computer graphing situations, a simple 3D object is represented on a 2D screen. Demonstrate how this is done. Start by reading section 2.7. You are interested in the sections on 3D graphics, but the first section will help you understand the rest better. Take an object in \(\mathbb{R}^3\) (a cube would be fine with its eight vertices). Create an interesting rotation matrix for action in \(\mathbb{R}^3\text{,}\) apply it to the object, and project both the original object and its rotated version on to a 2D screen using a center of projection. I recommend using GeoGebra software to demonstrate.- Calculus: Fourier Series (Alex, Jeremy, Ian T)
-
This topic allows you to prove that \(\sum\limits_{n=1}^{\infty}\frac{1}{n^2}=\frac{\pi^2}{6}\text{.}\)Let $I$ be the set of integrable functions on \([-\pi,\pi]\text{.}\) Show that we can define an inner product on \(I\) according to \(\langle f,g\rangle=\int_{-\pi}^{\pi}f(t)g(t)\,dt\text{.}\) Convince the audience that the collection \begin{equation*} S=\left\{\frac{1}{\sqrt{2\pi}},\frac{\sin(x)}{\sqrt{\pi}},\frac{\cos(x)}{\sqrt{\pi}},\frac{\sin(2x)}{\sqrt{\pi}},\frac{\cos(2x)}{\sqrt{\pi}},\ldots\right\} \end{equation*} is an orthonormal set.Find the projection of \(f\) onto \(S\text{,}\) where \(f(x)=x^2\) (restricted to \([-\pi,\pi]\text{.}\) This projection, \(\mathrm{proj}_{\mathrm{Span}(S)}(f)\text{,}\) is known as the Fourier series for \(f\text{.}\) An advanced theorem tells us that \(f=\mathrm{proj}_{\mathrm{Span}(S)}(f)\) for input in \((-\pi,\pi)\text{.}\) Evaluate both sides at \(x=\pi/3\) to get a famous and interesting equation. - Cryptography: Encryption Key Matrices (Josh, Mike C.)
Invertible matrices can sometimes be used for encryption. A message (or part of a message) is converted into a vector, and the matrix transforms the vector into a new vector that is seemingly nonsense. Explain how this kind of encryption and the corresponding decryption works. You will need to explain a little bit about modular arithmetic (for example, mod \(26\)). You should recognize and explain that there are special conditions about which matrices may be used for this process.- Vector Calculus: The Second Derivative Test (Elijah, Jae, Jared)
If \(f\) is a function of two variables, the graph of \(z = f(x,y)\) is a surface in \(\mathbb{R}^3\text{.}\) In vector calculus, we learn that potential extrema of \(f\) happen at critcal points: \((x_0,y_0)\) where \(\nabla f(x_0,y_0) = \vec{0}\text{.}\) We also learn a second derivative test for discerning whether a given critical point is a local maximum, local minimum, or saddle point for \(f\text{.}\) Present this second derivative test and explain why it works, using eigenvalues and a diagonalization of the Hessian matrix. (The second derivative test for two-variable functions can be found in Stewart’s Calculus book, in section 11.7.)- Classifying all \(2\times2\) matrices (James W, Ross)
As we know, \(2\times2\) matrices provide linear transformations on the plane. Classify all \(2\times2\) matrices by what they do geometrically to the plane. Get started by considering what kinds of eigenvalues a matrix could have:two distinct real
one repeated real
two complex conjugates (this may take the bulk of your explanation)
- Recursive Sequences: Fibonacci and Tribonacci (Erin, McKenzie, Jen)
Explain how powers of the matrix \(\begin{bmatrix}0\amp1\\1\amp1\end{bmatrix}\) produce the Fibonacci sequence. (It’s not enough to just demonstrate some powers of this matrix — you need to give explanation.) Use this matrix and diagonalization to find an explicit formula for \(F_n\text{,}\) the \(n\)th term of the Fibonacci sequence, and show how the formula can be used to quickly find very large Fibonacci numbers. Explain how the same method could be used to find formulas for other, similar sequences defined by recurrence relations. For example, the Tribonacci relation is that \(T_{n+1} = T_n +T_{n−1} +T_{n−2}\text{,}\) and the first terms in the sequence are \(T_0 = 0\text{,}\) \(T_1 = 1\text{,}\) and \(T_2 = 1\text{.}\)s- Quadric Curves (Laura, James H, Andrew)
Explain how to take a generic quadratic equation in two variables \begin{equation*} Ax^2+Bxy+Cy^2+Dx+Ey+F=0 \end{equation*} and determine what it looks like. By using basic algebra and linear algebra (in particular, diagonalizability of symmetric matrices) you can identify the “center” of the curve, whether it is an ellipse, hyperbola, parabola, or something “degenerate”, what the orientation of primary axes is, and what the “eccentricity” is.- Markov Chains (Mohammed, Marcus, Ryan)
A Markov Chain captures the idea of a collection of values iterating in a regular way. Present what a Markov Chain is, what a stochastic matrix is, and what can be said about the eigenvalues of a stochastic matrix. Give several application examples.- Linear Systems of Differential Equations (Oleg, Khoa, Junboom)
-
A linear differential equation in \(n\) functions of time is an equation like
\begin{equation*} w_1'=a_1w_1+a_2w_2+\cdots+a_nw_n \end{equation*}where the \(w_a\) are functions of time, and the \(a_i\) are constant. Explain how linear algebra can be used to solve a system of linear differential equaions, focusing on systems of two equaions in two unknows. Explore the paths taken by solution curves where \(x=w_1(t)\) and \(y=w_2(t)\text{.}\) Explore how eigenvalues and eigenspaces affect these paths.
- Sums of Powers (Suhwan, Gloria, Ian M)
Discuss two bases for polynomials of degree \(k\) or less: \(\{1,x,x^2,\ldots x^{k}\}\) and \(\left\{\binom{x}{0},\binom{x}{1},\binom{x}{2},\ldots,\binom{x}{k}\right\}\text{.}\) Explain how a change of basis matrix makes it easy to come up with simple formulas for sums of powers like \(1^2+2^2+3^2+\cdots+n^2\text{.}\)