This is calculus. I'm Robert Ghrist, professor,
Mathematics and Electrical and s=Systems Engineering,
at the University of Pennsylvania, and you are about to have a dream
of multivariable calculus. You're having a dream. A very strange dream in which
the ghosts of calculus past have come to haunt you,
showing you series everywhere you see. The ghost of calculus future now
comes to show you what you might see in multivariable calculus. Calculus, as ever, begins with functions,
functions that have inputs and outputs. But in multivariable calculus,
one considers functions with multiple inputs and perhaps multiple outputs. Such functions are common in so
many applications. In dynamics, in market systems, in looking
at digital images, anything having to do with data will involve functions with
multiple inputs and multiple outputs. How do we do calculus for such functions? All of those inputs and
outputs can lead to notational complexity. But with the appropriate data structure, the story of calculus
will remain the same. What is the right data structure for doing
calculus with multivariate functions? The appropriate data structure
is that of a matrix. A matrix is simply an array of numbers. For example, a four by three matrix
consists of four rows and three columns. Certain matrices are especially
useful in calculus. These are the square matrices. For example, a three by three matrix,
or a two by two matrix. Or even a one by one matrix, which you are
used to thinking of as simply a number. Certain matrices have
wonderful properties. For example the identity matrix,
often denoted as I, is a square matrix consisting of ones. Along the diagonal, and
zeros off the diagonal. Why is this called the identity matrix? This is connected to matrix algebra. One of the first tasks, multi-variable
Calculus is learning matrix algebra. For example,
you can multiply matrices together, we could take say a two
by three matrix and three by three matrix and
multiple them together as follows. What one does is multiply
the rows of the first and the columns of the second matrix
through a particular manner the first term in the first row is multiplied by
the first term in the first column. To this is added the second term,
the third terms etcetera. So for example,
the first row of the matrix shown 310 is multiplied by the first column,
the second matrix two one negative two. The answer is three times two,
plus one times one, plus zero times negative two, or seven. One fills in all of the other
slots of the product matrix. Through a similar method. In the end, some very nice
properties to matrix algebra. The identity matrix is
something like the number one in that it does not change a matrix
when you multiply by the identity. Other matrices have similar,
interesting numerical properties. For example,
consider the matrix A that is a two by two matrix, 0, -1, 1, 0. Since A is a square matrix,
we can multiply it by itself. What happens when we square A? We will get negative ones on the diagonal,
and zeros off the diagonal. That is, a squared is something
like negative 1 in matrix algebra. So, this matrix a is something
like the square root of -1. You may wish to remember that little fact. As data structures,
matrices work well with vectors. One way to think of a vector is as
a difference between two points. One considers two points in a dimensional
space and looks at their difference. This gives an object that has both
a magnitude or a length, and a direction. For example,
a planar vector has two components to it. The change in the x direction, and
the change in the y direction. A vector with four components is something
that you might call four dimensional. And you might represent it as
a column vector, or an n by 1 matrix. Now vectors also have an algebra. You can add vectors together in a way that
you've probably done before in geometry class, by moving them head to tail and
looking at the resultant vector. Of course, vector addition is commutative
unlike matrix multiplication and it has some wonderful properties. Vectors relate to calculus by encoding
rates of change in multiple variables. This leads to the first key idea
in multi-variable calculus. I want you to remember that the derivative
of a function at a point is not a number. It is rather a matrix. When you have multiple inputs and
multiple outputs, you can think of the rates
of change of a particular output with respect to change
in a particular input. These are called partial derivatives. And they are extremely useful in calculus. Why is the data structure
of a matrix important? It is because the algebra of
matrices mirrors what functions, and their derivatives do. For example, the chain rule is
manifested as matrix multiplication. This and other examples of matrix algebra are extremely useful in
multivariable calculus. We also need matrices, to solve systems
of ordinary differential equations. When there are multiple variables,
then we need multivariable calculus. Consider for example, the simple system. X prime equals minus y and y prime equals
x, where x and y are functions of t. This is a coupled system. The derivative of x depends upon y. The derivative of y depends upon x. We can recast this as a matrix equation by defining a vector variable, capital X, that has x(t) in the first slot and
y(t) in the second slot. Then what is the derivative
of this vector, capital X? We can write that as a product
of the two by two matrix A, 0, -1, 1, 0, with the vector x. This is the multivariate
analog of the simple O-D-E. X prime equals AX. But now, prime our vectors and
A is a square matrix. What's the solution to
this equation going to be? Well, we've seen an equation
like this before. Let's see if the solution
bears out the pattern. What I want you to remember is that
the solution to X prime equals AX is the exponential whether we
are talking about scalers or vectors. Here the solution is X of T
equals E to the At times X0. Where, as you may recall,
X0 is the initial condition, in this case, initial conditions,
since there's more than one variable. And what is e to the At? How do I exponentiate a matrix? Well, how do we
exponentiate anything else? Of course, we do so
via a series expansion. Since A is a square matrix you
can take it to the second power, the third power,
the fourth power etcetera. The exponential of a matrix A,
e to the A is 1 plus A, plus one-half A squared, plus one over
three factorial A cubed, etcetera. Wait a minute,
what do I mean by 1 at the beginning? I mean the identity matrix I, and so the solution to our differential
equation involves multiplying the matrix A by t and then exponentiating this. This is the solution to our linear system. Remember this,
you will see it again some day. Now of course calculus doesn't
end with derivatives or differential equations, we still have
the notion of integrals to worry about. You've already seen a little bit of
what will come in the sense of multiple integrals, double integrals,
triple integrals or even more. Without going in to any details,
the one thing that I want you to remember is that when your integrating and you want to do use substitution,
you're going to have to use derivatives. That means your gonna have to understand. Derivatives, and the matrices
associated with them very well. If you have a vector of variables,
X, and a new vector of variables, U, that is related to X by some function,
then what is dU? What is dX? And how are they related? And here's a hint, it has something
to do with matrices and derivatives. Remember you'll see this change
of variables formula again. The last set of grand ideas in
multivariate calculus concerns fields. We'll begin by looking at vector fields. A vector field is an arrangement of
a vector at every point in space. Such objects model say
the motion of a fluid or certain quantities, intromegnetics. How would you talk about
the derivative of such a field. What does it mean to
differentiate a field of vectors? How do you integrate with
respect to a vector field? What does that even mean? We'll spend a lot of time
answering those questions. What I want you to remember when you
see multivaried calculus is that it tells the same story as single
variable calculus only with a few new characters and
data structures and a lot more action. I hope that when you do take
multi-variable calculus that you remember having had this wonderful, weird dream
about what multi-variable calculus is, how it follows single variable calculus,
and how important the subject is. Till then, sweet dreams.