Sie sind auf Seite 1von 5

This is Calculus.

I'm Robert Ghrist, Professor Mathematics


& Electrical Systems Engineering at the
University of Pennsylvania.
And you are about to have a dream of
multivariable calculus.
You're having a dream, a very strange
dream in which the ghosts of calculus
past come to haunt you, showing you
series everywhere you see.
The ghost of calculus future now comes to
show you what you might see in
multivariable calculus.
Calculus as ever, begins with functions,
functions that have inputs and outputs.
But in multivariable calculus, one
considers functions with multiple inputs
and perhaps multiple outputs.
Such functions are common in so many
applications, in dynamics, in market
systems, in looking at digital images.
Anything having to do with data will
involve functions with multiple inputs
and multiple outputs.
How do we do calculus for such functions?
All of those inputs and outputs can lead
to notational complexity.
But with the appropriate data structure,
the story of calculus will remain the
same.
What is the right data structure for
doing calculus with multivariate
functions?
The appropriate data structure is that of
a matrix.
A matrix is simply an array of numbers.
For example, a 4-by-3 matrix consists of
four rows and three columns, and certain
matrices are especially useful in
calculus.
These are the square matrices.
For example, a 3-by-3 matrix or a 2-by-2
matrix or even a 1-by-1 matrix, which you
are used to thinking of as simply a
number.
Certain matrices have wonderful
properties.
For example, the identity matrix, often
denoted as I is a square matrix
consisting of ones along the diagonal and
zeroes off the diagonal.
Why is this called the identity matrix?
This is connected to Matrix Algebra.
One of the first tasks of multivariable
calculus is learning Matrix Algebra.
For example, you can multiply matrices
together.
We could take say a 2 by 3 matrix and a 3
by 3 matrix and multiply them together as

follows.
What one does, is multiply the rows of
the first and the columns of the second
matrix through a particular manner.
The first term in the first row is
multiplied by the first term in the first
column.
To this is added the second term, and the
third terms, etcetera.
So for example, the first row of the
matrix shown, 3, 1, 0 is multiplied by
the first column, the second matrix 2, 1,
negative 2.
The answer is 3 times 2 plus 1 times 1
plus 0 times negative 2 or 7.
One fills in all of the other slots of
the product matrix through a similar
method.
Now, in the end, there's some very nice
properties to Matrix Algebra.
The identity matrix is something like the
number 1, in that it does not change a
matrix when you multiply by the identity.
Other matrices have similar, interesting
numerical properties.
For example, consider the matrix A that
is a 2 by 2 matrix, 0, negative 1, 1, 0.
Since A is a square matrix, we can
multiply it by itself.
What happens when we square A?
We will get negative 1s on the diagonal
and 0s off the diagonal.
That is A squared is something like
negative 1 in Matrix Algebra.
So this matrix A is something like the
square root of negative 1.
You may wish to remember that little fact
as data structure, matrices work well
with vectors.
One way to think about vectors is there a
difference between two points.
One considers two points in nth
dimensional space and look at their
difference.
This gives an object that has both a
magnitude or length and a direction.
For example, a planar vector has two
components to it, the change in the x
direction and the change in the y
direction.
A vector with four components is
something that you might call
four-dimensional, and you might represent
it as a column vector or an n by 1
matrix.
Now, vectors also have an algebra, you
can add vectors together in a way that
you've probably done before in geometry
class by moving them head to tail and
looking at the resultant vector.

Of course, vector addition is


commutative, unlike matrix multiplication
and it has some wonderful properties.
Vectors relate to calculus by encoding
rates of change in multiple variables.
This leads to the first key idea in
multivariable calculus.
I want you to remember that the
derivative of a function at a point is
not a number.
It is rather a matrix.
When you have multiple inputs and
multiple outputs, you can think of the
rartes of change of a particular output
with respect to change in a particular
input.
These are called partial derivatives and
they are extremely useful in calculus.
But why is the data structure of a matrix
important, it is because the algebra of
matrices mirrors what functions and their
derivatives do.
For example, the chain rule is manifested
as matrix multiplication, this and other
examples of Matrix Algebra are extremely
useful in multivariable calculus.
We also need matrices to solve systems of
ordinary differential equations.
When there are multiple variables, then
we need multivariable calculus.
Consider for example, the simple system x
prime equals minus y and y prime equals x
or x and y are functions of t.
This is a coupled system.
The derivative of x depends upon y, the
derivative of y depends upon x.
We can recast this as a matrix equation
by defining a vector variable, capital X
that has x of t in the first slot and y
of t in the second slot.
Then what is the derivative of this
vector, capital X?
We can write that as a product of the 2
by 2 matrix A, 0, negative 1, 1, 0 with
the vector X.
This is the multivaried analogue of the
simple ODE, X prime equals AX.
But now, X and X prime are vectors and A
is a square matrix.
What's the solution to this equation
going to be?
Well, we've seen an equation like this
before.
Let's see if the solution bears out the
pattern.
What I want you to remember is that the
solution to X prime minus is the
exponential.
Whether we're talking about scalars or
vectors.

Here, the solution is X of t equals e to


the At times X naught.
Where, as you may recall, X naught is the
initial condition, in this case, initial
conditions, since there's more than one
variable, and what is e to the At?
How do I exponentiate a matrix?
Well, how do we exponentiate anything
else?
Of course, we do so via a series
expansion.
Since A is a square matrix, you can take
it to the 2nd power, the 3rd power, the
4th power, etcetera.
The exponential of a matrix A e to the a
is one plus a plus one half a squared
plus one over three factorial a cubed,
etcetera.
Oh, wait a minute, what do I mean by 1 at
the beginning?
I mean the identity matrix I.
And so, the solution to our differential
equation involves multiplying the matrix
A by t, and then exponentiating this.
This is the solution to our linear
system.
Remember this, you will see it again
someday.
Now, of course, calculus doesn't end with
derivatives or differential equations.
We still have the notion of integrals to
worry about.
You've already seen a little bit of what
will come in the sense of multiple
integrals, double integrals, triple
integrals, or even more.
Without going into any details, the one
thing that I want you to remember is that
when you are integrating, and you want to
do U-substitution you are going to have
to use derivatives.
That means you are going to have to
understand derivatives and the matrices
associated with them very well.
If you have a vector of variables X, and
a new vector of variables U that is
related to X by some function, then what
is DU?
What is dx?
And how are they related?
Here's a hint, it has something to do
with matrices and derivatives.
Remember, you'll see this change of
variables formula again.
The last set of grand ideas in
multvariate calculus concerns fields.
We'll begin by looking at vector fields.
A vector field is an arrangement of a
vector at every point in space, such
objects [UNKNOWN] say the motion of a

fluid or certain quantities in


electromagnetics.
How would you talk about the derivative
of such a field?
What does it mean to differentiate a
field of vectors?
How do you integrate with respect to a
vector field?
What does that even mean?
We'll spend a lot of time answering those
questions.
What I want you to remember when you see
multivariate calculus, is that it tells
the same story as single variable
calculus, only with a few new characters
and data structures and a lot more
action.
I hope that when you do take
multivariable calculus, that you remember
having had this wonderful, weird dream
about what multivariable calculus is, how
it follows single variable calculus and
how important the subject is.
Till then, sweet dreams.

Das könnte Ihnen auch gefallen