Sie sind auf Seite 1von 9

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.

com

Vector Subspace/ Spanning Set/ Inner Products


[In the Lecture Notes-I we provided a formal introduction to Vector Space with some examples. Here we continue with the structure and provide more practical examples.]
Subspaces:

A vector subspace is a vector space that is embedded in a larger vector space. In other words, subspace is a subset of a vector space that is itself a vector space! Note: Being a subset, the vectors in it follow most of the vector space axioms. One additional condition that is required is the closure property. This automatically implies the inclusion of zero element. Symbolically, we write

is the subspace of , where Conditions:

is vector space defined over a field

Let be a vector space over a field . Now if be a subspace of , then the following conditions have to satisfy. 0 , where 0 zero vector inclusion for all Closure property , for and

Examples: #1. Straight lines or planes through the origin constitute a subspace of the three dimensional Euclidean space. {( } (3D Euclidean space, written symbolically) Now we can write, {( } (The xy-plane) which is isomorphic to } . which is

The above is a 2D vector subspace of In the same way, the x-axis {( then isomorphic to .

is a 1D vector subspace of

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

#2. To prove that Proof: (i) (ii)

{(

is a subspace of

Zero vector inclusion: 0 = (0, 0, 0) which satisfies the constraint, Closure property under addition: Consider two vectors, ( , and . ( ( ( Thus,

, so 0 ( with the conditions that Condition satisfied

(iii)

Closure under multiplication: ( Let ( implies for all and all .

Therefore, we can say

Spanning Set:

Let , be a set of elements from a vector space . The vector subspace consisting of }. all linear combinations of , is called the vector space spanned by the set { } will be called the spanning set for . Now the set of vectors { In mathematical terms, { Note the following: }. Every element of is a linear combination of { } only, is the smallest subspace of . The subspace containing { } , where .

Example: is spanned by , , (unit vector along x, y and z-axes). More elaborately, we can say, the vectors make a spanning set for . ( , ( , (

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

Consider

as a linear combination of the above vectors: ( ( ( }. Thus { } is a

Now we can check, every spanning set for .


Basis Set:

is a linear combination of {

A basis set is a set of vectors { Example: Consider the following vector subspace {( From the constraint, we have So we write, {(

} which is linearly independent and a spanning set of .

, where }.

For the following choices, , , ( ( and and

We can check that and are independent and they span the space. In this case can be chosen as a basis set. H.W. Problems: #1. Prove the above claim (in the example) #2. Prove that if ( and { } forms a basis set for . ( are elements of a vector space

, then

SUM of two Vector Spaces:

Definition: Consider, , { The vector space consists of all sums, .


3

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

Example: Consider two vector spaces, {( {( } and {( } }

DIRECT SUM:

The direct sum is when every vector one way) as . Example: {( and {( } } xy-plane

in

can be written in an unique way (i.e., one and only

Line along z-axis can be uniquely written as ( ( ( .

Any vector (

Note the difference between above two cases: Ordinary SUM: (1st case) , so we can

( ( ( ( or ( ( write the sum in more than one ways and so this it is not unique. Direct SUM: (2nd case) ( ( ( unique!

THEOREM: The vector space is the direct sum of its subspaces and , and (ii) { }, intersection consists only zero element. Proof: Consider Let and
4

if and only if (i)

and

{ }

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

There is a vector

for which and

. such that .

To show uniqueness, also let Thus we can have, But and

[using closure property] { } which implies and

Now, for uniqueness ,

A very useful example from Physics:

Consider, be the vector space of symmetric Matrices and be the vector space of antisymmetric Matrices. Any matrix can be written as the sum of symmetric and anti-symmetric matrices, ( * ( * ( We can say, Next, we show that + + , ( ( ( and { } . . = , where [ [ is symmetric Matrix] is anti-symmetric Matrix]

Suppose, some element belongs to the intersection of the two sets, Thus we will have, Hence, Therefore, NOTE: The general forms of ( ), ( real symmetric and anti-symmetric matrices, ) { } . and also which means .

Inner products: Inner product of two vectors is a generalization of the dot product that we already know. 5

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

Inner Product:

[The real inner product is a function (Mapping) from For the real inner product, the following axioms hold: (i) (ii) (iii) , for all and for all

, , for all

[Linearity]
[Symmetry] [Positive definite]

if and only if

Note: Linearity and symmetry together is called bilinearity. ( (

Let

and

We define the inner product,


[Euclidean Norm]

If the vectors are represented by column matrices,

( Then ( and

),

( ] and is a . =

Transpose of

Now suppose, is a

are two vectors in

matrix.

column and is another vector in ( (

Now consider the inner product Also, we can have Example: ( Orthogonal. ) and ( )

= .

and

are

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

NORM of a vector is defined as, (inner product with itself) .

This is Euclidean Norm when Following rules hold for a NORM:

0 for all and if and only if , for all and all .

Note: For Euclidean Norm, Let us define -Norm: ( ( , for , which we normally do.

is our special case. It may be called 2-norm.

Two Important Theorems on inner product and sum of two vectors

CAUCHY-SCHWARZ Inequality: For two vectors


Proof: According to definition,

[Euclidean, defined on Real field]

Consider any two real numbers, Now, ( Now considering, ( and ( we set

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

. (1)

Here we considered the norms, , and also .

From (1), summing on both side,


In the above, we have considered the following identity:

Hence,

(proved)

[Note that,

; think of the identity

MINKOWSKIs Inequality: For the real numbers:


Proof:

[Since,
..(2)

Now we apply Cauchy-Schwarz inequality, From (2),

Lecture Notes II compiled by Dr. Abhijit Kar Gupta, kg.abhi@gmail.com

(proved)

Note: ( (

Now if the inner product,

, the vectors are orthogonal, we get back Pythagoras Theorem! , then is

Also, under the new symbols, we rediscover, if , i.e., if called unit vector. For any vector , (normalized) For any non-negative real number, ( and .

is called the distance between

Inner Product defined on Complex vector space,

where ( ( Note that, and

The bilinearity is lost!

NOTE: A real inner product space is sometimes called Euclidean space and a complex inner product space is called a Unitary space. ==========

Das könnte Ihnen auch gefallen