Sie sind auf Seite 1von 20

Schedule

Week Date Topic Classification of Topic


1 9 Feb. 2010
Introduction to
Numerical Methods
and Type of Errors
Measuring errors, Binary
representation, Propagation of errors
and Taylor series
2 14 Feb. 2010 Nonlinear
Bisection Method
3 21 Feb. 2010
Newton-Raphson Method
4 28 Feb. 2010 Interpolation
Lagrange Interpolation
5 7 March 2010
Newton's Divided Difference Method
6 14 March 2010
Differentiation Newton's Forward and Backward
Difference
7 21 March 2010 Regression
Least squares
8 28 March 2010
Systems of Linear
Equations
Gaussian Jordan
9 11 April 2010
Gaussian Seidel
10 18 April 2010
Integration Composite Trapezoidal and Simpson
Rules
11 25 April 2010
Ordinary Differential
Equations
Euler's Method
12 2 May 2010
Runge-Kutta 2
nd
and4
th
order Method
Linear Regression
3
What is Regression?
What is regression? Given n data points
) , ( , ... ), , ( ), , ( 2 2 1 1 n n y x y x y x
best fit
) (x f y =
to the data. The best fit is generally based on
minimizing the sum of the square of the residuals, r S
Residual at a point is
) ( i i i x f y = c

=
=
n
i
i i r x f y S
1
2
)) ( (
) , ( 1 1 y x
) , ( n n y x
) (x f y =
Figure. Basic model for regression
Sum of the square of the residuals
.
4
Least Squares Criterion
The least squares criterion minimizes the sum of the square of the
residuals in the model, and also produces a unique line.
( )
2
1
1 0
1
2

= =
= =
n
i
i i
n
i
i r
x a a y S c
x
i i i
x a a y
1 0
= c
1 1
, y x
2 2
, y x
3 3
, y x
n n
y x ,
i i
y x ,
i i i
x a a y
1 0
= c
y
Figure. Linear regression of y vs. x data showing residuals at a typical point, x
i
.
5
Finding Constants of Linear Model
( )
2
1
1 0
1
2

= =
= =
n
i
i i
n
i
i r
x a a y S c Minimize the sum of the square of the residuals:
To find
( )( ) 0 1 2
1
1 0
0
= =
c
c

=
n
i
i i
r
x a a y
a
S
( )( ) 0 2
1
1 0
1
= =
c
c

=
n
i
i i i
r
x x a a y
a
S
giving
0 a
and
1 a
we minimize with respect to
1 a 0 a
and r S .

= + = +
= = =
y x a na y x a a
1 0
n
1 i
i i
n
1 i
1
n
1 i
0
x y x a x a x y x a x a
n
1 i
2
1 0 i
n
1 i
i
2
i
n
1 i
1 i
n
1 i
0

= + = +
= = = =
6
Finding Constants of Linear Model
0 a
Solving for
2
2
1
x x n
y x xy n
a
|
|
.
|

\
|

=


and
or
x x n
xy x y x
a
2
2
2
0
|
|
.
|

\
|

=


1 a
and directly yields,
) x a y a (
1 0
=
7
Example 1
The torque, T needed to turn the torsion spring of a mousetrap through
an angle, is given below.
Angle, Torque, T
Radians N-m
0.698132 0.188224
0.959931 0.209138
1.134464 0.230052
1.570796 0.250965
1.919862 0.313707
Table: Torque vs Angle for a
torsional spring
Find the constants for the model given by
u + =
1 0
a a T
Figure. Data points for Angle vs. Torque data
0.1
0.2
0.3
0.4
0.5 1 1.5 2
(radians)
T
o
r
q
u
e

(
N
-
m
)
8
Example 1 cont.
1 a
The following table shows the summations needed for the calculations of
the constants in the regression model.
u
2
u u T
Radians N-m Radians
2
N-m-Radians
0.698132 0.188224
0.487388 0.131405
0.959931 0.209138
0.921468 0.200758
1.134464 0.230052
1.2870 0.260986
1.570796 0.250965
2.4674 0.394215
1.919862 0.313707
3.6859 0.602274
6.2831 1.1921
8.8491 1.5896
Table. Tabulation of data for calculation of important

=
=
5
1 i
5 = n
Using equations described for
2
2
1
5
T T 5
a
|
|
.
|

\
|
u u
u u
=


( ) ( )( )
( ) ( )
2
2831 6 8491 8 5
1921 1 2831 6 5896 1 5
. .
. . .

=
2
10 6091 9

= .
N-m/rad
summations
0 a
T
and with
9
Example 1 cont.
n
T
T
i
i
=
=
5
1
_
Use the average torque and average angle to calculate
_
1
_
0
a T a u =
n
i
i
=
=
5
1
_
u
u
5
1921 . 1
=
1
10 3842 . 2

=
5
2831 . 6
=
2566 . 1 =
Using,
) 2566 . 1 )( 10 6091 . 9 ( 10 3842 . 2
2 1
=
1
10 1767 . 1

= N-m
0
a
10
Example 1 Results
Figure. Linear regression of Torque versus Angle data
Using linear regression, a trend line is found from the data
Nonlinear Regression
Some popular nonlinear regression models:
1. Exponential model: ) (
bx
ae y =
2. Power model:
) (
b
ax y =
3. Saturation growth model: |
.
|

\
|
+
=
x b
ax
y
4. Polynomial model: ) ( 1 0
m
mx a ... x a a y + + + =
12
Exponential Model
Given
best fit
bx
ae y = to the data.
bx
ae y =
) , (
n n
y x
) , (
1 1
y x
) , (
2 2
y x
) , (
i i
y x
) (
i i
x f y
) , ( , ... ), , ( ), , ( 2 2 1 1 n n y x y x y x
13
Finding constants of Exponential Model
Consider the equation Taking the logarithm on both sides, we get
bx
ae y =
bx a ln e ln a ln ae ln y ln
bx bx
+ = + = =
bx A Y + =
Then by using the linear module let then,
a ln A , Y y ln = =
14
Example 2-Exponential Model
Many patients get concerned when a test involves injection of a
radioactive material. For example for scanning a gallbladder, a few drops
of Technetium-99m isotope is used. Half of the techritium-99m would be
gone in about 6 hours. It, however, takes about 24 hours for the
radiation levels to reach what we are exposed to in day-to-day activities.
Below is given the relative intensity of radiation as a function of time.
Table. Relative intensity of radiation as a function of time.
t(hrs) 0 1 3 5 7 9
1.000 0.891 0.708 0.562 0.447 0.355

15
Example 2-Exponential Model cont.
The relative intensity is related to time by the equation
Find:
a) The value of the regression constants and
b) Radiation intensity after 24 hours
t
Ae

=
A
16
t Y t
0 1.000 0 0 0
1 0.891 -0.115 1 -0.115
3 0.708 -0.345 9 -1.035
5 0.562 -0.576 25 -2.88
7 0.447 -0.805 49 -5.635
9 0.355 -1.035 81 -9.315
3.963

= ln Y
2
t
876 . 2 Y =

25 t =

165 t
2
=

98 . 18 Yt =

17
( ) ( )
1151 . 0
365
98 . 41
625 990
9 . 71 88 . 113
25 ) 165 ( 6
) 876 . 2 ( 25 ) 98 . 18 ( 6
t t 6
t Y Yt 6
2 2 2
=

+
=

=


Calculating the Other Constant
The value of can now be calculated
and the value of A can now be calculated
9998 . 0 A
00025 . 0 47958 . 0 47933 . 0 )
6
25
( 1151 . 0
6
876 . 2
t Y A ln
=
= + = +

= =
18
Plot of data
19
Plot of data and regression curve
20
Relative Intensity After 24 hrs
The relative intensity of radiation after 24 hours
( ) 24 1151 . 0
9998 . 0

= e
2
10 3160 . 6

=
This result implies that only
% 317 . 6 100
9998 . 0
10 316 . 6
2
=


radioactive intensity is left after 24 hours.

Das könnte Ihnen auch gefallen