Sie sind auf Seite 1von 16

NAME : SHASHANK NATHANI ROLL NUMBER : 14HS20034

Assignment 1
QUESTION 1

> library(readr)
Warning message:
package ‘readr’ was built under R version 3.4.2
> y <- read_csv("C:/Users/ssnii/Desktop/compstatsassn/y.csv",
+ col_names = FALSE)
Parsed with column specification:
cols(

## Declaring the column names for the given dataset

> colnames(y)="y"
> colnames(x1)="x1"
> colnames(x2)="x2"
> colnames(x3)="x3"

# Considering A as our dataframe for the model consisting all data

>A=cbind(y,x1,x2,x3,x4)
>attach(A)

# Applying OLS Model in the given data set (LINEAR MODEL)

> summary(lm(y~.,data=A))

# We have the values of coefficients and their standard errors. We observe


that the intercep and x2 are not significant

Call:
lm(formula = y ~ ., data = A)

Residuals:
Min 1Q Median 3Q Max
-2.6502 -0.6483 -0.0167 0.6248 2.4452

Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 2.9915 0.3984 7.509 2.08e-12 ***
x1 -9.6719 7.2728 -1.330 0.185
x2 -7.2027 0.0362 -198.957 < 2e-16 ***
x3 -12.5531 7.2752 -1.725 0.086 .
x4 11.9443 7.2751 1.642 0.102
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 1.002 on 195 degrees of freedom


Multiple R-squared: 0.9955, Adjusted R-squared: 0.9954
F-statistic: 1.077e+04 on 4 and 195 DF, p-value: < 2.2e-16

# Now we apply the SVD decomposition in the given question.


Which breaks it into u (200 X 4) and v (4 X $) matrix. Here d
are the elements of our diagonal matrix. We observe that 0.079 is
a value close to zero hinting towards multicollinearity.

>svd(A)
$d
[1] 161.24569935 32.85624568 16.85541157 0.07954737

$u
[,1] [,2] [,3] [,4]
[1,] 0.05780458 0.0564109861 0.0026570418 0.0042855465
[2,] 0.06452557 -0.0704556000 -0.0366304492 0.0236052161
[3,] 0.08230665 -0.0275312559 0.0319415373 0.1351153783
[4,] 0.05826374 0.1474434974 -0.0526143376 -0.0878959464
[5,] 0.06959511 -0.1511292556 0.0062840469 -0.0422175300
[6,] 0.04908226 -0.0368268939 0.0247028696 0.0467500569
[7,] 0.07831935 0.0487506011 -0.0119791788 -0.0639642147
[8,] 0.07602570 -0.0022458088 -0.0513619738 0.0433050336
[9,] 0.08537546 0.0920991582 0.0285224005 0.1002800806
[10,] 0.09590011 -0.0727802604 0.0094690332 -0.2011997400
[11,] 0.06942317 -0.0060941899 -0.0859824587 -0.1132361781
[12,] 0.05563935 0.1069182408 0.0567077284 0.0003120502
[13,] 0.07199020 -0.0979492437 0.0161269556 -0.0121491900
[14,] 0.08963891 0.0685948235 -0.0991096870 -0.0404548623
[15,] 0.06949340 0.0105331780 0.1118287444 -0.0414038372
[16,] 0.06532159 0.0494228251 0.1714630312 0.0223006613
[17,] 0.05334378 0.0255066490 0.2017036896 0.0503946351
[18,] 0.08444119 -0.0330489281 0.0240711429 -0.0891800961
[19,] 0.06061751 0.0048098253 -0.0724928408 -0.0075572797
[20,] 0.05624898 -0.0487861305 0.0815759256 0.0831860497
[21,] 0.06640319 -0.0809382125 0.0497601278 0.0244678457
[22,] 0.06228226 0.0057134308 -0.0011009707 -0.0217234523
[23,] 0.05732448 0.0203197808 -0.0049393712 -0.0158232353
[24,] 0.09913196 -0.0070487481 -0.0806980025 0.0385556886
[25,] 0.07718840 -0.1436499742 -0.0973364519 -0.0225610513
[26,] 0.06095134 0.1411179383 -0.0421700132 0.0346401787
[27,] 0.06421400 0.0369702316 -0.0565073101 0.0491124356
[28,] 0.08734707 -0.1345889151 0.0022444794 -0.0629474218
[29,] 0.07587409 -0.2273129267 -0.0162867187 -0.0839104922
[30,] 0.08398180 -0.0514589070 0.0193947002 -0.0877500853
[31,] 0.07466953 -0.0106036927 -0.0785856849 0.1339254267
[32,] 0.06748267 -0.0181301167 0.0244666481 0.1127259304
[33,] 0.06834953 0.0507027051 0.0541777823 0.0122688900
[34,] 0.05265151 -0.0015179811 0.1527101276 -0.0447741475
[35,] 0.04811017 -0.0535231766 0.1043638499 0.0304504657
[36,] 0.04161246 -0.0138123654 0.0343509754 0.1582531788
[37,] 0.06369181 0.0144750231 -0.0637290440 0.1707953773
[38,] 0.08286899 -0.0593866520 0.0446881248 0.0918288122
[39,] 0.07687779 -0.0072268721 0.0703517323 -0.0383938929
[40,] 0.05302920 0.1189446343 -0.0005277546 -0.0843488710
[41,] 0.06960232 -0.0180370929 -0.0045750497 0.0023928740
[42,] 0.04793846 0.0074891760 0.0424083624 0.0033033669
[43,] 0.05209664 0.0337584076 0.0317891832 -0.0441323472
[44,] 0.04565065 0.1113279652 -0.0127247130 -0.0549737969
[45,] 0.08108139 -0.0127017689 0.0074600593 -0.0707350156
[46,] 0.07472938 0.0058324779 -0.0190222815 -0.0519720625
[47,] 0.07045008 -0.0181306932 -0.0003527473 0.1017131564
[48,] 0.05970639 0.0206260484 0.0624241962 -0.0729978738
[49,] 0.07012201 -0.0708309911 -0.0266109119 -0.0702252540
[50,] 0.05598563 0.0710818133 0.0186892605 0.1219479340
[51,] 0.06711437 0.0196931484 0.0277229473 -0.1199963076
[52,] 0.04138648 -0.0553474355 -0.0941884157 0.0420250675
[53,] 0.07355683 -0.0059600325 -0.0278795514 0.0362434715
[54,] 0.07114784 0.0085799660 -0.0445714686 0.0356679121
[55,] 0.06375070 0.0764869309 -0.0197890792 -0.1306523345
[56,] 0.08193067 0.0141132046 -0.1238074485 0.0240787059
[57,] 0.10321809 -0.1120149684 -0.0255172271 -0.1276477888
[58,] 0.05902548 0.1993782172 0.0812515429 -0.0310889631
[59,] 0.05346440 -0.0118118774 0.0397530741 -0.0499182699
[60,] 0.05453159 0.1336550740 -0.0132325044 -0.0499111783
[61,] 0.07775922 -0.1368118112 -0.0023999816 0.0017083886
[62,] 0.08771597 0.0390979910 -0.0528792106 0.0356056195
[63,] 0.06622718 0.0125295932 0.0390031139 -0.1195310607
[64,] 0.05971875 -0.0726006648 0.0489986470 -0.0443794462
[65,] 0.07575449 -0.0053903501 -0.1624658195 0.0411165848
[66,] 0.06588211 -0.0155330519 -0.1026167212 -0.0531116685
[67,] 0.03364266 0.0709677011 -0.0023384996 0.0863879024
[68,] 0.06800230 -0.0893159159 -0.0699622196 -0.0548516737
[69,] 0.07820406 -0.0074096992 -0.0583233557 -0.0759389163
[70,] 0.08952941 -0.0527795675 -0.1376093690 0.1064204488
[71,] 0.07443615 -0.0915211331 0.0461592154 -0.0369256269
[72,] 0.08316369 -0.0359723182 -0.1525374130 0.0160042597
[73,] 0.05321302 0.0330865573 0.0200272874 0.0132451996
[74,] 0.06918761 0.1427286275 -0.0217260905 -0.0526875881
[75,] 0.05342870 -0.1500422631 0.0977187488 0.0886281009
[76,] 0.06514174 0.0291548062 0.0309403472 0.1023314196
[77,] 0.09562229 -0.0094758388 0.0420626222 0.0454231469
[78,] 0.07959320 0.0175980810 -0.0499178225 0.1060085687
[79,] 0.05240523 0.1050491392 -0.0003298555 0.0721547866
[80,] 0.07017784 0.0966521950 0.0123069526 -0.1220625874
[81,] 0.09538180 -0.0856012112 -0.1030033388 0.0526633198
[82,] 0.06367163 -0.0200520538 0.0844336606 0.0079973034
[83,] 0.07150235 0.0621381274 0.0232911075 0.0031771415
[84,] 0.09160267 -0.0704686008 -0.0273651513 -0.1170069720
[85,] 0.05714148 0.0310995596 0.0590399462 0.0055453041
[86,] 0.07557063 -0.0703380941 -0.0421923696 0.0951264545
[87,] 0.06351859 0.0323150838 -0.0221103935 -0.0155430688
[88,] 0.06905307 -0.0502128798 -0.1994519220 0.0920700628
[89,] 0.08901371 -0.0132609310 0.0146496555 -0.1276515618
[90,] 0.07826679 0.0123723392 0.0957664348 -0.0024056649
[91,] 0.07160742 0.0602002197 -0.0602294468 -0.0007751069
[92,] 0.08123847 0.0306760354 0.0053400577 -0.0396860652
[93,] 0.04246850 0.0994886639 0.0413110858 -0.0733645222
[94,] 0.06658601 0.0162196591 0.0267506192 0.0440665063
[95,] 0.07411189 -0.0095225030 -0.0464542119 0.0013648956
[96,] 0.08125505 -0.1021178988 -0.0783100276 -0.0430445588
[97,] 0.06929132 0.0905753628 -0.0026706146 0.0514766606
[98,] 0.05865548 0.0562753872 0.0013354770 0.0157665548
[99,] 0.04438829 -0.0555219332 -0.0135833808 -0.0600864006
[100,] 0.06919565 0.0115661066 0.0269492288 0.0044868795
[101,] 0.04320024 -0.0132847823 0.1510811093 0.0329117646
[102,] 0.07833351 0.0192340900 -0.0010267277 0.1102218189
[103,] 0.06538792 0.0066547427 0.0576223627 -0.0652057275
[104,] 0.06318742 -0.0701163707 0.0270275232 0.1023793912
[105,] 0.06554020 0.0266766013 0.0043692949 -0.0111111212
[106,] 0.07947537 0.1576430997 -0.0990254858 0.0531083773
[107,] 0.09662775 -0.0152470715 0.0156036091 -0.0041653924
[108,] 0.05589028 0.0210109208 0.0875765790 0.0506391752
[109,] 0.07567749 -0.0049622970 0.1203954265 -0.0130402927
[110,] 0.07852124 -0.1030493831 -0.0108556877 -0.0603626044
[111,] 0.09243404 0.0813173537 -0.1206578691 0.0444892237
[112,] 0.06446190 -0.0104805600 0.0520987679 -0.0246370515
[113,] 0.06898206 0.0450651753 -0.0973283558 -0.0287101040
[114,] 0.05039383 -0.0480787159 -0.0237347534 0.0299055473
[115,] 0.09419208 -0.0120784471 -0.0368142535 0.0577213828
[116,] 0.06451399 0.0357244133 0.0156299368 -0.0358384849
[117,] 0.08350184 0.0353567676 0.0097340908 0.0107254823
[118,] 0.08166854 0.0056237734 0.1661813759 -0.0660703871
[119,] 0.08570658 0.0364291806 -0.0034549742 -0.1303586686
[120,] 0.07069791 -0.0011900012 -0.0582266254 0.0562620359
[121,] 0.05065102 0.0864333891 -0.0210787989 -0.0820838863
[122,] 0.08318480 0.0843448594 -0.0926609149 -0.0454479328
[123,] 0.07164409 -0.0528743338 0.0370049053 0.0203771214
[124,] 0.04223590 0.0402953272 0.0632792103 -0.1756672672
[125,] 0.05661049 0.1245628408 -0.0280225361 -0.0643729511
[126,] 0.08630931 -0.0924572179 0.0515530562 0.1552507118
[127,] 0.06463769 0.0023088221 0.1750425005 0.0365126940
[128,] 0.06920754 -0.0670053756 0.0770289237 0.0045695798
[129,] 0.08964951 -0.0890854555 0.1486228270 0.0014182217
[130,] 0.06915623 0.0630784854 0.0586331659 0.0122824782
[131,] 0.07416211 -0.0965933714 0.1698548059 0.1433783611
[132,] 0.07103713 0.0008764964 0.0523939464 0.0059968784
[133,] 0.05596632 0.0485989128 -0.0134072650 0.0106365655
[134,] 0.05841474 0.0030884086 0.0604418203 -0.0403815860
[135,] 0.06109859 -0.0216250063 -0.0257227850 0.0021427641
[136,] 0.06122816 0.0181081904 0.0558872346 0.0312976781
[137,] 0.06400835 -0.0111901805 0.0533415632 0.0351578408
[138,] 0.08673410 -0.0667735277 0.0222091056 -0.0340094499
[139,] 0.06795486 0.0234798171 0.0345957347 0.0556667323
[140,] 0.06325691 0.0810631222 0.0671178289 0.0029783820
[141,] 0.06869320 -0.0092727859 0.1345013866 0.0439063112
[142,] 0.06218286 -0.0683690583 0.0243377714 0.1202745585
[143,] 0.07594860 -0.0303642935 -0.0922488955 -0.0251584401
[144,] 0.08367738 0.0161818204 -0.0328580812 0.0694396367
[145,] 0.05675508 0.1231170442 0.0585002949 0.1711742690
[146,] 0.07199017 0.1206917052 -0.0570051536 0.0205743334
[147,] 0.05417716 0.0459078190 -0.0210213012 0.0059399241
[148,] 0.08661020 0.0182304312 -0.0882942828 0.0619561632
[149,] 0.08248620 -0.0724188661 -0.0293605629 0.0754966084
[150,] 0.07674649 0.0554822833 0.0745302161 -0.1129001748
[151,] 0.06268605 0.0729112417 -0.0569461414 0.0326056008
[152,] 0.06774830 -0.0056720596 0.0020854434 -0.0478140318
[153,] 0.06081316 0.0732820367 -0.0582812774 -0.0777509707
[154,] 0.08782352 -0.0494214966 -0.1147064197 0.0481894531
[155,] 0.06170270 -0.0490902975 0.0397454902 0.0609392356
[156,] 0.07308545 0.0834404680 -0.0075994465 -0.1073097824
[157,] 0.07231331 -0.0747124243 0.0348519711 -0.0454122360
[158,] 0.05634136 0.0135989594 -0.0401223928 -0.0391818862
[159,] 0.06412393 0.0433216138 0.0002747462 0.0505804713
[160,] 0.04582104 0.1073015559 0.0701168933 0.0279102439
[161,] 0.08994903 -0.0102424023 -0.0364506461 -0.0055449513
[162,] 0.06586782 0.0343379331 -0.0456946717 -0.0287606938
[163,] 0.06007411 0.0364787896 0.0526418805 0.0423662555
[164,] 0.04464681 0.0582148250 0.1038226839 -0.0255673237
[165,] 0.08431296 0.0594252554 0.1411721099 -0.0435054528
[166,] 0.08060927 -0.0069001968 0.0598991882 0.0619190408
[167,] 0.08506266 -0.0330005009 -0.0763281751 -0.0101100327
[168,] 0.06825658 -0.0737095583 0.0567385180 0.0388054798
[169,] 0.06337789 -0.0215663775 -0.0383054951 -0.0845809192
[170,] 0.04952295 0.2452477064 0.0065804824 0.0943275014
[171,] 0.06510853 -0.0724258902 0.0562791052 -0.1009212071
[172,] 0.07463571 -0.0553992677 0.1417173479 -0.1222461094
[173,] 0.07733478 -0.1128746850 -0.0068101142 -0.0673886581
[174,] 0.05563294 0.1338255408 0.0778240537 -0.0066186307
[175,] 0.07428453 0.0461323953 0.0067451244 -0.0672321358
[176,] 0.09991450 -0.0072773854 0.0022684295 -0.0035291548
[177,] 0.04812830 0.0603203317 0.0877276958 -0.0635490265
[178,] 0.10803874 0.0229883272 0.1102525342 0.1499165853
[179,] 0.08055405 -0.0141595153 -0.0775823207 -0.0314349815
[180,] 0.06136839 -0.1461431347 -0.1421949828 0.0026026171
[181,] 0.06589155 0.0642656033 -0.0343731393 0.1269990017
[182,] 0.06910213 -0.0724053706 -0.1049342992 0.0634137509
[183,] 0.07068726 0.0721423500 -0.1628734220 -0.0800913573
[184,] 0.05506906 0.0029808775 0.0414650931 -0.0535053071
[185,] 0.05091413 -0.0946327949 0.0562485461 -0.0699719563
[186,] 0.07780668 0.1087997129 -0.0271837381 0.0540583351
[187,] 0.07071809 -0.0558116502 0.0096918730 0.0327735106
[188,] 0.06530989 -0.0214364150 0.0355534369 0.0819524473
[189,] 0.07563266 -0.0172738712 0.0761022231 0.0838267682
[190,] 0.07880945 -0.0766729665 -0.0512831877 -0.0770048362
[191,] 0.05606026 0.0379499932 -0.0471722432 0.0857953858
[192,] 0.07322781 -0.0071723142 -0.0104233927 0.0026952043
[193,] 0.07832999 0.0856682976 -0.0218390354 0.0100099247
[194,] 0.08935895 0.0343059192 -0.0619750161 -0.1122099647
[195,] 0.07069278 -0.0309831035 -0.0995440875 -0.0586933292
[196,] 0.05280620 -0.1024779200 0.1072459025 -0.0685917444
[197,] 0.07783774 0.0780717131 -0.0417945979 -0.0348290249
[198,] 0.07347134 0.0805745028 -0.0877734056 -0.0471176202
[199,] 0.07327113 -0.0799796024 -0.0456021773 0.0310427183
[200,] 0.07571697 -0.0674596680 -0.0353230876 0.1054071294

$v
[,1] [,2] [,3] [,4]
[1,] -0.001885687 0.2272177 0.7843609 0.5771884340
[2,] 0.863558576 0.4848767 -0.1384238 0.0000520533
[3,] -0.355635192 0.4726741 -0.5627145 0.5774557304
[4,] -0.357472573 0.6998899 0.2212897 -0.5774066058

> sigma=svd(A[,-1])$d
> sigma=sigma[order(sigma,decreasing=TRUE)]
>sigma
[1] 161.24569935 32.85624568 16.85541157 0.07954737

# Now we determine the value of our and also compute different ratios of
sigma

>> for(i in 1:3)


+{
+ a=a+sigma[i]
+ print(a/sum(sigma)) ## Ratios
+}
[1] 0.764064
[1] 0.9197536
[1] 0.9996231

>r=2;
>> B=svd(A[,-1])
> attach(B)

# We create a diagonal matrix of all eigen values

> d=matrix(c(d[1],0,0,0,0,d[2],0,0,0,0,d[3],0,0,0,0,d[4]),nrow=4,ncol=4,byrow = TRUE)


>d
[,1] [,2] [,3] [,4]
[1,] 161.2457 0.00000 0.00000 0.00000000
[2,] 0.0000 32.85625 0.00000 0.00000000
[3,] 0.0000 0.00000 16.85541 0.00000000
[4,] 0.0000 0.00000 0.00000 0.07954737

# We define the matrices ui vi di and un vn and dn

> ui=u[,c(1:r)]
> di=d[c(1:r),c(1:r)]
> vi=v[,c(1:r)]
>un=u[,c(3:4)]
> dn=d[c(3:4),c(3:4)]
> vn=v[,c(3:4)]

# We now check whether the original matrix turns out to be the


sum of Xi and Xn

> Xi=ui%*%di%*%t(vi)
> Xn=un%*%dn%*%t(vn)
>head(Xi+Xn)
[,1] [,2] [,3] [,4]
[1,] 0.43888609 8.941502 -2.4637083 -2.0249820
[2,] -1.02880456 7.947891 -4.4458761 -5.4772057
[3,] 0.19793280 10.947666 -5.4441661 -5.2643988
[4,] 0.38339124 10.584655 -0.5562738 -0.1600114
[5,] -1.06827808 7.268442 -6.3995314 -7.4614788
[6,] 0.03888057 6.190131 -3.6186914 -3.5860160
>head(A[,-1])
x1 x2 x3 x4
1 0.43888609 8.941502 -2.4637083 -2.0249820
2 -1.02880456 7.947891 -4.4458761 -5.4772057
3 0.19793280 10.947666 -5.4441661 -5.2643988
4 0.38339124 10.584655 -0.5562738 -0.1600114
5 -1.06827808 7.268442 -6.3995314 -7.4614788
6 0.03888057 6.190131 -3.6186914 -3.5860160

# It turns out they are the same we calculate Wr for the estimation of Beta_PCA
>Wr=(Xi+Xn)%*%vi
> head(Wr)
[,1] [,2]
[1,] 9.320740 1.8534532
[2,] 10.404471 -2.3149065
[3,] 13.271593 -0.9045737
[4,] 9.394777 4.8444398
[5,] 11.221912 -4.9655400
[6,] 7.914304 -1.2099935
> g=solve(t(Wr)%*%Wr)%*%t(Wr)%*%A[,1]
>g
[,1]
[1,] -5.749695
[2,] -3.199006

# Calculating the values of Beta_PCA

> beta_PCA=vi%*%g
> beta_PCA
[,1]
[1,] -0.7160285
[2,] -6.5163216
[3,] 0.5327063
[4,] -0.1835938

>predictions=as.matrix(A[,-1])%*%beta_PCA

# Plotting our prediction from PCA

> plot(A[,1],predictions)

# Now we caculate covariance matrices for PCA and without PCA


It turns out that covariance matrix without PCA is greater as
compared to PCA

>sigma_sq_est=(sigma(lm(A$y~.,data=A)))^2
>cov_beta_PCA=sigma_sq_est*vi%*%solve(t(Wr)%*%Wr)%*%t(vi)
> cov_beta=sigma_sq_est*solve(t(X)%*%X)
>cov_beta_PCA
[,1] [,2] [,3] [,4]
[1,] 4.800143e-05 0.0001023707 9.988156e-05 0.0001478826
[2,] 1.023707e-04 0.0002473786 2.012338e-04 0.0003036053
[3,] 9.988156e-05 0.0002012338 2.126092e-04 0.0003124893
[4,] 1.478826e-04 0.0003036053 3.124893e-04 0.0004603700

> cov_beta
x1 x2 x3 x4
x1 52.845314294 0.004484408 52.86610505 -52.862305904
x2 0.004484408 0.000315502 0.00524424 -0.004572028
x3 52.866105054 0.005244240 52.89337876 -52.887675343
x4 -52.862305904 -0.004572028 -52.88767534 52.883682096

ASSIGNMENT 2

QUESTION 1

A=cbind(Problem1_Output_Training,cbind(1,Problem1_Input_Training))
A=as.matrix(A)

#
B=as.matrix(cbind(1,Problem1_Input_Test))
Yn=as.matrix(Problem1_Output_Test)
beta=matrix(nrow = 51,ncol=3)
beta[1,]=solve(t(A[,-1])%*%A[,-1])%*%t(A[,-1])%*%A[,1]
>colnames(beta)=c("int","coef1","coef2")

# DEFINING THE Mn MATRIX

>Mn=solve(t(A[,-1])%*%A[,-1]);

#APPLYING FOUR LOOPS FOR FOUR DIFFERENT LAMBDAS


> for(i in 1:50)
{
Zn=t(as.matrix(B[i,])) #Zn as our row vector
a=Mn%*%t(Zn)%*%Zn%*%Mn
b=0.99+as.numeric(Zn%*%Mn%*%t(Zn))
c=a/b
Mn=(1/0.99)*(Mn-c) #Applying the matrix inversion lemma
beta[i+1,]=beta[i,]+(Mn%*%t(Zn)%*%(as.matrix(Yn[i])-(Zn%*%as.matrix(beta[i,]))) ) #Recursion on Beta values
}
prediction=as.matrix(A[,-1])%*%(beta[51,]) #Calculating prediction error
> l=0
> for(i in 1:50)
+ {l=l+(A[i,1]-prediction[1])^2}
l

>Mn=solve(t(A[,-1])%*%A[,-1]); #Repeating the same procedure on all lambdas


> for(i in 1:50)
{
Zn=t(as.matrix(B[i,]))
a=Mn%*%t(Zn)%*%Zn%*%Mn
b=1+as.numeric(Zn%*%Mn%*%t(Zn))
c=a/b
Mn=(Mn-c)
beta[i+1,]=beta[i,]+(Mn%*%t(Zn)%*%(as.matrix(Yn[i])-(Zn%*%as.matrix(beta[i,]))) )
}
prediction=as.matrix(A[,-1])%*%(beta[51,])
> l=0
> for(i in 1:50)
+ {l=l+(A[i,1]-prediction[1])^2}
l

> write.csv(beta,file="table.csv")
> Mn=solve(t(A[,-1])%*%A[,-1]);
> for(i in 1:50)
{
Zn=t(as.matrix(B[i,]))
a=Mn%*%t(Zn)%*%Zn%*%Mn
b=0.95+as.numeric(Zn%*%Mn%*%t(Zn))
c=a/b
Mn=(1/0.95)*(Mn-c)
beta[i+1,]=beta[i,]+(Mn%*%t(Zn)%*%(as.matrix(Yn[i])-(Zn%*%as.matrix(beta[i,]))) )
}
prediction=as.matrix(A[,-1])%*%(beta[51,])
> l=0
> for(i in 1:50)
+ {l=l+(A[i,1]-prediction[1])^2}
l

> write.csv(beta,file="table.csv")
> Mn=solve(t(A[,-1])%*%A[,-1]);
> for(i in 1:50)
{
Zn=t(as.matrix(beta[i,]))
a=Mn%*%t(Zn)%*%Zn%*%Mn
b=0.2+as.numeric(Zn%*%Mn%*%t(Zn))
c=a/b
Mn=(1/0.2)*(Mn-c)
beta[i+1,]=beta[i,]+(Mn%*%t(Zn)%*%(as.matrix(Yn[i])-(Zn%*%as.matrix(beta[i,]))) )
}
> write.csv(beta,file="table.csv")
prediction=as.matrix(A[,-1])%*%(beta[51,])
> l=0
> for(i in 1:50)
+ {l=l+(A[i,1]-prediction[1])^2}
l

We have the following prediction errors here. Taking error sum of squares as our prediction errors

LAMDA=0.2 LAMBDA=0.95 LAMBDA=0.99 LAMBDA=1


14847.78 13454.95 13372.36 13658.05

WE GET THE FOLLOWING 4 SETS OF BETA VALUES CORRESPONG TO DIFFERENT LAMBDAS. FOR QUESTION
NUMBER 2 WE APPLY THE SAME CODE AND THE BETA VALUES AND CORRESPONDING ERROR VALUES ARE GIVEN
IN THE NEXT PAGE.

QUESTION 2
int coef1 coef2 int coef1 coef2 int coef1 coef2 int coef1 coef2
-12.5663 0.347823 -0.72613 -12.5663 0.347823 -0.72613 -12.5663 0.347823 -0.72613 -12.5663 0.347823 -0.72613
-12.7984 0.362517 -0.7165 -12.5809 0.348649 -0.72551 -12.5794 0.348562 -0.7256 -12.5887 0.349111 -0.72529
-13.109 0.381924 -0.74924 -12.598 0.349499 -0.72711 -12.5948 0.349282 -0.727 -12.6162 0.350319 -0.72761
-13.266 0.368426 -0.75299 -12.6066 0.349075 -0.72746 -12.602 0.348942 -0.72729 -12.6275 0.34982 -0.72809
-13.3461 0.36826 -0.74768 -12.6129 0.348814 -0.72743 -12.6071 0.348742 -0.72728 -12.6352 0.349539 -0.72808
-13.1927 0.342621 -0.75637 -12.6176 0.349501 -0.72738 -12.6108 0.349269 -0.72725 -12.641 0.350294 -0.72804
-13.0746 0.338088 -0.82752 -12.6036 0.34978 -0.72838 -12.6008 0.349457 -0.72797 -12.6259 0.350527 -0.72914
-15.25 0.191581 -1.0209 -12.5658 0.34828 -0.73196 -12.5743 0.348373 -0.73048 -12.5862 0.348741 -0.73291
-12.1519 0.318418 -0.87078 -12.5727 0.347469 -0.73215 -12.5789 0.347833 -0.73061 -12.5927 0.347973 -0.73312
-11.8637 0.304671 -0.85185 -12.5756 0.347751 -0.73233 -12.581 0.348025 -0.73074 -12.5954 0.348202 -0.73328
-13.3412 0.389813 -0.67519 -12.5928 0.350287 -0.73229 -12.5924 0.34964 -0.73073 -12.6131 0.350495 -0.73329
-14.4923 0.353299 -0.64265 -12.6168 0.350945 -0.73237 -12.6073 0.350064 -0.73079 -12.6348 0.351122 -0.73342
-14.5138 0.317216 -0.76091 -12.6249 0.352421 -0.73159 -12.6125 0.350973 -0.73033 -12.6422 0.352351 -0.73279
-14.2044 0.341619 -0.79024 -12.6327 0.351995 -0.73099 -12.6165 0.350752 -0.73002 -12.648 0.352041 -0.73231
-14.6063 0.313159 -0.83071 -12.6332 0.352033 -0.73094 -12.6169 0.350781 -0.72998 -12.6484 0.352074 -0.73227
-19.6987 0.234836 -1.09951 -12.6514 0.352756 -0.72954 -12.6269 0.3512 -0.72923 -12.6617 0.352657 -0.73128
-11.9473 0.393081 -0.62633 -12.6093 0.357383 -0.72815 -12.6055 0.353618 -0.72849 -12.6334 0.355791 -0.7302
-11.4009 0.363721 -0.57597 -12.5843 0.354419 -0.72785 -12.5931 0.352148 -0.72837 -12.6149 0.353786 -0.72998
-11.9813 0.374512 -0.42509 -12.6147 0.357847 -0.72588 -12.6089 0.353939 -0.72739 -12.6356 0.356053 -0.72875
-13.3353 0.389028 -0.66355 -12.6125 0.357736 -0.72564 -12.608 0.353896 -0.7273 -12.6334 0.355952 -0.72854
-9.91357 0.558485 -0.70195 -12.5779 0.357344 -0.728 -12.5926 0.353658 -0.72834 -12.614 0.355623 -0.72986
-15.9032 -0.14749 -0.83863 -12.6378 0.356422 -0.72195 -12.619 0.353348 -0.72567 -12.6456 0.355308 -0.72649
-12.4702 0.118655 -0.64692 -12.6391 0.356423 -0.7219 -12.62 0.353352 -0.72564 -12.6466 0.355314 -0.72646
-14.6775 -0.25414 -0.83592 -12.6509 0.356419 -0.72307 -12.6244 0.353365 -0.72608 -12.6518 0.355326 -0.72695
-13.1825 0.415442 -0.70142 -12.6757 0.358828 -0.7198 -12.6361 0.354463 -0.72461 -12.6654 0.356634 -0.72522
-14.779 0.269535 -0.69854 -12.7166 0.357878 -0.72305 -12.6516 0.354139 -0.72588 -12.6844 0.356265 -0.72676
-15.262 0.190742 -0.67763 -12.6989 0.357003 -0.72124 -12.6447 0.353816 -0.7252 -12.6747 0.355854 -0.72584
-12.0789 0.335864 -0.74008 -12.6832 0.35816 -0.72165 -12.6397 0.354178 -0.72535 -12.669 0.356258 -0.726
-15.076 0.725643 -0.48391 -12.7543 0.359267 -0.72418 -12.6664 0.354656 -0.72633 -12.7007 0.356805 -0.7272
-14.0294 0.337864 -0.89934 -12.697 0.359595 -0.72957 -12.6473 0.354708 -0.72822 -12.6796 0.356823 -0.72936
-13.8563 0.373305 -0.88659 -12.7955 0.349934 -0.73575 -12.6809 0.351512 -0.73046 -12.7183 0.353285 -0.73198
-13.2155 0.315565 -0.86136 -12.8367 0.354863 -0.7392 -12.6979 0.353333 -0.73185 -12.738 0.355195 -0.73353
-14.361 0.312545 -0.80993 -12.888 0.353901 -0.74015 -12.7164 0.353044 -0.73224 -12.7583 0.3549 -0.73399
-15.1333 0.518359 -0.51907 -12.851 0.35577 -0.73817 -12.7067 0.353487 -0.7317 -12.7469 0.355399 -0.73334
-12.1181 0.494492 -0.78575 -12.7998 0.356689 -0.7406 -12.6916 0.353734 -0.73241 -12.7306 0.355644 -0.7341
-14.2285 0.929768 -0.75137 -12.83 0.356512 -0.73923 -12.701 0.35369 -0.73199 -12.7403 0.355611 -0.73367
-13.2338 0.858155 -0.66613 -12.7968 0.35759 -0.73705 -12.6927 0.353923 -0.73145 -12.7307 0.35587 -0.73303
-13.0272 0.673503 -0.66831 -12.8979 0.367954 -0.74792 -12.7307 0.357037 -0.73529 -12.7728 0.359016 -0.73709
-12.6462 0.430966 -0.87392 -12.9178 0.36525 -0.74957 -12.7389 0.356065 -0.73596 -12.781 0.35806 -0.73777
-13.5447 0.436896 -0.82887 -12.9019 0.36474 -0.75094 -12.7345 0.355906 -0.73637 -12.7764 0.357886 -0.73821
-13.1531 0.437423 -0.80854 -12.9078 0.364217 -0.75134 -12.7392 0.35552 -0.73671 -12.7809 0.357537 -0.73853
-12.6517 0.337047 -0.75592 -12.8949 0.363286 -0.75002 -12.7399 0.355553 -0.73678 -12.7806 0.357526 -0.73851
-11.349 0.305761 -0.82794 -12.8259 0.357209 -0.75146 -12.7233 0.354151 -0.73702 -12.763 0.356081 -0.73873
-11.6702 0.284619 -0.82247 -12.8199 0.357796 -0.75154 -12.7211 0.354361 -0.73703 -12.761 0.356274 -0.73874
-11.6058 0.293596 -0.78624 -12.8116 0.358196 -0.75072 -12.7221 0.354318 -0.73712 -12.7614 0.356261 -0.73877
-13.8254 0.405285 -0.69341 -12.877 0.359097 -0.74731 -12.7371 0.354516 -0.73636 -12.7756 0.356466 -0.73807
-13.847 0.319172 -0.67803 -12.9755 0.347196 -0.74504 -12.7639 0.351397 -0.73597 -12.801 0.353498 -0.73772
-15.5434 0.349967 -0.78785 -13.0769 0.351139 -0.75032 -12.7936 0.352295 -0.73768 -12.8297 0.354334 -0.73937
-14.9508 0.29636 -0.86735 -13.0336 0.349466 -0.75479 -12.784 0.351911 -0.73881 -12.8209 0.35396 -0.74043
-14.0737 0.403904 -0.86341 -12.981 0.35004 -0.75889 -12.7709 0.352095 -0.73988 -12.8089 0.354113 -0.74142
-12.9093 0.448267 -0.80084 -12.9582 0.351057 -0.75848 -12.7677 0.352238 -0.73981 -12.8057 0.354253 -0.74135
lambda = 0.2 lamnda = 0.95 lamda = 0.99 lambda = 1
lamba=1 int coef1 coef2 lambda=0.99 int coef1 coef2 lambda=0.95 int coef1 coef2 lambda=0.2 int coef1 coef2
1 -11.9645 -0.01692 -0.76235 1 -11.9645 -0.01692 -0.76235 1 -11.9645 -0.01692 -0.76235 1 -11.9645 -0.01692 -0.76235
2 -11.9652 -0.01688 -0.76233 2 -11.9652 -0.01688 -0.76233 2 -11.9652 -0.01688 -0.76233 2 0.211949 -0.04158 -0.7563
3 -11.9854 -0.01617 -0.76379 3 -11.9857 -0.01616 -0.76382 3 -11.9874 -0.0161 -0.76394 3 -0.00695 -0.00768 -0.64377
4 -12.0164 -0.01722 -0.76524 4 -12.0177 -0.01724 -0.76531 4 -12.0232 -0.01732 -0.7656 4 -0.00095 0.066412 -0.33502
5 -12.0316 -0.01765 -0.76532 5 -12.0334 -0.01768 -0.76539 5 -12.0414 -0.01784 -0.76569 5 -0.01424 0.040378 0.086174
6 -12.0734 -0.01332 -0.76538 6 -12.0771 -0.01315 -0.76545 6 -12.0938 -0.01238 -0.76575 6 1.002888 -0.70423 -0.82811
7 -12.0609 -0.01321 -0.76633 7 -12.0638 -0.01304 -0.76646 7 -12.0769 -0.01223 -0.76704 7 0.684903 -0.64991 -0.78235
8 -12.0439 -0.0141 -0.76795 8 -12.0458 -0.01399 -0.76819 8 -12.0536 -0.01347 -0.7693 8 -0.05467 -2.03874 -1.99192
9 -12.0109 -0.01038 -0.76658 9 -12.0102 -0.00999 -0.76671 9 -12.0057 -0.00811 -0.76731 9 8.800387 3.828918 1.669004
10 -12.0273 -0.00931 -0.76742 10 -12.0278 -0.00884 -0.76761 10 -12.0287 -0.00662 -0.76851 10 -3.81457 0.284609 5.077896
11 -11.9893 -0.01322 -0.76714 11 -11.9863 -0.01312 -0.76732 11 -11.9692 -0.01276 -0.76811 11 -7.02424 13.62675 -8.60027
12 -11.9752 -0.01363 -0.76697 12 -11.9707 -0.01356 -0.76713 12 -11.9467 -0.01336 -0.76785 12 -0.72082 -3.07369 -2.8494
13 -12.0084 -0.00875 -0.76458 13 -12.0073 -0.00818 -0.76445 13 -11.9993 -0.00537 -0.76362 13 1.623791 0.589328 0.995114
14 -11.9801 -0.00727 -0.76725 14 -11.9754 -0.00651 -0.76745 14 -11.9489 -0.00274 -0.76839 14 2.827058 0.919555 0.597803
15 -11.9922 -0.00632 -0.76619 15 -11.989 -0.00545 -0.76625 15 -11.9698 -0.00111 -0.76642 15 -1.03885 0.281367 2.948475
16 -11.99 -0.00643 -0.76635 16 -11.9864 -0.00558 -0.76644 16 -11.9656 -0.0013 -0.76674 16 0.141127 -0.58636 -0.22505
17 -12.0303 -0.01053 -0.76819 17 -12.0337 -0.01032 -0.76858 17 -12.055 -0.00973 -0.77069 17 -4.5115 22.26136 -3.20914
18 -12.0154 -0.01183 -0.76792 18 -12.0164 -0.01181 -0.76825 18 -12.0237 -0.01233 -0.76996 18 2.095951 0.211368 3.010813
19 -12.0128 -0.01209 -0.76804 19 -12.0134 -0.01211 -0.76839 19 -12.019 -0.01278 -0.7702 19 -3.02894 -1.19778 0.43773
20 -12.0331 -0.01139 -0.76979 20 -12.037 -0.01132 -0.77044 20 -12.0605 -0.01157 -0.77404 20 8.763001 0.012823 -7.77055
21 -12.0763 -0.01048 -0.76692 21 -12.0879 -0.01029 -0.76703 21 -12.1577 -0.01008 -0.7673 21 5.408016 2.17131 6.6704
22 -12.0586 -0.0104 -0.76904 22 -12.0669 -0.01018 -0.76958 22 -12.1159 -0.00965 -0.7725 22 0.163817 0.598596 0.668805
23 -12.0672 -0.01033 -0.7688 23 -12.0771 -0.0101 -0.7693 23 -12.1356 -0.00958 -0.77199 23 1.591455 -7.6175 1.980789
24 -12.0516 -0.01036 -0.76743 24 -12.0577 -0.01012 -0.76758 24 -12.0899 -0.00945 -0.76793 24 -0.27743 3.569372 -0.4762
25 -12.0578 -0.00971 -0.76659 25 -12.0651 -0.00935 -0.76657 25 -12.1031 -0.00798 -0.76597 25 -3.79491 1.224088 -2.02389
26 -12.1186 -0.01063 -0.77133 26 -12.1396 -0.01052 -0.77236 26 -12.2626 -0.01101 -0.77795 26 11.09589 2.677226 -4.58014
27 -12.1387 -0.00996 -0.77305 27 -12.1631 -0.00974 -0.77438 27 -12.2996 -0.00971 -0.78122 27 1.79168 4.629402 13.76293
28 -12.1312 -0.00945 -0.77324 28 -12.1535 -0.00908 -0.77462 28 -12.2736 -0.00787 -0.78175 28 -0.02942 -7.22058 2.233329
29 -12.1418 -0.00928 -0.77367 29 -12.1661 -0.00889 -0.77513 29 -12.2941 -0.0076 -0.78254 29 -3.34508 2.469994 -0.57894
30 -12.1151 -0.0094 -0.77662 30 -12.1323 -0.00902 -0.77881 30 -12.2129 -0.00758 -0.7907 30 -0.71008 2.14325 -0.72889
31 -12.15 -0.01225 -0.77909 31 -12.1759 -0.01261 -0.78185 31 -12.3049 -0.01577 -0.79671 31 -2.90547 -5.90936 2.847882
32 -12.1011 -0.01596 -0.77565 32 -12.1141 -0.01743 -0.77748 32 -12.1662 -0.02863 -0.78673 32 7.501074 -0.45235 -1.22584
33 -12.0909 -0.01585 -0.77538 33 -12.1006 -0.01726 -0.77713 33 -12.1299 -0.02803 -0.78593 33 -1.71659 4.008377 -0.67371
34 -12.1268 -0.0172 -0.77749 34 -12.1461 -0.01904 -0.77978 34 -12.2301 -0.03269 -0.79148 34 -10.3639 -18.0606 -48.0758
35 -12.1519 -0.01748 -0.77637 35 -12.1792 -0.01943 -0.7783 35 -12.32 -0.03396 -0.78735 35 -0.61095 -4.83611 2.059359
36 -12.188 -0.01746 -0.77489 36 -12.2266 -0.01944 -0.77634 36 -12.445 -0.03421 -0.78191 36 2.67681 0.731832 -0.74194
37 -12.1845 -0.01739 -0.77466 37 -12.221 -0.01931 -0.77597 37 -12.4177 -0.0334 -0.78009 37 -8.7573 2.650685 1.000629
38 -12.2057 -0.01612 -0.77648 38 -12.2475 -0.01764 -0.77828 38 -12.4723 -0.02861 -0.78527 38 2.038545 -5.15959 1.572944
39 -12.2135 -0.01695 -0.77715 39 -12.2556 -0.01855 -0.77898 39 -12.4659 -0.02774 -0.78471 39 4.571235 5.825725 -3.13159
40 -12.2091 -0.01713 -0.77757 40 -12.2496 -0.01879 -0.77954 40 -12.4442 -0.0285 -0.78652 40 -1.24389 1.367277 -0.74003
41 -12.2358 -0.01901 -0.77956 41 -12.2831 -0.02123 -0.78202 41 -12.5066 -0.03377 -0.79087 41 8.820008 2.307126 15.81125
42 -12.2299 -0.01925 -0.77905 42 -12.2744 -0.02162 -0.78126 42 -12.4789 -0.03568 -0.78831 42 7.100968 -7.50652 -4.68617
43 -12.2249 -0.01963 -0.77909 43 -12.2676 -0.02215 -0.78133 43 -12.4598 -0.03729 -0.78868 43 -2.02611 0.008736 -0.66916
44 -12.2004 -0.01728 -0.77916 44 -12.2322 -0.01876 -0.7815 44 -12.333 -0.02509 -0.79053 44 3.572672 7.708125 -4.69403
45 -12.2318 -0.01843 -0.78199 45 -12.2724 -0.02028 -0.78516 45 -12.413 -0.02865 -0.79851 45 -2.28187 -2.16802 1.275871
46 -12.1937 -0.0191 -0.78379 46 -12.2182 -0.02122 -0.78775 46 -12.2321 -0.03167 -0.80774 46 2.751858 -2.56766 3.216445
47 -12.1574 -0.0148 -0.7842 47 -12.1667 -0.01507 -0.78844 47 -12.0827 -0.01306 -0.81135 47 -1.30507 -2.55741 -1.46385
48 -12.138 -0.01533 -0.78312 48 -12.1392 -0.01588 -0.78692 48 -11.9974 -0.01642 -0.80716 48 -4.25477 5.740752 7.84953
49 -12.1511 -0.0147 -0.78149 49 -12.1599 -0.01493 -0.78446 49 -12.103 -0.01202 -0.79647 49 4.922996 4.314649 0.155759
50 -12.1639 -0.01482 -0.78039 50 -12.1792 -0.0151 -0.78284 50 -12.1804 -0.01259 -0.79043 50 0.201244 0.179588 0.684707
51 -12.156 -0.01451 -0.78019 51 -12.1677 -0.01465 -0.78256 51 -12.1459 -0.01116 -0.78975 51 -12.078 14.73817 -16.9637
PREDICTION ERRORS FOR QUESTION NUMBER TWO WILL BE FOLLOWING

lAMDA=0.2 LAMBDA=0.95 LAMBDA=0.99 LAMBDA=1


3043374 7660.766 7643.352 7637.368

QUESTION 3 (LOCALLY WEIGHTED REGRESSION)

#Loading data
> Problem1_Input_Training <- read_csv("C:/Users/ssnii/Desktop/compstatsassn/Problem1_Input_Training.csv",
+ col_names = FALSE)
Parsed with column specification:
cols(
X1 = col_double(),
X2 = col_double()
)
> View(Problem1_Input_Training)
> library(readr)
> Problem3_Output_Training <- read_csv("C:/Users/ssnii/Desktop/compstatsassn/Problem3_Output_Training.csv",
+ col_names = FALSE)
Parsed with column specification:
cols(
X1 = col_double()
)
> View(Problem3_Output_Training)
> colnames(Problem3_Output_Training)="Y1"
A=cbind(Problem3_Output_Training,cbind(1,Problem1_Input_Training))

# Declaring distance dataframe and weight data frame


di=data.frame()
wi=data.frame()
yc=matrix(nrow=1,ncol=1)
B=cbind(1,Problem1_Input_Test)
D=A[,-1]
C=D
A=as.matrix(A) # Complete data set
B=as.matrix(B) # Independent Variables
C=as.matrix(C)
D=as.matrix(D)
p=matrix(nrow = 50,ncol=3)
PART A
> for(k in 1:50) #Applying 50 loops for 50 regression
{
for(j in 1:(49+k))
{ di[j,1]=0;
for(i in 1:3)
{
di[j,1]=di[j,1]+((A[j,-1][i]-B[k,i])^2) #Calculating the Euclidian distance
}
di[k,1]=sqrt(di[k,1])
wi[k,1]=ifelse(di[k,1]<=10,1,0) # Further calculating the weights . Binary in this case
C[k,]=sqrt(wi[k,1])*C[k,]
}
p[k,]=t(solve(t(C)%*%C)%*%t(C)%*%A[,1]) #This denotes the beta values. p is a beta matrix
yc[1]=D[k,]%*%p[k,]
A=rbind(A,cbind(yc,t(D[k,]))) #Adding the predicted values to increase the data set
C=A[,-1]} #Following is the predicted Y1 values and further 50 beta coefficients
int coef1 coef2 Y1 1 X1 X2
1 -12.8864 -0.00788 -0.76807 1 -14.3925 1 -14.1248 2.105721
2 -12.9114 -0.00699 -0.76815 2 -11.0169 1 -2.73974 -2.44132
3 -12.9253 -0.00722 -0.7707 3 -31.5888 1 -4.59807 24.25932
4 -12.8874 -0.00681 -0.76764 4 -22.1541 1 6.843273 12.01095
5 -12.8998 -0.00724 -0.7684 5 -21.4291 1 -0.11879 11.10112
6 -12.8811 -0.00638 -0.7698 6 2.36366 1 11.96018 -19.9026
7 -12.8842 -0.00865 -0.76824 7 -17.732 1 -24.9815 6.591471
8 -12.8979 -0.00728 -0.76838 8 -12.044 1 -3.56602 -1.0775
9 -12.9033 -0.00795 -0.76839 9 -8.16796 1 18.91049 -6.35829
10 -12.9001 -0.00759 -0.76858 10 -19.4152 1 15.49839 8.323533
11 -12.9023 -0.00776 -0.76841 11 -9.41117 1 15.99707 -4.70497
12 -12.8986 -0.00733 -0.76851 12 -29.9011 1 -8.69068 22.20691
13 -12.8992 -0.0073 -0.76816 13 5.801499 1 1.262911 -24.3567
14 -12.8978 -0.00738 -0.76838 14 -5.32086 1 14.76574 -10.0028
15 -12.899 -0.00747 -0.7685 15 -21.0602 1 13.63978 10.48724
16 -12.9057 -0.00848 -0.76773 16 1.697607 1 23.69639 -19.2831
17 -12.9126 -0.00841 -0.76792 17 -4.931 1 14.57483 -10.5533
18 -12.8984 -0.00745 -0.76834 18 -9.81631 1 -19.5045 -3.82212
19 -12.9003 -0.00768 -0.76812 19 4.490307 1 21.45933 -22.8551
20 -12.9211 -0.00603 -0.76981 20 -30.9356 1 -22.3379 23.57636
21 -12.8938 -0.00763 -0.76803 21 -28.1224 1 -15.188 19.97886
22 -12.9065 -0.00709 -0.76757 22 6.134162 1 1.163332 -24.8172
23 -12.9083 -0.00706 -0.76855 23 -17.1458 1 -8.47094 5.591494
24 -12.8988 -0.00718 -0.76852 24 4.177243 1 22.84749 -22.433
25 -12.8955 -0.00679 -0.76861 25 -0.16728 1 22.89926 -16.7623
26 -12.9016 -0.00746 -0.76852 26 -19.673 1 18.04015 8.635882
27 -12.8862 -0.00706 -0.76798 27 -15.6244 1 4.188865 3.526967
28 -12.8968 -0.00733 -0.76866 28 1.314187 1 6.697482 -18.5519
29 -12.9046 -0.00789 -0.76873 29 -16.3689 1 21.17424 4.289092
30 -12.8977 -0.00755 -0.76863 30 0.212411 1 -10.1659 -16.9567
31 -12.8801 -0.00541 -0.76585 31 -31.5723 1 16.79394 24.28852
32 -12.8854 -0.00808 -0.76895 32 -3.70017 1 -8.87477 -11.8519
33 -12.8933 -0.00764 -0.76855 33 -2.77308 1 -13.1469 -13.0371
34 -12.8822 -0.00691 -0.77025 34 6.044851 1 10.36036 -24.6655
35 -12.8908 -0.00792 -0.76896 35 1.806469 1 -13.806 -18.9711
36 -12.8968 -0.00726 -0.76835 36 -28.8111 1 -5.45329 20.76397
37 -12.9054 -0.00734 -0.7682 37 -7.12892 1 4.695252 -7.56435
38 -12.8921 -0.00804 -0.769 38 5.893279 1 -22.3243 -24.1947
39 -12.9072 -0.00671 -0.76751 39 1.327315 1 -5.3694 -18.4995
40 -12.8935 -0.00774 -0.76851 40 -7.55843 1 -24.7756 -6.69274
41 -12.9154 -0.00724 -0.76969 41 -27.4245 1 -3.94514 18.88763
42 -12.9043 -0.00691 -0.76836 42 -10.4408 1 -11.8846 -3.09937
43 -12.8836 -0.0081 -0.76782 43 -22.8348 1 -16.2603 13.13189
44 -12.8966 -0.0073 -0.76845 44 0.709695 1 16.85663 -17.8665
45 -12.8875 -0.00659 -0.76816 45 -12.552 1 16.5066 -0.57847
46 -12.9015 -0.00735 -0.76829 46 -6.22526 1 3.669664 -8.72475
47 -12.9116 -0.00513 -0.76702 47 0.183197 1 -23.6527 -16.9143
48 -12.8909 -0.00711 -0.768 48 -19.8141 1 2.351146 8.992732
49 -12.8935 -0.00702 -0.76808 49 -20.9697 1 9.233113 10.43034
50 -12.9054 -0.00782 -0.76749 50 4.18438 1 16.13809 -22.4316
PART B
for(k in 1:50)
{
for(j in 1:(49+k))
{ di[j,1]=0;
for(i in 1:3)
{
di[j,1]=di[j,1]+((A[j,-1][i]-B[k,i])^2)
}
di[k,1]=sqrt(di[k,1])
wi[k,1]=exp(-0.5*(di[j,1]^2)) #Exponential weights in this case
C[k,]=sqrt(wi[k,1])*C[k,]
}
p[k,]=t(solve(t(C)%*%C)%*%t(C)%*%A[,1])
yc[1]=D[k,]%*%p[k,]
A=rbind(A,cbind(yc,t(D[k,])))
C=A[,-1]
}

PART C
for(k in 1:50)
{
for(j in 1:(49+k))
{ di[j,1]=0;
for(i in 1:3)
{
di[j,1]=di[j,1]+((A[j,-1][i]-B[k,i])^2)
}
di[k,1]=sqrt(di[k,1])
wi[k,1]=ifelse(di[k,1]!=0,1/di[k,1],10)
C[k,]=sqrt(wi[k,1])*C[k,]
}
p[k,]=t(solve(t(C)%*%C)%*%t(C)%*%A[,1])
yc[1]=D[k,]%*%p[k,]
A=rbind(A,cbind(yc,t(D[k,])))
C=A[,-1]
}

Following are the tables for predicted values of Y and corresponding X for part b and part c sequentially.
It also consists the different beta values predicted at each iteration.

Now when we compare all the three predicted values of y with Problem3 _ Output_Test we get no
conclusive result and there is no relation between the two. There behaviour is not the same.
X1 1 X1 X2 int coef1 coef2
1 -11.0169 1 -2.73974 -2.44132 1 -12.8864 -0.00788 -0.76807
2 -31.5888 1 -4.59807 24.25932 2 -12.9114 -0.00699 -0.76815
3 -22.1541 1 6.843273 12.01095 3 -12.9253 -0.00722 -0.7707
4 -21.4291 1 -0.11879 11.10112 4 -12.8874 -0.00681 -0.76764
5 2.36366 1 11.96018 -19.9026 5 -12.8998 -0.00724 -0.7684
6 -17.732 1 -24.9815 6.591471 6 -12.8811 -0.00638 -0.7698
7 -12.044 1 -3.56602 -1.0775 7 -12.8842 -0.00865 -0.76824
8 -8.16796 1 18.91049 -6.35829 8 -12.8979 -0.00728 -0.76838
9 -19.4152 1 15.49839 8.323533 9 -12.9033 -0.00795 -0.76839
10 -9.41117 1 15.99707 -4.70497 10 -12.9001 -0.00759 -0.76858
11 -29.9011 1 -8.69068 22.20691 11 -12.9023 -0.00776 -0.76841
12 5.801499 1 1.262911 -24.3567 12 -12.8986 -0.00733 -0.76851
13 -5.32086 1 14.76574 -10.0028 13 -12.8992 -0.0073 -0.76816
14 -21.0602 1 13.63978 10.48724 14 -12.8978 -0.00738 -0.76838
15 1.697607 1 23.69639 -19.2831 15 -12.899 -0.00747 -0.7685
16 -4.931 1 14.57483 -10.5533 16 -12.9057 -0.00848 -0.76773
17 -9.7767 1 -19.5045 -3.82212 17 -12.9126 -0.00841 -0.76792
18 -12.8852 -0.00874 -0.7687
18 4.489972 1 21.45933 -22.8551
19 -12.8997 -0.00774 -0.76814
19 -30.9338 1 -22.3379 23.57636
20 -12.9204 -0.0061 -0.76983
20 -28.121 1 -15.188 19.97886
21 -12.8931 -0.0077 -0.76805
21 6.135243 1 1.163332 -24.8172
22 -12.9058 -0.00716 -0.76759
22 -17.1447 1 -8.47094 5.591494
23 -12.9077 -0.00713 -0.76856
23 4.176798 1 22.84749 -22.433
24 -12.8982 -0.00725 -0.76853
24 -0.16783 1 22.89926 -16.7623
25 -12.8948 -0.00685 -0.76862
25 -19.6737 1 18.04015 8.635882
26 -12.9009 -0.00753 -0.76853
26 -15.6241 1 4.188865 3.526967
27 -12.8855 -0.00713 -0.768
27 1.314761 1 6.697482 -18.5519
28 -12.8962 -0.00739 -0.76868
28 -16.3697 1 21.17424 4.289092
29 -12.904 -0.00796 -0.76875
29 0.214111 1 -10.1659 -16.9567
30 -12.897 -0.00762 -0.76865
30 -31.5732 1 16.79394 24.28852
31 -12.8794 -0.00548 -0.76587
31 -3.69867 1 -8.87477 -11.8519 32 -12.8847 -0.00814 -0.76897
32 -2.77125 1 -13.1469 -13.0371 33 -12.8926 -0.00771 -0.76857
33 6.045303 1 10.36036 -24.6655 34 -12.8815 -0.00697 -0.77027
34 1.808479 1 -13.806 -18.9711 35 -12.8901 -0.00798 -0.76898
35 -28.8105 1 -5.45329 20.76397 36 -12.8961 -0.00733 -0.76837
36 -7.12842 1 4.695252 -7.56435 37 -12.9048 -0.0074 -0.76822
37 5.896081 1 -22.3243 -24.1947 38 -12.8914 -0.00811 -0.76903
38 1.32872 1 -5.3694 -18.4995 39 -12.9066 -0.00678 -0.76753
39 -7.55587 1 -24.7756 -6.69274 40 -12.8928 -0.00781 -0.76853
40 -27.4239 1 -3.94514 18.88763 41 -12.9147 -0.0073 -0.76971
41 -10.4392 1 -11.8846 -3.09937 42 -12.9036 -0.00697 -0.76838
42 -22.8332 1 -16.2603 13.13189 43 -12.883 -0.00817 -0.76783
43 0.709581 1 16.85663 -17.8665 44 -12.896 -0.00737 -0.76846
44 -12.5524 1 16.5066 -0.57847 45 -12.8869 -0.00666 -0.76818
45 -6.22466 1 3.669664 -8.72475 46 -12.9008 -0.00742 -0.76831
46 0.185906 1 -23.6527 -16.9143 47 -12.9109 -0.0052 -0.76704
47 -19.8137 1 2.351146 8.992732 48 -12.8903 -0.00717 -0.76802
48 -20.9698 1 9.233113 10.43034 49 -12.8929 -0.00709 -0.7681
49 4.184403 1 16.13809 -22.4316 50 -12.9047 -0.00789 -0.76751
X1 1 X1 X2 int coef1 coef2
1 -14.4509 1 -14.1248 2.105721 1 -12.9171 -0.00595 -0.76834
2 -11.0213 1 -2.73974 -2.44132 2 -12.9154 -0.00687 -0.76816
3 -31.6174 1 -4.59807 24.25932 3 -12.9351 -0.00713 -0.77146
4 -22.1751 1 6.843273 12.01095 4 -12.8987 -0.00709 -0.76829
5 -21.4349 1 -0.11879 11.10112 5 -12.9037 -0.00719 -0.76858
6 2.364168 1 11.96018 -19.9026 6 -12.8834 -0.00631 -0.7699
7 -17.7385 1 -24.9815 6.591471 7 -12.8871 -0.00853 -0.76834
8 -12.0472 1 -3.56602 -1.0775 8 -12.9009 -0.00719 -0.76848
9 -8.16947 1 18.91049 -6.35829 9 -12.9061 -0.00791 -0.76848
10 -19.4182 1 15.49839 8.323533 10 -12.9029 -0.00755 -0.7687
11 -9.41249 1 15.99707 -4.70497 11 -12.905 -0.00771 -0.76851
12 -29.9094 1 -8.69068 22.20691 12 -12.9021 -0.00724 -0.76869
13 5.802515 1 1.262911 -24.3567 13 -12.9015 -0.00724 -0.76829
14 -5.32199 1 14.76574 -10.0028 14 -12.9006 -0.00733 -0.76847
15 -21.064 1 13.63978 10.48724 15 -12.9019 -0.00743 -0.76863
16 1.698943 1 23.69639 -19.2831 16 -12.9082 -0.00841 -0.76784
17 -4.93167 1 14.57483 -10.5533 17 -12.9152 -0.00835 -0.76803
18 -9.79402 1 -19.5045 -3.82212 18 -12.8924 -0.00822 -0.76868
19 4.49155 1 21.45933 -22.8551 19 -12.9024 -0.00765 -0.76824
20 -30.9506 1 -22.3379 23.57636 20 -12.9257 -0.00586 -0.77009
21 -28.1291 1 -15.188 19.97886 21 -12.8966 -0.00757 -0.76819
22 6.14013 1 1.163332 -24.8172 22 -12.9076 -0.0071 -0.76786
23 -17.1506 1 -8.47094 5.591494 23 -12.9117 -0.00699 -0.7687
24 4.179074 1 22.84749 -22.433 24 -12.9011 -0.00714 -0.76866
25 -0.16642 1 22.89926 -16.7623 25 -12.8978 -0.00675 -0.76874
26 -19.6773 1 18.04015 8.635882 26 -12.9043 -0.00747 -0.76869
27 -15.6277 1 4.188865 3.526967 27 -12.889 -0.00704 -0.76813
28 1.314787 1 6.697482 -18.5519 28 -12.8991 -0.00729 -0.7688
29 -16.372 1 21.17424 4.289092 29 -12.9072 -0.00788 -0.76889
30 0.212114 1 -10.1659 -16.9567 30 -12.9001 -0.00752 -0.76877
31 -31.5851 1 16.79394 24.28852 31 -12.8837 -0.00551 -0.76616
32 -3.70125 1 -8.87477 -11.8519 32 -12.8879 -0.00805 -0.7691
33 -2.77404 1 -13.1469 -13.0371 33 -12.8958 -0.00761 -0.7687
34 6.046802 1 10.36036 -24.6655 34 -12.8845 -0.00688 -0.77041
35 1.806828 1 -13.806 -18.9711 35 -12.8932 -0.0079 -0.76912
36 -28.8213 1 -5.45329 20.76397 36 -12.9009 -0.00723 -0.76863
37 -7.13023 1 4.695252 -7.56435 37 -12.908 -0.00731 -0.76836
38 5.894328 1 -22.3243 -24.1947 38 -12.8946 -0.00802 -0.76917
39 1.329286 1 -5.3694 -18.4995 39 -12.9092 -0.00672 -0.76772
40 -7.56095 1 -24.7756 -6.69274 40 -12.8961 -0.0077 -0.76867
41 -27.4357 1 -3.94514 18.88763 41 -12.9202 -0.0072 -0.77002
42 -10.4451 1 -11.8846 -3.09937 42 -12.908 -0.00681 -0.76851
43 -22.8417 1 -16.2603 13.13189 43 -12.8869 -0.00804 -0.76801
44 0.710565 1 16.85663 -17.8665 44 -12.8992 -0.00727 -0.76861
45 -12.5547 1 16.5066 -0.57847 45 -12.8905 -0.00659 -0.76833
46 -6.22688 1 3.669664 -8.72475 46 -12.9045 -0.00733 -0.76845
47 0.182811 1 -23.6527 -16.9143 47 -12.9143 -0.0051 -0.76719
48 -19.8191 1 2.351146 8.992732 48 -12.8942 -0.00709 -0.7682
49 -20.9744 1 9.233113 10.43034 49 -12.8965 -0.00701 -0.76827
50 4.186507 1 16.13809 -22.4316 50 -12.9079 -0.00779 -0.76767

Das könnte Ihnen auch gefallen