79 views

Uploaded by Muhammad Imran

- 36_1
- Newbold Stat8 Ism 07
- 721402-5-17E.docx
- 1.1 Statistical Analysis
- camp sunshine evaluation
- metaanaliza moderatori
- pert2.ppt
- Imprimir Minitab Lab 3 2016-2
- Problem Sheet IV_Testing of Hypothesis
- Statistics
- syllabus
- Lecture 28
- Population statistics
- 36_1_dscrbng_data
- Nepc Ttr Folo Adler Response
- Bio Statistics
- US Federal Reserve: 200441pap
- THE EFFECT OF TIERED TRAINING ON PEDAGOGICAL KNOWLEDGE OF EARLY CHILDHOOD TEACHER IN PROVINCE OF NORTH MALUKU.
- Lot Hi an 1998
- Assignment

You are on page 1of 2

KEY FORMULAS CHAPTER 5 • DISCRETE RANDOM VARIABLES AND THEIR • Confidence interval for for a large sample: • Test statistic for a test of hypothesis about 1 2 for two large p̂1q̂1 p̂2q̂2

Prem S. Mann • Introductory Statistics, Fifth Edition PROBABILITY DISTRIBUTIONS and independent samples: where sp1 p2

^ ^

x zsx if s is known B n1 n2

• Mean of a discrete random variable x: xP(x) x z sx if s is not known 1 x1 x2 2 1m1 m2 2 • For two large and independent samples, for a test of hypothesis

z

CHAPTER 2 • ORGANIZING DATA • Empirical rule: • Standard deviation of a discrete random variable x: where sx s 1n and sx s 1n sx1x2 about p1 p2 with H0: p1 p2 0:

For a specific bell-shaped distribution, about 68% of the obser- s 2x 2P1x2 m2 If 1 and 2 are not known, then replace sx1 x2 by its point Pooled sample proportion:

• Relative frequency of a class f f • Confidence interval for for a small sample:

vations fall in the interval ( ) to ( ), about 95% fall in estimator sx1 x2. x1 x2 n1 p̂1 n2 p̂2

Percentage of a class (Relative frequency) 100 • n factorial: n! n(n 1)(n 2) . . . 3 2 1

• the interval ( 2) to ( 2), and about 99.7% fall in the x tsx where sx s 1n p or

interval ( 3) to ( 3). Number of combinations of n items selected x at a time: • For two small and independent samples taken from two popu- n1 n2 n1 n2

• Class midpoint or mark (Upper limit Lower limit)2 •

• Margin of error for the point estimation of p: lations with equal standard deviations: Estimate of the standard deviation of p̂1 p̂2:

Class width Upper boundary Lower boundary • Interquartile range: IQR Q3 Q1 n! Pooled standard deviation:

•

nCx 1.96sp where sp 1p̂q̂n

pqa b

x!1n x2! 1 1

1n1 12s21 1n2 12s22

where Q3 is the third quartile and Q1 is the first quartile ^ ^

sp1 p2

^ ^

• Cumulative relative frequency B n1 n2

Binomial probability formula: P1x2 nCx pxq nx Confidence interval for p for a large sample: sp

1 p̂1 p̂2 2 1 p1 p2 2

• The kth percentile: • •

Cumulative frequency B n1 n2 2

z

Total observations in the data set • Mean and standard deviation of the binomial distribution: p̂ z sp where sp 2p̂q̂n Test statistic:

Pk Value of the a b th term in a ranked data set

kn ^ ^

Estimate of the standard deviation of x1 x2: sp1 p2 ^ ^

• Cumulative percentage 100 m np and s 1npq • Maximum error of the estimate for : 1 1

(Cumulative relative frequency) 100 • Hypergeometric probability formula: sx1 x2 sp

• Percentile rank of xi E zsx or z sx A n1 n2 CHAPTER 11 • CHI-SQUARE TESTS

r Cx Nr Cnx

1 x1 x2 2 tsx1 x2

Number of values less than xi

100 P1x2 • Determining sample size for estimating : n z E 2 2 2

Confidence interval for m1 m2: • Expected frequency for a category for a goodness-of-fit test:

N Cn

1 x1 x2 2 1m1 m2 2

Total number of values in the data set

• Maximum error of the estimate for p: E np

lxel Test statistic: t

CHAPTER 3 • NUMERICAL DESCRIPTIVE MEASURES • Poisson probability formula: P1x2 sx1 x2 • Degrees of freedom for a goodness-of-fit test:

x! E z sp ^ where sp 1p̂q̂n

^

• Mean for ungrouped data: xN and x xn • Mean, variance, and standard deviation of the Poisson probabil- • For two small and independent samples selected from two pop- df k 1 where k is the number of categories

CHAPTER 4 • PROBABILITY • Determining sample size for estimating p: n z 2pqE 2

ity distribution: ulations with unequal standard deviations: Expected frequency for a cell for an independence or homo-

• Mean for grouped data: mfN and x mfn where •

• Classical probability rule for a simple event: geneity test:

m is the midpoint and f is the frequency of a class m l, s2 l, and s 1l s21 s22 2

a b 1Row total21Column total2

P1Ei 2

1

• Median for ungrouped data

Degrees of freedom: df

n1 n2 E

Total number of outcomes CHAPTER 9 • HYPOTHESIS TESTS ABOUT THE MEAN AND s21 2 s22 2 Sample size

n1

Value of the a b th term in a ranked data set Classical probability rule for a compound event: PROPORTION a b b a • Degrees of freedom for a test of independence or homogeneity:

2 • CHAPTER 6 • CONTINUOUS RANDOM VARIABLES AND n1 n2

Number of outcomes in A THE NORMAL DISTRIBUTION • Test statistic z for a test of hypothesis about for a large sample: n1 1

n2 1 df 1R 121C 12

• Range Largest value Smallest value P1A2

Total number of outcomes xm xm s Estimate of the standard deviation of x1 x2: where R and C are the total number of rows and columns,

• Variance for ungrouped data: • z value for an x value: z z if s is known, where sx respectively, in the contingency table

s sx 2n

1x2 2 1x2 2

• Relative frequency as an approximation of probability:

s21 s22 Test statistic for a goodness-of-fit test and a test of independ-

x2 x2 f • Value of x when , , and z are known: x z xm sx1 x2 •

N n s B n1 n2 ence or homogeneity:

s

2

and s 2 P1A2 or z if s is not known, where sx

N n1 n sx 1n Confidence interval for m1 m2: 1 x1 x2 2 tsx1 x2 1O E2 2

x2

1 x1 x2 2 1m1 m2 2

where 2 is the population variance and s 2 is the sample variance • Conditional probability of an event: • Test statistic for a test of hypothesis about for a small sample: E

CHAPTER 7 • SAMPLING DISTRIBUTIONS

xm Test statistic: t

Confidence interval for the population variance 2:

P1A 0 B2 and P1B 0 A2

• Standard deviation for ungrouped data: P1A and B2 P1A and B2 Mean of x : mx m s sx1 x2 •

• t where sx

1 x2 2 1x2 2 P1B2 P1A2

• Standard deviation of x when nN .05: sx s 1n

sx 1n • For two paired or matched samples: 1n 12s 2 1n 12s 2

x2 x2 to

N n • Condition for independence of events: xm • Test statistic for a test of hypothesis about p for a large sample: Sample mean for paired differences: d dn x 2a2 x 21a2

s and s z value for x : z

P1A2 P1A 0 B2 and/or P1B2 P1B 0 A2

•

R N R n1 sx p̂ p pq Sample standard deviation for paired differences: • Test statistic for a test of hypothesis about 2:

z where sp

1d 2 2 1n 12s2

^

where and s are the population and sample standard deviations, Population proportion: p XN sp An

For complementary events: P(A) P( A) 1

^

•

•

d 2 x2

respectively

Multiplication rule for dependent events: • Sample proportion: p̂ xn n s2

• sd

Variance for grouped data: Mean of p̂: mp p R n1

P1A and B2 P1A2 P1B 0 A2

• • ^

1m f 2 1m f 2

CHAPTER 10 • ESTIMATION AND HYPOTHESIS TESTING:

2 2

• Standard deviation of p̂ when n N .05: sp 1pqn ^

TWO POPULATIONS

Mean and standard deviation of the sampling distribution of d: CHAPTER 12 • ANALYSIS OF VARIANCE

m2f m2f Multiplication rule for independent events:

and s d sd 1n

•

N n p̂ p md md Let:

s2 and s2 z value for p̂: z • Mean of the sampling distribution of x1 x2:

N n1 P1A and B2 P1A2 P1B2 •

sp

^ Confidence interval for d: k the number of different samples (or treatments)

mx1x2 m1 m2

• Standard deviation for grouped data: • Joint probability of two mutually exclusive events: d ts d where s d sd 1n ni the size of sample i

Confidence interval for 1 2 for two large and independent

1m f 2 1m f 2

2 2 P1A and B2 0

•

Test statistic for a test of hypothesis about d: Ti the sum of the values in sample i

m2f m2f CHAPTER 8 • ESTIMATION OF THE MEAN AND samples: n the number of values in all samples

N n Addition rule for mutually nonexclusive events: d md

1 x1 x2 2 zsx1 x2 if s1 and s2 are known

•

s and s PROPORTION t n1 n2 n3 # # #

R N R n1 sd

P1A or B2 P1A2 P1B2 P1A and B2 • Margin of error for the point estimation of : or 1 x1 x2 2 z sx1x2 if s1 and s2 are not known x the sum of the values in all samples

• Chebyshev’s theorem: • For two large and independent samples, confidence interval for

For any number k greater than 1, at least (1 1k2) of the values

• Addition rule for mutually exclusive events: 1.96sx or 1.96sx s21 s22 s21 s22 p1 p2: T1 T2 T3 # # #

where sx1 x2 sx1 x2

1 p̂1 p̂2 2 z sp1 p2

and

for any distribution lie within k standard deviations of the mean. P1A or B2 P1A2 P1B2 where sx s 1n and sx s 1n B n1 n2 B n1 n2 ^ ^ x 2 the sum of the squares of values in all samples

0009T_gatefold_01-02.qxd 14/05/03 14:37 Page 2

• For the F distribution: CHAPTER 14 • NONPARAMETRIC METHODS Table VII Standard Normal Distribution Table† Table VIII The t Distribution Table†

Degrees of freedom for the numerator k 1 • Test statistic for a sign test about the population proportion for

Degrees of freedom for the denominator n k a large sample:

• Between-samples sum of squares: 1X .52 m The entries in this table give the areas under

The entries in the table give the critical values

z where m np and s 1npq of t for the specified number of degrees

T12 T22 T32 1x2 2 s the standard normal curve from 0 to z.

SSB a # # #b Here, use (X .5) if X n2 and (X .5) if X

n2

of freedom and areas in the right tail.

n1 n2 n3 n 0 z 0 t

• Within-samples sum of squares: • Test statistic for a sign test about the median for a large sample

and the sign test about the median difference between paired data z .00 .01 .02 .03 .04 .05 .06 .07 .08 .09 Area in the Right Tail under the t Distribution Curve

T12 T22 T32

SSW x 2 a # # #b for a large sample:

1X .52 m

n1 n2 n3 0.0 .0000 .0040 .0080 .0120 .0160 .0199 .0239 .0279 .0319 .0359 df .10 .05 .025 .01 .005 .001

1 x2 2 z 0.1 .0398 .0438 .0478 .0517 .0557 .0596 .0636 .0675 .0714 .0753

• Total sum of squares: SST SSB SSW x 2 s

0.2 .0793 .0832 .0871 .0910 .0948 .0987 .1026 .1064 .1103 .1141 1 3.078 6.314 12.706 31.821 63.657 318.309

n where m np, p .5, and s 1npq 2 1.886 2.920 4.303 6.965 9.925 22.327

Variance between samples: MSB SSB 1k 12

0.3 .1179 .1217 .1255 .1293 .1331 .1368 .1406 .1443 .1480 .1517

• Let X1 be the number of plus signs and X2 the number of minus 0.4 .1554 .1591 .1628 .1664 .1700 .1736 .1772 .1808 .1844 .1879 3 1.638 2.353 3.182 4.541 5.841 10.215

• Variance within samples: MSW SSW 1n k2 signs in a test about the median. Then, if the test is two-tailed, 0.5 .1915 .1950 .1985 .2019 .2054 .2088 .2123 .2157 .2190 .2224 4 1.533 2.132 2.776 3.747 4.604 7.173

either of the two values can be assigned to X; if the test is left- 5 1.476 2.015 2.571 3.365 4.032 5.893

• Test statistic for a one-way ANOVA test: F MSB MSW 0.6 .2257 .2291 .2324 .2357 .2389 .2422 .2454 .2486 .2517 .2549

tailed, X smaller of the values of X1 and X2; if the test is right- 6 1.440 1.943 2.447 3.143 3.707 5.208

0.7 .2580 .2611 .2642 .2673 .2704 .2734 .2764 .2794 .2823 .2852

tailed, X larger of the values of X1 and X2. Also, we use (x .5) 7 1.415 1.895 2.365 2.998 3.499 4.785

CHAPTER 13 • SIMPLE LINEAR REGRESSION 0.8 .2881 .2910 .2939 .2967 .2995 .3023 .3051 .3078 .3106 .3133

if x n2, and (x .5) if x

n2. 8 1.397 1.860 2.306 2.896 3.355 4.501

0.9 .3159 .3186 .3212 .3238 .3264 .3289 .3315 .3340 .3365 .3389

• Simple linear regression model: y A Bx • Test statistic for the Wilcoxon signed-rank test for a large 9 1.383 1.833 2.262 2.821 3.250 4.297

1.0 .3413 .3438 .3461 .3485 .3508 .3531 .3554 .3577 .3599 .3621

• Estimated simple linear regression model: ŷ a bx sample: 10 1.372 1.812 2.228 2.764 3.169 4.144

1.1 .3643 .3665 .3686 .3708 .3729 .3749 .3770 .3790 .3810 .3830

• Sum of squares of xy, xx, and yy: T mT 1.2 .3849 .3869 .3888 .3907 .3925 .3944 .3962 .3980 .3997 .4015 11 1.363 1.796 2.201 2.718 3.106 4.025

z

1x21y2

sT 1.3 .4032 .4049 .4066 .4082 .4099 .4115 .4131 .4147 .4162 .4177 12 1.356 1.782 2.179 2.681 3.055 3.930

SSxy xy n1n 12 n1n 1212n 12 1.4 .4192 .4207 .4222 .4236 .4251 .4265 .4279 .4292 .4306 .4319 13 1.350 1.771 2.160 2.650 3.012 3.852

n where mT and sT 14 1.345 1.761 2.145 2.624 2.977 3.787

1x2 1y2 2

2 4 B 24 1.5 .4332 .4345 .4357 .4370 .4382 .4394 .4406 .4418 .4429 .4441

15 1.341 1.753 2.131 2.602 2.947 3.733

SSxx x2 and SSyy y2 • Test statistic for the Wilcoxon rank sum test for large and 1.6 .4452 .4463 .4474 .4484 .4495 .4505 .4515 .4525 .4535 .4545

n n 16 1.337 1.746 2.120 2.583 2.921 3.686

independent samples: 1.7 .4554 .4564 .4573 .4582 .4591 .4599 .4608 .4616 .4625 .4633

• Least squares estimates of A and B: 17 1.333 1.740 2.110 2.567 2.898 3.646

T mT 1.8 .4641 .4649 .4656 .4664 .4671 .4678 .4686 .4693 .4699 .4706

b SSxy SSxx and a y bx z 1.9 .4713 .4719 .4726 .4732 .4738 .4744 .4750 .4756 .4761 .4767 18 1.330 1.734 2.101 2.552 2.878 3.610

sT 19 1.328 1.729 2.093 2.539 2.861 3.579

• Standard deviation of the sample errors: n1 1n1 n2 12 n1n2 1n1 n2 12 2.0 .4772 .4778 .4783 .4788 .4793 .4798 .4803 .4808 .4812 .4817

20 1.325 1.725 2.086 2.528 2.845 3.552

where mT and sT

SSyy b SSxy 2 B 12 2.1 .4821 .4826 .4830 .4834 .4838 .4842 .4846 .4850 .4854 .4857

se 2.2 .4861 .4864 .4868 .4871 .4875 .4878 .4881 .4884 .4887 .4890 21 1.323 1.721 2.080 2.518 2.831 3.527

B n2 • Test statistic for the Kruskal-Wallis test:

2.3 .4893 .4896 .4898 .4901 .4904 .4906 .4909 .4911 .4913 .4916 22 1.321 1.717 2.074 2.508 2.819 3.505

• Error sum of squares: SSE e2 1y ŷ2 2 H

12 R21 R22

c

R2k

# # # d 31n 12 2.4 .4918 .4920 .4922 .4925 .4927 .4929 .4931 .4932 .4934 .4936 23 1.319 1.714 2.069 2.500 2.807 3.485

1y2 2

n1n 12 n1 n2 nk 2.5 .4938 .4940 .4941 .4943 .4945 .4946 .4948 .4949 .4951 .4952 24 1.318 1.711 2.064 2.492 2.797 3.467

• Total sum of squares: SST y2 where ni size of ith sample, n n1 n2 # # # nk,

25 1.316 1.708 2.060 2.485 2.787 3.450

n 2.6 .4953 .4955 .4956 .4957 .4959 .4960 .4961 .4962 .4963 .4964

• Regression sum of squares: SSR SST SSE k number of samples, and Ri sum of ranks for ith sample 2.7 .4965 .4966 .4967 .4968 .4969 .4970 .4971 .4972 .4973 .4974 26 1.315 1.706 2.056 2.479 2.779 3.435

27 1.314 1.703 2.052 2.473 2.771 3.421

• Coefficient of determination: r 2 b SSxySSyy • Test statistic for the Spearman rho rank correlation coefficient test: 2.8 .4974 .4975 .4976 .4977 .4977 .4978 .4979 .4979 .4980 .4981

28 1.313 1.701 2.048 2.467 2.763 3.408

• Confidence interval for B: 6d 2 2.9 .4981 .4982 .4982 .4983 .4984 .4984 .4985 .4985 .4986 .4986

rs 1 3.0 .4987 .4987 .4987 .4988 .4988 .4989 .4989 .4989 .4990 .4990 29 1.311 1.699 2.045 2.462 2.756 3.396

b tsb where sb se 1SSxx n1n2 12 30 1.310 1.697 2.042 2.457 2.750 3.385

†

bB This table is the same as Table VII of Appendix C.

• Test statistic for a test of hypothesis about B: t where di difference in ranks of xi and yi 31 1.309 1.696 2.040 2.453 2.744 3.375

sb • Test statistic for the runs test for randomness for a large sample: 32 1.309 1.694 2.037 2.449 2.738 3.365

SSxy 33 1.308 1.692 2.035 2.445 2.733 3.356

• Linear correlation coefficient: r R mR

1SSxx SSyy z 34 1.307 1.691 2.032 2.441 2.728 3.348

sR 35 1.306 1.690 2.030 2.438 2.724 3.340

n2

• Test statistic for a test of hypothesis about : t r where R is the number of runs in the sequence

A 1 r2 36 1.306 1.688 2.028 2.434 2.719 3.333

• Confidence interval for y | x: 2n1n2 2n1n2 12n1n2 n1 n2 2 37 1.305 1.687 2.026 2.431 2.715 3.326

mR 1 and sR

1 1x0 x 2 2 n1 n2 B 1n1 n2 2 2 1n1 n2 12 38 1.304 1.686 2.024 2.429 2.712 3.319

ŷ t sym^ where sym se

^

Bn SSxx 39 1.304 1.685 2.023 2.426 2.708 3.313

• Prediction interval for yp: 40 1.303 1.684 2.021 2.423 2.704 3.307

1 1x0 x 2 2 1.282 1.645 1.960 2.326 2.576 3.090

ŷ t syp

^ where syp se 1

^

B n SSxx †

This table is an abbreviated version of Table VIII that appears in Appendix C. This table goes up to 40 degrees of

freedom. For degrees of freedom from 41 to 70, use Table VIII of Appendix C.

- 36_1Uploaded byMahyar Zobeidi
- Newbold Stat8 Ism 07Uploaded byMisha Lezhava
- 721402-5-17E.docxUploaded bysatarupa
- 1.1 Statistical AnalysisUploaded bydiahema
- camp sunshine evaluationUploaded byapi-375291396
- metaanaliza moderatoriUploaded byCorina Ica
- pert2.pptUploaded byGabriel Pardamean
- Imprimir Minitab Lab 3 2016-2Uploaded byTatîanaReyesSegura
- Problem Sheet IV_Testing of HypothesisUploaded byVivekChoudhary
- StatisticsUploaded byRicha Mahajan
- syllabusUploaded byUday Kumar Chadalwada
- Lecture 28Uploaded byFight Nesan
- Population statisticsUploaded bydmaproiect
- 36_1_dscrbng_dataUploaded bytarek mahmoud
- Nepc Ttr Folo Adler ResponseUploaded byNational Education Policy Center
- Bio StatisticsUploaded byanaeshkl
- US Federal Reserve: 200441papUploaded byThe Fed
- THE EFFECT OF TIERED TRAINING ON PEDAGOGICAL KNOWLEDGE OF EARLY CHILDHOOD TEACHER IN PROVINCE OF NORTH MALUKU.Uploaded byIJAR Journal
- Lot Hi an 1998Uploaded bycclaudel09
- AssignmentUploaded byasyikinhazam
- Presentacion Laura InglesUploaded bylaura
- US+Confidence+Intervals+IUploaded byMartha Schwendener
- basic_statistics Lesson I.pdfUploaded bydaniel
- Normal DistributionUploaded byJohan Gutierrez
- 1972_IntMathGeology_PaleocurrentAnaly.pdfUploaded byAlexandre Castelo
- ipi97596Uploaded byCungmin Minnie Choi
- Burnham etal 2008_Inference_2sept.pdfUploaded byMGato Hiperbóreo
- MEI_S1Uploaded byKoirala Prijun
- Lab 1Uploaded byMeower
- Sample SizeUploaded byReny Lase

- Dyeing Defects and Their RemediesUploaded byMuhammad Imran
- EMISCH04Uploaded byMuhammad Imran
- EMISCH06Uploaded byMuhammad Imran
- EMISCH05Uploaded byMuhammad Imran
- Dyeing- Cold Pad Batch Project.pdfUploaded byJaywant Bari
- EMISCH01Uploaded byMuhammad Imran
- EMISCH02Uploaded byMuhammad Imran
- Best Technology Cold Pad Batch DyeingUploaded byMuhammad Imran
- Best Technology Cold Pad Batch DyeingUploaded byMuhammad Imran
- IBM Business on DemandUploaded byWaseem Muhammed Khan
- EMISCH03Uploaded byMuhammad Imran
- VFE Form by TargetUploaded byMuhammad Imran
- Enterprise Rent a Car Edition 13 FullUploaded byMuhammad Imran
- SOPs for Textile Mfg by M&SUploaded byMuhammad Imran
- Defects in Corrugated BoardUploaded byMuhammad Imran
- Defects in Corrugated BoardUploaded byMuhammad Imran
- Defects in Corrugated BoardUploaded byMuhammad Imran
- Defects in Corrugated BoardUploaded byMuhammad Imran
- ISO Lab Test for Apparel FabricsUploaded byMuhammad Imran
- Fehling TestUploaded byMuhammad Imran
- Sears Care SymbolsUploaded byMuhammad Imran
- testing uploadingUploaded byMuhammad Imran

- g6 u6 anc ace answers inv4Uploaded byapi-252906085
- Grade 4 Math PelcUploaded byRonan Sibbaluca
- Bresenham Circle Algorithm in the Short FormUploaded bykamar
- Notes on MATLAB ProgrammingUploaded byvisakoushik
- Week7 HomeworkUploaded byyashar2500
- Economic Order Quantity ModelsUploaded byP Singh Karki
- Oracle FunctionsUploaded bySanjana Garg
- AssignmentUploaded byEr D R Kumawat
- SCRinGI-EnglandVerrallUploaded byankarad
- Sampled SfgUploaded byricharai2312
- s4Uploaded byMonal Bhoyar
- Cal162 Iterated IntegralsUploaded bymarchelo_chelo
- M2 Imp QuestnsUploaded byMani Raj N
- Mat22a Midterm2 Ssoln05olUploaded byfranciis
- 17_chi_sq_testsUploaded byGiorgos Papak
- 10 Ways to Solve a Quadratic EquationUploaded byYuxdar Contell
- 2010 Gauss 7 ContestUploaded bygamagrista
- JIP 06 1 NumeralsUploaded byindology2
- Mathematics_Grade_11_Ch18_SurdsUploaded byapi-19999615
- tpack template iwb 2Uploaded byapi-355029044
- Fosnot C.T. Dolk M. Young Mathematicians at Work. Constructing Multiplication and DivisionUploaded byzameershah
- SilvaHoover PredatorPreyUploaded byugwak
- [Steven Galbraith] Mathematics of Public Key CryptUploaded byN Anil Kumar
- AQA-MM1A-MM1B-MM2A-MM2B-MM03-MM04-MM05-SQPUploaded byclassicalcat
- DemandEstimationUploaded byFarihan Idris
- dp2 trig qpUploaded byMuhammad Asad Ali
- Std09-Maths-TM-2.pdfUploaded byTNPSCbooksmaterials
- Arly as the 6th Century AD by Indian MathematiciansUploaded byHaryo Wawan
- Isaac Newton and the Publication of His Mathematical ManuscriptsUploaded by1unorma
- Motion in 2 dimensions Vidyamandir ClassesUploaded byYadav