41 views

Uploaded by nawei27

Attribution Non-Commercial (BY-NC)

- probability worksheet 1 1
- Additional Mathematics Project Work 2
- The Basics - The Law of Increasing Returns
- Lesson 5_Understanding Random Variables
- Probability and 1 - P
- Course Notes
- Fifty Examples
- 2. Algorithm Analysis & Types of Algorithms.ppt
- Report Card Eli as Ma Rquez Delgado
- Project Work
- Reading Passage 3-Ielts4
- assess3 (1)
- Unlikely events
- Experimenting From a Distance EJP Special Issue 28(2007)
- WEEK 1
- FORUMLA 2
- Understanding Probabilistic Interpretation (Tugas 3)
- Pl3255page2
- Are Women Paid Less Than Men 10-09-2012 v2.02
- Probability Terminology And Concepts

You are on page 1of 4

[Type the author name]

What is probability & Mathematical Statistics? It is the mathematical machinery necessary to answer questions about uncertain events. Where scientists, engineers and so forth need to make results and findings to these uncertain events precise...

Random experiment A random experiment is an experiment, trial, or observation that can be repeated numerous times under the same conditions... It must in no way be affected by any previous outcome and cannot be predicted with certainty. i.e. it is uncertain (we dont know ahead of time what the answer will be) and repeatable (ideally).The sample space is the set containing all possible outcomes from a random experiment. Often called S. (In set theory this is usually called U, but its the same thing)

Discrete probability Finite Probability This is where there are only nitely many possible outcomes. Moreover, many of these outcomes will mostly be where all the outcomes are equally likely, that is, uniform nite probability. An example of such a thing is where a fair cubical die is tossed. It will come up with one of the six outcomes 1, 2, 3, 4, 5, or 6, and each with the same probability. Another example is where a fair coin is ipped. It will come up with one of the two outcomes H or T.

Terminology and notation. Well call the tossing of a die a trial or an experiment. Where we should probably reserve the word experiment for a series of trials, but when theres only one trial under consideration, then the experiment is just the trial. The possible values that can occur when a trial is performed are called the outcomes of the trial. Following the notation in the text, well generally denote the outcomes 1, 2,. .. n, where n is the number of possible outcomes. With a cubical die, the n = 6 outcomes are 1 = 1, 2

= 2, 3 = 3, 4 = 4, 5 = 5, and 6 = 6. With a coin, the n = 2 outcomes are 1 = H, 2 = T .Usually, well introduce a symbol for the outcome of an experiment, such as X, and call this symbol a random variable. So, if you ip a coin and an H comes up, then X = 1 = H, but if a T comes up, then X = 2 = T. Lets take the case of tossing a fair die. When its tossed, all six outcomes are equally probable, so they should all have the same probability. Well want the sum of all the probabilities of the outcomes to be 1, so for a fair die, each outcome will have probability 1/6. Well use a couple of notations to express the assumption that each of the probabilities of the outcomes of a fair die is 1/6. Sometimes, to emphasize that we have a measure for each outcome, well use the notation m(1) = m(2) = m(3) = m(4) = m(5) = m(6)=1/6. But, more often, we usually use a random variable and a probability notation P (X=1) = P (X=2) = P (X=3) = P (X=4) = P (X=5) = P (X=6) = 1/6. The probability notation is more convenient since we can use it to express probabilities other than X having a particular value. For example: P (X 5) expresses the probability that the die comes up with a value greater than or equal to 5, that is to say, either 5 or 6. Since each of 5 and 6 have probabilities of 1/6, therefore P (X 5) = 2/6 =1/3. Thus, the probability notation allows us to express probabilities of sets of outcomes, not just single outcomes. The set of all possible outcomes is called a sample space. For tossing a fair die, the sample space is = ,1, 2, 3, 4, 5, 6- = ,1, 2, 3, 4, 5, 6-. But a sample space is more than just a set; it also has measures for each of its elements. Thats the function m : *0, 1+ which assigns to each outcome i its measure m(i). This function m is called a distribution function or a probability mass function. Sometimes we come across a dierent symbol for it, like f. From m, we can derive probabilities for each subset E of the sample space . A subset E of the sample space, that is, a set E of some of the outcomes,is called an event. The probability P (E) of an event E is dened as the sum of measures of all the outcomes in the event. So, for example, we dene the probability of the event E of the outcome being greater than or equal to 5 as P (E) = P (X 5) = m(5) + m(6) =1/6 +1/6 =1/3.

Symmetry leads to uniform probabilities. So far, our two examples are uniform. A fair coin has two sides, and we assign the same probability to each because of the the physical symmetry of the coin. Likewise, we assign the same probability to each of the six sides of a fair die because of symmetry of the die. There are lots of other situations where symmetry implies that the probabilities of the outcomes should be equal. For instance, some of the examples well look at involve choosing balls from urns. Say an urn has 5 balls in it. Take out one of the balls at random. There are ve possibilities, and each ball has the same probability of being chosen. Thus, we get a uniform probability where the probability of choosing any particular ball is 1/5. The symmetry of the situation is required, however, to conclude that the probability is uniform. For instance, if you throw a ball at the moon, there are two outcomeseither it hits the moon or it doesnt. But the two outcomes are not symmetric, and it would be foolish to assume that the probability that your ball hits the moon is 1/2.

The frequency concept of probability. When theres no symmetry, we cant assume the probability is uniform. Suppose you bend a coin. Its no longer symmetric, so we cant assume that H and T have the same probability. But you can get some idea of what the probability is by performing the experiment of ipping the coin repeatedly. Suppose you ip the coin 1000 times and you get 412 heads and 588 tails. It would be reasonable to conclude that P (H) is near 0.412, and P (T ) is near 0.588. However, it would be foolish to believe those are the exact probabilities. For that, we assume that the basic probabilities that we use are known. But for some of the situations to consider, we wont know the basic probabilities and the problem will be to determined somehow by experiment to investigate what those probabilities are, and thats the provision of Statistics. Thus, the conclusion in the last paragraph that P (H) is near 0.412 is a statement in Statistics rather than a statement in Probability.

- probability worksheet 1 1Uploaded byapi-314332531
- Additional Mathematics Project Work 2Uploaded byAdam Ng
- The Basics - The Law of Increasing ReturnsUploaded byAdam Hill
- Lesson 5_Understanding Random VariablesUploaded byJennifer Bunquin
- Probability and 1 - PUploaded byAmy Li
- Course NotesUploaded bybubu
- Fifty ExamplesUploaded bysarvesh_mishra
- 2. Algorithm Analysis & Types of Algorithms.pptUploaded byZehrilaY- Entertainment
- Report Card Eli as Ma Rquez DelgadoUploaded byIsaí Jasso
- Project WorkUploaded byAmal Radhakrishnan
- Reading Passage 3-Ielts4Uploaded byEldho Skariah
- assess3 (1)Uploaded byLordFluffenstien
- Unlikely eventsUploaded byAMIT RADHA KRISHNA NIGAM
- Experimenting From a Distance EJP Special Issue 28(2007)Uploaded byCristina Slovineanu
- WEEK 1Uploaded byKimberly Gayosa
- FORUMLA 2Uploaded bySivakeerth Siva
- Understanding Probabilistic Interpretation (Tugas 3)Uploaded bysudiarto
- Pl3255page2Uploaded byDwight Tan
- Are Women Paid Less Than Men 10-09-2012 v2.02Uploaded byStinkyLinky
- Probability Terminology And ConceptsUploaded byVinayKumarSingh
- MTH263 Lecture 2Uploaded bysyedtariqrasheed
- lab5.pdfUploaded bySelvam Pds
- Lecture 06ppt.pdfUploaded bysana
- Muhammad AhmadUploaded byMuhammad Ahmad Mian
- vocabUploaded byapi-248518134
- UNIT iv.docxUploaded byshacarmi ga
- random processUploaded byvrsafe
- TIME TO RECRUITMENT FOR A SINGLE GRADE MANPOWER SYSTEM WITH TWO TYPES OF LOSS OF MANPOWER AND THE BREAKDOWN THRESHOLD HAS TWO COMPONENTS.Uploaded byIJAR Journal
- W37Uploaded byamalina rohaizan
- Probabilty AnswersUploaded byLee Xin Pei

- UPENN Phil 015 Handout 4Uploaded bymatthewkimura
- 2011 2nd Mid Term Exam f2 - Paper 2Uploaded byApple Lee Kuok Ing
- EVEN14Uploaded byjarizagarcia
- Resume DeborahnuqueUploaded byCathy Lopez
- Statistical Data AnalysisUploaded byTari Baba
- Vector CalculusUploaded byHarpreetSingh
- Greedy AlgorithmsUploaded byDevroy Malakar
- o4 Multifold AxiomsUploaded bywarekar
- Sergei P. Odintsov Constructive Negations and ParaconsistencyUploaded byThiagoNascimento
- CH03_5C_4Uploaded byFaisal Rahman
- The Impact of Using Technology on StudentsUploaded byFrederick Rabang
- Baye's TheorumUploaded bycybercalc
- Isoquant and IsocostUploaded byMsKhan0078
- MFUploaded byVince Bagsit Policarpio
- accelerated algebra one scope sequenceUploaded byapi-263209117
- 09 JanUploaded byAllyLau
- CADUploaded byrebyroy
- Equivalent FractionsUploaded byhayden
- rockUploaded byBabakBahrami
- 2000 Shape Description (Contours), Bryan S. MorseUploaded byArmin Okić
- Craig Bourne - A Theory of PresentismUploaded byKK-bookslover
- Gerber-Life Insurance Mathematics-Springer Berlin Heidelberg (1995)Uploaded byIván Avila
- 1 Comparative Analysis for Various Artificial Intelligence Techniques Applied to Induction Motor DriveUploaded byDrPrashant M. Menghal
- task 4Uploaded byapi-295576641
- Angular Displacement, Velocity, AccelerationUploaded byMuhammad Javeed
- Metallurgy Class NotesUploaded bySai Teja Ravi
- complete unit plan (days 1-11)Uploaded byapi-242219136
- articleUploaded byapi-286313605
- Q530_kkUploaded bySumedh Kakde
- 1405.7579.pdfUploaded bySiro Ajah