Beruflich Dokumente
Kultur Dokumente
Abstract
Our survey tried to identify if there was any correlation between console and video game
preference and which groups showed significant engagement in the practice. We compared
data with different methods of sampling data (SRS vs. biased) and tried to see if any difference
in approach skewed the response. We learned by SRSing the data, we significantly limited the
chance of errors in our data collection opposed to the biased version. In the end we found that
Hypothesis
Our hypothesis was based on social stigma. We thought that boys would have more video
games than girls and play them more times in a week. Also, since we developed the survey
during the time that the popular shooting game Black Ops came out, we expected to see a rise
in playing time and an increased use in Playstation 3 and Xbox specifically for boys.
Non-biased survey:
Hello
This is Tina, Emily, Amrita, and Eamon. As one of our AP stat projects, we need to survey
people. You were one of the people that was randomly selected. It would be greatly appreciated
if you take some time to fill out this survey. Thanks and here’s the URL.
https://spreadsheets.google.com/viewform?
formkey=dE44cDNJT0prX0hKUTd1OTJRMkFBZmc6MQ
(as an error, we had to have everybody retake the survey because we didn’t know how to sort
the data into biased and non-biased because we sent everybody to the same link)
So sorry. But could u just answer the survey through here. Thanks.
Because we knew that Facebook was an easy way to access many or all of the people we
were surveying, we sent out our surveys through Facebook and explained to the people
who we were and asked them to take 5 minutes to answer the survey questions. We all
sent out the same messages to everyone to avoid any possible signs of bias.
Because we were gathering data in two different ways (biased and non-biased), we
realized that we could not send out the link to the survey to everyone or else we would
not be able to figure out which of the answers were from the biased survey and which
were from the non-biased survey. Therefore, we had to send out the actual survey itself
and had everybody reply with their responses.
Also, after we collected the data, we saw that some people didn’t respond to the survey.
This was or could have been because they were not frequent Facebook users and did
not see the message, or because they thought the survey was junk mail or a spam message.
Also we limited the communication to simple forms of distributing the messages such as
Facebook. In our SRS sampling, some of the people we randomly selected did not have a
Facebook or we did not have their email.
In this case, we had to skip that person and SRS for another person because there was
no way to contact them.
Since all four members of the groups were juniors and we all distributed the messages to our
friends, only the upperclassmen were represented.
This put pressure on our friends to take part in the survey and answer the questions in a
way that satisfied the way we saw them.
Since most of us sent messages to our friends in the biased survey, and there is a ratio of 3
girls to 1 boy in our group, a significantly higher amount of messages were sent to other female
students over males.
We copied and pasted the survey in the biased message as well, so the person had to type their
responses rather than clicking a button.
We also let the message recipients know they had been selected to participate in the bias part
of our survey.
Letting our friends know they were not randomly selected could have potentially
influenced their answers as well. Some might have purposely skewed the data and answered
inaccurately since they knew we weren’t going to use their data as the main part of our random
sampling. Others might have felt pressured to see how we would use their biased data.
SRS Analysis
We took 3 girls and 3 boys from each grade. This even distribution of both genders assured our
group that we are receiving equal amounts of input from both sides and preventing bias. Inside
the Facebook message there was a link they could click to take them to a survey that would
submit their input anonymously. This way they would not feel pressured from any interviewer
bias. We used a template to send out the exact same message to each person, so our content
in the message equally affects all recipients to the survey invitation.
Purpose
The purpose of this survey was to see if a person’s video game preference was influenced by
the type of console that person purchased and to identify the video game habits of Gunn High
School students. In addition, we also wanted to see the effects of an unbiased survey on data
compared to a biased survey.
Conclusion Analysis
Conclusion
Questions
Length vs To the point
How much should there be?
how many graphs?
Do we needa box and whisker plot? (what kinds of graphs)
Introduction
On November 9, video game creator Activision released the long awaited game “Call of Duty:
Black Ops.” This event corresponded with the creation of AP Statistic’s Survey Project, resulting
in our survey about video games. We surveyed Gunn students from all grades in order to find
out their video game habits, such as their favorite consoles, most played games, and frequency
of playing time. We looked to see if the release of a popular video game would increase the use
of students’ Playstations 3’s or XBox