Sie sind auf Seite 1von 41

1

An Evaluation of the ​Everyone’s a Critic


(or they should be)​ Information Literacy
Workshop

Prepared by: Kayla Hennis, Jennifer Dodson, Julia Murphy, Gina


Madera, and Josselyn Hernandez
2

Table of Contents
Executive Summary……………………………………………... 3
Introduction………………………………………………………. 5
Evaluation Methods……………………………………………. 8
Results……….…………………………………………………... 11
Conclusions……………...……………………………………… 16
Project Cost……………………………………………………... 19
References………………………………………………………. 20
Appendix A………………………………………………………. 21
Appendix B………………………………………………………. 33
Appendix C………………………………………………………. 35
Appendix D………………………………………………………. 41
3

Executive Summary
Overview
This formative evaluation report is in response to an RFP that asked to determine the
effectiveness of the workshop, ​Everyone’s a Critic (or they should be).​ This report provides the
Everyone’s a Critic (or they should be)​ team feedback on instruction and determined the
effectiveness of the workshop in addition to providing student outcomes. The workshop was
originally designed for first-year college students but could be used as a professional
development tool for computer teachers, media specialists, and academic librarians. This report
describes the various data sources, evaluation procedures, results, conclusions, a finalized
budget, and the appendices.

Program Description
The workshop, ​Everyone’s a Critic (or they should be),​ was designed to provide learners with
the critical thinking and information literacy skills necessary to be informed researchers and
consumers of information. The 50 - 60 minute workshop provided students with the necessary
skills to evaluate the quality of information and sources necessary for research, provided
students with tools to locate and define different types of sources, and helped students identify
potential biases and decided if those biases affected credibility.

Data Sources
Three data sources were used to gather information on the effectiveness of the ​Everyone’s a
Critic (or they should be)​ workshop. Eighteen students from an ENGL 1100 class at Columbus
State Community College participated in the ​Everyone’s a Critic (or they should be)​ workshop
on April 4th, 2018. They were surveyed during the workshop via pre and post test assessments
and through an electronic learner satisfaction survey. A subject matter expert and workshop
instructor were interviewed and asked to share their observations and comments. In addition to
the participants and subject matter expert, the ENGL 1100 professor was also interviewed. For
this workshop, the SME and workshop instructor were the same person, Ms. Kayla Hennis.

Evaluation Procedures
Multiple data collection instruments were used. A pre test was administered before the
workshop began and a post test was given after the workshop. The course instructor provided
4

observational data based on student participation, student performance, and course flow. A
learner satisfaction survey was provided to the participants after the workshop. The evaluators
included their expert judgement and provided recommended changes for the workshop.

Results
There were four types of data sources used to collect the final data throughout this process: pre
test results, post test results, learner satisfaction survey, and expert judgement. With 18
participants, the pre test also gauged how students viewed information literacy and what
sources they think are credible, resulting in only 6 participants knowing which type of sources
are considered scholarly. Of the 18 participants, only 17 students answered the post test
questions. Of the 13 questions, ten of the questions had at least 76% correct or higher,
indicating a growth in knowledge after participating in the workshop. The learner satisfaction
survey included eight student responses. Of those eight, 75% strongly agreed that information
literacy is an important part of the curriculum. Fifty percent of the participants noted that the
presentation helped them to evaluate the quality of sources. Overall, the results of the 18
participants showed adequate growth and understanding which is an indication of the
importance of the content presented in the workshop.
5

Introduction
Overview
The Association of College and Research Libraries defines information literacy as “the set of
integrated abilities encompassing the reflective discovery of information, the understanding of
how information is produced and valued, and the use of information in creating new knowledge
and participating ethically in communities of learning” (Framework, 2016, 12).

As a Reference and Instruction Librarian, Ms. Kayla Hennis’ primary responsibility is to provide
information literacy and research instruction to a variety of courses. In her experience, many
students entering college are unable to critically assess whether an online or print resource
could be considered credible. According to Smith, et al, learners entering two or four-year
colleges are unable to effectively and efficiently locate, access, and use information (2013).
Learners also lack the knowledge of how to locate, access, and evaluate credible sources in a
variety of formats, including being able to differentiate between opinion and fact-based literature
and how the use of each can affect a source's credibility. The ​Everyone’s a Critic (or they should
be)​ workshop was created as a way to increase information literacy by teaching students about
the various types of sources, how to efficiently locate and critically examine a source for
credibility, and how to determine if a source is fact or opinion and how facts can be used to
influence bias.

This final report details the data sources and evaluation procedures used to gather data, as well
as any professional recommendations for changes to the workshop. This report includes a
detailed description of the data sources and evaluation procedures utilized as well as any
recommendations for change.

Program Description
The focus of this evaluation is ​Everyone’s a Critic (or they should be)​, a 50 to 60 minute,
one-time workshop offered to adult learners, average age 17-33, entering a two-year or
four-year undergraduate program. The original intent of the four topic workshop focuses on
providing the participants with the skills to think critically about information and apply the learned
knowledge to assignments and real world scenarios. Topics include fact vs. opinion, credibility
of sources, different source types, and locating sources though a basic internet search and a
6

library database. Due to time constraints and instructor input, the fact vs. opinion section of the
workshop was excluded. The instructor of the workshop was Ms. Kayla Hennis.

The workshop began with the 6 question pre test to gauge the students understanding of
information literacy. The first topic Ms. Hennis introduced was the different types of sources and
she explained how anything can be a source depending on the context of the information and
research needs. Ms. Hennis emphasized that authors write for many different reasons and the
importance of evaluating a source for reliability.

The second part of the workshop involved the information cycle. Information changes as new
facts, data, knowledge, etc. comes to light which means what was credible last year might not
be credible today. Sources are published in a cycle, all focused on a key event or idea. The
further from the event, the more credible the source because the author and publisher has had
time to verify facts and research, in addition to peer-reviewed.

Lastly, the workshop discussed the 5 W’s of evaluating information and sources: Who, What,
When, Where, and Why. Once this information was presented, Ms. Hennis provided some in
class practice, evaluating a website on traumatic brain injuries. Finally, Ms. Hennis introduced
the different types of library databases and how to conduct an in depth search using the
7

Academic Search Complete database. The workshop concluded with the 13 question post test.
After the workshop, Ms. Hennis sent the students a learner satisfaction survey via Google
Forms.

Program Objectives
The main goal of the ​Everyone’s a Critic (or they should be)​ workshop was to provide learners
with the critical thinking skills needed to effectively locate and evaluate information. The purpose
of this evaluation was to determine if the workshop was successful in reaching this goal. The
questions that guided this evaluation are as follows:
1. Does the workshop provide learners with the tools to evaluate the quality of
information and sources when researching?
2. Does the workshop provide learners with the tools to locate and define the
different types of sources?
3. Does the workshop provide learners with the tools to identify potential biases in
information and how it can affect credibility?
8

Evaluation Methods
The focus of this evaluation was centered around answering the following questions: does the
workshop provide learners with the tools to evaluate information and sources; to locate and
define the different types of sources; and identify potential bias in a source and how it can affect
credibility. Various data sources and evaluation instruments were used in collecting information.

Participants
Students
Eighteen students from a basic freshman English Course (ENGL 1100) at Columbus State
Community College attended the workshop. Prerequisites for the ENGL 1100 is a successful
completion of high school or GED equivalent. Their ages range from 18-30 with one student
older than 40. Eight of the students were English as a Second Language Learners. The
students were given a pre and post workshop assessment during the instruction session to
measure knowledge retention and concept understanding. An electronic survey was sent after
workshop completion in order to gauge learner satisfaction and attitude.

Course Instructor
The professor of the English course was interviewed prior to the start of the workshop and was
encouraged to share feedback afterwards. The course instructor has been teaching the basic
English composition course at Columbus State Community College for five years.

Librarian
An interview with the subject matter expert and workshop instructor was conducted after the
workshop to gather observations and expert judgment. The librarian has been teaching
information literacy courses for over five years and works with students one-on-one to develop
their critical thinking and research skills.

Data Sources
Participant Performance and Attitudes
Data was collected through two assessments and one student survey: a six question pre test, a
13 question post test, and a 15 question learner satisfaction survey. The pre and post test
assessments were administered during the workshop through TurningPoint software embedded
9

in the PowerPoint presentation. The learner satisfaction survey was administered two days after
the workshop via a Google form. Appendix A includes all student pre and post test assessment
questions and answers. Appendix C includes all Google form data from the learner satisfaction
survey.

Interviews and Expert Reviews


Observations and expert judgement was gathered through interviews with the course instructor
and subject matter expert (librarian). For this evaluation, the workshop instructor and the subject
matter expert are the same person. All interviews lasted 10-20 minutes and were guided with
the expert judgement rubric in Appendix B.

The evaluation team calculated the correct versus incorrect answers via the pre and post test
responses and provided comparative data to determine growth. The evaluation team also
collected the post workshop learner satisfaction survey data in an effort to build
recommendations for changes to the workshop based on feedback.

Evaluation Procedures
The evaluation procedures focused on the data sources detailed above, participant performance
and attitude, and interviews and expert reviews. The information gathered from the workshop
participants (students), the ENGL 1100 professor (course instructor), the workshop instructor,
and a subject matter expert (librarian) was used to answer the evaluation questions and make
recommendations for improvement.

The original workshop was designed to focus on four topics: fact vs. opinion, credibility of
sources, different source types, and locating sources though a basic internet search and a
library database. Upon request of the ENGL 1100 instructor, the content of the workshop was
changed to the four topics mentioned in the program description: the different source types and
the information cycle: how information changes as new knowledge comes to light; how to
evaluate information and sources using the 5 W’s of Evaluation (Who, What, When, Where, and
Why); and how to locate sources using a library database. The ENGL 1100 instructor felt that
the fact and opinion topic was not suited for college-level learners.
10

The workshop was conducted in a 30 seat classroom with student computers, instructor station,
and projector. The primary mode of demonstration was a powerpoint presentation with the pre
and post test added in via the TurningPoint software. Students were given the pre test before
the workshop began in order to get a sense of prerequisite knowledge and misconceptions
surrounding information literacy. Throughout the workshop, Ms. Hennis observed students as
they participated in the workshop and worked on the in-class activity. The ENGL 1100
instructor was also present during the workshop and observed the students. Near the end of
the workshop a post test was given to the students to measure growth. Two days later,
Ms.Hennis sent the students the learner satisfaction survey via Google Forms.

The evaluation team collected the pre and post test scores along with the learner satisfaction
survey results. Ms. Hennis also filled in the Expert Judgment Rubric that can be found in
Appendix C. This rubric allowed the evaluation team to quantify Ms. Hennis’ observations of
student growth, engagement, and satisfaction. The pre and post tests were compared to
determine growth, while the learner satisfaction survey results were used along with the Expert
Judgement Rubric to make recommendations for changes to the workshop.
11

Results
Pre Test Results
Quantitative data was collected from a pre and post test administered during the workshop.
They were used to measure growth and confidence levels in information literacy. The pre test
was designed to show that information literacy is fluid and credibility of information depends on
what content is being searched for and when. The pre test also gauged how students viewed
information literacy and what sources they think are credible, prior to attending the workshop.
Prior to the start of instruction, only six students knew what type of source is considered
scholarly: journals.

When the pre test results were examined, it was difficult to determine the starting point for
student growth due to the fact that there were multiple correct answers. The test did not reflect
that more than one answer was possible for some questions. However, of the six questions,
three of them did have one correct answer. Even though there were 18 participants in the
workshop, only 17 students actually participated in answering the questions. Forty-three
percent of students answered ​“Yo
​ ur assignment asks you to use scholarly information to support
your thesis. Which would be the most appropriate source to use”​ c​ orrectly. Eighty-five percent
​ hen evaluating sources for their credibility, you should consider all BUT
of students answered ​“W
the following:”​ ​correctly, and 85% answered “​Do you think grammar and formatting errors affect
a sources credibility” ​correctly. These responses indicated that while students believed they
understood some aspects of what makes a source credible, not all realized what scholarly
information or sources are. See Appendix A for the pre and post test questions and participant
answers.

Post Test Results


The post test was also designed to show that information literacy is fluid and that it depends on
the content being searched for and when. When the post test results were examined, it was
difficult to determine the end point for student growth due to the fact that there were multiple
correct answers and the test did not allow for multiple answers to be chosen. Ms. Hennis did
mention to the students that some of the questions had no right or wrong answer. In an effort to
score the test, if the student chose at least one of the correct answers, it was considered
correct. Of the 18 participants, only 17 students answered the post test questions. Of the 13
12

questions, ten of the questions had at least 76% correct or higher. One question had 59%
correct, another 52% correct, and the last one had 18% correct. While difficult to determine, the
overall percentages indicate a growth in knowledge.

When looking at comparative data, there seems to be growth between pretest answers and
posttest answers.

Pre test Question 5 asked the learners if they think grammar and formatting errors affect a
sources credibility in a yes or no format. Post test Question 17 asked the learners “While
reviewing a source for a research paper, you notice many typographical and grammatical errors.
What might this indicate?” Question 17 asked learners to apply the knowledge acquired in the
workshop when choosing an answer: “the source may be inaccurate; the source may not be
relevant; the author’s personal biases are evident’; the source is peer-reviewed; not sure.”
Since question 17 was not a standard yes or no answer, the subject matter expert analyzed the
responses and concluded that 16 students believe grammar and formatting can affect a sources
credibility and one student does not. Fourteen students responded in Question 5 and 17
responded in Question 17.
13

Pre Test Question 6 asked the learners to decide the best order for researching a topic, choices
were: Books – Social Media – Websites – Database; Encyclopedia – Database – Website –
Social Media; Social Media – Books – Website – Database; Website – Encyclopedia – Social
Media – Database; Not Sure. Post test Question 18 asked the learners where they think the
best place to start looking for information or sources on a topic is. The subject matter expert
analyzed the data from the two questions and broke it down into 5 possible answers:
Websites/Social Media, Books, Databases, Encyclopedias, and Not Sure. The data shows that
prior to the workshop learners were more likely to start their research online or an encyclopedia.
After the workshop, users are more comfortable using library databases for their research. See
Appendix A for the pre and post test questions and participant answers.

User Satisfaction Survey


Of the 18 participants in the workshop, eight students have responded to the learner satisfaction
survey. Seventy-five percent of respondents strongly agreed that information literacy is an
important part of the curriculum. Fifty percent of the participants noted that the presentation
helped them to evaluate the quality of sources, while one participant disagreed. Although this
participant disagreed that information literacy was important to college curriculum and also
disagreed that the presentation taught how to evaluate the quality sources, this person
14

commented that the course was “excellent” and would not change anything. That same
participant said they were likely to use information presented for future assignments and found
the presentation useful. Sixty-two and a half percent of participants strongly agreed that the
presentation taught them to be aware of potential bias. When asked what they would change
about the presentation, one student was quoted as saying “time/pacing because had less time
to think about the answer choices” and another said to add a hands on activity, while another
one felt there was too much repetition.
15

Observations and Interview Data


Ms. Hennis observed the students during the ​Everyone’s a Critic (or they should be)​ workshop.
She observed that some students were definitely more interested than others. Those that were
interested were more engaged with the material and attentive, responding to questions when
prompted or making eye contact. A few of the English language learners were taking notes
during the presentation and were very observant, asking questions when necessary. In addition,
about halfway through the workshop, some students became disinterested. Looking at the data
from the learner satisfaction survey, this could have been because of concept repetition.
Students also exhibited frustration with the clickers while answering the pre and post test
questions, which may have been due to technology malfunction or too many questions.

The interview with the ENGL 1100 course instructor indicated that the information presented in
the workshop was extremely helpful and benefited the students greatly. The professor also
indicated that the information presented was helpful for not only their class, but future classes or
projects the students will take. Prior to the start of the workshop, the professor commented that
a few of their students were hesitant to begin researching for their final project because of the
“must use credible sources” requirement. The comments from the professor, prior to the
workshop and after, imply that the content covered provided their students with the necessary
skills to evaluate and locate credible sources.
16

Conclusions
Discussion
The goal of this evaluation was to determine if the ​Everyone’s a Critic (or they should be)
workshop was successful in providing learners with the critical thinking skills needed to
effectively locate and evaluate information. Using the questions stated in our program objective
to guide this evaluation, the overall results are positive.

The​ ​first question this evaluation sought to answer was whether or not the workshop provided
learners with the tools to evaluate the quality of information and sources when researching. The
learner satisfaction survey and interviews gave good insight into this question. In the learner
satisfaction survey, 75% of respondents agreed that the workshop helped them to evaluate the
quality of the sources which supports that the workshop does meet this goal. While one student
disagreed, that respondent also said they would not make any changes to the workshop so their
response may not be an accurate indicator of their true sentiment. Another question on the
survey asked respondents to measure comfort level with evaluation of quality and sources. All
respondents rated themselves a seven or higher on a ten-point scale, so all learners felt
confident in their ability post workshop. The post workshop interview with the ENGL 1100
instructor also revealed their strong feelings of confidence that the workshop would be
instrumental in helping their students evaluate the quality of information and sources in their
upcoming assignments. They also voiced that the content was applicable beyond their class
and would benefit students in all their research while in school. The learner satisfaction survey
and results can be found in Appendix C.

The second question this evaluation expected to answer was whether or not the workshop
provided learners with the tools to locate and define the different types of sources. Based on the
results of the post test, most students grasped the concepts surrounding how to identify and
evaluate sources. Additionally, the post workshop interviews were a good data source for this
question, showing that the ENGL 1100 professor felt the students would be able to use the
presented information immediately. Since this workshop was a “one-shot” instruction session,
i.e. the workshop instructor does not grade the final assignments, there is no definitive way to
know if the students used certain sources in their projects. Follow-up with the ENGL 1100
17

professor after the assignments have been graded would be necessary in order to see what
sources learners used and how they were located (i.e. web vs. database vs. print).

The third question this evaluation proposed to answer was whether or not the workshop
provided learners with the tools to identify potential biases in information and how it can affect
credibility. A majority (62.5%) of workshop participants strongly agreed that the presentation
taught them to be aware of potential bias. The results for questions surrounding bias had a
majority of correct answers on the post test so there is evidence that students understand the
concept of bias. The learner satisfaction survey asked, using a 5 point likert-type scale rating
from strongly disagree to strongly agree, “the presentation taught me to be aware of potential
bias and how it can affect credibility” with 87% of respondents either selecting agree or strongly
agree. This data confirms that the workshop provided learners with the skills to identify potential
biases and their effect on credibility. See Appendix C for the learner satisfaction survey.

Although the results of the learner satisfaction survey were overwhelmingly positive, it’s
important to note that, out of 18 participants, only eight responded (44%) to the survey so there
is a limited sample size. For future workshops, the learner satisfaction survey should be given
to the participants before they leave the workshop instead of given after, in order to collect more
responses. Additionally, because the questions on the pre and post tests were not identical, it is
hard to gauge whether learning gains were as a result of the workshop or students’ prior
knowledge. By making the pre and post test questions identical, this issue could be resolved
and make the evaluation results stronger. Additional data and evaluation may be needed to
answer the three evaluation questions more accurately, and in turn, improve client confidence in
the workshop.

The above data shows that the ​Everyone’s a Critic (or they should be)​ workshop is successful in
providing learners with the critical thinking skills needed to locate and evaluate information. With
the below recommendations taken into consideration, the workshop can improve on pacing and
knowledge retention with a more active learning approach, see Appendix D for an example.

Recommendations
There are several recommendations based on observations, feedback, and the librarian’s
experience teaching the workshop. Modifications to the content may be required to improve
18

timing, flow, and address student learning preferences. The fact or opinion exercise should be
removed to reduce time and align better with the learner’s experience. Since students are
exposed to fact and opinion well before college, covering it again in this workshop does not
seem appropriate for the age or skill level of participants. Instead, including information about
detecting bias may be more useful for students looking to synthesize large amounts of
information quickly for projects and assignments. The content structure should be modified to
improve information flow, with the information cycle appearing before evaluating sources and
databases to follow how students are likely to use the concepts in real life practice. Additionally,
the workshop instructor observed students losing interest when completing the 13 post test
questions one after each other, leading her to question the validity of the answers. The
recommendation is to include questions in a knowledge check format rather than a post test
consisting of 13 consecutive questions.

Survey data revealed participants felt rushed in selecting answers to questions so pacing and
timing should be evaluated throughout the workshop. It may be beneficial to pilot the workshop
with a control group to ensure timing is correct. Further, a suggestion to add a hands-on activity
led to the recommendation that an activity be added where students are asked to evaluate
screenshots of articles and websites rather than asking them knowledge questions. The activity
should provide an opportunity to apply their newly acquired knowledge. See Appendix D for an
example hands-on activity.

In addition to a hands-on activity, the pre and post test questions need to be modified to include
only questions with one correct answer or an additional statement be added to the majority of
questions directing the student to select all that apply. ​The print versions of the post test
questions allow for this but the current software used in embedding them into the PowerPoint
only allowed for one answer per person per question so making this change would require the
use of Poll Ever​ywhere or similar software to ensure a positive experience. ​The final
recommendation surrounds the need for an analysis to be completed to identify repetitive
content. Some of the content may need to be rephrased to ensure it is articulated in a unique
way or potentially removed if it is deemed to be redundant.
19

Project Cost
The total budget for the project is $5100. A detailed budget outlines evaluator time spent on
data collection, compilation, and analysis. Printing costs were also included. The hourly rate
was billed at $50 per person.

Activity Man Hours Team Members Cost

Create participant 4 hours x 5 members 5 ECET Members $1000


evaluation = ​20 hours
instruments

Finalize evaluation 1 hour x 5 members 5 ECET Members $250


tools = ​5 hours

Survey participants 2 hours K. Hennis $100

Compile results from 2 hours x 5 members 5 ECET Members $500


evaluation tools = ​10 hours

Analyze evaluation 2 hours x 5 members 5 ECET Members $500


tool results = ​10 hours

Compile final report 10 hours x 5 5 ECET Members $2500


members = ​50 hours

Printing 1 hour K. Hennis $50

Supplies and N/A $200


instructional
materials:
● Tips for
Evaluating
Information
Handout
● Evaluating
Information
Rubric (2 per
student or one
double-sided
sheet)

Total 98 $5100
20

References
Association of College and Research Libraries. (2015). Framework for information literacy for
higher education. Retrieved from
http://www.ala.org/acrl/sites/ala.org.acrl/files/content/issues/infolit/Framework_ILHE.pdf.
Hennis, K., Moon, P., & Richardson, J. (2017). ​Everyone’s a Critic (or they should be) ​(EDT 502
Project). Arizona State University, Tempe, Arizona.
Smith, J., Given, L., Julien, H., Ouellette, D., & DeLong, K. (2013). Information literacy
proficiency: Assessing the gap in high school students' readiness for undergraduate
academic work. Library and Information Science Research, 35(2), 88-96.
doi:10.1016/J.LISR.2012.12.001.
21

Appendix A - Pre Test Questions 1-7 & Post Test Questions 8-19

Q1: No Correct Answer


22

Q2: Correct Answer: Journal; Q3: No Correct Answer


23

Q4: Correct Answer: Length; Q5: Correct Answer: Yes


24

Q6: No Correct Answer


25

Q7: Correct Answer: B


26

Q8: Correct Answer: D


27

Q9: Correct Answer: D; Q10: Correct Answers: B, C, D


28

Q11: Correct Answer: D: Q12: Correct Answer: A, C


29

Q13: Correct Answers: A, B, D


30

Q14: Correct Answer: C; Q15: Correct Answer: B


31

Q16: Correct Answer: C; Q17: Correct Answer: A


32

Q18: Correct Answer: A, B, C, D; Q19: Correct Answer: False


33

Appendix B - Expert Judgment Rubric

Not Effective Somewhat Effective Effective

User Performance 5 Points 10 Points 15 Points

12/15 Learners showed no Learners showed little Learners showed


growth from pretest to growth from pretest to adequate growth from
posttest posttest pretest to posttest

12/15 Learners exhibited a Learners exhibited Learners exhibited a


lack of understanding little understanding on good understanding on
on in-class practice in-class practice in-class practice

Comments:​ A majority of learners exhibited growth and understanding of the concepts while
others seemed disinterested and unwilling to participate. Some of the questions could have been
reworded in order to help with comprehension. After the presentation, students were provided
time to do individual research, most seemed comfortable with searching and evaluating the
information they came across, while a few needed a bit more guidance. Those that needed more
guidance were English Language Learners so the language barrier could have contributed to the
misunderstanding.

Not Effective Somewhat Effective Effective

Observations 5 Points 10 Points 15 Points

Learners did not Learners engaged in Learners were engaged


10/15 engage in discussions some discussions and in discussions and
and did not participate participated little when participated when
when asked asked questions. asked questions.
questions.

Learners were unable Learners had difficulty Learners were able to


N/A to accurately fill in the filling in evaluation fill in the evaluation
evaluation rubric and rubric, but were able to rubric with ease and
had many questions. eventually do so after asked very few
questions were asked. clarifying questions.

Learners do not Learners have difficulty Learners understand


appear to be understanding provided the concepts with
10/15 understanding the content, but understand relative ease. Very little
content provided. once instructor answers questions are asked.
Instructor has to stop questions.
and readdress topics
34

to clarify to whole
class.

Learners are unable Learners have difficulty Learners have no


to participate in their participating in their own difficulty participating in
10/15 own discussion discussion group, but their own group
groups. with instructor discussions.
provocation gain
momentum.

Comments:​ I was informed prior to the start of the workshop, by the professor, that this group of
students was very quiet and not very forthcoming or willing to participate in open discussions.
Three students frequently responded when I prompted or asked for an answer but a majority of
them didn’t seem comfortable sharing. Those that did respond, seemed to grasp the concepts and
knowledge fairly quickly. Once the presentation was over and the class was instructed to start
researching a topic, the discussion did pick up. I witnessed several students ask their peers where
they started searching and if they had found anything worthwhile.

I passed out the rubric and explained its purpose, but students did not fill it out in class. They were
instructed to use it while researching for their final writing assignment.

Not Effective Somewhat Effective Effective

Learner 5 Points 10 Points 15 Points


Satisfaction

Learners do not feel they Learners feel they Learners feel they
12/15 learned anything in the learned a little bit, but learned a lot in the
workshop. still have many workshop.
questions.

10/15 Learners do not feel the Learners feel the Learners feel the
workshop was a valuable workshop was somewhat workshop was a
learning experience. valuable valuable
experience.

Comments:​ Based on the results of the learner satisfaction survey, the participants felt the
workshop was valuable and are comfortable evaluating information and sources for future
assignments.

Total Points: ​74/105


35

Appendix C - User Satisfaction Survey Results


36
37
38
39

What was your overall assessment of the workshop/presentation? 


8 responses 
 
I learned how to find out credible sources. 
Great outlook into better and a successful research paper 
Very informational 
Great!!! 
I didn't even get to see this thing! 
Excellent 
Very helpful 
Informative and well put together! 
40

If you could change anything in the presentation, what would you 


change and why? ​8 responses 
 
N/A (2) 
Time pacing because had less time to think about the answer choices. 
Add a hands on activity 
I'd listen to it and be there as well. 
Nothing 
I felt there was a lot of repetition so maybe just reword some things instead of repeating it over and 
over again 
Nothing comes to mind 

 
41

Appendix D - Website Recommendations for Hands-On Activity

Below is a list of “suspect” websites that could be a used as a hands on activity for students to
evaluate websites and detect potential bias.

● Help Save the Endangered Pacific Northwest Tree Octopus​, zapatopi.net/treeoctopus:


The Pacific Northwest Tree Octopus does not exist, it is not a real animal.

● Save the Rennents,​ savetherennents.com: Rennent is not an animal, it’s a group of


enzymes

● Save the Guinea Worms​, deadlysins.com/guinea-worm: According to the CDC,


“Dracunculiasis, also known as Guinea worm disease (GWD), is an infection caused by
the parasite Dracunculus medinensis. A parasite is an organism that feeds off of another
organism to survive. GWD is spread by drinking water containing Guinea worm larvae.
Larvae are immature forms of the Guinea worm.”
-https://www.cdc.gov/parasites/guineaworm/gen_info/faqs.html

● Uncyclopedia​, en.uncyclopedia.co: Uncyclopedia is a parody of Wikipedia, claiming to be


the “content-free encyclopedia.” The articles are displayed in a similar format to those
found on Wikipedia.

Examples of how to integrate above websites into the workshop:


1. Workshop instructor provides screenshots of the home pages and about us sections of
each website and has learners evaluate them based on justt hose images. Once
completed, learners discuss their findings
2. Workshop instructor divides the participants into groups and give each group an ipad
with one of the websites already loaded on it. Allow the groups time to click through the
website and evaluate it based on what they see. Once finished, bring each website up
on the projector and have one person from each group share with the class their
findings.

Das könnte Ihnen auch gefallen