Sie sind auf Seite 1von 17

DIRECTED LEARNNG EXPERIENCE

Directed Learning Experience:

Focus on Classroom‐Based Assessment

Katherine Hallford

University of Nevada, Las Vegas

Culminating Experience

April 1st, 2019


DIRECTED LEARNING EXPERIENCE 1

Clark County School District is vast, operating 360 schools and serving approximately

320,000 students as of the 2018-2019 school year. (CCSD 2018) Due to the sheer size of the

school district, the socioeconomic status of my current school, and other factors, transience is

high in my assigned school. It is not unheard of to see several students come and go each year,

with some of those students potentially returning and leaving again several times. In my teaching

career, I have served at Cahlan Elementary in North Las Vegas, teaching English Language Arts

and Social Studies to the fifth grade population. Our school is high-performing, with 100% free

and reduced lunch, qualifying us as a Title One institution. My current homeroom and class of

record is comprised of twenty-seven students, thirteen identifying as male, and fourteen

identifying as female. Within that population, seven of those students are identified as English

Language Learners, who are all of Spanish speaking families. In my classroom, I have a variety

of students with unique needs, including vision services. They read a range of first grade through

eighth grade, and all require unique instruction to achieve academic success.

On an otherwise typical Monday morning, the principal enters my classroom with a new

student, Allison. Allison is a fifth grader who is 12 years old and has just moved to Las Vegas

from California. Typically, with a new student my first reaction would be to check Infinite

Campus for her academic history in order to gauge where she is at, but due to the transfer to a

new state, I have no information on Allison or her academic history beyond what she is able to

tell me. Unfortunately, that also means that I won’t have those records for weeks, if at all. I know

that in order to be successful in getting to know Allison as a person and scholar, I will have to act

quickly to assess her throughout the course of her first week at our school.
DIRECTED LEARNING EXPERIENCE 2

To perform a complete assessment on Allison in this crucial first week, I have to tap into

my education and determine which assessments will be the most relevant to my grade level and

academic specialties. This will provide me with more information about Allison’s strengths and

areas of growth. In order to do this, I will provide her with several assessments. These

assessments include a reading interest inventory, AIMSweb fluency assessment, Words Their

Way spelling inventory, comprehension assessment, and an informal writing assessment. These

assessments will allow me to create a more complete picture of who Allison is as a scholar. Once

completed, I will be able to use these assessments as a guide for differentiation and scaffolding

as I integrate her into our classroom curriculum and community.

Assessments

Assessments are, or should be, the root of all classroom instruction. Without assessments,

there is no true way to understand what our students have learned from what we’ve taught. These

assessments don’t have to be the typical summative assessments the general public may think of,

but instead can be made up of many summative assessments as we teach and observe. In the

instance of assessing my student, Allison, she will be completing many formative assessments in

order for me to start building an educational background on her.

Pellegrino, Chudowsky, and Glaser (2001) believe that assessment is most effective when

three components of reading assessment are followed: first to assist in learning, second to

measure achievement of each student, and third to analyze and evaluate data to improve

curriculum. In this, I agree: without utilizing all three of these components, the purpose of

assessment loses meaning and becomes aimless. Often times, students that are behind their peers

in reading by the time they reach me find it impossible to catch up. To avoid this, I have made
DIRECTED LEARNING EXPERIENCE 3

strategic decisions in the assessments I have chosen to give to my new student. My goal in doing

so is to identify areas of strength and needs in order to understand where I can help.

Interest Inventory

My first step in assessing Allison would be to assign her a reading interest inventory.

Flynt and Cooter describe the affective domain as “one of the most important and often ignored

aspects of reading assessment” (Flynt & Cooter, 2014, p.2) The affective domain are the

elements of reading that tap into the students interests and attitudes on reading. In utilizing this

survey, you get a baseline of several areas of importance. First is the general attitude a student

has about reading. Knowing whether or not they like reading will let you know how much they

are reading outside of the classroom. Additionally, the reading interest survey provides you with

knowledge of that student’s interests. This will allow you to begin the process of suggesting

books to that reader that are tailored to who they are as a person. Knowing whether or not they

are familiar with a certain genre or have an immense dislike of another, and selecting books

based off of their opinion is important to ensuring they feel like a welcome member of the

classroom reading community with a powerful voice. In providing this assessment to Allison, I

am able to quickly determine where her interests lie and can begin thinking of how I can

differentiate reading and writing assignments to allow her choice in what she is reading and

writing about in the classroom.

AIMSweb Fluency Passages

My next step in assessing Allison’s reading and writing knowledge is to assess her using

the AIMSweb Reading Curriculum Based Measurement (RCBM) passages. It is safe to assume

that by the fifth grade, Allison has been exposed to reading instruction of some sort, having been

enrolled in school prior to arriving in our classroom. These passages are required testing three
DIRECTED LEARNING EXPERIENCE 4

times per year at our school site. In using the RCBM to assess Allison, not only will I be able to

get a grasp on her reading abilities, I will be able to hear her read aloud and determine her

reading fluency, including any areas of strength or areas that may need additional support.

Rasinski (2004) states “This initial fluency assessment gives teachers baseline information

against which to measure subsequent progress” (p. 20) In utilizing these passages to assess

Allison, not only will I be able to establish her personal baseline, I will be able to compare it to

that of typical grade level peers.

Fluency is described as “the ability to read quickly, accurately, and with proper

expression” and is a vital skill in reading achievement. (NRP, 2000, p.3-5) This means that the

fluent reader will be able to read a passage aloud in a way that is akin to having a conversation

with another individual. Additionally, Rasinski states that it is important that reading fluency

also represents a “solid understanding of the passage” being read aloud (2004 p. 23). The RCBM

assessment analyzes student fluency by taking note of the words read correctly, along with any

miscues. These passages were curated by the AIMSweb staff and written by teachers with

experience in those grade level areas; these teachers wrote passages based on length (350 words

in 5th grade) and the Fry readability model for grade level appropriateness. (AIMSweb Technical

Manual, 2012, p. 3) In the curation of these passages, they chose three passages per grade level

based on the WRC in piloting reaching grade level means and standard deviations. (AIMSweb

Technical Manual, 2012, p.4)

With these research-based passages, I will be able to assess Allison on the same criteria

used in our school community and nationally. This is a benefit to both Allison and I, as just

handing her a passage and assessing fluency based on that will not likely yield results that are

backed by research and data. Additionally, many schools utilize this assessment, or something
DIRECTED LEARNING EXPERIENCE 5

similar, creating a more comfortable environment for Allison to assess in. This assessment is also

a benefit to Allison and me as it is the schoolwide screener for response to intervention (RTI)

placements. Depending on the score Allison received on this assessment, it will allow us to

determine her need for RTI services.

However, the AIMSweb assessments are not without negative attributes. First and

foremost, by assessing Allison using these passages, I will only be assessing her words read

correctly and incorrectly. It does not allow me to assess her reading skills in a true running

record fashion, instead only focusing on Allison’s automaticity and accuracy. Additionally, in

my personal experience, the R-CBM assessments don’t take into account accuracy if the reader

in question has a slower rate of reading. Slow, yet accurate readers can be flagged for at-risk

scores, when they are typically a high achieving student. As with any assessment, it is not

without flaws.

By utilizing the AIMSweb assessment, I will be able to see how Allison might begin to

fit into the classroom ecosystem. While fluency is not a complete picture, it allows me to

reassess any pre-determined assessments to see if there may be other, more pertinent assessments

that may be of further benefit to Allison and myself. If flagged for RTI services, Allison can then

quickly be placed into the necessary group and intervention services can be strategically targeted

to her individual needs.

San Diego Quick Assessment

After assessing Allison’s reading fluency, I will move on to using the San Diego Quick

Assessment (SDQ) to gain a better understanding of her instructional, independent, and

frustration reading levels. The San Diego Quick Assessment is different than that of the RCBM

or other reading assessments in that in assesses student knowledge of words out of context. This,
DIRECTED LEARNING EXPERIENCE 6

according to the San Diego Quick Assessment of Reading Ability, is beneficial because “weak

readers overrely on context and recognize words in context more easily than out of context”.

(San Diego Quick Assessment Protocol p. 68) This assessment was first discussed in a 1969

article in Journal of Reading, where Margaret La Pray and Ramon Ross state: “The graded word

list has two uses: 1) to determine a reading level; 2) to detect errors in word analysis. One can

use the test information to group students for corrective practice or to select appropriate reading

materials for those students. The list is remarkably accurate when used for these purposes.” (La

Pray and Ross, 1969, p.305) With this assessment, I will be able to further understand Allison’s

decoding skills and areas of need.

The benefits of utilizing this assessment are simple: it adds another layer of

understanding to who and where Allison is as a reader. It is a great as a first tool when assessing

students’ reading levels in the classroom setting. While the SDQ touts itself as being a more

reliable predictor of student ability than reading words in context, there is conflicting data that is

unsupportive of their claim. A study by Ardoin et. Al found that “participants' oral reading rate

of words in context was significantly greater than their rate of reading words out of context”.

(Aldoin et. Al 2013 p.255) With this assessment, I would utilize the data it produced, along with

the data from the AIMSweb fluency benchmark passages to guide me on a starting place for my

next assessment, the Flynt/Cooter Comprehension Assessment.

Flynt/Cooter Comprehension Assessment

After assessing Allison using the SDQ, I will use it to determine a good starting place for

the Flynt/Cooter comprehension assessment. The comprehension assessment, unlike the

AIMSweb fluency assessment, utilizes silent reading to assess student reading comprehension.

This assessment is done as a two-part process. After reading the passage, the student retells what
DIRECTED LEARNING EXPERIENCE 7

they remember from the passage, while the administrator checks off comprehension questions

addressed in the retelling. Afterwards, the administrator asks the students the comprehension

questions that were not addressed in the student’s retelling of the passage. According to the

Flynt/Cooter manual, “this process greatly reduces the time required for testing and is generally

more thorough than questioning alone” (Flynt & Cooter, 2014 p. 4)

My reasoning for using this assessment is to create a deeper understanding of where

Allison is as a reader. By moving from oral reading of both fluency passages and word lists to

truly understanding as she is reading. I have found, in my experience, that some students are

fluent readers, but are unable to comprehend what they read for a myriad of reasons. This is

backed by a 2007 study by Cain & Oakhill (2007), as referenced in Farrall’s Reading

Assessment: Linking Language, Literacy, and Cognition:

…about 10% of school-age children struggle with true comprehension deficits-deficits

that do not have their origins in word recognition, decoding, or reading fluency. These

children often go unnoticed by their teachers. They do not stumble over words, and they

do not read with painstaking efforts. However, their responses to questions based on text

may be superficial and fragmented, and they may also have difficulty formulating a well-

organized summary or narrative. (Cain 2009)

Keeping this in mind, it is important to move beyond fluency assessments and assess all areas of

reading, the “Big Five” areas of reading identified by the National Reading Panel (2000):

phonemic awareness, phonics, comprehension, vocabulary, and fluency. The AIMSweb

passages, along with the SDQ, all allow informal assessments of three of the five areas, but the

comprehension assessment allows the assessment of comprehension and vocabulary knowledge.


DIRECTED LEARNING EXPERIENCE 8

The benefits to utilizing a reading comprehension assessment are vast. First and foremost,

it allows us to look at a students’ inferential skills. Farrall states: “Inferential thinking is the heart

and soul of reading comprehension” (2012 p. 235). By utilizing a comprehension assessment, we

are not only looking at Allison’s skills at recalling information, but her ability to make inferences

from texts. As an added layer to her comprehensive understanding, by assessing Allison using

the comprehensive assessment, we are able to begin drawing conclusions about her background

knowledge by looking at both her inferential and retelling abilities. According to Farrall, this is

fundamental in learning experiences throughout life, stating:

These events, facts, and concepts are stored in long-term memory, and they become the

tools with which we interpret all new experience. Experience, coupled with rich

opportunities for language input, provides children with words, facts, concepts, and

knowledge of structure that enables them to interact and process the world about them.

(2012 p. 236)

Having an understanding of Allison’s background knowledge, comprehension skills, and

inferential knowledge is imperative to understanding who she is as a student and person.

However, these assessments are not without criticism. Morris (2015) gives one area of

deficit these assessments have:

Comprehension is a difficult area to assess, especially when the examiner is limited to six

to eight questions on a short 150-250 word passage. Paris and Carpenter (2003) pointed

out that passages in informal reading inventories can vary in difficulty, length, and

familiarity, thereby influencing the reader’s comprehension. In addition, question

difficulty can vary by item or passage, thus affecting comprehension performance and

measurement reliability. (pp. 7-8)


DIRECTED LEARNING EXPERIENCE 9

When assessing our students, it is important to see them as a complete picture whenever possible

but acknowledging deficits in student assessments will allow us to do so with as little bias via the

testing medium as possible. Once this assessment is completed, I will have a fairly accurate and

thorough idea of who Allison is as a reader. I will use this data to guide me in many ways, from

screening her for RTI to identifying student areas of need to place her in small group lessons and

centers during the school day. Without this assessment, and those preceding, her placement in

these areas may not be appropriate and cause her to feel either overwhelmed or bored and will

short change her academically.


DIRECTED LEARNING EXPERIENCE 10

Words Their Way Spelling Inventory

After wrapping up the basic reading assessments with Allison, my next step will be to

conduct a spelling inventory. Bear, Invernizzi, Templeton and Johnston compare literacy to a

braid of interwoven threads, and state that orthography is a thread that strengthens that bonding.

(2017, p.3) Spelling is often put on the backburner, but is still fundamental in literacy

development, as evidenced in the above reference. Words Their Way is a spelling program that

aims to use the developmental research of spelling development in students to build their

recognition of patterns in words, therefore increasing vocabulary development and reading

development. This is done through word study, which Bear et. Al states was “determined that

through an informed analysis of students’ spelling attempts, teachers can differentiate and

provide timely instruction in phonics, spelling, and vocabulary that is essential to move students

forward in reading and writing” (2017, p. 5). This word study system aims to have students

make meaning with words, recognizing patterns and using hands on activities to solidify

learning. This differs from the rote memorization and weekly word lists, accompanied by weekly

spelling tests.

The assessment portion of the words their way program is the spelling inventory. This

assessment has three formats: primary, elementary, and upper-level inventories. While these

assessments do require students to spell words called out by the teacher, they are chosen with

research in mind. These assessments will place students in the best developmental level for them,

allowing meaningful word study to take place. In Allison’s case, she will receive the elementary

spelling inventory. This inventory, once completed, will place her in one of the following

categories: emergent, letter name-alphabetic, within word pattern, syllables and affixes, and

derivational relations. By assessing Allison with this inventory, she will be able to integrate into
DIRECTED LEARNING EXPERIENCE 11

classroom word study seamlessly, joining classmates in respective spelling groups during small

group instruction.

While there are many benefits to the Words Their Way assessment, and it is used in

classrooms all over the country, there are vocal critics about the program as well. J. Richard

Gentry is one of the most outspoken, arguing that the spelling stages provided are not

developmentally sound, and that they often present low expectations on spelling achievement.

(Gentry, 200, p.328) While this criticism is valid, I have found in my classroom Words Their

Way can be an effective assessment and classroom program, if a bit time consuming on the

teacher end of things. This program has been especially effective with students that are learning

English as a second or other language, as it bridges any gaps they may have.

Informal Writing Assessment

The final assessment I will give Allison is the Informal Writing assessment. The informal

writing assessment allows me to build upon Allison’s orthographic knowledge and observe her

writings skills in a low-risk environment. Writing, and writing well, is incredibly important.

Graham and Harris describe the importance as such: “Writing is an indispensable tool for

learning and communicating. We use writing as a medium to gather, preserve, and transmit

information. Just as important, writing about what we are learning helps us understand and

remember it better” (Graham & Harris, 2016, p.5). By collecting an informal writing sample, I

will be able to see where Allison’s strengths and weaknesses lie. In particular, I will use two

protocols to assess her writing. First, I will assess using the Writing Continuum for Higher-Order

Concerns by Scott, Nagelhout and Spies (2016). This will allow me to place Allison on a

developmental level of writing by completing an analysis of her written skills. The second

protocol I will use will be the writing rubric provided by the department of education that is
DIRECTED LEARNING EXPERIENCE 12

utilized schoolwide. By using this rubric to score Allison, I will be able to see where she is

scoring in comparison to her peers. This will allow me to place Allison in an appropriate writing

partnership during workshop sessions and pull her in the appropriate small group conferencing

session.

The difficulties in utilizing an informal writing inventory to assess student writing are

limited, but present. First and foremost, it is up to the student to provide a truly accurate writing

sample. If the student is feeling less than motivated, has issues with the writing process, or

simply does not like to write, then gaining insight from a sample will be difficult. As Graham

and Harris state: “For example, even when a teacher assigns a writing task, the student must still

decide to do the task, determine how much effort to commit, formulate intentions and goals, and

decide how to accomplish it” (2016, p. 10). Additionally, the grading or analysis of the writing

sample is subjective; interpretation of the writing sample, continuum, and skills may vary

significantly from teacher to teacher. While the writing assessments are full of valuable

information for teachers to analyze and utilize in classroom instruction, the subjective nature of

the data makes the assessment less than ideal for highly accurate results.

Reflection

After administering all of the assessments to Allison, I will ideally have an assessment

portfolio that paints a near-complete portrait of Allison as a learner. With the variety of

assessments given, I should have enough data to fully integrate her into classroom lessons and

routines with little to no struggles academically. Without these assessments, the classroom

environment may be far more difficult for Allison to adjust to, and for myself to adjust to an

additional child in an overcrowded classroom. Each classroom is its own distinct ecosystem, and
DIRECTED LEARNING EXPERIENCE 13

any changes or disruptions to established routines can prove to be problematic if not handled

appropriately.

While assessment is incredibly important to each classroom and drives instruction, it is

also important to note that in today’s classroom, a battery of assessments similar to the ones

suggested above are near impossible to complete for each student in a timely manner.

Classrooms are overcrowded, with teachers and administration reaching for assessments that are

quick, easy, and readily available to assess each student in the classroom. In my fifth grade class,

that means assessing each of my 28+ students three times per year with assessments that can be

given to whole classes at once, with scoring that is simple to complete. The assessments given to

Allison will allow for a student profile that is more complete than simple benchmarks and an

SBAC score could possibly give an instructor regarding a student. However, by implementing

and integrating these assessments into everyday learning, teachers can be better informed to

instruct each student in their classroom in spite of the wide range of abilities and learning needs.
DIRECTED LEARNING EXPERIENCE 14

References

AIMSweb Technical Manual [PDF]. (2012).

Ardoin, S. P., Eckert, T. L., Christ, T. J., White, M. J., Morena, L. S., January, S.-A. A., & Hine,

J. F. (2013). Examining Variance in Reading Comprehension Among Developing

Readers: Words in Context (Curriculum-Based Measurement in Reading) Versus Words

Out of Context (Word Lists). School Psychology Review, 42(3), 243–261.

Cain, Kate (2009) Making sense of text : skills that support text comprehension and its

development. Perspectives on Language and Literacy, 35 (2). pp. 11-14.

Cain, K., & Oakhill, J. (2007). Reading Comprehension Difficulties: Correlates, Causes, and

Consequences. In K. Cain & J. Oakhill (Eds.), Challenges in language and literacy.

Children's comprehension problems in oral and written language: A cognitive

perspective (pp. 41-75). New York, NY, US: Guilford Press.

Cooter, R. B., Flynt, E. S., & Cooter, K. S. (2014). The Flynt/Cooter Comprehensive reading

Inventory-2: Assessment of K-2 reading skills in English and Spanish. Boston: Pearson.

Farrall, M. L. (2012). Reading assessment : Linking language, literacy, and cognition. Retrieved

from https://ebookcentral.proquest.com

Fast Facts 2018-2019. (n.d.). Retrieved February 17, 2019, from https://newsroom.ccsd.net/wp-

content/uploads/2018/10/Fast-Facts-2018-19-Eng.pdf
DIRECTED LEARNING EXPERIENCE 15

Gentry, J. (2000). A Retrospective on Invented Spelling and a Look Forward. The Reading

Teacher, 54(3), 318-332. Retrieved from

http://www.jstor.org.ezproxy.library.unlv.edu/stable/20204910

Graham, S., & Harris, K. (2016). A Path to Better Writing: Evidence-Based Practices in the

Classroom. The Reading Teacher, 69(4), 359-365.

Informational Writing Rubric for Grades 3-5[PDF]. (2018, August). Nevada Department of

Education.

JOHNSTON, F., Bear, D., Invernizzi, M., & Templeton, S. (2017). WORDS THEIR WAY. Place

of publication not identified: PEARSON.

La Pray, M., & Ross, R. (1969). The Graded Word List: Quick Gauge of Reading

Ability. Journal of Reading, 12(4), 305-307. Retrieved from

http://www.jstor.org.ezproxy.library.unlv.edu/stable/40011379

Morris, D. (2015). Morris informal reading inventory: Preprimer through grade 8. New York:

The Guilford Press.u

National Research Council (US) Division of Behavioral and Social Sciences and Education

Committee on the Foundations, of Assessment, Pellegrino, J. W., Chudowsky, N., Glaser,

R., & ebrary, I. (2001). Knowing what students know : The science and design of

educational assessment. Washington, DC: Washington, DC : National Academy Press.

Rasinski, T. V. (2004). Assessing Reading Fluency. Place of publication not identified:

Distributed by ERIC Clearinghouse.

San Diego Quick Assessment of Reading Ability[PDF]. (n.d.).

Scott, C., Nagelhout, E., Spies, T. (2016). Writing continuum: Key features for higher-

order concerns: Purpose, audience, focus, and organization.


DIRECTED LEARNING EXPERIENCE 16

Das könnte Ihnen auch gefallen