You are on page 1of 271

A Survey of School Psychologists Application of the Problem-Solving Model to

Counseling Services

By

Rebecca Cole

A Dissertation Submitted to the University at Albany,


State University of New York in Partial Fulfillment of
The Requirements for the Degree of
Doctor of Psychology

School of Education
Department of Educational and Counseling Psychology
2012

UMI Number: 3511677

All rights reserved


INFORMATION TO ALL USERS
The quality of this reproduction is dependent on the quality of the copy submitted.
In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.

UMI 3511677
Copyright 2012 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.

ProQuest LLC.
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 - 1346

UMI Number: 3511677

All rights reserved


INFORMATION TO ALL USERS
The quality of this reproduction is dependent on the quality of the copy submitted.
In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.

UMI 3511677
Copyright 2012 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.

ProQuest LLC.
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 - 1346

Acknowledgements
As I put the finishing touches on my dissertation and my graduate experience at
the University at Albany, I am filled with gratitude for all the people who have helped me
to get to this point in my career. I would like to begin by recognizing Dr. Deborah
Kundert, my dissertation chair. She has worn many hats over the years that we have
known each other, beginning as one of my professors, and then becoming my supervisor,
advisor, and mentor. Deb was a natural choice as a chairperson, as she has always
exuded the qualities of professionalism, attention to detail, and a strong work ethic that I
greatly admire and seek to develop in myself. Thank you, Deb, for helping me to be a
woman on a mission, and for devoting so much time, energy, and support to me over the
years. I would not have achieved my dissertation goals without you. Many thanks also
go to Dr. Kevin Quinn and Dr. Stacy Williams, my committee members, for carefully
reading and providing valuable feedback that helped to shape my dissertation. Stacy has
also played multiple roles in my development as a school psychologist, and I am
fortunate to be able to take so many lessons away from our experiences together. Thank
you, Stacy, for building me back up and for helping me to stay the course when things
got challenging.
Every endeavor I have pursued has been supported by my parents, Mary and Gary
Cole, who have always taught me that anything worth having is worth working hard for.
Thanks, Mom and Dad, for always being in my corner, for always picking up the phone
(no matter what time it was), for all those signatures, and for all your words and gestures
of support over the years. My sisters, Aleta, Colleen, and Rita, have always reminded me
that, despite the sacrifice, all the hard work would be worth it, and as I take time to

ii

celebrate with them, it is clear that they were right. I have also been blessed with a large
extended family, who always asked, How is school going? at regular intervals, while
also wishing me luck and marveling at my achievements, before finally asking if they
would need to call me doctor after graduation. When I think of the invaluable support
my family has provided, I am also reminded of my grandmother, the late Ann Malloy,
who, as I was getting ready to return to Albany after visiting (always too soon), would
remind me that she loved me and that the problems will still be there on Monday. You
were right, Grandma, and I know you wish you were here to celebrate with me as much
as I wish you were.
Thanks also go out to my friends for always being there for me, and for never
holding a grudge over my inability to keep in touch, or when I cancelled plans at the last
minute to study, complete an assignment, or work on my dissertation. Special thanks go
out to Romann Weber, for playing the role of boyfriend, best friend, lifeline, coach, and
therapist. Your support has proved integral in getting me through my dissertation,
whether this involved well-placed humor, defusing statistics or formatting tantrums, or
just reassuring me that I knew what I was doing and that everything would work out. I
am indebted to you for your unwavering support and the time and energy you sacrificed
from your own work to help me get where I am. Thank you, and I love you very much.
Finally, I wish to acknowledge the myriad teachers, administrators, service providers,
paraprofessionals, families and students that I have worked with over the years as a
school psychologist and psychologist in training, and as a teacher. These experiences
have motivated me and taught me more than any class or lecture could have, and for this I
am truly grateful.

iii

A Survey of School Psychologists Application of the Problem-Solving Model to


Counseling Services
Rebecca Cole
University at Albany, State University of New York, 2012
Dissertation Chairperson: Deborah K. Kundert
Abstract
Given the current focus on student outcomes, use of the problem-solving model to plan
interventions is one method by which school psychologists can hold themselves
accountable for implementing counseling interventions that have a positive impact on
student behavioral outcomes and mental health. This study surveyed school
psychologists about their general counseling practices, as well as their application of the
problem-solving model when designing and implementing counseling interventions. A
total of 283 Nationally Certified School Psychologists completed an online survey based
on current research and best practices for behavioral interventions. Results indicated that
counseling is typically provided to general and special education students in group and
individual formats. Infrequently, students are recommended for discontinuation, and in
most cases because counseling goals have reportedly been met. Most often respondents
indicated using many general components of the problem-solving model including
defining and establishing the behavior of concern (e.g., behavioral definition, problem
validation), as well as those involved in determining what should be done about it (e.g.,
goal setting, intervention plan development). Less frequently did they report monitoring
and determining the effectiveness of counseling interventions (e.g., formative and
summative evaluation, decision-making plan), as well as problem-analysis.

iv

These results suggest that school psychologists apply many of the steps of the problemsolving model in accordance with federal special education laws, especially when
defining target behaviors and planning interventions. These results, however, also call
into question the degree to which these school psychologists engage in progress
monitoring and data-based decision making. The quality and frequency of baseline and
progress monitoring data collection may not enable accurate comparison of student
behavior and demonstration that behavioral improvement has been made. Further
research is needed to determine barriers and facilitators to objective data gathering in
practice settings. Implications for school psychology training programs include
knowledge of practices to focus on in order to help new and current practitioners make a
paradigm shift from assessors to researchers and active problem-solvers who consistently
and effectively implement all aspects of the problem-solving model and consult with
other school professionals to gather and evaluate behavioral data and demonstrate
accountability.

TABLE OF CONTENTS
Page
TITLE ................................................................................................................................. i
ACKNOWLEDGEMENTS ............................................................................................... ii
ABSTRACT ...................................................................................................................... iv
TABLE OF CONTENTS .................................................................................................. vi
LIST OF TABLES ............................................................................................................ ix
LIST OF APPENDICES ................................................................................................... xi
CHAPTER 1 INTRODUCTION .................................................................................... 1
Purpose of Study .................................................................................................... 6
Significance of Study ............................................................................................. 6
CHAPTER 2 REVIEW OF RELEVANT LITERATURE ............................................. 8
Overview ................................................................................................................ 8
School Psychology: History, Approaches, Training, and Current Status ............ 11
Growth and Development of the Field of School Psychology ................ 11
Refinements in the Provision of School Psychological Services............. 15
Problem-Solving Approaches .................................................................. 16
Training .................................................................................................... 18
Contemporary Roles and Functions ......................................................... 20
Section Summary ..................................................................................... 22
School Psychologists as Counselors .................................................................... 23
Definitions of Counseling and Intervention ............................................. 23
Rationale for School Psychologists and Counseling ............................... 25
Current Counseling Practices of School Psychologists ........................... 29
Section Summary ..................................................................................... 32
Current Best Practices in Counseling .................................................................. 33
Evidence-Based Practices/Evidence-Based Interventions ....................... 34
Factors Supporting the Exclusive Use of Evidence-Based Practices ...... 39
Current Evidence-Based Practices Related to Counseling ...................... 40
Section Summary ..................................................................................... 45
Designing and Evaluating Direct Interventions ................................................... 46
vi

Comparing Academic and Social Emotional Behavioral Interventions .. 46


EBIs for Childhood Behavioral Disorders ............................................... 66
Section Summary ..................................................................................... 77
Legislation Impacting the Field of School Psychology ....................................... 78
Contemporary Legislation ....................................................................... 88
Current Status and Future Directions ....................................................... 93
Section Summary ..................................................................................... 94
Survey Research................................................................................................... 95
Chapter Summary .............................................................................................. 101
Research Questions ............................................................................................ 104
CHAPTER 3 METHODOLOGY ............................................................................... 106
Overview ............................................................................................................ 106
Participants ......................................................................................................... 106
Instrumentation .................................................................................................. 106
Survey .................................................................................................... 106
Coverletter Email ................................................................................... 107
Follow-up Email .................................................................................... 107
Procedure ........................................................................................................... 108
CHAPTER 4 - RESULTS ............................................................................................ 112
Overview ............................................................................................................ 112
Data Analysis Plan ............................................................................................. 112
Respondent Characteristics/Demographic Data ................................................ 112
General Counseling Practices ............................................................................ 115
Comparison of Demographic Variables and General Counseling Practices ..... 117
Components of the Problem-Solving Model Used ............................................ 123
Behavioral Definition and Baseline Data Collection ............................. 125
Goals, Intervention Planning, Measurement, and Decision-Making ..... 125
Progress Monitoring, Formative Evaluation, Treatment Integrity, and
Summative Assessment ......................................................................... 132
Comparison of Demographic Variables and Use of the Problem-Solving
Model ................................................................................................................ 137
vii

Comments .......................................................................................................... 138


CHAPTER 5 DISCUSSION ..................................................................................... 139
Overview ............................................................................................................ 139
Respondent Characteristics ................................................................................ 139
Counseling Practices of School Psychologists .................................................. 143
Use of the Problem-Solving Model ................................................................... 145
Implications for School Psychology .................................................................. 153
Limitations and Directions for Future Research ................................................ 155
Summary ............................................................................................................ 160
REFERENCES .............................................................................................................. 162
APPENDICES ............................................................................................................... 213

viii

LIST OF TABLES
Table

Page

1. Summary of Important People and Events Contributing to the Development


of the Field of School Psychology ...................................................................... 12
2. Definitions of Evidence-Based Interventions ...................................................... 36
3. Clauses Recommending the Use of Evidence-Based Interventions .................... 41
4. An Outline for Planning Interventions Aligned With the Problem-Solving
Model .................................................................................................................. 43
5. An Outline for Planning Academic and Behavioral Interventions ...................... 50
6. Table of Evidence-Based Interventions ............................................................... 67
7. Criteria for Classifying Evidence-Based Psychosocial Treatments..................... 76
8. Selected Federal Legislation Impacting School Psychology ............................... 80
9. Court Decisions Impacting the Practice of School Psychology........................... 84
10. Expository Reports Impacting the Practice of School Psychology ..................... 86
11. Agencies and Initiatives Impacting the Practice of School Psychology .............. 87
12. Key Provisions of NCLB ..................................................................................... 89
13. Research Findings Guiding IDEA 2004 .............................................................. 92
14. Guidelines and Considerations for Creating Internet Surveys Following the
TDM Framework ............................................................................................... 102
15. Summary of Feedback from Pilot Subjects ....................................................... 109
16. Mailing and Response Data ............................................................................... 111
17. Demographic Characteristics of Respondents ................................................... 113
18. Estimates of Time Allocation ........................................................................... 116
19. Counseling Practices of School Psychologists ................................................. 118
20. Multinomial Regression Predicting Type of Counseling from Graduate Degree,
Years of Experience, and Time Spent Counseling ............................................ 120
21. Multinomial Regression Predicting Students Served in Counseling from Graduate
Degree, Years of Experience, and Time Spent Counseling .............................. 120
22. Multinomial Regression Predicting the Number of Student Discontinued from
Counseling from Graduate Degree, Years of Experience, and Time Spent
Counseling ........................................................................................................ 120
ix

23. Multinomial Regression Predicting Reasons for Discontinuation from Counseling


from Graduate Degree, Years of Experience, and Time Spent Counseling ..... 121
24. Chi Square Analysis Comparing Type of Counseling and Years of
Experience ......................................................................................................... 121
25. Chi Square Analysis Comparing Number of Students Discontinued from
Counseling Each Year and Years of Experience ............................................... 122
26. Use of Intervention Components of the General Problem-Solving Model........ 123
27. Use of Specific Components of Behavioral Definition Composition ................ 126
28. Use of Specific Types of Baseline Data Collection ........................................... 127
29. Average Number of Baseline Data Points Collected to Establish a Stable Pattern
of Student Behavior ........................................................................................... 127
30. Use of Specific Criteria When Writing Behavioral Goals ................................. 128
31. Use of Specific Components of Counseling Intervention Plans ........................ 129
32. Use of Specific Components for Measuring Target Behaviors ......................... 130
33. Use of Specific Decision-Making Components ................................................. 131
34. Use of Specific Progress Monitoring Techniques ............................................. 133
35. Number of Progress Monitoring Data Points Collected to Establish a Stable
Pattern of Student Behavior ............................................................................... 133
36. Use of the Same Method for Baseline Data Point and Progress Monitoring Data
Point Collection ................................................................................................. 134
37. Sources of Data Considered During Formative Assessment ............................. 134
38. Use of Methods to Measure Treatment Integrity ............................................... 135
39. Sources of Data Considered During Summative Assessment ........................... 136
40. Comparison of Demographic Characteristics .................................................... 141
41. Use of General and Specific Intervention Components of the Problem-Solving
Model ................................................................................................................ 146

LIST OF APPENDICES
Appendix

Page

A. School Psychologists Survey ............................................................................. 213


B. Coverletter Email ............................................................................................... 225
C. Follow-Up Email ............................................................................................... 226
D. Pilot Survey Feedback Questions ...................................................................... 227
E. Data Analysis Table ........................................................................................... 228
F. Non-Significant Chi-Square Results .................................................................. 243
G. Logistic Regression Models ............................................................................... 248

xi

CHAPTER 1: Introduction
As the field of school psychology has grown over time, the duties of the school
psychologist have changed in response to the needs of students, mandates from state and
federal government, and research defining practices proven to be effective for meeting
the academic and behavioral needs of students (Fagan, 2008). The emergence and
growth of the field of school psychology occurred in tandem with the development of the
modern American public school. Efforts at special education in response to the growth
and diversity of the student body in American schools created the need for a variety of
new positions within the school, including attendance officers, guidance counselors,
school nurses, school psychologists, school social workers, and vocational counselors
(Fagan, 1992).
The work of early school psychologists developed as part of several movements,
particularly child study, the beginnings of psychology as a distinct field, and the
emergence of psychological and educational testing (Reisman, 1966; Wallin & Ferguson,
1967). Another sign of development was the mandatory provision of psychological
services to explore deficits in student behavior being written into state laws and
regulations by departments of education (Hollingworth, 1932). Factors beyond the
school setting, such as population increases after World War II, and special education
laws requiring psychoeducational evaluations (Fagan, 2008), also helped promote growth
in the field, as school psychologists began to encounter larger student bodies with diverse
educational and behavioral needs. Current problem-solving approaches used to address
these needs look at the childs response to intervention, based on direct assessment of
observable behaviors, with less time devoted to understanding underlying traits or skills

(Fagan, 2008; Lichtenstein, 2008). At this time, however, the extent to which the
problem-solving approach informs the current practices of school psychologists is
unknown.
To ensure compliance with federal special education requirements, states passed
credentialing standards requiring that school psychologists complete certain courses and
gain specific experiences before working with children. These laws and credentialing
standards shaped training in school psychology. Over time, training experiences have
expanded to prepare school psychologists to conduct a variety of assessments and design
interventions based on their results, along with providing consultation and prevention
services in schools through a combination of classwork, practica, and internship
experiences (Fagan, 1986, 2008).
In order to meet the needs of students, the activities and responsibilities of the
school psychologist have also evolved. The primary and most enduring role of the school
psychologist has been assessment for the purposes of determining appropriate placement
and educational experiences. The second role entails designing and implementing direct
interventions for academic and behavioral issues, while the third involves applying
interventions on an indirect or systems-level basis as a consultant. Research on time
allotted to these different activities reveals that most school psychologists spend almost
half of their time on assessment, while devoting the remaining half on direct
interventions, and systems-level consultation and research (Bramlett, Murphy, Johnson,
Wallingsford, & Hall, 2002; Fisher, Jenkins, & Crumbley, 1986; Goldwasser, Meyers,
Chistenson, & Graden, 1983; Hartshorne & Johnson, 1985; Lacayo, Sherwood, & Morris,
1981; Meacham & Peckam, 1978; Reschly & Wilson, 1995; Smith, 1984).

It is important to recognize that the field of school psychology emerged in direct


response to an expanding population of students in American public schools and their
corresponding needs. Over time, school psychology as a field has evolved in response to
these needs, as seen through refinements in problem-solving practices, credentialing
standards, training programs, and daily activities and responsibilities. Currently, school
psychologists and professional bodies are focusing attention on the behavioral and mental
health needs of students in schools (Agresta, 2004; Hosp & Reschly, 2002; NASP, 2006)
as research reveals that many times these needs go unmet (Ringel & Sturm, 2001),
preventing students from benefitting from instruction and developing academic skills
(Crespi, 2009; Farrell, Guerra, & Tolan, 1996; Haertel, Walberg, & Weinstein, 1983;
Wang, Haertel, & Walberg, 1990).
Counseling is one activity that school psychologists have engaged in to directly
address the social, emotional, and behavioral needs of the students with whom they work
(Doll & Cummings, 2008). Contemporary research on counseling practices reveals that
the most common theoretical orientation guiding school psychologists is cognitive
behavioral, while this service is most often delivered in a group format, in addition to
offering individual and class-wide sessions (Hanchon & Fernald, 2011; Yates, 2003).
Common referral issues addressed using counseling include behavioral issues, emotional
difficulties, academic problems, and social skills deficits (Hanchon & Fernald, 2011).
In an effort to inform the counseling practices of school psychologists,
professional organizations regularly publish research detailing best practices when
working to meet the needs of students. Several current themes within the realm of
counseling and direct interventions are the application of evidence-based interventions,

data-based decision making, and accountability. The literature on evidence-based


interventions provides the most support for treatments using cognitive behavior therapy
(CBT) and behavior therapy (BT; e.g., Behavioral classroom management [BCM;
Barkley et al., 2000; Miranda, Prescentacion, & Soriano, 2002; MTA Cooperative Group,
1999]). Furthermore, there is a significant overlap between the activities involved in
CBT and BT and best practices for designing counseling and direct interventions. For
example, best practices recommend designing interventions using a deliberate problemsolving approach in which the problem behavior is defined in objective and measurable
terms (Miltenberger, 2005), collecting direct measures of observable behavior before and
during the intervention (Upah, 2008), and using regularly gathered assessments of
behavior to make decisions related to continuing or revising the intervention (Kazdin,
1982). When applied to planning and implementing counseling as a direct intervention,
the use of evidence-based interventions and regular measurements of student behaviors
are two practices that will allow school psychologists to determine whether or not their
interventions are effective (Upah, 2008). This provides a measure of accountability for
meeting the behavioral and mental health needs of the students with whom school
psychologists work.
The current focus on accountability extends beyond school psychologists and
their professional organizations, as seen by federal legislation guiding the delivery of
instruction and assessment in American public schools. In a similar fashion to the growth
and evolution of school psychology, federal education legislation has been created and
shaped in response to the needs of students. A variety of court cases (e.g., PARC v.
Pennsylvania, 1972; Mills v. Board of Education, 1972) and expository reports have

publicized weaknesses and inequalities in the education provided to specific groups of


students. To address these inequalities, federal laws have been passed and reauthorized
over time (e.g., The Elementary and Secondary Education Act, The Individuals With
Disabilities Education Act) that have defined specific educational practices that must be
followed in schools to ensure access to adequate educational experiences for all students.
Specific examples of some of these practices include evidence-based instruction, positive
behavior supports, and early intervention services (20 U.S.C. 1400 [5][E]).
A prime example of accountability mandated by federal legislation is the No
Child Left Behind Act (NCLB). This piece of legislation specifies standards for effective
instruction that schools must meet, and mandates regular measurement of student
progress using standardized assessment. In addition, NCLB includes a series of
corrective actions for schools that do not demonstrate adequate gains in student progress
that range from resources and assistance for remediation planning, to staff replacement
and school closure. More contemporary actions by the federal government include grant
programs providing states with federal assistance contingent on their ability to implement
standards-based assessments, progress monitoring and data-based decision making
systems related to student academic achievement, and the use of student achievement
data to assess the performance of teachers and administrators (U.S. Department of
Education, 2009). Drafts of the next reauthorization of NCLB have thus far focused on
improving teacher and administrator effectiveness, the delivery of effective instruction
and interventions, and the implementation of standards and assessments to prepare
students for future success in college and careers (U.S. Department of Education, Office
of Planning, Evaluation and Policy Development, 2010). Judging from these goals and

the laws already in place, it would appear that accountability for student outcomes is one
theme that will continue to guide the activities of school professionals, including school
psychologists.
Purpose of Study
The purpose of the current study was to survey school psychologists about their
current counseling practices to determine whether they are implementing research and
best practice guidelines related to the use of evidence-based interventions, progress
monitoring, and data-based decision making. In addition, this study explored whether
specific aspects of counseling practices varied according to demographic characteristics.
Significance of Study
The notion of accountability is a common theme that is repeatedly mentioned not
only in the literature describing best practices in counseling and mental health, but also in
current and proposed federal education legislation (Wright & Wright, 2009). School
psychologists have expressed a desire to spend more time involved in counseling as a
direct intervention to meet the mental health and behavioral needs of their students
(Agresta, 2004). The use of evidence-based interventions, progress monitoring, and databased decision making are measures of accountability that can be applied to the design
and implementation of counseling as a direct intervention and may provide assurance to
school psychologists that their efforts are addressing the mental health and behavioral
needs of their students. In addition, although, at this time, accountability standards and
consequences apply to educational practices and outcomes, as well as managed care and
the private sector (Kazdin, 2008), in future, accountability standards and consequences
may also govern school psychologists as they address mental health and behavioral

needs. Designing and implementing counseling interventions that allow school


psychologists to determine whether or not these interventions have been effective
provides a measure of accountability for meeting the needs of students, and anticipates a
time when the federal government may mandate such accountability.
Information on the effectiveness of an intervention also promotes the efficient
allocation of resources, in terms of space and materials, time spent by the school
psychologist, and time that the student spends away from classroom instruction and
independent opportunities for socialization with peers. With respect to time as a
resource, data on the effectiveness of an intervention allow school psychologists to make
data-based decisions on continuing, discontinuing, or changing the focus of an
intervention. In addition, these data allow school psychologists to clearly demonstrate to
students, families, teachers, and administrators whether the student has made
improvements, whether counseling is justified as a necessary service for the student, and
whether the mental health needs of the student population are being addressed. On a
larger scale, information on the current counseling practices of school psychologists, with
a specific focus on accountability and data-based decision making, is valuable in terms of
informing the focus of training programs preparing future school psychologists.

CHAPTER 2: Review of Relevant Literature


Overview
Public education has always been shaped by legislation. Throughout the history
of public education, the actions of state and local governments have been guided by
evidence documenting the needs of students in American public schools. Early examples
of federal involvement focused on providing access to education beginning with
compulsory attendance laws in the early 1900s. Compulsory attendance laws and the
resulting increase in student needs between 1890 and 1930 were the major forces behind
the emergence and establishment of the field of school psychology (Fagan, 1992). The
Elementary and Secondary Education Act (ESEA; 1965) was designed to improve
educational outcomes and opportunities for children living in poverty by providing
funding for the construction of schools, instructional materials, and teacher salaries.
Around this time, similar bills were passed to improve educational access and outcomes
for students from diverse cultural backgrounds (e.g., Title VIII Bilingual Education Act
(1967); Bilingual Act of 1974 [Stewner-Manzanares, 1988]), as well as students with
educational exceptionalities (e.g., Rehabilitation Act of 1973; Education for All
Handicapped Children Act [1975; Beyer, 1989]).
Over time, after taking into account the results of implementation, pieces of
legislation were either discontinued or amended in an effort to improve the quality of
educational experiences provided to students. The ESEA, for example, has been
amended several times as law makers, educators, and researchers have attempted to
determine the best way to provide quality instruction (e.g., the Stennis Amendment to the
ESEA [1970; Crespino, 2006]), the appropriate amount of funding for specialized

education programs (e.g., ESEA Federal Amendments [1974]), and how federal funding
should be distributed (e.g., the Educational Consolidation Improvement Act [1982; Gray,
Caulley, & Smith, 1982]). Issues related to funding and appropriate educational
opportunities are two themes that can also be traced throughout the evolution of
legislation focusing on students with educational exceptionalities. Once access to public
education had been secured, quality educational opportunities were defined initially as
appropriate placements and services (e.g., the Education of the Handicapped Act
Amendments [1986; House Committee on Education and Labor, 1986]), with later
specifications mandating educational experiences with non-disabled peers (e.g.,
Individuals With Disabilities Education Act [1990]; School of Public Health and Health
Professions, University at Buffalo, 2005), and an explicit connection between special
education and the general curriculum (e.g., Individuals With Disabilities Education Act;
National Center for Children and Youths with Disabilities [1998]).
Currently, legislation is focused on accountability for student outcomes. An early
example of accountability can be seen in the 1988 amendments to the ESEA, the
Hawkins-Stafford School Improvement Amendments (House Committee on Education
and Labor, 1988), as it made federal funding contingent on documented gains in student
achievement, with increased regulation for schools unable to demonstrate student
improvement. Since this time, standards defining adequate progress and corrective
actions when gains in achievement have not been made have become more specified, and
have had a significant impact on public education (Nelson & Weinbaum, 2009). A
contemporary example of accountability legislation is the No Child Left Behind Act
(NCLB; 2002), with provisions detailing appropriate training for teachers and

paraprofessionals, standards-based instruction and assessment, and corrective actions for


schools in need of improvement. In addition, current legislation focused on students with
educational exceptionalities mandates the provision of evidence-based instruction,
intervention, and behavioral supports, as well as high achievement standards (e.g., the
Individuals With Disabilities Education Improvement Act of 2004; Wright & Wright,
2009).
As one of the school professionals taking on the responsibility of ensuring
positive outcomes for students, the work of school psychologists has been and continues
to be influenced by federal legislation. As such, this chapter will explore the growth and
development of school psychology over time, in order to examine accountability as it
relates to school psychologists working to ensure positive social-emotional-behavioral
student outcomes through the design and implementation of counseling as a direct
intervention. This chapter begins with a brief history of the social factors that have
shaped the training, everyday practices, and responsibilities of school psychologists from
their early days to present times. Following this, the role of the school psychologist as a
counselor will be discussed, including the factors that have established counseling and
mental health services in schools, the current status of these services, and research and
best practices when designing and implementing counseling as an intervention. Because
of its impact on the roles and functions of the school psychologist, federal education
legislation, including actions that have already been taken, as well as research and
legislative mandates intended to guide next steps, is also chronicled. This chapter
concludes with a discussion of the advantages and disadvantages of internet survey
research, as well as the proposed research questions guiding this study.

10

School Psychology: History, Approaches, Training, and Current Status


This section will discuss selected facets of the growth and development of the
practice of school psychology from its origination in the early 1900s to the present.
Selected events and individuals whose actions were vital to the establishment of school
psychology are described briefly to provide necessary context and background. This
context will be followed by a depiction of the changes in the problem-solving approach
guiding school psychologists in their work with children and families. Details on the
evolution of school psychology training programs have also been included. This section
concludes with a chronicle of the changes in the role and function of the school
psychologist over time. The reader will notice that the field of school psychology has
evolved over time in response to the needs of students, as evidenced by federal mandates,
credentialing standards, and training guidelines from professional organizations.
Although the exact form is difficult to predict, it can safely be assumed that the field of
school psychology will continue to change as school psychologists strive to meet the
needs of students in schools today. In Table 1, the reader is provided with a brief
historical context of the foundation and early development of the field of school
psychology.
Growth and development of the field of school psychology. The profession of
school psychology emerged as a response to increased diversification in the academic and
social-emotional needs of students in American schools as a result of the passage of
compulsory education laws. Early growth and solidification of roles in the field occurred
in tandem with improved methods of understanding and addressing educational needs.
Knowledge detailing normative development and learning in children was gained through

11

Table 1
Summary of Important People and Events Contributing to the Development of the Field
of School Psychology
Event/Person
Social Phenomena
Compulsory attendance laws
Immigration
Rapid increase in the number of
students attending American public
schools (Grant & Eiden, 1980)

Implications
School staff are presented with
challenges related to truancy, learning
difficulties, and discipline (Irwin, 1915;
Reisner, 1915)
Schools began to offer medical and
psychological examinations (Wallin,
1914), as well as special education
classes (Fagan, 1992; Van Sickle,
Witmer, & Ayers, 1911)

G. Stanley Hall and the child study


movement

Data collection on child development


The establishment of a scientific
understanding of childrens
development and behavior (Benjamin
& Baker, 2003; Hollingworth, 1932)
Parents and school staff were now
armed with more precise knowledge
related to childrens behavior and
development
Hall advocated for the practice of
child-centered, individualized
instruction, wherein the curriculum of
the school was revised to best meet the
needs of the student (Minton, 1987)

Lightner Witmer

The establishment of the psychological


clinic
In-depth diagnosis and treatment for
children who struggled in school
(Benjamin & Baker, 2003)
Creation of an applied field of
psychology focused on the diagnosis
and resolution of education problems
(Witmer, 1897)

Creation and proliferation of


psychoeducational tests

Binet-Simon Scales in 1905 (Cutts,


1955)
The Intelligence Quotient (Benjamin &
Baker, 2003)
(table continues)
12

Table 1 continued
Event/Person

Implications
Age norms for children (Hollingworth,
1932)
Tests suitable for group administration
(Cutts, 1955)
Developments and refinements in the
use of psychoeducational testing
allowed school psychologists to
classify students in an objective and
standardized fashion

Clauses mandating the provision of


psychological services for students

Services provided for students who


came in to contact with their states
juvenile justice system (Hollingworth,
1922) or were being considered for
placement in special education classes
(Hollingworth, 1932; University of the
State of New York, 1931)
Profession and expertise of the school
psychologist were given legitimacy

Thayer Conference in 1954 (Cutts, 1955)

Assessed the development and future of


the profession
Defined the roles and function of the
school psychologist
Delineated qualifications and training
standards
Increased number of school psychology
practitioners (Fagan, 2008)
Improved the service ratio of school
psychologists to students (Charvat,
2005; Fagan, 1988; Reschly &
Connolly, 2000)

Establishment, growth, and development


of professional organizations representing
school psychology

Division 16 of the American


Psychological Association (APA) in
1945
The National Association of School
Psychologists in 1969
Promoted growth and development of
the field
(table continues)
13

Table 1 continued
Event/Person

Implications
Provided representation for school
psychologists in matters of public
policy
Promoted the use of best practices
through the dissemination of
information
Established nationally recognized
training and certification standards
(Benjamin & Baker, 2003)

14

the child study movement and in early psychological clinics. This knowledge was
applied to the creation of standardized psychoeducational tests, which provided an
objective way of measuring student abilities. Early psychological tests were then piloted
and improved as their use became mandated by state and federal laws. The remainder of
this section will provide greater depth related to changes in service provision, problemsolving approaches, training, and roles and functions, to describe how the field of school
psychology has developed and evolved.
Refinements in the provision of school psychological services. The discussions
held and consensus reached at the Thayer Conference had a profound impact on the field
of school psychology that helped to solidify the professional identity of the school
psychologist (French, 1984). Fagan (2008) provided an excellent description of the
evolution of the field of school psychology that serves to explain the forces behind the
growth in the number of school psychology practitioners and the refinement in service
ratios. Societal factors impacting the growth in the number of school psychology
practitioners included an increase in the population after World War II, and the passage
of special education laws requiring the provision of psychological services (Fagan, 2008).
Early school psychological services were provided in the schools, through external
agencies. By 1960, however, the school was the primary employment location for school
psychologists. In the 1980s, federal laws regarding evaluations for students with
disabilities, increases in insurance reimbursement for psychologists, and growth in the
number of school psychologists earning doctoral degrees contributed to variety in the
locations where school psychological services were to be provided (Fagan, 2008).
Contemporary practice settings include public and private schools, private practices, state

15

departments of education, colleges or universities, and medical institutions (Curtis,


Lopez, Batsche, & Smith, 2006).
Problem-solving approaches. The evolution in the practice of school
psychology can also be understood by looking at changes in the problem-solving
approach used by practitioners at different points in time. Fagan (2008) described how
this focus initially centered on the child, expanded to consider more ecological variables,
and currently focuses on the childs response to intervention. The work of early school
psychologists was largely based on the medical model, as problems children experienced
were thought to be a direct result of their skills, abilities, or personality characteristics
(Irwin, 1915; Reisner, 1915). The responsibility of the school psychologist was to study
the child, diagnose the problem, and design a student-centered intervention (Hutt, 1923).
To understand the problem, school psychologists used questionnaires, interviews,
informal assessments, analysis of work samples, health and developmental histories,
classroom observation, and medical examinations (Fagan, 1992, 2008). The advent of
standardized testing added specificity and objectivity to the diagnosis of problems and
classification of students based on abilities and behaviors. Interventions at this time
included counseling, academic skill remediation, and instructional or alternative
placement special education (Fagan, 2008).
Gradually, the exclusive focus on the child was broadened, such that assessment
and intervention included other variables considered significant to development and
behavior. School psychologists adopted an ecological approach, as they began to
consider how the interactions of factors such as parents or guardians, the home
environment, the childs teacher, and the instruction delivered in the classroom were

16

affecting the behavior and educational achievement of the students with whom they
worked. In addition to standardized and unstandardized measures of student intelligence
and academic achievement, practitioners now used formal and informal measures to
gather information on the students peer relationships, feelings about school, relationship
with parents or guardians, instructional environment, and behavior with and without
peers (Fagan, 2008). Child-centered interventions enjoyed continued use, and were
refined by improvements in special education and therapies for groups and individuals,
informed by developments in child-centered, rational-emotive, and behavior modification
theories (Fagan, 2008). Special education placements at this time became less restrictive,
as mainstreaming and inclusion practices became popular. Interventions were now also
provided indirectly, as school psychologists consulted with teachers and parents (Porter
& Holzberg, 1978).
Although the ecological approach had a significant impact on the practice of
school psychology, more contemporary approaches focus on assessment and the design
of corresponding interventions based on directly observable academic skills and
behaviors, with less of a focus on underlying abilities or traits (Lichtenstein, 2008).
School psychologists still consider a range of ecological variables related to peers,
teachers, classroom environment, instruction, and parenting, but only as these relate to
the creation of outcome-based assessments and interventions (Batsche, Castillo, Dixon, &
Forde, 2008). Provisions in special education laws requiring functional behavioral
assessment and treatment accountability reinforced a focus on outcomes, along with
increased priority placed on the use of empirically supported interventions as articulated
by clinical, counseling, and school psychology. Current problem-solving approaches

17

now directly assess academic skills embedded in the school curriculum, with
interventions directly linked to skill development, rather than relying entirely on
normative assessment measures (Hosp, 2008). The movement towards outcome-based
and empirically valid practices is exemplified in the response to intervention (RTI)
model. Procedures used within the RTI model include pre-referral assessment and
screening, assessment, and intervention based directly on classroom curriculum,
criterion-referenced measurement, evidence-based practices, and problem-solving
consultation (Casbarro, 2008). The use of RTI represents a distinct shift from childcentered approaches to assessment and intervention, but is also one supported by current
federal legislation (Lichtenstein, 2008).
Training. Growth and refinement in the field of school psychology were
accompanied and made possible by increases and changes in preparation programs. At
the time of the Thayer Conference, 28 institutions offered training in school psychology
(Cutts, 1955). As time progressed, the number of institutions offering school psychology
training programs increased from 28 in 1954 to 211 in 1984 (Fagan, 1986). The National
Association of School Psychology (NASP) program directory supplements these data,
indicating that in 1989, there were 231 programs, with more current figures from 2008
listing 238 institutions that provided programs in school psychology (Fagan, 2008).
A look at the development of school psychology training programs provides
another lens for examining the evolution of the field. Early school psychology training
programs provided their students with a foundation in clinical psychology, educational
psychology, or a blend of the two (Fagan, 2008). Up until the 1960s, the content offered
in training programs was primarily determined by the background and interests of faculty

18

members, and their opinions on skills necessary to deliver psychological services in


public schools (Fagan, 1986). State credentialing provisions also influenced training.
National standards did not exist at this time. As such, classes focused on traditional
psychology or education with specialized training in areas such as psychoeducational
assessment (Cutts, 1955). As federal special education laws were passed, states
incorporated compliance provisions into their credentialing regulations, which in turn
impacted coursework provided to prospective school psychologists.
In addition to the effect of state credentials, over time, school psychology training
programs were further impacted by the creation of accreditation standards established by
the American Psychological Association (APA), the National Council for Accreditation
for Teacher Education (NCATE), and NASP. To address these new accreditation
standards, content offered by training programs became more specific, and began to
prepare students to offer different types of assessments, as well as corresponding
interventions, along with refining research methods, and providing consultation and
prevention services in schools (Fagan, 2008). More recent changes in training programs
included the extension of field experience requirements to include practica and internship
experiences, and courses taught by faculty members trained specifically in school
psychology (Fagan, 1986, 2008). Currently, standards specified by state psychology
boards have considerable influence over doctoral-level training programs. Most boards
require students to complete a minimum of two years of supervised preparation.
Contemporary coursework continues to provide training in assessment, intervention, and
consultation, while also focusing on topics such as the use of technology in schools, crisis
intervention, prevention, the use of empirically based practices, and data-based problem-

19

solving.
Contemporary roles and functions. Over time, the role and function of the
school psychologist has also changed. Fagan (2008) described four roles that school
psychologists have played at various points in the history of the field. The first, and most
enduring role, is that of the assessor, whose primary responsibility has been the
administration of psychoeducational assessments for the purpose of special education
placement. This role has evolved, such that the school psychologist is now part of a team
of service providers who use a vast array of assessment tools to determine placement and
services for a variety of students in schools. At times, the role of the assessor has also
included identifying and designing interventions for students considered to be at-risk for
academic and social-emotional difficulties in school. The second role is that of the direct
interventionist, where the school psychologist provides individual and group intervention
in the form of academic remediation or counseling. The range of interventions provided
by school psychologists has expanded over the years as a result of improvements in
training and internships, acceptability of more direct interventions such as psychotherapy
in schools, and a focus on the need for the provision of mental health services to students
in schools (Fagan, 2008; Sandoval, 1993; Talley & Short, 1996). The role of the
consultant is the third function, and while this role was practiced by early school
psychologists, only in recent decades has it become a well-researched and theory-based
process (Fagan, 2008). An extension of the consultant role is the more recent function of
the school psychologist as a systems-level interventionist who designs assessments,
interventions, and prevention efforts at the school level, as opposed to the individual level
(NASP, 2010).

20

Despite a proliferation in roles and functions, assessment activities continue to be


the mainstay of the practice of school psychology (Merrell, Ervin, & Gimpel, 2006b).
National surveys from the past several decades documenting how school psychologists
spend their time reveal some consistencies in time allocation. For example, respondents
rated spending approximately 50% of their time on assessment, 20-25% engaged in direct
intervention, 20-25% on consultation, and the remaining time involved in systems-level
or research activities (Bramlett, Murphy, Johnson, Wallingsford, & Hall, 2002; Fisher,
Jenkins, & Crumbley, 1986; Goldwasser, Meyers, Chistenson, & Graden, 1983;
Hartshorne & Johnson, 1985; Lacayo, Sherwood, & Morris, 1981; Meacham & Peckam,
1978; Reschly & Wilson, 1995; Smith, 1984). Interestingly, studies comparing actual
with desired time allocation reveal that school psychologists would like to spend less
time on assessment, and devote more time and resources to interventions, consultation,
research, and program evaluation (Hosp & Reschly, 2002; Reschly & Wilson, 1995).
Current school psychology practitioners have also expressed a desire to address
the mental health needs of students in school (Agresta, 2004; Hosp & Reschly, 2002).
For example, data from national school psychologist surveys indicated that two-thirds of
their respondents provided the following services to address the mental health needs of
their students: individual counseling, crisis intervention, assessment or evaluation,
behavior-management consultation, case management, referrals to specialized programs,
group counseling, and substance use and/or violence prevention (Brener, Martindale, &
Weist, 2001; Foster et al., 2005). Specifically related to student mental health, Agrestas
(2004) research on desired roles revealed that school psychologists would like to spend
more time each week on parent education and consultation, and individual and group

21

counseling activities because they see the need for and value of such services in their
schools.
In addition to the practitioner focus on mental health, professional bodies, such as
NASP, have also responded to concerns and research in this area. Recent NASP position
papers (2006) have recommended that schools provide a range of comprehensive mental
health services where there is found to be a need, in order to promote academic
achievement, school connectedness and community, respectful behavior, student wellbeing, and a positive school climate. Current training guidelines (NASP, 2000) have
recommended that school psychologists receive training in designing and implementing
prevention and intervention programs to foster the mental health and physical wellbeing
of the student bodies they serve. Because of their training in mental health and
education, NASP (2006) has advocated for continued role expansion for school
psychologists as effective providers of a wide range of interventions to be delivered in the
school building, such as universal prevention and intervention, specific interventions for
students at risk, and comprehensive interventions as designed or reinforced by
community agencies.
Section Summary. Although the exact direction that the field of school
psychology will take in the future is difficult to predict, a brief look at the history of the
field makes it clear that the role and function of the school psychologist will continue to
evolve. The effect of federal legislation, state credentialing regulations, and training
standards from professional bodies such as APA, NCATE, and NASP will likely continue
to influence training programs and the daily functioning of school psychologists. The
argument can also be made that societal factors, which were largely responsible for the

22

emergence and establishment of the field of school psychology, will also continue to
influence future directions. Concern for the mental health and well-being of children in
schools is beginning to exert considerable influence on school psychologists, training
programs, and professional bodies.
School Psychologists as Counselors
In addition to focusing on educational needs, one of the roles of the school
psychologist has always been addressing the social, emotional, and behavioral needs of
students in school (Doll & Cummings, 2008). Counseling has always been one of the
roles played by the school psychologist, although the time spent on this activity has
periodically been limited by responsibilities related to assessment; comfort, and
willingness by the school psychologist to offer counseling; and, the presence of other
providers of counseling services within a school building (Murphy, 2008). An
understanding of current research and best practices related to evidence-based
interventions, and designing and implementing counseling as a direct intervention is one
way to promote this role in cases where school psychologists report experiencing
discomfort. Given the current focus on mental health needs and student well-being,
knowledge describing effective practice is essential to ensuring that counseling
interventions produce positive social-emotional outcomes for students.
Definitions of counseling and intervention. Many definitions of counseling
have been offered by different authors and professional organizations. For example, the
American Counseling Association (ACA; 2011) defines counseling as a professional
relationship that empowers diverse individuals, families, and groups to accomplish
mental health, wellness, education, and career goals (n. pag). According to the

23

Merriam-Webster dictionary (2011), counseling entails providing professional guidance


of the individual by utilizing psychological methods especially in collecting case history
data, using various techniques of the personal interview, and testing interests and
aptitudes (n. pag). Of more relevance to the school psychologist are the definition and
descriptions of counseling provided by Velleman and Aris (2010). According to these
authors, counseling is primarily about enabling individuals, as far as possible, to
overcome obstacles, to take control of their own lives, [and] to learn how to take
maximum responsibility and decision-making power for themselves and their futures
(pp. 19-20). Furthermore, these authors describe how the different approaches to
counseling can be grouped into two different categories. The first approach is a more
directive or prescriptive, training or skills oriented, external-action oriented style, where
the helper directs, instructs or guides the person in need to an appropriate action
(Velleman & Aris, 2010, p. 20), while the second approach involves a more facilitative,
internal, insight oriented, personal-exploration oriented style, where the helper is less
directive and seeks to encourage and support the person in need to work with and
discharge emotion and to reach their own realizations of appropriate actions (Velleman
& Aris, 2010, p. 20).
Implied in the definitions provided for counseling is the notion of change, from
one maladaptive behavior or circumstance to another that is considered to be more
adaptive. The idea of change is also found in the definition of an intervention, where
intervening involves coming in or between by way of hindrance or modification
(Merriam-Webster, 2011). Following this line of reasoning, counseling can be thought of
as a specific type of intervention. For the purposes of this discussion, an intervention is

24

being defined as a set of strategies or procedures designed to improve the performance of


one or more students, with the objective of narrowing the divide between current
performance and expectation (Upah, 2008). Furthermore, social-emotional and
behavioral interventions are those that focus on the presence or absence of behaviors that
impede learning and academic achievement, in order to develop attitudes and skills
necessary for effective functioning in society and career success (Forman & Burke,
2008).
Rationale for school psychologists and counseling. Doll and Cummings (2008)
provided important context related to the priority that has been placed on the provision of
counseling and mental health services in schools. In the 1970s, counselors, school
psychologists, and social workers focused their attention on offering guidance to address
minor student issues, while referring more significant cases of behavioral issues to
community mental health providers. The focus of the school was on meeting the
educational needs of the student body, as it was assumed that only a small percentage of
the student body required mental health services, which they were receiving through
outside agencies (Presidents Commission on Mental Health, 1978).
Over time, evidence was revealed that served to challenge the notion that the
mental health needs of children were being adequately addressed outside of the school
building. For example, epidemiological data gathered during the 1980s suggested that at
least 20% of school children had a diagnosable psychiatric disorder, while only onefourth of these children were receiving mental health services to address their needs
(Doll, 1996; U. S. Department of Health and Human Services, 1999).
Research from the past two decades is ripe with examples to support the

25

evaluation by Tolan and Dodge (2005) that students in America are currently
experiencing a mental health crisis. For instance, Huang et al. (2005) reported that 1 in 5
children have a diagnosable mental disorder. Furthermore, 3-7% of children have been
diagnosed with Attention-Deficit/Hyperactivity Disorder (Root & Resnick, 2003).
According to the American Psychiatric Association (2000), some of the contemporary
issues currently found to have a negative impact on children include family disputes,
child abuse, attention disorders, and violence. One in seven children are reported to have
been punched, kicked, or choked by a parent (Moore, 1994). In addition, Crespi and
Howe (2002) estimate that approximately 80% of children have been exposed to some
form of spousal abuse. One in six families have had to cope with the effects of parental
alcoholism, resulting in 28-34 million people who have directly experienced life in an
alcoholic family (Newcomb, Galaif, & Locke, 2001).
It is reported that more than 8 million children are in need of psychological
services (Carnegie Council on Adolescent Development, 1996), but that most youth with
a psychological disorder never receive mental health care (Farmer, Burns, Philip, Angold,
& Costello, 2003; Ringel & Sturm, 2001). For example, a 2002 study by Kataoka,
Zhang, and Wells described the percentages of children who accessed mental health
services across three cross-sectional, nationally representative samples comprised of
more than 11,500 households. In their sample, it was found that 15-21% of youth ages 617 had a mental health problem, while only 6-7.5% of those same youth (or 29-49% of
the entire sample) were receiving some form of mental health treatment. One reason for
this may be an inability to pay for such care, as it is estimated that 1 in 7 adolescents lack
health insurance or third-party reimbursement for mental health services in the private

26

sector (Crespi & Howe, 2002).


On average, children spend six hours of each day in school (Crespi, 2009).
Observations have revealed that children who exhibit problematic behaviors in school
come from families who have experienced difficulties in one or more aspects of
functioning (Fergusson, Horwood, & Lynskey, 1994), such that children who learn
aggressive behaviors at home tend to bring them to school (Farrell, Guerra, & Tolan,
1996). Furthermore, academic performance and difficulties with behavioral adjustment
have been correlated with conflict children experience at home (Crespi, 2009). When
developing an understanding of a childs behavior, however, one perspective to take
comes from Banduras Social Learning Theory (Bandura, 1977). According to the
reciprocal determinism tenet of this theory, a students behaviors, cognitions and
personality variables, and environment can be understood as interconnected factors that
have a bi-directional effect on each other (Bandura, 2004). The strength of the influence
of one factor relative to the others will vary depending on the child, his or her
environment, and the specific circumstances of that environment (Bandura, 2004).
Therefore, the factors causing and maintaining negative behaviors come from a variety of
differences sources, and form a complex constellation of potential cause and effect
relationships.
Of note to school psychologists and mental health professionals is that children
have tended to access more mental health services at school than in any other venue
(Burns et al., 1995; Hoagwood & Erwin, 1997; Leaf et al., 1996). To illustrate this,
Farmer et al. (2003) pointed to epidemiological research on mental health prevalence
rates and service delivery, stating that 11-12% of youth in any given year sought services

27

through the education sector, while only 7% pursued these services through specialty
mental health providers, or in a medical facility (4%). These results suggest that schools
may be seen as the primary source for mental health services for children and youth. In
addition, school-based early intervention programs have been found to be effective in
reducing delinquent behavior in adolescents (Crespi & Rigazio-DiGilio, 1996),
suggesting that schools may be an ideal location for the provision of psychological
services (Crespi & Fischetti, 1997). Developmental research highlights the fact that
student mental health and psychological well-being are necessary conditions for
educational success at school (Haertel, Walberg, & Weinstein, 1983; Wang, Haertel, &
Walberg, 1990). As these findings have become more well-known, schools have
responded by becoming the default provider of mental health services for most children
and adolescents (Hoagwood & Johnson, 2003).
The provision of counseling and mental health services in schools is also
mandated by federal legislation. For example, according to IDEA, counseling and
psychological services are two related services that schools must provide to students, if it
is found that these services are necessary for students with disabilities to benefit from
special education (Wright & Wright, 2009). These authors have explained specific legal
requirements related to the provision of counseling in schools. Counseling services can
only be provided by social workers, psychologists, guidance counselors, or other
qualified professionals. The legal definition of psychological services covers a variety of
responsibilities carried out by the school psychologist. Responsibilities specifically
related to counseling include administering and interpreting assessments, obtaining and
sharing information about a childs behavior and conditions necessary for learning,

28

collaborating on the development of positive behavioral intervention strategies, and


planning and implementing psychological services such as psychological counseling for
children and parents.
In addition to legal mandates, professional organizations representing school
psychology, such as NASP and APA, specify that school psychologists provide
counseling to the students with whom they work. For example, according to the NASP
(2010) publication listing the professional services of school psychologists, some of the
direct services that school psychologists deliver to children, families, and schools include
interventions and mental health services designed specifically to develop social and life
skills. According to these standards, mental health services may also be provided
indirectly at the systems level through the implementation of prevention and responsive
services designed by school psychologists (NASP, 2010). Furthermore, the APA (2011)
defines one of the roles of the school psychologist as one who is able to conceptualize
childrens development from multiple theoretical perspectives and translate current
scientific findings to alleviate cognitive, behavioral, social, and emotional problems
encountered in schooling (n. pag).
Current counseling practices of school psychologists. Several researchers have
gathered data describing the counseling practices of school psychologists using surveys
(Curtis et al., 2008; Hanchon & Fernald, 2011; Yates, 2003) and focus groups (Suldo,
Friedrich, & Michalowski, 2010). The most common theoretical orientation guiding the
counseling process of the school psychologists sampled was cognitive behavioral,
followed by behavioral, brief solution-focused, and reality-based (Hanchon & Fernald,
2011; Yates, 2003). Focus group responses indicated that group counseling was the most

29

commonly utilized format (Suldo et al., 2010), with a range of 53-74% of school
psychologists indicating that they offered this service (Hanchon & Fernald, 2011; Yates,
2003). The average number of student groups counseled each year was 8.8 (Curtis et al.,
2008), with group sizes ranging from 2-4 students (Yates, 2003). Respondents reported
seeing 1-5 groups each week, while offering each group between 5-16 sessions (Yates,
2003). Specific issues addressed during group counseling sessions included social skills
development, anger management, study skills, anxiety, grief, and organizational skills
(Suldo et al., 2010).
The second most commonly used counseling format was individual counseling
(Suldo et al., 2010), with 61-88% of school psychologists providing this service
(Hanchon & Fernald, 2011; Yates, 2003). School psychologists met with an average of
9.9 students each year for individual counseling (Curtis et al., 2008). Students received 5
or more sessions each year, with session lengths ranging from 30-45 minutes each (Yates,
2003). Specific behaviors addressed using individual counseling included crisis
intervention, suicide assessment and intervention, threat assessment, de-escalation, and
other various intervention components (Suldo et al., 2010). A range of 32-52% of those
sampled administered classroom counseling sessions (Hanchon & Fernald, 2011; Yates,
2003), making this the third most common counseling format. Issues addressed during
classroom sessions included teaching social skills, family issues, girls issues, violence
prevention, study skills, art and music therapies, and self-esteem issues (Yates, 2003).
Respondents from two of the surveys also provided family counseling (8-31%; Hanchon
& Fernald, 2011; Yates, 2003). In addition, respondents to the Hanchon and Fernald
survey (2011) also provided crisis response counseling (51%), individual counseling with

30

adults (7%), and brief solution-focused counseling sessions (45%).


In addition to the provision of direct counseling services, school psychologists
interviewed by Suldo and colleagues (2010) during their focus group described offering a
variety of indirect mental health services to students, teachers, and parents. For example,
after individual counseling, consultation with school staff and teams, and parents was the
next most frequently delivered service, after which school psychologists also indicated
that they commonly designed behavioral interventions. These school psychologists also
provided case management services in collaboration with psychiatrists and other
outpatient agencies, in addition to engaging in their own social-emotional behavioral
assessments. These school psychologists, moreover, offered counseling services to other
school employees to address various personal needs affecting their ability to fulfill their
educational responsibilities. Furthermore, school psychologists delivered in-service
professional development and training to school staff and parents, and implemented
prevention measures such as school- and class-wide screening, and drug education. The
final service they administered involved support groups for parents.
Several researchers have surveyed school psychologists about the types of referral
issues with which they are confronted. A discrepancy in responses was noted in this area,
depending on when data on referral issues had been collected. For example, according to
the responses provided by Bramlett, Murphy, Johnson, and Wallingsford (2002) and
Yates (2003), the most common referral issue that school psychologists in their samples
received involved academic problems, followed by externalizing behaviors (e.g., conduct,
anger, ADHD), peer relationship problems, and self-esteem issues. Data collected
recently, however, indicated that behavioral issues were the most common referral

31

received by school psychologists, followed by emotional difficulties, academic problems,


and social skills deficits (Hanchon & Fernald, 2011). This change suggests that the focus
on meeting the mental health needs of students may be warranted, as behavioral issues
may be having a more significant impact on the ability of students to benefit from
instruction and participate in school. Although not as frequently, internalizing behaviors,
such as depression, anxiety, suicidal thoughts and thought disorders, as well as
difficulties related to trauma and abuse, were also referred to school psychologists (Yates,
2003). Referrals also seemed to vary by age. Depression, motivation, school refusal,
substance abuse, and truancy referrals were more frequently received by school
psychologists working with students in grades 6-12 than those who worked in elementary
schools (Yates, 2003).
Section summary. Although school psychologists continue to report allocating a
majority of their time on assessment activities (Bramlett, Murphy, Johnson, Wallingsford,
& Hall, 2002; Fisher, Jenkins, & Crumbley, 1986; Goldwasser, Meyers, Chistenson, &
Graden, 1983; Hartshorne & Johnson, 1985; Lacayo, Sherwood, & Morris, 1981;
Meacham & Peckam, 1978; Reschly & Wilson, 1995; Smith, 1984), several factors have
converged to focus attention on student mental health and emotional well-being.
Research from different fields and specialties has documented examples of student
exposure to a variety of stressors (American Psychiatric Association, 2000) and the
difficulty some students experience functioning efficiently in school as a result (Crespi,
2009). Furthermore, student mental health needs have not consistently been addressed
outside of school (Kataoka, Zhang, & Wells, 2002). At the same time, there is also
evidence citing the success of prevention and early intervention programs delivered in the

32

school setting (Crespi & Fischetti, 1997; Crespi & Rigazio-DiGilio, 1996).
These factors have prompted school psychologists to offer counseling and mental
health services on a direct and indirect basis (Curtis et al., 2008; Hanchon & Fernald,
2011; Suldo, Friedrich, & Michalowski, 2010; Yates, 2003). Professional organizations
have also responded by delineating training standards and guidelines (NASP, 2010), as
well as federal legislators who have passed legal mandates regulating these services
(Wright & Wright, 2009) in an attempt to address student mental health needs. Given the
needs of students, knowledge detailing the current counseling practices of school
psychologists is valuable. Determining whether school psychologists counseling
practices have adequately addressed student needs, however, may require establishing a
solid foundation of research documenting effective practices in counseling, and ensuring
that those practices and strategies proven to be effective are those being used.
Current Best Practices in Counseling
A variety of different authors and professional organizations provide guidance for
school psychologists as they implement direct and indirect mental health services. As an
example, NASP regularly publishes information detailing best practices which help to
translate research findings into steps that can be taken within the school setting. At this
time, data-based decision making and accountability are two practices that form the basis
for service delivery within the field of school psychology. These practices are also
heavily researched and a popular focus of discussion in relation to educational and mental
health interventions. Doll and Cummings (2008) discussed the importance of data-based
decision making and accountability in relation to the provision of population-based
mental health services, stressing that, based on an assessment of the needs of the school

33

population, school mental health teams should identify indicators of student emotional
well-being early in the process of planning services. In addition, once these indicators
have been specified, methods for regularly evaluating whether these objectives are being
met must also be determined. These authors recommended continuous and formative
assessment to inform the actions of mental health providers.
In their publication for the NASP School Psychology Forum, Coffee and RaySubramanian (2009) described the use of Goal Attainment Scaling (GAS) as one method
for regular progress monitoring of behavioral interventions that can be completed by a
variety of school professionals familiar with the student, or even by the student him- or
herself. According to these authors, additional benefits of using GAS include its utility as
a repeated measure to monitor student behavior on a daily or weekly basis due to its
sensitivity to small changes in behavior, as well as being a tool to evaluate the overall
effectiveness of a given intervention.
Furthermore, Doll and Cummings (2008) supported the exclusive selection and
use of evidence-based treatments as school psychologists provide direct and indirect
mental health services to students. In relation to this, as part of best practices related to
brief individual counseling, Murphy (2008) recommended developing clear and
meaningful goals with the student, while evaluating progress towards goals regularly
throughout the counseling process using feedback from the client, and by comparing
precounseling and postcounseling data gathered using observations, behavior rating
scales, grades, and discipline records.
Evidence-based practices/evidence-based interventions. Over the past 10
years, practitioners in the fields of mental health and education have expressed significant

34

interest in psychosocial treatments that have been empirically proven to successfully


address a variety of child and adolescent behavior problems (Silverman & Hinshaw,
2008). Throughout the literature, these types of psychosocial treatments are referred to as
evidence-based practices (APA, 2005), evidence-based treatments (Doll & Cummings,
2008), evidence-based interventions (Kratochwill & Shernoff, 2004), and empiricallysupported treatments (Association for Behavioral and Cognitive Therapies [ABCT] &
Society of Clinical Child and Adolescent Psychology [SCCAP], 2010b). Presented in
Table 2 are examples of definitions for the different terminology used to describe
evidence-based interventions. Despite the differences noted among these definitions, a
common theme in the focus on evidence-based interventions is the use of strategies,
therapies, or practices that have been empirically proven to achieve a specific result, or
produce a specific behavior. To maintain clarity, I will reference the definition provided
by Kratochwill and Shernoff (2004), who maintain that evidence-based interventions
(EBIs) are those whose contextual applications have been demonstrated, and which have
been proven to be efficacious when implemented and evaluated in practice settings.
Several issues have been noted in the research literature surrounding EBIs. The
first relates directly to terminology. Throughout the last several decades of the evidencebased practice movement, one key stakeholder, the American Psychological Association
Task Force on Psychological Intervention Guidelines, decided to replace the phrase
empirically validated with empirically supported, as it was their opinion that a particular
treatment could never be completely validated (Chambless & Hollon, 1998; Lonigan,
Elbert, & Johnson, 1998). Currently, this group advocates for the use of the term
evidence based, as it is their view that having the word evidence in their definition

35

Table 2
Definitions of Evidence-Based Interventions
Terminology from the Literature
Evidence-based practice in psychology
(APA, 2005)

Definition
the integration of the best available
research with clinical expertise in the
context of patient characteristics, culture,
and preferences (n. pag)

Evidence-based practice (Association for


Behavioral and Cognitive Therapies
[ABCT] & Society of Clinical Child and
Adolescent Psychology [SCCAP], 2010a)

treatments that are based directly on


scientific evidence that has revealed the
strongest contributors and risk factors for
psychological symptoms. Most EBPs have
been studied in several large-scale clinical
trials, involving thousands of children
and/or adolescents and careful comparison
of the effects of EBPs vs. other types of
psychological treatments. Dozens of multiyear studies have shown that EBPs can
reduce symptoms significantly for many
years following the end of psychological
treatment-similar evidence for other types
of therapies is not currently available (n.
pag)

Evidence-based interventions (Forman &


Burke, 2008)

interventions [that] are empirically


supported substantiated with findings in
the research literatures that demonstrate
that they are likely to produce predictable,
beneficial, and effective results (p. 799)

Evidence-based interventions (Kratochwill


& Shernoff, 2004)

an intervention should carry the evidencebased designation when information about


its contextual application in actual practice
is specified and when it has demonstrated
efficacy under the conditions of
implementation and evaluation in practice
(p. 35)

Empirically-supported treatments
(Ollendick & King, 2004)

treatments of scientific value (p. 5)

Empirically-supported treatments
(Association for Behavioral and Cognitive

interventions that have been found to be


efficacious for one or more psychological
(table continues)
36

Table 2 continued
Terminology from the Literature
Therapies [ABCT] & Society of Clinical
Child and Adolescent Psychology
[SCCAP], 2010b)

Definition
conditions, like major depression, panic
disorder, or obsessive-compulsive disorder,
within a given population (n. pag)

37

facilitates the use and understanding of evidence-based interventions by professionals in


other fields (Silverman & Hinshaw, 2008). At this time, the term evidence based is also
used by the APA Presidential Task Force on Evidence-Based Practice (2006), as well as
researchers and advocates in the field of medicine (Sackett, Rosenberg, Gray, Haynes, &
Richardson, 1996).
Another issue is related to distinctions made by researchers between efficacy and
effectiveness studies (Hoagwood, Hibbs, Brent, & Jensen, 1995; Weisz, Donenberg, Han,
& Weiss, 1995). Efficacy studies provide information on whether a specific treatment
has been found to reduce symptoms or impairment when the researchers have used
experimental methods, such as random assignment, control groups, and manualized
treatment protocols (Silverman & Hinshaw, 2008). On the otherhand, effectiveness
studies describe interventions or treatments that have been found to decrease symptoms
or impairment in environments where such interventions are most likely to be delivered,
such as classrooms or mental health centers (Silverman & Hinshaw, 2008).
One additional distinction that has recently gained importance in the literature on
EBIs involves defining mediators and moderators related to treatment outcomes
(Hinshaw, 2002; Hombeck, 1997; Kraemer, Wilson, Fairburn, & Agras, 2002).
Mediation describes how therapeutic change is produced when a specific intervention is
used (Silverman & Hinshaw, 2008). Moderators of treatment outcome are factors that
are independent of treatment condition but exert a significant influence on the differential
effects of a treatment condition (Kraemer, Wilson, Fairburn, & Agras, 2002).
Moderation details for whom or under what conditions therapeutic change takes place
(Silverman & Hinshaw, 2008). Mediators of treatment outcome are not factors

38

independent of the treatment condition, but instead, are a consequence of the treatment
that explain why one treatment produces improved outcomes compared to another
(Kraemer, Wilson, Fairburn, & Agras, 2002).
These distinctions and terminology are important contextual factors within the
evidence-based practice movement. Their significance stems from a continued gap
between research and practice, or the difference between what clinical scientists know
about which treatments successfully reduce symptoms, and what clinicians and
practitioners are actually using when working with children and adolescents (Herschell,
McNeil, & McNeil, 2004; Kazdin, Kratochwill, & VandenBos, 1986; Weisz, Weiss, &
Donenberg, 1992). Researchers have cautioned against relying on the assumption that
treatments or interventions found to be successful using efficacy studies will also be
effective when used in routine practice settings (Hoagwood, Burns, Kiser, Ringeisen, &
Schoenwald, 2001). Although there are a variety of factors maintaining the gap between
research and practice, the increased dissemination of research detailing effectiveness
studies, the efficacy of interventions along with consideration of their generalizibility or
transportability to non-research settings, and practice guidelines related to mediation and
moderation of treatment outcomes are recommendations for narrowing this gap to ensure
that children and adolescents are receiving the highest quality treatment available
(Hoagwood, Burns, Kiser, Ringeisen, & Schoenwald, 2001; Silverman & Hinshaw,
2008).
Factors supporting the exclusive use of evidence-based practices. The
movement supporting the use of EBIs has been seen in several different fields, such as
medicine, education, social work, nursing, and dentistry (Kazdin, 2008). Nemade, Reiss,

39

and Dombeck (2007) described the role that insurance companies have played in
promoting the use of EBIs in mental health care because these treatments provide a
measure of accountability as they have scientific evidence supporting their use.
Furthermore, when compared to other treatments such as psychotherapy, EBIs tend to be
more short-term, while allowing a scientifically-based method for clinicians to justify the
number of sessions they will need to address a specific behavior or disorder. Within the
field of education, the federal No Child Left Behind Act (2002) specified that all school
practices must be based on scientifically-based research, the definition of which was
clarified by the U.S. Department of Education (2003) in the following way:
scientifically based research means there is reliable evidence that the program or
practice works (n. pag). In addition, professional organizations representing the fields
of school psychology and social work have affirmed their commitment to the use of EBIs
through various ethical standards and practice models described in Table 3.
Current evidence-based practices related to counseling. Research findings by
Kazdin (2003) indicated that counseling has been used as a feature of interventions for
many issues addressed in schools today, including externalizing and internalizing
problems, learning and mental disabilities, and with profound forms of psychopathology,
such as autism. In line with the movement promoting the use of EBIs, and in response to
research findings indicating that one of the factors preventing some school psychologists
from delivering counseling and mental health services is a lack of adequate training and
experience with these tasks (Suldo, Friedrich, & Michalowski, 2010), this section will
discuss best practices related to designing, implementing, and evaluating evidence-based
interventions.

40

Table 3
Clauses Recommending the Use of Evidence-Based Interventions
Professional Organization
National Association of School
Psychologists (NASP) Model for
Comprehensive and
Integrated School Psychological
Services (2010)

Recommendation
NASPs mission is accomplished through
identification of appropriate evidencebased education and mental health services
for all children; implementation of
professional practices that are empirically
supported, data driven, and culturally
competent; (p. 1)
School psychologists collect and use
assessment data to understand students
problems and to select and implement
evidence-based instructional and mental
health services (p. 4)

American Psychological Association


(APA) Ethical Principles of Psychologists
and Code of Conduct (2010)

2.04 Bases for Scientific and Professional


Judgments: Psychologists' work is based
upon established scientific and professional
knowledge of the discipline.

Code of Ethics of the National Association


of Social Workers (NASW; 2008)

4.01 Competence
(c) Social workers should base practice on
recognized knowledge, including
empirically based knowledge, relevant to
social work and social work ethics.

41

In relation to indirect, population-based mental health services, Doll and


Cummings (2008) recommended first conducting an assessment of the mental health
needs of the student body, and then identifying mental health resources in the
community. Furthermore, these authors suggested offering a continuum of mental health
services within the school building following from the three-tiered service model
proposed by Walker et al. (1996) and Osher, Dwyer, and Jackson (2004). Depending on
identified needs, universal services would be provided to the entire school body, with
more individualized and direct services provided to approximately 15-20% of the student
body considered to be at a high functional or demographic risk (Doll & Cummings,
2008). In addition, highly intensive, direct educational and social emotional services
along with referral to a private mental health provider may be necessary for the remaining
1-5% of the school students whose needs have not been met through interventions
delivered in the first two tiers.
Although a number of suggestions have been offered to help school psychologists
plan and implement interventions, these suggestions seem to rely on a deliberate
problem-solving approach. As an example of this, Upah (2008) provided steps to take
when implementing an intervention that are aligned with the problem-solving model
(Bergan & Kratochwill, 1990), which may be helpful to school psychologists as they plan
counseling interventions. These steps can be found in Table 4.
As part of her explanation of the different intervention components listed in Table
4, Upah (2008) recommended that any behavior that is the focus of intervention first be
operationally defined, or described using specific, observable, and measureable terms.
Over the past few decades, several researchers have acknowledged the link between

42

Table 4
An Outline for Planning Interventions Aligned with the Problem-Solving Model
Problem-solving logic
What is the problem?

Intervention components
Behavioral definition
Baseline data
Problem validation

Why is it happening?

Problem analysis steps

What should be done about it?

Goal setting
Intervention plan development
Measurement strategy
Decision-making plan

Did it work?

Progress monitoring
Formative evaluation
Treatment integrity
Summative evaluation

43

grounding interventions in a behavioral definition of the target behavior and the success
of the intervention designed to change that behavior (Baer, Wolf, & Risley, 1987; Deno,
1995; Flugum & Reschly, 1994; Reynolds, Gutkin, Elliot, & Witt, 1984; Steege &
Wacker, 1995). Behavioral definitions allow for a common understanding among those
involved in the intervention of when the target behavior does and does not occur, and as
such, are necessary for reliable measurement of the target behavior during
implementation (Kazdin, 1982; Steege & Wacker, 1995). Behavioral definitions must
satisfy the following three conditions, in that they must be: objective, or descriptive of
observable actions that can be seen or heard; clear and unambiguous enough so that
someone unfamiliar with the student or the intervention could repeat or accurately
summarize the behavioral definition; and, complete in describing examples and
nonexamples of the behavior so that anyone observing the childs behavior is able to tell
when the target behavior is and is not occurring (Hawkins & Dobes, 1977; Howell &
Nolett, 2000; Kazdin, 1982; Reschly, Tilly, & Grimes, 2000).
In addition to the intervention components listed in Table 4, Forman and Burke
(2008) provided additional recommendations designed to help improve the effectiveness
of counseling interventions. Once goals have been formulated, during the intervention
plan development phase, these authors propose that school psychologists conduct a
review of the literature pertaining to EBIs proven to be effective in remediating the
identified problem, and select an intervention from these sources. Forman and Burke
(2008) also suggested that school psychologists identify intervention implementers and
stakeholders, assess their perceptions, attitudes, and beliefs related to the intervention,
and develop administrative and stakeholder support for the intervention. In addition, they

44

proposed that intervention implementers be provided with training, technical assistance,


and resources necessary for effective implementation. Along with providing training,
school psychologists should also consider the manner in which the normal functioning of
the school (e.g., the schools mission, policies and procedures, other programs already in
place) will impact stakeholders and intervention implementers, and acknowledge that
implementation should be rewarded, supported, and expected.
Section summary. Two themes noted in the literature on best practices in
counseling involve accountability and data-based decision making (e.g., Doll &
Cummings, 2009; Forman & Burke, 2008). The exclusive use of evidence-based
interventions (EBIs), or those interventions that have been found to be effective when
used in practice settings, has been advocated in a variety of different fields (Kazdin,
2008), and in particular, by professional organizations representing school psychologists
(APA, 2010; NASP, 2010) and social workers (NASW, 2008). One way for school
psychologists to hold themselves accountable for their work with students is through the
use of EBIs as they implement counseling as a direct intervention. Furthermore, the
problem-solving model (Bergan & Kratochwill, 1990), as applied to designing and
implementing interventions (Upah, 2008), includes components such as establishing a
behavioral definition for the problem behavior, collecting baseline data before
implementation, and regular progress monitoring of the students behavior during the
intervention. These behavioral data are then used for formative and summative
assessment, providing information on whether the intervention is working, or needs
revision, and when and if the intervention should be continued (Upah, 2008).
In addition to considering the broad guidelines provided by Upah (2008) and

45

other researchers, it may be helpful to look at the research literature for more specific
guidelines related to designing direct behavioral interventions and EBIs. With these
more specific guidelines in mind, researchers are better able to evaluate whether the gap
between research and practice (Herschell & McNeill, 2004; Kazdin, Kratochwill, &
VandenBos, 1986; Weisz, Weiss, & Donenberg, 1992) extends to school psychologists in
terms of their application of current research related to accountability and data-based
decision making in the provision of counseling services.
Designing and Evaluating Direct Interventions
This section will discuss literature on designing and implementing direct
interventions. The current status of research related to academic and behavioral
interventions is a necessary starting point for school psychologists looking to align their
work with students with best practices. Although, at this time, research describing
effective practices in terms of social-emotional-behavioral interventions is in its early
stages, available research guidelines reinforce the importance of repeated measures of
student behavior to inform interventions, as well as the use of evidence-based techniques
and strategies. As such, this section concludes with a review of evidence-based
interventions for behavioral issues commonly seen in children.
Comparing academic and social-emotional-behavioral interventions. Under
traditional assessment models, teachers would identify students displaying behavioral
problems and school psychologists would conduct evaluations for special education
eligibility (Kamphaus, DiStefano, Dowdy, Eklund, & Dunn, 2010) with the result that
only students demonstrating high levels of need would be provided with services (Cash &
Nealis, 2004). Large numbers of students with behavioral and emotional problems (Mills

46

et al., 2006), however, called into question the utility and effectiveness of the teacher
referral system for several reasons: teachers may not be adequately trained to recognize
developing problem behaviors; teachers vary in their ability to address problem
behaviors, leading to different rates of referral; some students are not identified in an
effective and efficient manner (Tilly, 2008); and, some teachers consider behavior
problems and difficulties with social-emotional adjustment to be beyond their area of
responsibility (Severson, Walker, Hope-Doolittle, Kratochwill, & Gresham, 2007).
A problem-solving approach to identifying and supporting students with
behavioral problems has been recommended as an alternative to the teacher referral
system (Tilly, 2008). Although there are a variety of problem-solving approaches in use
in different contexts, some common features have been noted across different models,
such as the importance of universal screening and periodic assessment (Schwanz &
Barbour, 2005). Despite the existence of multiple problem-solving approaches,
traditional school-based assessment practices have not always meshed well with the
problem-solving approach to assessment and data-based decision making (Gresham et al.,
2010). An example of this is the use of standardized ability and achievement tests and
behavioral measures that have proved useful for making eligibility decisions, but do not
possess the treatment validity needed to inform instruction, have not been found to be
feasible or designed to progress-monitor a students response to intervention (Fuchs &
Fuchs, 1998; Gresham, 2002; Gresham & Witt, 1997), and have not been designed to
measure response to intervention in order to make special education eligibility or exit
decisions (Gresham, 2007; Shinn, 2008).
Furthermore, traditional school-based assessment practices do not always fit in

47

with the RTI paradigm (Briesch, Chafouleas, & Riley-Tilman, 2010). According to the
RTI framework, problem behaviors must be systematically and proactively defined
through the screening process and then regularly measured as part of progress monitoring
to determine whether the intervention was successful or other remediation strategies are
necessary. Some potential problems arise, however, when an RTI paradigm is applied to
behavioral assessment, as many available psychometrically sound assessment measures
are not feasible for repeated administration with large groups of students because of their
length and the frame of reference raters must consider when providing responses
(Briesch, Chafouleas, & Riley-Tilman, 2010).
The current link between social-emotional-behavior (SEB) assessment and
intervention is tenuous (Merrell, 2010). Following the problem-solving model, the
strength of current SEB assessments lies in the practitioners ability to use these measures
to determine problem behaviors and the factors maintaining them, without providing
guidance on how to address these problems and determine whether what has been done
was effective (Merrell, 2010). Many available rating scales for common DSM-IV
disorders are in existence (Merrell, 2008; Pelham, Fabiano, & Massetti, 2005), but are not
sensitive enough to change to be useful as a repeated measure of the target behavior of an
intervention (Volpe & Gadow, 2010). This is especially true in cases when the target
behavior is not the sole focus of the rating scale, or when a given scale is not best suited
to the referral concern (Volpe & Gadow, 2010). In addition, limited time and resources
may prevent practitioners from collecting enough data to make informed decisions
related to student behavior, underscoring the need for continued research and
development of feasible and psychometrically sound behavioral assessment measures

48

(Briesch, Chafouleas, & Riley-Tilman, 2010).


School-based problem-solving teams are able to make decisions effectively when
their actions are guided by clearly defined and measurable academic and SEB outcomes
(Newton, Horner, Algozzine, Todd, & Algozzine, 2009). There is a current impetus in
the field of education for the development of psychometrically sound, data-based
decision-making tools to be utilized within problem-solving frameworks (National Center
on Response to Intervention, 2011), but the result has been a focus on academic skills,
while research, knowledge, and resources related to SEB assessment are lacking
(Briesch, Chafouleas, & Riley-Tilman, 2010). Although it may appear logical to
compare problem-solving tools for academic difficulties with those used for SEB
difficulties, doing so is problematic for a number of reasons (Chafouleas, Volpe,
Gresham, & Cook, 2010). While there exists a variety of objective and measurable
indicators of academic growth, consensus has not been reached regarding general
outcome measures for behavior (Chafouleas, Volpe, Gresham, & Cook, 2010). The fact
that a general outcome measure of behavioral health does not exist the way that general
outcome measures exist for assessment of academic abilities has limited and complicated
efforts to develop feasible methods for monitoring SEB problems (Shinn, 2007; Volpe &
Gadow, 2010). The literature on designing academic and behavioral interventions
following the problem-solving model is summarized in Table 5.
A review of the literature related to designing academic and behavioral
interventions reveals that, although comparison of these types of interventions is
problematic, because they are grounded in the problem-solving model, they share some
common elements. For example, both models stress the importance of grounding

49

50

Problemsolving logic
What is the
problem?

Intervention
component
Behavioral
definition

50

RTI Literature on Designing Academic


Interventions
Operationally define the behavior in clear
language that includes topography (how
the behavior looks), frequency (how
often the behavior occurs), duration (how
long the behavior lasts), and intensity
(the degree to which the behavior is
problematic) (Brown-Chidsey & Steege,
2010)
Create codes in order to quickly and
accurately identify and record behaviors
during data collection and documentation
(Brown-Chidsey & Steege, 2010)
Determine and describe the locations(s)
where the behavior(s) of concern will be
observed and recorded (Brown-Chidsey
& Steege, 2010)
Define the replacement behavior, or
target skill that needs improvement, and
describes what the student needs to be
able to do in concrete, measurable terms
(Batsche, Castillo, Dixon, & Forde,
2008)

An Outline for Planning Academic and Behavioral Interventions

Table 5

Literature on Behavioral Interventions


Define target behaviors by identifying
exactly what the student says and does that
make up the problem and the desirable
behavior (Miltenberger, 2005)
Use active verbs describing the observable
actions of the student, using a thorough
definition including all the different
responses encompassed within the
behavior being defined (Miltenberger,
2005)
Identify the dimensions of the behavior by
describing the frequency (number of times
the behavior occurs), latency (time that
passes from the presentation of a stimulus
and the students response or behavior),
intensity (strength or force with which the
behavior is displayed), topography
(configuration, form, or shape of the
behavior), accuracy (measure of how well
the behavior fits a standard or is correct),
and duration (the amount of time that
passes between the onset and ending of a
behavior) (Sulzer-Azaroff & Mayer, 1991)
(table continues)

51

Table 5 continued
ProblemIntervention
solving Logic Component
Baseline
data

51

RTI Literature on Designing Academic


Interventions
At least 3 data points are needed to
establish a baseline level of behavior or
performance (Hayes, Barlow, & NelsonGray, 1999)
Baseline data should be stable over time;
baseline data that demonstrates
variability/unstable trends provides an
unreliable basis for determining whether
or not the intervention was effective
(Hayes, Barlow, & Nelson-Gray, 1999)
Avoid overlap of baseline and
intervention data as this weakens
confidence in the effectiveness of the
intervention (Hayes, Barlow, & NelsonGray, 1999)
Level of baseline data should be serious
enough to warrant an intervention and
make clear any behavior changes caused
by an intervention (Hayes, Barlow, &
Nelson-Gray, 1999)
Trends in baseline data should not be in
the same direction as would be trends
caused by an effective intervention
(Hayes, Barlow, & Nelson-Gray, 1999)
Procedures for recording behavior during
baseline are the same as those used to

(table continues)

Literature on Behavioral Interventions


Direct, accurate, objective and systematic
measurement of the target behavior as it
occurs in a natural setting to establish the
students level of functioning before
implementing an intervention (Upah, 2008)
Collection of baseline data consists of
repeated measures of behavior over
multiple sessions/days/weeks to establish a
stable pattern of behavior (no new highs or
lows for three consecutive data points)
(Sulzer-Azaroff & Mayer, 1991)
Social-emotional-behavioral assessment
strategies fall into 1 of 6 different
categories: direct behavioral observation,
3rd party behavior rating scales,
sociometric techniques, interviews,
objective self-report measures, or
projective-expressive techniques (Merrell,
2010)

52

Why is it
happening?

Problem
Analysis
steps

Problem
Validation

Table 5 continued
ProblemIntervention
solving Logic Component

52

Identify any changeable variables


contributing to the problem behavior
(Heartland Area Education Agency,
2007)

RTI Literature on Designing Academic


Interventions
record the same behavior during
intervention (Brown-Chidsey & Steege,
2010)
Phase change line is used to separate
baseline and intervention phases during
graphic representation of data (BrownChidsey & Steege, 2010)
Determine if the difficulty is specific to
the child or the classroom within he/she
resides, and then determine if the
problem is caused by a deficit in skill or
performance by providing the
opportunity for additional reinforcement
contingent upon the performance of a
suitable replacement behavior (Batsche,
Castillo, Dixon, & Forde, 2008)
Compare the childs rate of progress to a
projected rate of progress needed to
obtain proficiency within a set period of
time (Shinn, 1989)

Determine the magnitude of the problem


by establishing the difference between the
students current level of performance and
the environmental expectations (Howell &
Nolet, 2000)
Compare the students baseline
performance level to a standard for
appropriate and acceptable performance
(Upah, 2008)
Determine the standard of appropriate and
acceptable performance using the behavior
of typical peers, building or district norms,
teacher/classroom expectations, criteria for
a future environment, or school policy
(Upah, 2008)
Develop a hypothesis statement by
describing the events that occur just before
and immediately after the problem
behavior (such as setting events or
(table continues)

Literature on Behavioral Interventions

53

What should
be done
about it?

Goal Setting

Table 5 continued
ProblemIntervention
solving Logic Component

53

RTI Literature on Designing Academic and


Behavioral Interventions
Collect relevant information detailing
student instruction, curriculum,
environment and learner through reviews,
Interviews, observations and tests
(Heartland Area Education Agency,
2007)
Use this information to assess the
students skills, evaluating the rate (e.g.,
the speed with which the skills can be
successfully demonstrated) as well as the
accuracy with which the skill is
completed (Heartland Area Education
Agency, 2007)
Using a norm-referenced approach to
evaluate level and a criterion approach to
address slope of growth, target rates for
growth can be computed by determining
scores the predict passing on state tests at
three baseline assessments, where target
benchmark scores are calculated to
predict a proficient score on a state test
and a rate of growth necessary to achieve
these scores is calculated
(VanDerHeyden & Burns, 2010)
Goals for the rate of student performance
can be derived using instructional-level

(table continues)

Uses clear and measurable criteria to


define what the student will be able to do if
the intervention if effective (Upah, 2008)
Includes the timeframe (when the expected
progress will be accomplished), condition
(specific circumstances in which the
behavior will occur as established during
problem identification and analysis stages),
the behavior (written in objective,
observable, and measurable terms), and
criteria (the standard of how well the
behavior is to be performed) (Upah, 2008)

Literature on Behavioral Interventions


immediate antecedents); describe the target
behavior, and finally, identify the
presumed function of the behavior (Kern,
2005)
Ground the hypothesis in data collected
about the student, manipulating different
variables systematically to test different
hypotheses (Kern, 2005)
Use hypotheses to develop interventions
(Kern, 2005)

54

Table 5 continued
ProblemIntervention
solving Logic Component

54

RTI Literature on Designing Academic


Interventions
criteria or with national norms (e.g.,
scores at or above the 25th percentile)
(VanDerHeyden & Burns, 2010)
Accuracy goals can be calculated with
instructional level criteria using the
percentage of items correctly completed
(Gickling & Thompson, 1985); the use of
93-97% of known material can be set as a
criterion for instructional level accuracy
(Burns, 2007; Gickling & Armstrong,
1978; Treptow, Burns, & McComas,
2007)
Goals can be set for math and other skills
aside from reading comprehension and
evaluated using drill tasks where 90% is
considered an adequate criterion for
accuracy for skills such as multiplication
facts, letter sounds, comprehension
questions, and other skills with the
exception of reading fluency
(VanDerHeyden & Burns, 2010)
Goal lines can be calculated by
estimating student progress at a rate of
25% improvement in a given skill each
week; after collecting at least 3 data
points and establishing a baseline, select
the median score and multiply that by

(table continues)

Literature on Behavioral Interventions


For visual representation, choose the
central baseline data point, draw a vertical
line after the last baseline data point, draw
a second endpoint at the highest
performance level on the date when it is
expected that the student will have met the
goal, and draw a straight line connecting
both endpoints (Upah, 2008)

55

Intervention
Plan

Table 5 continued
ProblemIntervention
solving Logic Component

55

RTI Literature on Designing Academic


Interventions
1.25 (25%) to set a target for the
following week; use this procedure to set
goals for each week of the intervention
(Williams, 2010)
Using hypotheses confirmed by data
during problem analysis, develop
interventions for as many confirmed
hypothesis as possible given available
resources in order to remove barriers to
learning (Batsche, Castillo, Dixon, &
Forde, 2008)
Interventions selected must be evidencebased, and should be selected based on
evidence showing their effectiveness
when previously implemented for similar
problems (Batsche, Castillo, Dixon, &
Forde, 2008)
The intervention implementation plan
should detail personnel responsible for
supporting teachers as they implement an
intervention, the responsibilities of the
support personnel, and when and where
these activities will occur (Batsche,
Castillo, Dixon, & Forde, 2008)
Tier II interventions should be delivered
to a small group of students (2-8 in

(table continues)

Clear description of the procedures to be


used during an intervention (specific
strategies based on the problem and the
literature on EBIs, what will be done, how
each step will be completed, materials
needed for each step, who will do what and
when, where procedures will take place)
(Upah, 2008)

Literature on Behavioral Interventions

56

Measureme
nt Strategy

Table 5 continued
ProblemIntervention
solving Logic Component

56

RTI Literature on Designing Academic


Interventions
elementary school, 8-10 for middle
school, or 10-12 for high school),
delivered 3-5 times a week for 20-30
minute sessions, designed to last at least
8 weeks (VanDerHeyden & Burns, 2010)
Tier III interventions should be delivered
on an individual basis, with intervention
sessions occurring daily (VanDerHeyden
& Burns, 2010); other researchers
recommend that Tier III interventions
can be delivered in groups of 1-3, 5 times
per week, in 30-90 minute sessions
(Brown-Chidsey & Steege, 2010)
To facilitate accurate decision making,
during an intervention, direct evidence of
intervention implementation (e.g.,
observation of implementation,
completed worksheets, log-in records at a
computer, a self-monitoring checklist
completed by the teacher indicating that
the intervention steps have been
completed for the day, or a student score
on an assessment activity tracking the
intervention effects) and student learning
data (e.g., direct assessments of the skills
targeted for intervention, and/or a
criterion-level skill reflecting what is

(table continues)

Define the target behaviors, decide when


and where recording will occur, decide
who will record the target behaviors,
choose the most appropriate recording
method, choose the most appropriate
recording instrument (Miltenberger, 2004)

Literature on Behavioral Interventions

57

Decisionmaking Plan

Table 5 continued
ProblemIntervention
solving Logic Component

57

RTI Literature on Designing Academic


Interventions
necessary for success in the classroom)
should be collected (VanDerHeyden &
Burns, 2010)
Once a pre-determined amount of data
have been collected, the data will be
reviewed to determine whether the
intervention produced desired outcomes
by comparing the students
behaviors/performance to goals set for
the student (Brown-Chidsey & Steege,
2010)
Data can be analyzed with respect to
level (a score value obtained on a given
measure or at a given point in time, to be
compared to the performance of others)
and slope (the rate of progress a student
is making, to determine when and
whether or not astudent will meet a goal
in a pre-determined period of time)
(Brown-Chidsey & Steege, 2010)
Decisions about how to proceed with
instruction and interventions are made
based on whether slope and level data
show progress in meeting a specific goal
(Brown-Chidsey & Steege, 2010)
Task difficulty is increased

(table continues)

Determine how decisions will be made by


deciding the frequency of data collection,
how the data will be summarized for
evaluation purposes, how many data points
or how much time should occur before data
will be analyzed, and decision rules for
responding to specific data patterns (Tilly
& Flugum, 1995)
Two aspects of the data evaluated to judge
the effects of an intervention are the level
(how much the behavior is occurring
during baseline and intervention phases as
judged by its frequency, duration/intensity,
or percentage of intervals of its occurrence)
and trend (whether the level of the
behavior is increasing or decreasing within
the baseline and intervention phase)
(Miltenberger, 2005)

Literature on Behavioral Interventions

58

Table 5 continued
ProblemIntervention
solving Logic Component

58

RTI Literature on Designing Academic


Interventions
systematically as student learning
improves; generalization should be
assessed at routine intervals
(VanDerHeyden & Burns, 2010)
An aimline can be used to document the
expected rate of progress and thus
evaluate the students response to
intervention; a line is drawn connecting
the initial level of performance and the
desired level at the goal date, student data
are then plotted in a time-series graph,
with progress measure by comparing new
data points to the aimline; data points that
coincide with the aimline indicate that the
student is making adequate progress; 3
consecutive data points above the aimline
suggest that the goal should be revised to
be more challenging, while three
consecutive data points below the aimline
suggest that the intervention is not
working (Fuchs, Fuchs, Hintze, &
Lembke, 2006; Mirkin, Deno, Tindal, &
Kuehnle, 1982; Shinn, 1989) or that the
intervention is not at the right intensity
and the student may require placement in
a different tier (Fuchs et al., 2006)
Changes to intervention to increase

(table continues)

Literature on Behavioral Interventions

59

Did it work?

Progress
Monitoring

Table 5 continued
ProblemIntervention
solving Logic Component

59

RTI Literature on Designing Academic


Interventions
success include increasing the amount of
time for intervention sessions, reducing
the number of students in the intervention
group, or increasing the number of times
the student practices a specific skill
during the intervention (Brown-Chidsey
& Steege, 2010)
When a student has achieves six or more
data points indicating grade-level
performance of a specific skill the
intervention can be faded or discontinued
(Brown-Chidsey & Steege, 2010)
Progress monitoring assessment in Tier II
should typically occur on a weekly basis
(Fuchs, Kovaleski, & Carruth, 2009) , or
no less than once every other week
(VanDerHeyden & Burns, 2010);
progress monitoring data in Tier III are
collected at least once each week
(VanDerHeyden & Burns, 2010)
A minimum of 3-4 weeks of intervention
and 8 data points should be collected
before you examine student progress to
determine if changes need to be made to
the intervention (Fuchs, Kovaleski, &
Carruth, 2009)

Throughout intervention implementation,


student performance should be assessed for
continuous evaluation and to make needed
modifications (Upah, 2008)
Data gathering for progress monitoring
should entail the same procedures used to
gather baseline data during problem
identification (Upah, 2008)
Methods of progress monitoring include
checklists, frequency counts, observations,
percentage calculations, permanent
products, portfolios, rating scales, rubrics,
and time (e.g., measures of duration or
latency) (Upah, 2008)
(table continues)

Literature on Behavioral Interventions

60

Formative
Evaluation

Table 5 continued
ProblemIntervention
solving Logic Component

60

Assessments conducted at regular


intervals along the way to achieving a
specific learning goal to indicate whether
or not a student is making progress in
meeting an instructional goal (BrownChidsey & Steege, 2010)

RTI Literature on Designing Academic


Interventions
When analysis shows 4 consecutive data
points below the goal line, teams can
conclude that its not likely that the
student will achieve his/her year end goal
and should consider making instructional
changes (Fuchs, Kovaleski, & Carruth,
2009)

Literature on Behavioral Interventions


Using factor-derived or individualized
methods to abbreviate longer behavior
rating scales may allow for the creation of
more feasible progress-monitoring
behavior rating scales that retain the
technical qualities of their original formats
(Volpe & Gadow, 2010)
At least 3-5 structured direct observations
within or across days may be needed to
obtain a dependable estimate of academic
engagement (Briesch, Chafouleas, &
Riley-Tilman, 2010)
On-going evaluation to determine whether
or not an intervention will be successful,
and when necessary to make modifications
to increase the chances that goals for the
student will be met (Upah, 2008)
Visually analyze intervention performance
data compared to baseline data (Upah,
2008) with respect to change in mean (Is
the average rate of performance higher or
lower during the intervention vs. during
baseline), change in level (does the
students performance represent a change
in the desired direction from the end of
baseline data to the start of the
(table continues)

61

Treatment
Integrity

Table 5 continued
ProblemIntervention
solving Logic Component

61

Technically adequate RtI implementation


occurs when intervention integrity is
managed in such a way that decisions are
made in a timely manner, and systems are
in place to verify proper intervention

RTI Literature on Designing Academic


Interventions

(table continues)

Literature on Behavioral Interventions


intervention?), change in trend (has the
students performance trend increased or
decreased over time?), and latency of
change (is there a change in performance
after the first week of intervention
implementation?) (Kazdin, 1982)
Evaluate student progress relative to the
goal line for intervention; if 3-4 data points
are at or above the goal line, consider
raising the goal, program for maintenance
and generalization of a skill, and/or discuss
discontinuation; if 3-4 data points are
below the goal line, consider revising the
intervention with respect to changing the
pace of the intervention, modifying the
materials being used, providing more
response opportunities, or implementing or
adjusting reinforcement of desired
behaviors; if data are variable, consider
ways of motivating the student to more
consistently display the desired behavior
(Upah, 2008)
Treatment integrity as the degree to which
the intervention was implemented as
planned (Gresham, 1989; Telzrow, 1995;
Yeaton & Schrest, 1981)

62

Summative
Evaluation

Table 5 continued
ProblemIntervention
solving Logic Component

62

RTI Literature on Designing Academic


Interventions
management, to detect and correct
problems in implementation, and to
potentially take over implementation to
prevent delays in identification and
decision-making (VanDerHeyden &
Burns, 2010)
Direct evidence of correct intervention
implementation could include
observation of implementation,
completed worksheets, log-in records at a
computer, a self-monitoring checklist
completed by the teacher indicating that
the intervention steps have been
completed for the day, or a student score
on an assessment activity tracking the
intervention effects (VanDerHeyden &
Burns, 2010)
Assessment that allows practitioners to
determine whether or not an instructional
goal has been met (Brown-Chidsey &
Steege, 2010)
Using curriculum-based measures, postintervention level can be evaluated with
criterion-referenced comparison (e.g., a
score above the 25th percentile on a
national norm, or scoring within the low-

Evaluation that occurs after the


intervention has been completed to
determine whether or not it was effective in
producing positive student outcomes
(Upah, 2008)
To conduct summative evaluations, teams
can reference the decision rules established
during plan implementation, and/or
compare the students performance at
(table continues)

Literature on Behavioral Interventions


Different approaches to measuring
treatment integrity include self-reports,
logs, checklists, permanent products, and
direct observations of implementation by
those not administering the intervention
(Upah, 2008)

63

Table 5 continued
ProblemIntervention
solving Logic Component

63

RTI Literature on Designing Academic


Interventions
risk category from the DIBELS standard)
(VanDerHeyden & Burns, 2010)
Rate of growth can be numerically
determined using local norms (e.g.,
placing at or above the 25th percentile for
a specific grade or scoring within 1
standard deviation of the average rate of
growth for a specific grade)
(VanDerHeyden & Burns, 2010)
A student who scores below a criterion
for post-intervention level and whose
slope of growth was more than 1 standard
deviation below the mean would provide
evidence of an ineffective intervention;
these scores could justify intervention
delivered in a more intensive tier
(VanDerHeyden & Burns, 2010)

baseline and postintervention (Upah, 2008)


Social validity: collecting subjective
ratings of improvement in the target
behavior or improvement in the students
life (Kazdin, 1977; Wolf, 1978)

Literature on Behavioral Interventions

intervention design in a clear and objective behavioral definition of the problem.


Collection of data measuring the students behavior before and during intervention
implementation is essential. Practitioners are advised to use only those interventions,
techniques, and strategies that have a strong evidence base in the literature. Intervention
development also involves coming up with a clear plan by describing the different
intervention components, materials, and responsibilities of those involved with
implementation. The importance of measuring treatment integrity throughout
intervention implementation is also stressed in both intervention types. Despite having a
common foundation, more specific guidelines for academic assessment related to goal
setting, measurement of behavior, and data-based decision making have been empirically
validated when compared to what is known regarding SEB assessment as a direct result
of the lack of a general outcome measure and limited research.
It is recommended that the behavioral targets to be measured should fit with the
purpose and objectives of the intervention by including short-term performance
objectives and long-term, broad general objectives (Kratochwill & Bergan, 1990).
Potential long-term objectives of SEB interventions could include reductions in the
severity of symptoms associated with an SEB disorder and the level of functional
impairment (Pelham, Fabiano, & Massett, 2005). Despite these recommendations,
researchers have not yet identified a task, behavior, or skill that can be manipulated and is
a sensitive indicator of SEB problems in children, making it difficult to accurately define
appropriate methods of measurement, define target behaviors, and design comprehensive
models to integrate screening and progress monitoring (Chafouleas, Volpe, Gresham, &
Cook, 2010). Without accurately defining and measuring target behaviors, practitioners

64

are also unable to design formative assessment measures or determine decision rules
specifying what an appropriate response to the intervention would look like (Chafouleas,
Volpe, Gresham, & Cook, 2010).
A variety of different methods of behavioral assessment are in existence, each
with its own set of strengths and weaknesses with respect to psychometric defensibility,
flexibility, feasibility, and repeatability (Chafouleas, Volpe, Gresham, & Cook, 2010).
As such, there is no single best assessment method, and while a combination of different
methods may be the best approach, at this point in time, researchers have not yet defined
a clear set of guidelines based on empirical evidence (Chafouleas, Volpe, Gresham, &
Cook, 2010). Some researchers have proposed academic engagement as a target behavior
serving as the foundation of general outcome measures (Briesch, Chafouleas, & RileyTilman, 2010). Consensus has not been reached, however, on the definition of a general
outcome measure for behavior in school-based assessment, or whether it is possible to
establish such a construct in behavioral assessment, and as such, discussion of the most
appropriate behavioral targets to measure using the problem-solving model is on-going
(Chafouleas, Volpe, Gresham, & Cook, 2010).
Another deviation from academic interventions involves the difficulty inherent in
measuring the growth or development of a specific skill related to a target behavior
(Chafouleas, Volpe, Gresham, & Cook, 2010). Unlike academic domains, benchmarks
for desirable or appropriate behaviors have not been established or agreed upon
universally. For some students, no change in behavior is desirable, while for others,
desirable behavior is explicitly tied to the target behavior and the context in which it
occurs (Chafouleas, Volpe, Gresham, & Cook, 2010). Furthermore, visual analysis of

65

data is commonly recommended when evaluating interventions; however, quantitative


methods of analysis may provide a more defensible method of evaluation (Chafouleas,
Volpe, Gresham, & Cook, 2010). An assessment of social validation is another
potentially useful metric of intervention effectiveness (Gresham, 2005), while other
researchers recommend evaluating the level of impairment and quality of life pre- and
post-intervention (Kazdin, 2005).
EBIs for childhood behavioral disorders. As stated previously, promotion of
the use of EBIs has occurred in a variety of different fields. As such, school
psychologists have many treatment options available to them. This section describes
EBIs that have been identified to address common childhood behavior disorders. Listed
in Table 6 are the EBIs as well as the level of empirical support they have received. The
criteria used to evaluate each intervention before assigning it a level of empirical support
are also listed in Table 7.
Several conclusions can be drawn from the EBIs listed in Table 6. The wellestablished category listed the fewest number of treatments, while the possibly
efficacious category contained the largest number of interventions. This suggests that a
large number of treatment strategies are in need of further research. Across all three
categories, cognitive behavior therapy (CBT) and behavior therapy (BT) have received
the most empirical support as treatments for common childhood disorders. Given these
results, it is interesting to note that the most common theoretical orientations guiding the
current counseling practices of school psychologists were cognitive behavioral and
behavioral, suggesting that there may be some agreement between evidence-based
practices and the common counseling practices of school psychologists. In addition to

66

67
67

CBT
Individual CBT with parents (Cornwall, Spence, & Schotte, 1996)
Individual CBT with cognitive parent training (Nauta, Schooling, Emmelkamp, &
Minderaa, 2003)
Group CBT with parental anxiety management for anxious parents (Cobham, Dadds, &
Spence, 1998)
Family CBT (Bogels & Siqueland, 2006; Wood, Piacentini, South-Gerow, Chu, &
Sigman, 2006)
Parent group CBT (without youth involvement) (Mendlowitz, Manassis, Bradley,
Scapillato, Miezitis, & Shaw, 1999; Thienemann, Moore, & Tompkins, 2006)
Group CBT with parents plus internet (Spence, Holmes, March, & Lipp, 2006)

Anxiety General Symptoms


CBT
Individual CBT (Barrett, Dadds, & Rapee, 1996; Flannery-Schroeder & Kendall, 2000;
Kendall, 1994; Kendall,Flannery-Schroeder, Panichelli-Mindel, Southam-Gerow, Henin,
& Warman, 1997)
Group CBT (without parents) (Barrett, 1998; Flannery-Schroeder & Kendall, 2000;
Mendlowitz, Manassis, Bradley, Scapillato, Miezitis, & Shaw, 1999; Rapee, Abbott, &
Lyneham, 2006)
Group CBT with parents (Barrett, 1998; Mendlowitz, Manassis, Bradley, Scapillato,
Miezitis, & Shaw, 1999; Silverman, Kurtines, Ginsburg, Weems, Lumpkin, &
Carmichael, 1999; Spence, Holmes, March, & Lipp, 2006)

Behavior/Intervention

Table of Evidence-Based Interventions

Table 6

(table continues)

Possibly Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)

Probably Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)

Classification

68

Behavior/Intervention

Classification

68

Child and Adolescent PTSD


CBT
Trauma focused CBT (Cohen, Deblinger, Mannarino, & Steer, 2004; Cohen &
Mannarino, 1996, 1997; Cohen, Mannarino, & Knudsen, 2005; Deblinger, Lippman, &
Steer, 1996; Jaberghaderi, Greenwald, Rubin, Zand, & Dolatabadi, 2004; King et al.,
2000; Kolko, 1996)
CBT
School-based group CBT (Kataoka et al., 2003; Stein et al., 2003)

Child and Adolescent OCD


CBT
Individual CBT (Pediatric OCD Treatment Study Team, 2004)
Individual CBT, plus Sertraline (Pediatric OCD Treatment Study Team, 2004)
CBT
Family-focused individual CBT (Barrett, Healy-Farrell, & March, 2004)
Family-focused group CBT (Barrett, Healy-Farrell, & March, 2004)

Probably Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)
(table continues)

Well-Established
(Silverman, Pina, &
Viswesvaran, 2008)

Possibly Efficacious
(Barrett, Farrell, Pina,
Peris, & Piacentini, 2008)

Probably Efficacious
(Barrett, Farrell, Pina,
Peris, & Piacentini, 2008)

School Refusal
CBT
Possibly Efficacious
Individual CBT for school refusal (Heyne, King, Tonge, Rollings, Young, Pritchard et al., (Silverman, Pina, &
Viswesvaran, 2008)
2002; Last, Hansen, & Franco, 1998)
Individual CBT for school refusal with parent/teacher training (Heyne, King, Tonge,
Rollings, Young, Pritchard et al., 2002; King, Tonge, Heyne, Pritchard, Rollings, Young,
et al., 1998)
Parent/teacher training for school Refusal (Heyne, King, Tonge, Rollings, Young,
Pritchard et al., 2002)

Table 6 continued

69

Probably Efficacious
(Silverman, Pina &
Viswesvaran, 2008)

Classification
Possibly Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)

69

Specific Phobia
CBT
Possibly Efficacious
(Silverman, Pina, &
Emotive imagery for SP of darkness (Cornwall, Spence, & Schotte, 1996)
Viswesvaran, 2008)
In-vivo behavioral exposures with EMDR for SP of spiders (Muris, Merckelbach,
Holdrinet, & Sijsenaar, 1998)
Exposures plus contingency management for SP (Silverman, Kurtines, Ginsburg, Weems,
Rabian, & Serafini, 1999)
Exposures plus self-control for SP (Silverman, Kurtines, Ginsburg, Weems, Rabian, &
Serafini, 1999)
(table continues)

Social Phobia
CBT
Group CBT for SOP (Social Phobia; Gallagher, Rabian, & McCloskey, 2003; Hayward,
Varady, Albano, Thienemann, Henderson, & Schatzberg, 2000; Spence, Donovan, &
Brechman-Toussaint, 2000)
Social Effectiveness Training for Children for SOP (Beidel, & Morris, 2000)

Behavior/Intervention
CBT
Resilient Peer Treatment (Fantuzzo et al., 1996; Fantuzzo, Manz, Atkins, & Meyers,
2005)
Group CGT (Deblinger, Stauffer, & Steer, 2001)
Cognitive Processing Therapy (Ahrens & Rexford, 2002)
Eye Movement Desensitization and Reprocessing (Chemtob, Nakashima, & Carlson,
2002; Jaberghaderi, Greenwald, Rubin, Zand, & Dolatabadi, 2004)
Client Centered Therapy (Cohen, Deblinger, Mannarino, & Steer, 2004)
Family Therapy (Kolko, 1996)
Child Parent Psychotherapy (Lieberman, Van Horn, & Ippen, 2005)

Table 6 continued

70

70

CBT
Penn Prevention Program (PPP) - including culturally relevant modifications as seen in
the Penn Optimism Program (POP; Gillham, Reivich, Jaycox, & Seligman, 1995; Jaycox,
Reivich, Gillham, & Seligman, 1994; Roberts, Kane, Thomson, Bishop, & Hart, 2003;
Yu & Seligman, 2002)
Self-Control Therapy (Stark, Reynolds, & Kaslow, 1987; Stark, Rouse, & Livingston,
1991)
Behavior Therapy (Kahn, Kehle, Jenson, & Clark, 1990; King & Kirschenbaum, 1990)

Depression Children
CBT
Individual CBT (Asarnow, Scott, & Mintz, 2002; Gillham, Reivich, Jaycox, & Seligman,
1995; Jaycox, Reivich, Gillham, & Seligman, 1994; Kahn, Kehle, Jenson, & Clark, 1990;
Nelson, Barnard, & Cain, 2003; Roberts, Kane, Thomson, Bishop, & Hart, 2003; Stark,
Reynolds, & Kaslow, 1987; Stark, Rouse, & Livingston, 1991; Weisz, Thurber, Sweeney,
Proffitt, & LeGagnoux, 1997; Yu & Seligman, 2002)
CBT group, child only (Gillham, Reivich, Jaycox, & Seligman, 1995; Jaycox, Reivich,
Gillham, & Seligman, 1994; Kahn, Kehle, Jenson, & Clark, 1990; Roberts, Kane,
Thomson, Bishop, & Hart, 2003; Stark, Reynolds, & Kaslow, 1987; Weisz, Thurber,
Sweeney, Proffitt, & LeGagnoux, 1997; Yu & Seligman, 2002)
CBT child group, plus parent component (Asarnow, Scott, & Mintz, 2002; Stark, Rouse,
& Livingston, 1991)

Behavior/Intervention
One-session exposure treatment for SP (Ost, Svensson, Hellstrom, & Lindwall, 2001)
One-session exposure treatment with parents for SP (Ost, Svensson, Hellstrom, &
Lindwall, 2001)

Table 6 continued

(table continues)

Probably Efficacious
(David-Ferndon, &
Kaslow, 2008)

Well-Established (DavidFerndon,& Kaslow, 2008)

Classification

71
71

CBT
Adolescent group CBT, plus parent component (Clarke, Hawkins, Murphy, Sheeber,
Lewinsohn, & Seeley, 1995; Clarke, Rohde, Lewinsohn, Hops, & Seeley, 1999;
Kowelenko et al., 2005; Lewinsohn, Clarke, Hops, & Andrews, 1990; Lewinsohn,
Clarke, Rohde, Hops, & Seeley, 1996)
Individual CBT (Rossello & Bernal, 1999; Wood, Harrington, & Moore, 1996)
Individual CBT, plus parent/family component (Brent et al., 1997; Melvin, Tonge, King,
Heyne, Gordon, & Klimkeit, 2006; Treatment for Adolescents with Depression Study
(TADS) Team, 2004)
Adolescents Coping with Depression (CWD-A; Clarke, Hawkins, Murphy, Sheeber,
Lewinsohn, & Seeley, 1995; Clarke et al., 2001; Clarke, Rohde, Lewinsohn, Hops, &
Seeley, 1999; Lewinsohn, Clarke, Hops, & Andrews, 1990; Lewinsohn, Clarke, Rohde,
Hops, & Seeley, 1996; Rohde, Clarke, Mace, Jorgensen, & Seeley, 2004)
Interpersonal Psychotherapy (IPT)
IPT for Depressed Adolescents (IPT-A; Mufson, Dorta, Wickramaratne, Nomura, Olfson,
& Wiessman, 2004; Mufson, Weissman, Moreau, & Garfinkel, 1999

Interpersonal Psychotherapy (IPT)


Individual IPT (Mufson, Dorta, Wickramaratne, Nomura, Olfson, & Wiessman, 2004;
Mufson, Weissman, Moreau, & Garfinkel, 1999; Rossello & Bernal, 1999)

(table continues)

Probably Efficacious
(David-Ferndon, &
Kaslow, 2008)

Behavior/Intervention
Classification
Depression Adolescents
CBT
Well-Established (DavidGroup Cognitive Behavior Therapy, adolescent only (Clarke, Hawkins, Murphy, Sheeber, Ferndon, & Kaslow, 2008)
Lewinsohn, & Seeley, 1995; Clarke, Rohde, Lewinsohn, Hops, & Seeley, 1999;
Kowlenko et al., 2005; Lewinsohn, Clarke, Hops, & Andrews, 1990; Lewinsohn, Clarke,
Rohde, Hops, & Seeley, 1996; Reynolds & Coats, 1986)

Table 6 continued

72
72

Behavior Therapy
Helping the Noncompliant Child (Peed, Roberts, & Forehand, 1977; Wells & Egan,
1988)
Triple P (Positive Parenting Program) - Standard (Bor, Sanders, & Markie-Dadds, 2002;
Sanders, Markie-Dadds, Tully, & Bor, 2000); Enhanced (Bor, Sanders, & Markie-Dadds,
2002; Sanders, Markie-Dadds, Tully, & Bor, 2000)
Incredible Years - Parent training (Webster-Stratton & Hammond, 1997; WebsterStratton, Reid & Hammond, 2004); Child training (Webster-Stratton & Hammond, 1997;
Webster- Stratton, Reid & Hammond, 2001; Webster-Stratton, Reid, & Hammond, 2004)

CBT
Anger Control Training (Lochman, Coie, Underwood, & Terry, 1993; Robinson, Smith,
& Miller, 2002)
Rational-emotive mental health program (Block, 1978)

Oppositional Defiant Disorder and Conduct Disorder


Behavior Therapy
Parent Management Training (Bernal, Klinnert, & Schultz, 1980; Christensen, Johnson,
Phillips, & Glasgow, 1980; Patterson, Reid, Jones, & Conger, 1975)

Behavior/Intervention
Child and Adolescent ADHD
Behavioral parent training (BPT; Barkley et al., 2000; Bor, Sanders, & Markie-Dadds,
2002; Hoath & Sanders, 2002; MTA Cooperative Group, 1999; Sonuga-Barke, Daley,
Thompson, Laver-Bradbury, & Weeks, 2001)
Behavioral classroom management (BCM; Barkley et al., 2000; Miranda, Prescentacion,
& Soriano, 2002; MTA Cooperative Group, 1999)
Behavioral peer interventions (BPI; Pelham et al., 2000)

Table 6 continued

(table continues)

Probably Efficacious
(Eyberg, Nelson, & Boggs,
2008)

Well-Established (Eyberg,
Nelson, & Boggs, 2008)

Well-Established (Pelham
& Fabiano, 2008)

Classification

73
73

CBT
Group Anger Control Training (Feindler, Marriot, & Iwata, 1984)
Reaching Educators, Children, and Parents (RECAP; Weiss, Harris, Catron, & Han,
2003)
Behavior Therapy
Triple P (Positive Parenting Program) - standard group treatment (Leung, Sanders,
Leung, Mak, & Lau, 2003)
First Step to Success Program (Walker, Kavanagh, Stiller, Golly, Severson, & Feil, 1998)
Self-administered Treatment, plus Signal Seat (Hamilton & MacQuiddy, 1984)
Incredible Years Parent Training and Child Training (Webster-Stratton & Hammond,
1997); Parent Training and Teacher Training (Webster-Stratton, Reid, & Hammond,
2004); Incredible Years Parent Training, Teacher Training, and Child Training

Behavior/Intervention
Parent-Child Interaction Therapy (Nixon, Sweeney, Erickson, & Touyz, 2003;
Schuhmann, Foote, Eyberg, Boggs, & Algina, 1998)
Problem-Solving Skills Training Standard (Kazdin, Bass, Siegel, & Thomas, 1989;
Kazdin, Esveldt-Dawson, French, & Unis, 1987b; Kazdin, Siegel, & Bass, 1992);
Problem-Solving Skills Training and Practice (Kazdin, Bass, Siegel, & Thomas, 1989);
Problem-Solving Skills Training and Parent Management Training (Kazdin, EsveldtDawson, French, & Unis, 1987a)
Group Assertiveness Training - Counselor-led (Huey & Rank, 1984); Peer-led (Huey &
Rank, 1984)
Multidimensional Treatment foster care (Chamberlain & Reid, 1998; Leve, Chamberlain,
& Reid, 2005)
Multisystemic Therapy
Multisystemic Therapy (Borduin et al., 1995; Henggeler, Melton, Brondino, Scherer, &
Hanley, 1997; Henggeler, Melton, & Smith, 1992; Henggeler, Pickrel, & Brondino, 1999)

Table 6 continued

(table continues)

Possibly Efficacious
(Eyberg, Nelson, & Boggs,
2008)

Classification

74
74

Adolescent Anorexia Nervosa


Family Therapy for AN (Russel, Szmukler, Dare & Eisler, 1987)
Psychoanalytic Therapy (Self Psychology) for AN (Bachar, Latzer, Kreitler & Berry,
1999)
Cashs Body Image Therapy, plus Virtual Reality (Perpina et al., 1999)

Adolescent Substance Abuse


CBT
Group CBT (Battjes, Gordon, OGrady, Kinlock, Katz & Sears, 2004; Kaminer &
Burleson, 1999, Kaminer, Burleson, Blitz, Sussman & Rounsaville, 1998; Kaminer,
Burleson & Goldberger, 2002; Liddle, Rowe, Dakof, Ungaro & Henderson, 2004;
Waldron, Ozechowski, Turner & Brody, 2005; Waldron, Slesnick, Brody, Turner, &
Peterson, 2001)
Multidimensional Family Therapy (Dennis et al., 2004; Liddle, Dakof, Diamond, Parker,
Barrett & Tejeda, 2001; Liddle, Rowe, Dakof, Ungaro & Henderson, 2004)
Functional Family Therapy (Waldron, Ozechowski, Turner & Brody, 2005; Waldron,
Slesnick, Brody, Turner & Peterson, 2001)
Family Therapy
Brief Strategic Family Therapy (Santisteban et al., 2003)
Integrated Behavioral Family Therapy (Hops et al., 2007; Waldron et al., 2007;
Waldron, Ozechowski, Turner & Brody, 2005; Waldron, Slesnick, Brody, Turner &
Peterson, 2001)
Multisystemic Therapy )Henggeler, Pickrel & Brondino, 1999; Henggeler, Clingempeel,
Brondino & Pickrel, 2002)

Behavior/Intervention
(Webster-Stratton, Reid & Hammond, 2004); Teacher Training and Child Training
(Webster-Stratton, Reid & Hammond, 2004)

Table 6 continued

Possibly Efficacious (Keel


& Haedt, 2008)
(table continues)

Probably Efficacious (Keel


& Haedt, 2008)

Probably Efficacious
(Waldron & Turner, 2008)

Well-Established (Waldron
& Turner, 2008)

Classification

75
75

Autism
Behavior Therapy
Lovaas' Method (Cohen, Amerine-Dickens, & Smith, 2006; Eikeseth, Smith, Jahr, &
Eldevik, 2002; Lovaas, 1987; Smith, Lovaas, & Lovaas, 2002)
Parent Training (Aldred, Green, & Adams, 2004; Drew et al., 2002; Jocelyn, Casiro,
Beattie, Bow & Kneisz, 1998)

CBT
Child and Family-Focused CBT (West et al., 2009)
Dialectical Behavior Therapy (Goldstein, Axelson, Birmaher & Brent, 2007)
Psychotherapy
Individual Family Psychoeducation (Young & Fristad, 2007)

Child and Adolescent Bipolar Disorder


Family Therapy
Family-Focused Treatment for Adolescents (Miklowitz et al., 2008)
Psychotherapy
Multi-Family Psychoeducational Psychotherapy (Fristad, Verducci, Walters, & Young,
2009; Young & Fristad, 2007)

Behavior/Intervention
Adolescent Bulimia Nervosa
CBT
Guided self-care for binge-eating in BN (Schmidt et al., 2007)
Family Therapy for BN (Le Grange, Crosby, Rathouz, & Leventhal, 2007)

Table 6 continued

Possibly Efficacious
(Rogers & Vismara, 2008)

Well-Established (Rogers
& Vismara, 2008)

Probably Efficacious
(Association for
Behavioral and Cognitive
Therapies [ABCT] &
Society of Clinical Child
and Adolescent
Psychology [SCCAP],
2010b)
Possibly Efficacious
(Association for
Behavioral and Cognitive
Therapies [ABCT] &
Society of Clinical Child
and Adolescent
Psychology [SCCAP],
2010b)

Possibly Efficacious (Keel


& Haedt, 2008)

Classification

Table 7
Criteria for Classifying Evidence-Based Psychosocial Treatments
Well-Established Treatments
There must be at least two good group-design experiments, conducted in at least two
independent research settings and by independent investigatory teams, demonstrating
efficacy by showing the treatment to be:
a) statistically significantly superior to pill or psychological placebo or to another
treatment
OR
b) equivalent (or not significantly different) to an already established treatment in
experiments with statistical power being sufficient to detect moderate differences
AND
Treatment manuals or logical equivalent were used for the treatment
Conducted with a population, treated for specified problems, for whom inclusion
criteria have been delineated in a reliable, valid manner
Reliable and valid outcome assessment measures, at minimum tapping the problems
targeted for change were used, and
Appropriate data analyses
Probably Efficacious Treatments
There must be at least two good experiments showing the treatment is superior
(statistically significantly so) to a wait-list control group
OR
One or more good experiments meeting the Well-Established Treatment Criteria with
the one exception of having been conducted in at least two independent research
settings and by independent investigatory teams
Possibly Efficacious Treatments
At least one good study showing the treatment to be efficacious in the absence of
conflicting evidence
Experimental Treatments
Treatment not yet tested in trials meeting task force criteria for methodology
Adapted from the Division 12 Task Force on Psychological Interventions reports
(Chambless et al., 1996, 1998), from Chambless and Hollon (1998), and from Chambless
and Ollendick (2001).

76

CBT and BT, other therapies that have been determined to be well-established include
Family Therapy, Multidimensional Family Therapy, Functional Family Therapy,
Interpersonal Therapy, and Behavioral Parent Training, Behavioral Classroom
Management, and Behavioral Peer Interventions.
One reason for the high level of empirical support for CBT and BT techniques
could be related to the overlap between best practices in counseling and central tenets of
these treatments. For example, cognitive-behavioral practices are grounded in
empirically validated psychological theories, such as those related to learning and
cognition. The selection of specific treatment strategies is based on specific
characteristics of the child, and EBIs that have been shown to effectively address the
behaviors being displayed. Empiricism is the foundation for cognitive-behavioral
practices. The description and treatment of problematic behaviors is done using objective
terms and definitions, measurable goals, and a quantitative analysis of behavior. A major
component of treatments using CBT involves gathering objective data before, during, and
after an intervention. On-going evaluation of treatment goals is essential, as decisions
governing further intervention are made by examining data demonstrating the efficacy of
strategies already in use.
Section summary. Information detailing the current mental health needs of
students and the connection between mental health and academic success has been
acknowledged by school psychologists (Haertel et al., 1983; Wang et al., 1990). Data
collected describing the current counseling practices of school psychologists detail their
efforts to address the needs of the students with whom they work. Evidence of the
commitment to meeting the behavioral needs of students can be seen through the research

77

focus on the development of best practices related to indirect and direct intervention,
particularly in the area of counseling, the commitment to the use of evidence-based
interventions and research in this area, and the application of the problem-solving model
to the design and implementation of behavioral interventions. If research describing best
practices, evidence-based interventions, and the application of the problem-solving model
are applied to the design and implementation of counseling interventions, school
psychologists would employ strategies proven to be effective in addressing behavioral
problems in a manner that allows them to continuously monitor whether their efforts are
impacting student behavior as intended. Although research in these areas is on-going, the
use of evidence-based interventions, and the tenets of regular progress-monitoring of
student behavior and data-based decision making inherent in the problem-solving model
allow school psychologists to hold themselves accountable for meeting the mental health
needs of students when implementing counseling interventions with students who display
problem behavior.
Legislation Impacting the Field of School Psychology
In a similar fashion to the way the field of school psychology has evolved, federal
education policy has developed and changed in response to the needs of students.
Pertinent legislation influencing the practice of school psychology has been shaped by
federal involvement regarding students with disabilities, and students who come from
socially disenfranchised or economically disadvantaged backgrounds. The federal
government has responded to reports indicating that these groups of students were not
being provided with appropriate educational opportunities by passing laws and
regulations to address these inequalities and ensure that certain standards of practice are

78

followed. Over time these laws have been amended, and the role of the school
psychologist has evolved in response to these changes. This section provides an
overview of how federal policy and agencies, cases, and expository reports have
influenced education in general and the practice of school psychology more specifically.
A summary of selected influential federal policies and agencies, cases, and expository
reports can be found in tables 8, 9, 10, and 11.
This selected review of federal legislation, court decisions, expository reports, and
agencies provides an important context for examining the current climate of education in
American schools. From the period of the 1960s to the 1980s, a series of expository
reports (e.g., A Nation at Risk, Time for Results) publicized weaknesses in the education
of students in American schools, while at the same time proposing standardized testing
and increased autonomy for schools, contingent on improved outcomes. Rulings in
several court cases have specified standards for assessing special education eligibility and
determining how best to educate students with disabilities (e.g., PARC v. Pennsylvania,
Mills v. Board of Education). In response to these decisions, the federal government has
passed a series of laws codifying these decisions, and as such, shaped the role of the
school psychologist (e.g., IDEA). Several agencies have been created in order to
disseminate information on effective instruction, monitor the allocation and use of federal
funding, and collect and organize information detailing student achievement (e.g., NAEP,
OERI, National Assessment Governing Board). Federal legislation has evolved from
providing funding for programs to benefit students from minority or economically
disadvantaged backgrounds, to making such funding contingent on improvements in
student outcomes. Increased government regulation and accountability for student

79

80

Rehabilitation Act of 1973

80

(table continues)

Extended the civil rights of students with disabilities by prohibiting their


exclusion from programs receiving federal funding

Stewner-Manzanares (1988):
Targeted aid to schools with large numbers of non-English-speaking students

Increased federal aid for compensatory programs for students in poverty by 23%

ESEA Amendments (1974)

Bilingual Act of 1974

Affirmed the entitlement of all students in public schools to equal educational


opportunity, as determined by the neighborhood in which they lived
Specified procedures for dismantling the dual school system

Crespino (2006):
Applied desegregation requirements to all schools in the U.S.
Prevented parents from choosing where to send their children if doing so meant
that schools would be racially imbalanced

Title added to the ESEA to address economic disadvantage caused by limited


English proficiency by funding bilingual classes

Description
provided schools financial aid to be used to benefit economically disadvantaged
children

Equal Education Opportunity Act


(1974)

Stennis Amendment (1970)

Title VII Bilingual Education Act


(1967)

Title and Year


Elementary and Secondary Education
Act (ESEA; 1965)

Selected Federal Legislation Impacting School Psychology

Table 8

81

U.S. Department of Education (2010):


Created a cabinet-level position to regulate federal education programs

Education of the Handicapped Act


Amendments (1986)

81

House Committee on Education and Labor (1986):


Required that states provide placements and services for students with
disabilities according to federal mandates using either federal or state funding
Mandated special education services for students ages 3-5
Provided financial incentives to states providing services for children from birth
to three years old
(table continues)

Educational Consolidation Improvement Gray, Caulley & Smith (1982):


Act (1982)
Reduced federal education spending by 15%
Provided each state with a fixed amount of money depending on the number of
students in need of special services, allowing them to spend these funds as they
deemed appropriate

Department of Education Organization


Act (1979)

Table 8 continued
Title and Year
Description
Education for All Handicapped Children Beyer (1989):
Act (1975)
Mandated that all children with disabilities between the ages of 5-21 be
provided with
a free appropriate education through the use of special education and related
services
tailored to meet their individual needs
Specified due process provisions to protect the rights of parents and guardians
and include them in educational decision-making
Provided increased financial assistance to States and localities to fund
categorical programs for children with disabilities

82
82

Provided federal funding contingent of a states ability to align content and


performance standards, instruction, testing, teacher training, curriculum, and
accountability

Improving Americas Schools Act


(1994)

(table continues)

Offered grants to schools for the development of standards and assessments


Provided funding to states in the planning or implementation phases of systemic
reform based on a set of national education goals related to student outcomes
and educational achievement

Description
House Committee on Education and Labor (1990):
Increased the amount of federal aid to schools documenting increases in student
achievement using test scores or other achievement measures
Mandated increased regulation from local districts and state departments of
education for schools unable to document gains
Increased funding for school-wide reforms
Merrell, Ervin, & Gimpel (2006):
Reauthorized the original provisions of EHA
Required transition services for students with disabilities
Added autism and traumatic brain injury to the list of federal disability
conditions for which special education services are provided
School of Public Health and Health Professions, University at Buffalo, (2005):
Defined Assistive Technology Devises and Services to be included in student
IEPs
Extended the Least Restrictive Environment clause requiring that, to the
maximum extent, students with disabilities be educated with their non-disabled
peers

Goals 2000: The Educate America Act


(1994)

Individuals with Disabilities Education


Act (1990)

Table 8 continued
Title and Year
Hawkins-Stafford School Improvement
Amendments (1988)

83

Individuals With Disabilities Education


Act (1997)

Table 8 continued
Title and Year

83

National Center for Children and Youths with Disabilities (1998):


Required that students with disabilities participate in state and local
assessments, with accommodations if necessary
Specified that schools use a variety of assessment tools to determine educational
needs, while only gathering new assessment data when necessary
Prevented schools from classified students as having a disability based on
inadequate instruction or limited English proficiency
Stressed the role of parents as members of the IEP team (e.g., requiring their
consent for evaluations and input and participation in educational decisions)
Mandated that IEPs make more explicit connection between special education
and the general curriculum
Detailed special factors (e.g., behavioral supports, language and communication
needs, assistive technology) to be considered by teams writing IEPS
Required a uniform number of progress reports for students with and without
disabilities
Included new provisions related to planning transition services and the transfer
of legal rights for students reaching the age of majority

Description
Required that economically disadvantaged students be held to the same
standards as their peers

84

Nelson & Weinbaum (2009):


Ruled that special classes for students with disabilities were allowed only if such
environments were found to improve the quality of education for student
Mandated that schools provide adequate services to students with disabilities, in
integrated or separate classes, according to recommendations reached at regular
placement meetings, regardless of the financial cost to the school

Mills v. Board of Education (1972)

84

Nelson & Weinbaum (2009):


Ruled that students with disabilities in public schools must be educated in their
least restrictive environment
Established the precedent that students with disabilities had to be mainstreamed
into regular classes whenever possible as a necessary condition for providing
equal educational opportunities

Pennsylvania Association for Retarded


Children (PARC) v. Pennsylvania
(1972)

(table continues)

Description
Nelson & Weinbaum (2009):
Ruled that federal courts would uphold states school funding systems provided
that state education systems developed the basic skills necessary for students to
participate in a democratic society
Made states responsible for covering the cost of educational services mandated
by federal laws or federal court decisions
Specified that school funding decisions would no longer be handled by federal
courts, as educational quality and resources could no longer be considered
federal rights

Case and Year


San Antonio Independent School
District v. Rodriguez (1973)

Court Decisions Impacting the Practice of School Psychology

Table 9

85

Table 9 continued
Case and Year
Larry P. v. Riles (1979, 1986)

85

Description
Jacob & Hartshorne (2007):
Established the legal precedent that standardized tests administered to children
from diverse cultural backgrounds must have been validated for this purpose

86

Nelson & Weinbaum (2009):


Three-year study indicating that innovative programs in schools had not
produced improvements in reading, and were shown to have had a negative
impact on growth in arithmetic skills
National Commission on Education (1983):
Used standardized test score data to publicize the weak academic performance
of American students
Recommended a nationwide system of standardized tests to measure educational
achievement and opportunity as an alternative to financial aid, special programs,
or racial desegregation
Mandated that schools demonstrate increases in test scores in order to remain
eligible for federal aid
Alexander (1986):
Proposed that schools be released from government regulations if they produced
measurable gains in student achievement
Limited the autonomy of schools failing to demonstrate improvement

American Institute for Research Study


(1977)

A Nation at Risk: The Imperative for


Educational Reform

Time for Results (1986)

86

Description
Washington Research Project & NAACP Legal Defense and Educational Fund
(1969):
Reported a lack of evidence connecting Title I funding and increases in
academic achievement among students in poverty
Highlighted the misuse of Title I funds in several states where it had been found
that money had been disproportionately allocated to suburban districts

Report
Title I of the ESEA: Is it helping poor
children?

Expository Reports Impacting the Practice of School Psychology

Table 10

87

U.S. Department of Education Institute of Education Sciences (2011):


Established in 1964
National testing system providing data about strengths, weaknesses and changes
in student achievement
Nelson & Weinbaum (2009):
Created in 1979
Collects and disseminates research on effective teaching strategies, curriculum,
and administration through the Educational Resources Information Center
(ERIC)
Examines different cause and effect relationships among school variables
Nelson & Weinbaum (2009)
Formed in 1988
Administers and regulates state-by-state reporting of NAEP results
Shifted the emphasis of federal education policy from input variables to student
outcomes by changing the interpretation of test scores from showing what
students were capable of to establishing achievement levels on NAEP scales
showing what students should be able to do

National Assessment of Educational


Progress (NAEP)

Office of Educational Research and


Improvement (OERI)

National Assessment Governing Board

87

Description
U.S. National Archives and Records Administration (2011):
Created in 1972
Serves as an accountability mechanism for federally funded education programs
through study of the connection between federal dollars and academic
achievement

Agency
National Institute of Education

Agencies and Initiatives Impacting the Practice of School Psychology

Table 11

improvement are two factors that continue to impact the field of education in present
times.
Contemporary legislation. Shortly after taking office in 2001, one of George W.
Bushs first legislative actions was to propose the No Child Left Behind Act (NCLB;
2002), which would serve as a reauthorization of the ESEA and IASA. The NCLB
represented a continuation of several provisions from its previous versions, including
Title I, the 21st Century Schools Act, bilingual education, Title II grants funding
innovation, and a sizeable reading program, among others. Provided in Table 12 is a
summary of the key provisions of NCLB.
The provisions of NCLB signified the intent and commitment of the federal
government to measure and monitor the educational achievement of all students in
schools. At the same time, teachers and administrators were being held increasingly
responsible for demonstrating student progress and proficiency across a variety of subject
areas. In addition, parents were empowered with information and alternatives for
remediation to ensure that their children were being provided with quality instruction.
After a lengthy series of negotiations, the NCLB Act was passed in 2001. From
the beginning, various education interest groups, such as the National School Boards
Association, the American Association of School Administrators, the National Education
Association, and the National Conference of State Legislatures, have voiced concern over
this bill, citing the claim that school districts would not be able to meet the demands of
this piece of legislation given the limited federal funding it provided. In response to this
concern, the Bush administration maintained that, without federally mandated
accountability and assessment practices outlined in NCLB, states would continue the

88

Table 12
Key Provisions of NCLB
Accountability for Student Outcomes:
provided federal financing to bolster achievement using standards, assessments, and
accountability regulations
mandated that standards and assessments were to be applied to all students
specified accountability measures in the form of corrective actions for schools in need
of improvement
detailed a formula to determine how and when to take corrective action for schools
that failed to meet progress targets, such that:
a) by the year 2014, all students were expected to be performing at a proficient
level in reading, mathematics, and science
b) each school year, gains must have been made in student adequate yearly
progress such that 100% proficiency would be reached by 2014
c) the annual rate of progress would be calculated for aggregated as well as
disaggregated student groups based on income, race, gender, English language
proficiency, and special education classification, with the entire school considered
in need of improvement if any one of these groups were not meeting goals for
expected progress
Adequate Yearly Progress
A school receiving Title I funding that had not met AYP for two consecutive years
was to be referred to as a school in need of improvement:
the school was given the responsibility of writing a plan for improving students
educational progress
the local education agency provided the school with technical resources for plan
implementation
students were given the choice of transferring to another school within the district
that was not in need of improvement
If during the following year, the school was still not able to make AYP, it retained the
status of a school in need of improvement:
Students retained the option to transfer
Students from low socioeconomic backgrounds could receive supplemental
educational services, such as tutoring or remedial classes, from either a public or
private state-approved agency
When a school did not make AYP for four consecutive years:
The district enforced corrective actions, such as replacing staff or making
curricular changes
Parents were given the opportunity to send their children to a different school
In the event that AYP was not met for a fifth consecutive year, a restructuring plan
was implemented by the school district:
(table continues)
89

Table 12 continued
The school could reopen as a charter school
Staff members could be replaced
The leadership of the school could be turned over to the state or a private agency
Effective Instruction
stressed the use of effective instructional methods, particularly in reading, by
offering grants to fund research-based instructional programs
required that teachers meet certain training standards, including the completion of
a bachelors degree, demonstration of competency in specific areas of instruction,
and documentation that they had met their states requirements for licensure or
certification
required that paraprofessionals meet certain training standards, including the
completion of two years of college, or demonstration of their ability to support
student learning in reading, writing, and math
mandated that schools make public the certification status and educational
attainment of the teachers and paraprofessionals employed in their buildings
required that schools begin conducting yearly testing in reading, math, and
science for students in grades 3-8 to determine whether or not students were
meeting goals for AYP, with the overall goal that all students would be proficient,
or demonstrating grade-level competency, by the 2013-2014 school year

90

legacy of leaving behind students with disabilities and those from minority or
economically disadvantaged backgrounds. To provide extra funding, NCLB permitted
states to reallocate funding from non-Title I federal programs into their Title I budgets.
Additionally, the State and Local Flexibility Demonstration Act allowed states to redirect
administrative and activity funds from other ESEA programs into supplemental learning
programs that were specifically designed to help students make AYP. Overall, NCLB
increased Title I funding by 20% for schools in urban areas or areas with a high
concentration of students from economically disadvantaged backgrounds.
In order to meet yearly testing requirements, many states needed only to revise the
testing programs they had previously created using Goals 2000 funding to account for the
provision that all students in grades 3-8, rather than certain samples or benchmark grades,
were tested. By 2001, 49 states had written content standards and mandatory tests for
graduation and grade-level promotion. Despite this fact, the quality of proficiency
standards varied from one state to another (Nelson & Weinbaum, 2009). As a result,
variability was also found when determining AYP, resulting in a non-uniform distribution
of schools in need of improvement between and within states.
The most current revision to IDEA occurred in 2004, when the Individuals with
Disabilities Education Improvement Act of 2004 (IDEA 2004; Wright & Wright, 2009)
was signed into law (Merrell, Ervin, & Gimpel, 2006). Jacob and Hartshorne (2007)
provided information related to the passage of this latest revision. To guide their
amendments, Congress cited several notable research findings, which are listed in Table
13. This most current authorization of IDEA placed the onus on schools for providing
students with disabilities with effective early intervention services, evidence-based

91

Table 13
Research Findings Guiding IDEA 2004
The effective education of children with disabilities is achieved with a foundation
based on high achievement standards [20 U.S.C. 1400 (5)(A)]
Special education is a service and not a place [20 U.S.C. 1400 (5)(C)], and
therefore students with disabilities should be provided with access to the general
education curriculum in the regular education classroom[20 U.S.C. 1400 (5)(D)]
Funding should be provided for school-wide practices, evidence-based instruction in
reading, positive behavioral supports, and early intervention services [20 U.S.C.
1400 (5)(E)]
Data on the increasing diversity of the school-aged population highlights the need for
more effective instruction for students with limited proficiency in English [20 U.S.C.
1400 (10)(A)]

92

instruction, and behavioral supports, driven by high achievement standards. These


provisions have had direct relevance to school psychologists as they work directly and
indirectly not only with students with disabilities, but also at a systems level, to address
the academic and behavioral needs of all of the students in the buildings they serve.
Current status and future directions. To help states continue to meet the
challenge of instruction and assessment under NCLB and improve the educational
outcomes of students in American schools, $40 billion from the American Recovery and
Reinvestment Act was appropriated to state governments to cover the cost of budget
deficits and to fund educational requirements spelled out in Title I and IDEA legislation.
States could also apply for grant money to be awarded contingent on educational reform
efforts. An example of this was the Race to the Top Fund, which is a competitive grant
program awarding financial resources to states that submitted plans to address education
reform goals related to the use of internationally-benchmarked standards and
assessments, the recruitment and retention of quality teachers and principals, the
implementation of data systems to monitor student progress, and the development of
under-performing schools (U.S. Department of Education, 2009). All states have also
been required to facilitate the use of information detailing student achievement to assess
teacher and administrator performance, in addition to doing away with limits to the
number of charter schools allowed in each state (U.S. Department of Education, 2009).
Currently, the Obama administration, along with Arne Duncan, the Secretary of
Education, is drafting legislation for another revision of the ESEA, though no legislation
has yet been passed. Reauthorization efforts are being guided, however, by the following
goals: to improve the effectiveness of teachers and administrators; to give parents the

93

information they need to evaluate school choice and contribute to school effectiveness; to
provide teachers and administrators with information on delivering effective instruction;
to determine and implement standards and assessments that will ensure that American
students are college- and career-ready; and, to provide students in under-performing
schools with support and interventions that will boost their educational achievement
(U.S. Department of Education, Office of Planning, Evaluation and Policy Development,
2010). Despite these goals, at this time, it is unknown whether evidence-based practices
and interventions are being consistently implemented for non-academic purposes.
Section summary. This section has reviewed past and current efforts by various
groups to address educational inequalities experienced by students with disabilities, and
students who come from socially disenfranchised or economically disadvantaged
backgrounds. An important theme that can be drawn from this review is that, over time,
weaknesses in the educational programming provided to different student groups have
been revealed, resulting in the federal government passing laws and regulations holding
school professionals accountable for demonstrating that these specific weaknesses have
been addressed. In contemporary times, more proactive measures have been
implemented, as school professionals are being held accountable for providing students
with quality educational experiences. Consequences are enforced when schools are
unable to demonstrate regular growth in student achievement. As an education
professional, school psychologists are now responsible for meeting the needs of a range
of students, as they address the educational and behavioral needs of the entire student
body through direct and indirect prevention and intervention efforts. Although current
legislation is still being drafted, it appears as though mandates for accountability, through

94

the implementation of standards, and regular monitoring and assessment of student


abilities and behaviors, will continue to impact the way schools are run, and the specific
practices of school psychologists.
Survey Research
Surveys are one traditional method that researchers have used to gather data about
specific phenomena (Dillman, 2007). Over time, improvements have been made to the
survey approach, including purposeful sampling methods, improvements in questionnaire
design, and computerized analysis of data (Evans & Mathur, 2005). Recent
improvements to survey methodology have been spurred on by advances in technology,
such that e-mail, web-based, and internet surveys have become a popular method of
survey implementation (Dillman, 2007; Heun, 2001; Jackson, 2003). This section will
discuss the assets and limitations associated with internet surveys. The conclusion of this
section includes guidelines for the creation of internet surveys provided by Dillman
(2007) in line with the Tailored Design Method.
Researchers have enumerated on a number of benefits associated with internet
survey research. When compared to paper surveys, internet surveys offer flexibility,
convenience related to design and administration, and allow for administrative savings
associated with reduced need for paper, postage, mail-out, and data entry resources
(Couper, Kapteyn, Schonlau, & Winter, 2007; Dillman, 2007; Evans & Mathur, 2005;
Granello & Wheaton, 2004; Sax, Gilmartin, & Bryant, 2003). Researchers using internet
survey methods have potential access to an international sampling pool that might not
have been possible or as accessible using mail surveys (Dillman, 2007), as the number of
people using the internet continues to increase (Evans & Mathur, 2005; Pealer & Weiler,

95

2003). Internet surveys may also provide researchers with access to participants who
oppose providing identifying or personal information, and the perceived anonymity
associated with internet surveys may yield more accurate responses (Granello &
Wheaton, 2004) as some participants consider this survey method to be more secure
compared to mail or telephone surveys (Granello & Wheaton, 2004; Van Selm &
Jankowski, 2006).
In many cases, a reduction in survey implementation and response time for
internet surveys has been noted (Dillman, 2007; Evans & Mathur, 2005; Granello &
Wheaton, 2004), as researchers are not encumbered by the time it takes for participants to
mail in their responses (Vaux & Briggs, 2006). Less time is also required when sending
follow-up requests for participation, which may result in higher response rates within a
specified timeframe for conducting research (Evans & Mathur, 2005). Online surveys
can also be designed to require participants to provide responses for all items before
submitting their surveys (Evans & Mathur, 2005). The survey designer can program a
skip pattern, preventing respondents from having to respond to irrelevant items (Dillman,
2007). In addition, the designer can program pop-up instructions for immediate
assistance rather than referring the respondent to a separate set of instructions removed
from the actual item (Dillman, 2007). Drop-down boxes containing lengthy lists of
possible responses can be used to make coding of answers easier for the researcher
(Dillman, 2007). Responses can be used to screen respondents (Alreck & Settle, 2004;
Dillman 2007) and automatically direct them to the next most relevant set of items
(Dillman, 2007).
Some respondents perceive internet surveys to be more interesting when they are

96

designed to be interactive, as a variety of pictures, animation, and audio and video clips
can be incorporated into an electronic survey (Dillman, 2007; Sax et al., 2003). Internet
survey questions can also be tailored to change dependent on participant response (Evans
& Mathur, 2005). Furthermore, internet surveys can also be completed during the
respondents leisure time, whereas this is not always the case when researchers utilize
telephone survey methods (Sax et al., 2003).
Before deciding to use surveys as a method of data collection, there are several
limitations that researchers should consider. Once the survey has been designed and
administered, it cannot be changed or altered throughout the data collection process. In
addition, at the conclusion of data collection, researchers may discover that the sample
they created and surveyed did not match the population of interest (Mangione, 1995).
Survey questions must be general enough to facilitate comprehension by a large number
of respondents, and as a result, may omit questions of interest to the researcher and
certain respondents (Barribeau et al., 2005). Self-administered surveys also do not take
into account idiosyncrasies in context associated with each respondent, and cannot
always account for how this will impact the accuracy of their responses in reference to
the intentions of the researcher (Barribeau et al., 2005). Furthermore, respondents may
have difficulty aligning their views and experiences with the dichotomies or scales
presented to them as answer choices on surveys, which is a potential threat to validity
(Barribeau et al., 2005). Additional concerns related to self-administered surveys include
bias in the responding sample (i.e., purposefully falsifying responses, picking the
response that immediately comes to mind as opposed to the most accurate response, or
selecting the same response without considering the question), nonresponse to certain

97

items, and misinterpretation of survey content or questions by survey respondents


(Mangione, 1995).
Despite the many benefits that are associated with internet surveys, this
methodology also has some weaknesses that researchers need to be aware of when
considering using them. Although researchers may have access to a larger population,
current response rates to internet surveys are poor (Dillman, 2007; Evans & Mathur,
2005), and are generally no better than postal mail survey response rates (Sax et al.,
2003). Researchers cannot assume that people have had previous experiences with
electronic surveys the way they may have had with mail and telephone surveys (Dillman,
2007). Furthermore, it is important to note that computer skills are not uniformly
developed throughout the population (Dillman, 2007; Evans & Mathur, 2005), and that
even within households, skill and comfort with the internet vary (Dillman, 2007).
Respondents may be reluctant to put their computers at risk for some form of computer
virus by clicking on a survey invitation link, and may also hesitate to respond to a survey
due to concerns that doing so will expose them to an individual or group who will
persistently send them junk email (Dillman, 2007; Evans & Mathur, 2005; Sax et al.,
2003). The level of technical sophistication that can be employed when designing an
electronic survey can also increase the chances of survey error or incompletion (Dillman,
2007), and can potentially make it difficult for respondents to view and respond to
surveys due to issues with screen configuration and differences in computer operating
systems (Dillman, 2007; Evans & Mathur, 2005; Granello & Wheaton, 2004). Currently,
internet service provision is a private industry, and as such, access to respondents is not
assumed the way it previously was with telephone surveys (Dillman, 2007). Many

98

professional organizations will not allow researchers to survey their membership without
obtaining some form of permission or initiating some form of relationship (Dillman,
2007).
Recent data gathered by the Nielsen Company (2008) estimated that 80% of
homes across the United States contain a computer (desktop or laptop), and among those,
91.6% have some form of internet access. Despite these findings, however, the
distribution of internet access is not uniform (Nielsen Company, 2008), may be limited
within various areas, cultures, and countries (Van Selm & Jankowski, 2006), and appears
to vary depending on a variety of factors. For example, internet access is correlated with
education level and the combined annual income of a given household, such that
increases in these factors also increased the likelihood of internet access (Chesley, 2006;
Nielsen Company, 2008; Redpath et al., 2006). Across the country, internet access is
lowest among Hispanic and African-American households (Sax et al., 2003), as well as
those in which the head of the household has not completed high school (Chesley, 2006;
Nielsen Company, 2008). Within the Southeast Central region of the United States,
encompassing Alabama, Mississippi, Tennessee, and Kentucky, the highest number of
households without internet access can be found (Nielsen Company, 2008). In contrast,
larger cities, such as Washington, DC, Norfolk, Salt Lake City, Boston, and Portland,
contain the highest percentage of homes with internet access (Nielsen Company, 2008).
Given this information, researchers are cautioned that, although electronic survey
methods provide less expensive access to a larger portion of the population, and more
households report computer and internet access, it is inaccurate to assume that a
nationally-representative section of the population has been sampled without collecting

99

and analyzing demographic data (Andrews et al., 2003; Vaux & Briggs, 2006) to avoid
sampling bias and error (Alreck & Settle, 2004).
When considering the use of survey research to gather information, researchers
are advised to study the population of interest in order to determine whether this
population has uniform internet access and has a history of using internet surveys
(Dillman, 2007). It is more likely that researchers will obtain acceptable response rates,
if their survey population has regular access to the internet and e-mail (Granello &
Wheaton, 2004; Pealer & Weiler, 2003). The potential for sample bias may be reduced
when, in their sampling practices, researchers account for differences in internet access
and use among different groups in the general population (Andrews et al., 2003; Vaux &
Briggs, 2006), and decide whether an internet based survey is the most appropriate
format for gathering information (Pealer & Weiler, 2003). If the use of an internet survey
is considered appropriate, researchers are advised to design internet surveys in such a
manner that they can be clearly read and interpreted (Dillman, 2007). In addition, it is
recommended that, before formal survey implementation, researchers pilot their surveys
with a small subset of the sample population to examine brevity, reliability, and validity
(Andrews et al., 2003; Granello & Wheaton, 2004).
Application of the Tailored Design Method (TDM; Dillman, 2007) when creating
internet surveys is one way that researchers can ensure that the questionnaires used in
their research can be clearly read and interpreted by respondents. The development of
TDM has occurred over time, guided by social exchange theory, and revised by research
results exploring specific aspects of the survey process and their effect on the quality and
quantity of responses (Dillman, 2007). Contemporary research on TDM has focused on

100

ways that the design and layout of internet surveys impact respondents. As a summary of
this research, Table 14 provides a list of guidelines from the TDM that researchers can
use to design surveys with minimal error to achieve accuracy and high response rates.
Although internet surveys are a popular and efficient method of collecting
information on a population of interest (Dillman, 2007; Heun, 2001; Jackson, 2003),
researchers are advised to consider several limitations and factors associated with this
survey method before implementing it (Dillman, 2007). For example, because internet
familiarity, use, and access are not uniform across the country (Dillman, 2007; Evans &
Mathur, 2005), researchers are cautioned to examine demographic characteristics related
to the population of interest to determine whether an internet survey is the most efficient
way to collect data (Andrews et al., 2003; Vaux & Briggs, 2006). When designing
internet surveys, attention should be paid to the use of words and symbols in order to
facilitate ease and accuracy of responses by participants (Dillman, 2007). Piloting the
survey with a small group of the target population is one way that researchers can obtain
valuable feedback related to the effectiveness of their surveys before engaging in largescale implementation (Andrews et al., 2003; Granello & Wheaton, 2004).
Chapter Summary
One common theme throughout the topics discussed in this chapter is the
evolution of educational practices in response to the needs of students in American public
schools. The position of the school psychologist emerged in response to the rapid
expansion of the school population in the early 1900s, and continued to evolve over time
as methods of problem-solving, training, and the daily responsibilities of the school
psychologist developed and changed to provide an optimal learning environment.

101

Table 14
Guidelines and Considerations for Creating Internet Surveys Following the TDM
Framework
1) Respondents were more likely to endorse items presented in a forced-choice as
opposed to a check-all format
2) For scalar questions with answer choices presented in a drop down box, display
all answer choices in a drop down box without requiring scrolling.
3) The use symbols can increase response rate by cueing respondents to attend to
specific elements of questions and response choices without adding words
(Christian & Dillman, 2004).
4) When designing items with specific instructions that deviate from the general
directions, it is recommended that those instructions be placed after the question
but before the answer choices.
5) Numbering items can prevent respondents from making response errors. Some
respondents consider unnumbered items at the beginning of a survey to be
practice items, and as such, do not always answer them (Dillman & Redline,
2004). Skip patterns can be programmed so respondents answer only relevant
items, and in these cases, numbers can cause confusion. In place of numbers,
some survey designers will signify items using symbols, such as a question mark
or asterisk. This may be confusing to respondents, as the omission of a culturally
understood guideline may require extra attention to figuring out the use and
meaning of the symbol. Survey designers are cautioned to evaluate the use of
numbers in survey construction and navigation, as they are a culturally understood
and reliable method of survey navigation.
6) Cultural expectations prompt respondents to assume that the most positive
categories will be placed at the top of vertical scales and to the left on horizontal
scales, while the most negative categories will be at the bottom or to the right
(Tourangeau, Couper, & Conrad, 2004). Respondents also tend to assume that the
middle option will represent the average or typical value (Tourangeau, Couper, &
Conrad, 2004).
7) Responding to scalar questions is more difficult when graphics or verbal
descriptions of the scale are taken out of the respondents visual display, requiring
the respondent to refer back to the question stem, and then to the response area.
8) Provide instructions that facilitate accurate responding the first time the
respondent attempts an item, as an error message containing corrective feedback
can cause the respondent to experience frustration, and discontinue responding.
9) To enable respondents to provide answers in the correct format without receiving
and error message, designers may need to provide multiple visual cues (e.g.,
appropriate answer space size, use of symbols instead of words, numbers clearly
associated with symbols).

102

Research and best practices in counseling now focus on helping school psychologists
deliver counseling interventions that are evidence-based (Silverman & Hinshaw, 2008),
using the problem-solving model (Upah, 2008). The literature on student mental health
cites not only the connection between emotional well-being and academic success
(Haertel et al., 1983; Wang et al., 1990), but also the importance of the problem-solving
model and data-based decision making when designing and implementing counseling
interventions (Miltenberger, 2005; Tilly & Flugum, 1995; Upah, 2008).
Federal education legislation has also evolved to meet the needs of students over
time, such as disparities in the quality of education as a result of economic disadvantage
(e.g., ESEA) or special needs (e.g., IDEA). Currently, however, it is no longer enough
for those working with children to strive to meet the needs of students, as those working
in both the public and private sector are being held increasingly more responsible for
using evidence-based practices and documenting accountability for their actions (Kazdin,
2008). Although, at this time, federal legislation holds school professionals accountable
for students educational outcomes (e.g., NCLB), in time such standards of accountability
may also explicitly apply to school psychologists as they provide counseling
interventions to improve students social-emotional and behavioral outcomes. Research
exists documenting evidence-based interventions, the problem-solving model, and databased decision making practices for counseling and behavioral interventions that may
also provide measures of accountability for these interventions. At this time, however,
the extent to which school psychologists use evidence-based interventions, the problemsolving model, and data-based decision making in their counseling interventions is
unknown.

103

Research Questions
This study set out to survey practicing school psychologists in an attempt to
identify (a) their general counseling practices, (b) their use of best practices related to the
problem-solving model, and (c) demographic variables that might impact their counseling
practices (e.g., training, years of experience, school size, other roles and responsibilities).
General counseling practices of school psychologists. The following questions
were designed to gather information related to the current counseling practices of school
psychologists.
1) What percentage of school psychologists provide group and/or individual
counseling? Are school psychologists counseling general or special education
students, or both?
2) Approximately how many students do school psychologists recommend
declassifying from counseling each year, and what reasons are most commonly
cited when making this recommendation?
3) Are there any demographic differences (e.g., training, years of experience, other
roles and responsibilities) related to school psychologists group and individual
counseling practices?
4) What type of training and professional development have school psychologists
received related to planning and implementing counseling as a direct social
emotional behavioral intervention?
School psychologists use of best practices related to the problem-solving
model. The following questions were developed to determine which aspects of the
problem-solving model are most often employed in the design and implementation of

104

counseling interventions. These questions also explored whether demographic variables


were related to the use of these practices.
5) Which aspects of the general problem-solving model (i.e., behavioral definition,
baseline data, problem validation, problem analysis, goal-setting, intervention
plan development, measurement strategy, decision-making plan, progress
monitoring, formative evaluation, treatment integrity, and summative evaluation)
are most commonly used when designing and implementing counseling as a direct
intervention?
6) How frequently are specific components of selected steps of the problem-solving
model (i.e., behavioral definition, baseline data, behavioral goal, intervention plan
development, measurement strategy, decision-making plan, progress monitoring,
formative evaluation, treatment integrity, and summative evaluation)
implemented by school psychologists as they design and implement counseling as
a direct intervention?
7) Are there any demographic differences (e.g., training, years of experience, other
roles and responsibilities) between school psychologists in terms of their use of
the general steps of the problem-solving model when designing and implementing
counseling as a direct intervention?

105

CHAPTER 3: Methodology
Overview
The purpose of this chapter is to describe the methods and procedures that were
used in the current study. A description of the participants and instrumentation that were
used to gather information regarding the counseling practices of school psychologists is
provided.
Participants
Participants for the current study were selected from a random sample of school
psychologists who were listed in the registry of Nationally Certified School Psychologists
(NCSPs) on the National Association of School Psychologists (NASP) website
(www.nasponline.org). The original survey population was composed of 1,000 school
psychologists currently employed in public schools. A second sample of 500 school
psychologists was utilized, and gathered using the same method as the first sample.
Instrumentation
Several instruments were used in this study to explain to potential participants the
purpose and importance of this study and to gather information related to specific
counseling practices employed by school psychologists currently practicing in school
settings. Specific information describing each instrument is described in this section.
Survey. The survey employed in this study was modeled on a previous survey of
school psychology counseling practices (Yates, 2003), and was written using Psychdata
(www.psychdata.com), incorporating guidelines provided Dillman (2007) according to
the Tailored Design Method (TDM; see Appendix A). The use of TDM was meant to
maximize response rates while at the same time minimizing survey error.

106

Survey items were phrased as specific statements about the provision of


counseling services. The questionnaire consisted of a variety of closed and partially
close-ended questions. The responses were designed to gather information about: a) the
specific practices used by school psychologists in their school counseling practice,
specifically in reference to the use of the problem-solving model and data-based decision
making; b) demographic variables that may potentially impact the percentage of time that
school psychologists spend on counseling; and, c) training that school psychologists have
received that would impact their school counseling practice.
Coverletter email. Prospective respondents received an email which identified
the researcher and affiliated institution (Appendix B). In addition, this email explained
the purpose and significance of the study, and concluded with a link that respondents
could click in order to access the survey. The coverletter informed respondents that their
participation was voluntary, and that they were free to withdraw from the study at any
time without penalty. Furthermore, respondents were made aware that specific
demographic information would not be collected, and that their responses would be
anonymous. This email also identified individuals who could be contacted in the event
that the respondent had any questions about the survey. Respondents were also informed
that, at the end of the survey, they would be invited to click another link where they could
provide contact information for the purposes of entering a raffle for one of two $50 gift
certificates.
Follow-up email. As planned, after two weeks of data collection, the survey
response rate was evaluated and found to be low, and a follow-up email was sent out to
prospective respondents. This email thanked those who had already responded, and

107

solicited the participation of those who had not (Appendix C). This email contained the
same information as the original coverletter email, in terms of identifying the researcher
and affiliated institution, and informing participants of the purpose, significance,
anonymity, and procedural safeguards associated with the study.
Procedure
Prior to sending coverletter emails inviting potential respondents to complete the
survey, five school psychologists from the Albany, New York area previewed the
instruments. These school psychologists were asked to complete the survey and provide
feedback related to the instruments readability, structure, and clarity (Appendix D).
Based on this feedback, survey items were evaluated for content, design, and length.
All five school psychologists who were contacted participated in the pilot study.
Analysis of the feedback that was provided revealed that survey completion time ranged
from 10 to 20 minutes. The majority of the pilot subjects indicated that the questions and
terms used were clear and easy to understand. None of the questions was deemed
unnecessary or irrelevant. Provided in Table 15 is a summary of respondent comments
and suggestions that were considered in revisions to the survey before its final
implementation. Overall, the results of the pilot study were positive and supported the
decision to move forward with the study.
This study was then conducted via an online survey. The coverletter email was
sent to a random sample of school psychologists selected from the NASP NCSP
directory. This coverletter included a link to the survey created using PsychData, which
is a secure website that allows for the collection of research data while preserving
respondent anonymity. In order to maintain anonymity, specific identifying information,

108

Table 15
Summary of Feedback From Pilot Subjects
Comment/Suggestion
Differentiate between school
psychologists who provide crisis
based or more long-term counseling
Provide option for reporting the
number of students discontinued
from counseling in relation to the
number of students seen for
counseling each year

Change presentation of
psychologist:student (Item 9) as this
might be confusing
Items requesting the frequency of
certain counseling practices were
written such that respondents could
select more than one response

Add an option for collecting


0progress monitoring data points
to Item 68

Adjust length of survey as


respondents may not complete all
items

Changed
Yes/No
No

Rationale/Outcome
Subjects could specify their
counseling practices on Item 4, and
at the end of the survey

No

The relationship between students


counseled and students seen each
year was not considered pertinent to
the research questions guiding this
study

No

The design of this question is


patterned off of similar questions in
the research literature (e.g., Bramlett
et al., 2002; Fagan, 1988, 2008)

Yes

Questions revised to restrict


responses to a single frequency for
each practice

No

Survey is designed so that


respondents who report not
collecting any progress monitoring
data points on Item 61 are not
directed to Item 68

No

Survey is tailored to individual


counseling practices of each
respondent so that respondents only
see specific items they deemed
relevant to their experience; length is
necessary to answer research
questions; no items rated as
unnecessary; completion time was
not indicated to be unreasonable

109

such as respondents names, was not requested. At the end of the survey, all respondents
were invited to follow a second link to enter a raffle for one of two $50 gift certificates as
compensation for their participation. Illustrated in Table 16 are the mailing and response
data.
Initially, survey invitations were sent to 1,000 NCSPs. Due to a low response
rate, a second sample of 500 NCSPs was gathered and sent an email invitation to
complete the survey. Both the original sample and the second sample received initial and
reminder email invitations. In an attempt to increase the response rate, the initial sample
received a second reminder email. Each mailing had a certain number of emails that
were returned to the sender as undeliverable. Additionally, some of those contacted
indicated that they (a) were retired, (b) did not provide counseling, and/or (c) did not
provide psychological services in a school and therefore did not complete the survey.
The total number of emails sent (n=1,500) was adjusted for the emails that were returned
(n=171) and the replies declining participation in the survey for one of the reasons listed
earlier in this paragraph (n=65). Dividing the total number of responses (n=283) by the
adjusted number of valid emails (n=1264) yielded a total response rate of 22.39%.

110

111

135

1,000

1,000
500

500

1,000
1,500

Initial Sample
Reminder Email
(9/12/11)

Second Sample
(9/19/11)

Second Sample
Reminder Email
(10/3/11)

Initial Sample
Reminder Email
(10/16/11)

Total

171

11

15

Returned
Undeliverable

n Sent

Mailing (Date
Sent)
Initial Sample
(8/2911)

Mailing and Response Data

Table 16

25

Retired

111

23

10

Do Not
Provide
Counseling

17

Not
Employed
in a School

283

54

18

50

64

97

Completed
Surveys

22.39%

5.53%

3.67%

10.46%

6.58%

11.43%

Return Rate

CHAPTER 4: Results
Overview
This study was conducted to gain information about the counseling practices of
school psychologists. This chapter summarizes the results of the survey data gathered in
response to the research questions presented in Chapter 2. A discussion of the respondent
characteristics and demographic data follows, along with the quantitative and qualitative
results of the survey.
Data Analysis Plan
Several methods of data analysis were used to address the research questions
posed in this study. Descriptive statistics (e.g., frequencies, percentages, and averages)
were calculated to report on the general counseling practices of school psychologists, and
their use of the problem-solving model and accountability. A series of chi-square
analyses were conducted to determine the presence of relationships between counseling
practices and use of the problem-solving model and accountability and the demographic
information describing the school psychologists in this sample. A table displaying the
specific data analysis procedure used for each survey item can be found in Appendix E.
Respondent Characteristics/Demographic Data
As discussed in Chapter 3, an online survey was created using Psychdata and
emailed to a national sample of 1,500 randomly selected Nationally Certified School
Psychologists (NCSPs). A total of 283 survey responses were analyzed, representing a
response rate of 22.39%. Demographic characteristics of the respondents are
summarized in Table 17. It should be noted that not all participants completed all of the
demographic information items, and that the rate of non-response is listed next to each

112

Table 17
Demographic Characteristics of Respondents
Variable (% of Non-Response)
Graduate Degrees Received
MA/MS
Certificate/Specialist
PhD/PsyD/EdD
Graduate Program Accreditation
NASP
APA
NCATE
Your State
Not Accredited
Course Topics Included in Graduate Coursework
Academic Interventions
Behavioral Interventions
Counseling and Psychotherapy With Children
Counseling Children With Developmental
Disabilities
Group Counseling
Multicultural Counseling
Years Since Last Degree Earned
0-5
6-10
>10
Years of Employment in a School Setting (1.1%)
0-5
6-10
>10
Grade Levels Served By School Psychologists
Elementary School Students
Middle/Junior High School Students
High School Students
Types of Schools Served (2.5%)
Rural
Suburban
Urban
Mixed
Other (e.g., reservation, department of defense
school, small city)

113

43
172
64

15.4
61.6
22.9

247
78
50
75
1

87.3
27.6
17.7
61.8
00.4

247
256
244

87.3
90.5
86.2

127
212
169

44.9
74.9
59.7

125
62
91

45.0
22.3
32.7

116
54
110

41.4
19.3
39.3

220
136
123

77.7
48.1
43.5

67
106
50
43

24.3
38.4
18.1
15.6

10

03.6
(table continues)

Table 17 continued
Variable (% of Non-Response)
Psychologist to Student Ratio (3.2%)
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Region of Employment (1.8%)
Northeast
Midwest
South
West
Themes of Professional Development Attended Over the
Past Five Years
No Child Left Behind
Academic and Behavioral Accountability
Provision of Counseling
Evidence-Based Behavioral Interventions
Data-Based Decision Making
Response to Intervention
Other (e.g., Autism, Crisis Response, Positive
Behavior Support, Ethics, Legal Mandates)

114

33
77
81
47
36

12.0
28.1
29.6
17.2
13.1

70
60
75
73

25.2
21.6
27.0
26.3

61
162
104
226
195
245

21.6
57.2
36.7
79.9
68.9
86.6

49

17.3

variable displayed. Percentages were calculated from the total number of participants
that provided demographic data on each item, and not from the overall number of
participants. The majority of respondents held certificate or specialist degrees (61.6%)
from NASP-approved (87.3%) and state-accredited (61.8%) graduate programs. Their
coursework included training related to academic (87.3%) and behavioral (90.5%)
interventions, counseling and psychotherapy with children (86.2%), and group counseling
(74.9%). Over half of the respondents (59.7%) also studied multicultural counseling as
part of their training. Almost half of the respondents (45%) completed their graduate
training within the last 5 years. Nearly equal numbers of respondents indicated that they
had been employed in a school setting for 5 years or less (41.4%) or had more than 10
years of experience (32.7%). Typical school settings were suburban (38.4%) elementary
(77.7%) schools, at a psychologist-to-student ratio of 1:500-999 (28.1%) or 1:1,000-1499
(29.6%). Respondents were employed in similar numbers across the different regions of
the United States (responses range from 21.6% to 27% depending on region). Within the
past five years, many had attended professional development related to Response to
Intervention (86.6%), Evidence-Based Behavioral Interventions (79.9%), Data-Based
Decision Making (68.9%), and Academic and Behavioral Accountability (57.2%). As
displayed in Table 18, in terms of time allocation, 42.2% of respondents spent between
25-50% of their time on assessment, and devoted 25% of their time or less to direct
interventions (80%), consultation and indirect services (70.9%), research (100%),
administration (96.4%), or systems-level activities (97.1%).
General Counseling Practices
In addition to demographic data, respondents were asked to provide general

115

116

5.93 (9.72)
7.41 (10.14)

Administration

Systems-level
Activities

(265)

97.1 (267)

6.4

100.0 (275)
3.3

0.0
(9)

(0)

26.5 (73)

116

(3)

2.19 (9.72)

Research

70.9 (195)

18.2 (50)

Other (e.g.,
paperwork)
3.55 (8.00)
98.2 (270)
1.1
Note: 2.8% non-response rate for each activity within this item

22.21 (14.07)

Consultation and
Indirect Services

80.0 (220)

(6)

16.4 (14.67)

Direct Interventions

0-25
28.4 (78)

0.7 (2)

0.4 (1)

0.0 (0)

0.0 (0)

1.8 (5)

1.8 (5)

% Time Allocation (n)


26-50
51-75
42.2 (116) 20.7 (57)

2.2

M(SD)
42.67 (23.00)

Activity
Assessment

Estimates of Time Allocation

Table 18

0.0 (0)

0.4 (1)

0.4 (1)

0.0 (0)

0.7 (2)

0.0 (0)

76-100
8.7 (24)

information related to their counseling practices. As shown in Table 19, the majority of
respondents (54.8%) provide group and individual counseling to both general and
special education students (73.5%). Some variation was noted in the average number of
students recommended for discontinuation from counseling each year, with respondents
recommending discontinuation at low (e.g., 0 [15.1%], 1 and 3 [13.8%], 2 [24.6%]
students) and high frequencies (e.g., >7 [12.5%] students). The most common reason for
recommending discontinuation from counseling was that counseling goals were met
(55.9%). Nearly half of the sample (44.4%) indicated using print or online resources
when writing behavioral goals, clarifying the problem, or when determining behavioral
expectations.
Comparison of Demographic Variables and General Counseling Practices
In addition to exploring specific counseling and discontinuation practices of
current school psychologists, one of the goals of this survey was to determine whether
group and individual counseling practices varied by demographic characteristics (e.g.,
training, years of experience, other roles and responsibilities). To determine this, four
multinomial logistic regression models were created to ascertain whether respondents
graduate degree (training), years of experience, and time spent counseling could predict
the type of counseling they engaged in, the students they served, the number of students
they discontinued each year, and their reasons for discontinuation. For the purposes of
analysis, the variable graduate degree was re-categorized from three to two categories,
such that the first category included respondents with a MA/MS or Certificate/Specialist
degree, and the second category included respondents with doctoral-level training (PhD,
PsyD, EdD). The number of students discontinued from counseling each year was also

117

Table 19
Counseling Practices of School Psychologists
Variable (% Non-Response)
Type of Counseling Provided (14.5%)
Group Counseling Only
Individual Counseling Only
Group and Individual Counseling
Children Served in Counseling (14.5%)
Special Education Students
General Education Students
Special and General Education Students
Number of Students Recommended for Discontinuing
Counseling Each Year (18%)
0
1
2
3
4
5
6
7
>7
Reasons for Discontinuing Counseling Services
Counseling Goals Have Been Met
No Positive Effect on Behavior
Student Leaves School/District
Parent Preference
Other (e.g., school policy, outside referral, student
need)
Use of Print/Online Resources for Writing Goals, Problem
Clarification or Determining Behavioral Expectations (12.4%)
Use Print or Online Resources
Do Not Use Print or Online Resources

118

19
68
155

07.9
28.1
54.8

62
3
177

25.6
01.2
73.5

35
32
57
32
21
22
2
2
29

15.1
13.8
24.6
13.8
09.1
09.5
00.9
00.9
12.5

127
24
26
5

55.9
10.6
11.5
02.2

45

19.8

110
138

44.4
55.6

collapsed from 8 to 4 groups (e.g., group 1: 0 students; group 2: 1, 2, or 3 students; group


3: 4, 5, or 6 students; group 4: 7, or >7 students) to ensure adequate cell counts. The
variable measuring time spent on direct interventions was also renamed Time Spent
Counseling, and recoded into three groups (low counseling = 0-9% of time spent
counseling; medium counseling = 10-24% of time spent counseling; high counseling =
25-100% of time spent counseling).
Results of these regression analyses can be found in Tables 20, 21, 22, and 23.
Although the overall predictive utility of these demographic variables was unimpressive
(as indicated by R2 values close to 0), they offered some predictive improvement over the
null model when comparing demographic variables with the type of counseling
administered, and the number of students discontinued from counseling each year (as
shown by significance levels for the Final model <0.05). Furthermore, in four instances,
predictor variables met or approached a level of significance in the final model, and as
such, chi-square analysis was used to ascertain the presence of any relationships between
each predictor variable and counseling outcomes.
Over the course of data analysis, a total of 10 chi-square analyses were conducted.
To reduce the likelihood of Type I error caused by repeated analyses, the Sidak (1967,
1968) correction was applied, adjusting the level of significance from 0.05 to 0.005.
Over the four chi-square analyses that explored relationships between demographic
variables and counseling practices, two reached the adjusted level of significance, and are
displayed in Tables 24 and 25 (results of the two non-significant chi-squares can be found
in Appendix F). A relationship was found between counseling type and years of
experience, such that respondents with more than 10 years of experience were more

119

Table 20
Multinomial Regression Predicting Type of Counseling from Graduate Degree, Years of
Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
122.991
Final
96.701
26.29
10
Graduate Degree
101.099
4.40
2
Years of Experience
110.014
13.31
4
Time Spent Counseling
104.223
7.52
4
2
Note: R =.105 (Cox and Snell), .129 (Nagelkerke), .066 (McFadden)

Sig
.003
.111
.010
.111

Table 21
Multinomial Regression Predicting Students Served in Counseling from Graduate
Degree, Years of Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
76.18
Final
62.98
13.20
10
Graduate Degree
63.07
0.09
2
Years of Experience
73.82
10.84
4
Time Spent Counseling
65.57
2.59
4
2
Note: R = .054 (Cox and Snell), .075 (Nagelkerke), .044 (McFadden)

Sig
.213
.956
.028
.629

Table 22
Multinomial Regression Predicting the Number of Students Discontinued from
Counseling from Graduate Degree, Years of Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
180.84
Final
131.42
49.42
15
Graduate Degree
135.87
4.46
3
Years of Experience
158.64
27.22
6
Time Spent Counseling
143.85
12.43
6
Note: R2= .195 (Cox and Snell), .214 (Nagelkerke), .090 (McFadden)

120

Sig
.000
.216
.000
.053

Table 23
Multinomial Regression Predicting Reasons for Discontinuation from Counseling from
Graduate Degree, Years of Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
161.58
Final
135.95
25.62
20
Graduate Degree
136.77
.82
4
Years of Experience
150.30
14.35
8
Time Spent Counseling
147.64
11.69
8
2
Note: R = .108 (Cox and Snell), .119 (Nagelkerke), .047 (McFadden)

Sig
.178
.925
.073
.166

Table 24
Chi Square Analysis Comparing Type of Counseling and Years of Experience
Variable
Group Counseling Only
Observed
Expected
Std. Residual
Individual Counseling Only
Observed
Expected
Std. Residual
Group and Individual Counseling
Observed
Expected
Std. Residual
Total
Note: 2= 15.83, df=4, Sig.=0.003

0-5

Years of Experience
6-10
>10

Total

10.0
8.1
0.7

1.0
3.7
-1.4

8.0
7.2
-0.3

19
19

19.0
28.6
-1.8

11.0
13.1
-0.6

37.0
25.3
2.3

67
67

74.0
66.2
1.0
103.0

35.0
30.2
0.9
47.0

46.0
58.5
-1.6
91.0

155
155

121

241

Table 25
Chi Square Analysis Comparing Number of Students Discontinued From Counseling
Each Year and Years of Experience
Number of Students
0
Observed
Expected
Std. Residual
1-3
Observed
Expected
Std. Residual
4-6
Observed
Expected
Std. Residual
7
Observed
Expected
Std. Residual
Total
Note: 2= 31.96, df=6, Sig.=0.000

0-5

Years of Experience
6-10
>10

Total

17.0
14.9
0.5

2.0
6.6
-1.8

16.0
13.4
0.7

35
35

60.0
51.6
1.2

29.0
22.9
1.3

32.0
46.4
-2.1

121
121

18.0
19.2
-0.3

10.0
8.5
0.5

17.0
17.3
-0.1

45
45

4.0
13.2
-2.5
99.0

3.0
5.9
-1.2
44.0

24.0
11.9
3.5
89.0

31
31

122

232

likely than expected to engage in individual counseling exclusively; it was found to be


less likely that those with up to 5 years of experience would see students only on an
individual basis. A second relationship was noted between the number of students
discontinued from counseling each year and years of experience. Respondents with the
least amount of experience were less likely than expected to discontinue seven or more
students from counseling, while those with more than 10 years of experience were more
likely to discontinue students at this frequency.
Components of the Problem-Solving Model Used
This survey was designed to gather data on the use of common components of the
problem-solving model, as well as the application of selected steps involved in each
component. All subjects were asked to report on their use of common components, and
then asked to provide responses on selected steps for only those common components
that they reported using. As listed in Table 26, the most common components of the
problem-solving model used by respondents when designing and implementing
counseling as a direct intervention were constructing a behavioral definition for the
problem (87.5%), intervention plan development (85.1%), problem validation (83.8%),
progress monitoring (82.8%), and goal-setting (81.3%). Components that were least
often used were formative evaluation (55.3%), problem analysis (69.2%), summative
evaluation (69.3%), and a decision-making plan (70%). The format for the item
exploring treatment integrity included an option for sometimes implementing this during
counseling interventions. This intervention component received the lowest endorsement
for consistent use (15.2%), with 77.9% of respondents indicating that they measure
treatment integrity at least some of the time. The following paragraphs describe the

123

Table 26
Use of Intervention Components of the General Problem-Solving Model

Intervention component
Behavioral definition

Always Employ This


Component % (n)
87.5 (217)

Do Not Employ
This Component
% (n)
12.5 (31)

Did Not
Respond to this
Item % (n)
12.4 (35)

Baseline data

77.7 (188)

22.3 (51)

14.5 (41)

Problem validation

83.8 (196)

16.2 (38)

17.3 (49)

Problem analysis

69.2 (162)

30.8 (72)

17.3 (49)

Goal setting

81.3 (191)

18.7 (44)

17.0 (48)

Intervention plan
development

85.1 (194)

14.9 (34)

19.4 (55)

Measurement strategy

78.9 (176)

21.1 (47)

21.2 (60)

Decision-making plan

70.0 (156)

30.0 (67)

21.2 (60)

Progress monitoring

82.8 (173)

17.2 (36)

26.1 (74)

Formative evaluation

55.3 (114)

44.7 (92)

27.2 (77)

Treatment integrity

77.9 (159)*

22.1 (45)

27.9 (79)

Summative evaluation
69.3 (142)
30.7 (63)
27.6 (78)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond; * This item contained options for Sometimes (62.7% [128]) and Always
(15.2% [31]) measuring Treatment Integrity. For the purposes of analysis and reporting,
these categories have been combined.

124

frequency with which respondents reportedly apply specific aspects of the most common
components of the problem-solving model as they design and implement counseling as a
direct intervention. For data analysis purposes, the frequency categories of sometimes
and always used were combined. Frequency percentages for selected components were
calculated from the total number of participants that indicated using each corresponding
common component, and not from the overall number of participants.
Behavioral definition and baseline data collection. When writing behavioral
definitions, highest response rates were noted for action verbs describing student
behavior in observable terms (99.1%), and descriptions of the frequency (94.9%),
intensity (86%), and duration (80.2%) of the behavior. Lowest rates of endorsement were
provided for describing latency (52.1%) and accuracy (59.4%) of student behavior. The
most commonly used methods of baseline data collection were direct behavioral
observations (98.4%), third-party behavior ratings (99%), objective self-reports (87.1%),
and third-party interviews (90.3%). To determine a stable pattern of student behavior,
respondents reported collecting 3 to 5 baseline data points. Displayed in Tables 27, 28,
and 29 are the results for the use of these specific components.
Goals, intervention planning, measurement, and decision-making.
Determining what should be done to address the problem behavior involves goal setting,
developing an intervention plan, devising a measurement strategy, and coming up with a
plan for decision-making. Results showing the use of these specific components can be
found in Tables 30, 31, 32, and 33. Respondents indicated that they include timeframe
(92%), condition (97.8%), behavior (98.4%), and criteria (97.7%) some or all of the time
when writing a behavioral goal. When writing counseling intervention plans,

125

Table 27
Use of Specific Components of Behavioral Definition Composition

Behavioral Definition Component


Action verbs describing what the student does in
observable terms

Employ This
Component
% (n)

Did Not
Respond
% (n)

99.1 (214)

23.7 (67)

Frequency (the number of times the behavior occurs


during an observation period)

94.9 (203)

24.4 (69)

Latency (how much time passes between the presentation


of a stimulus and the students response or behavior)

52.1 (101)

31.4 (89)

Intensity (the strength or force with which the behavior is


displayed)

86.0 (178)

26.9 (76)

Topography (the configuration, shape, or form of the


behavior)

61.0 (119)

31.1 (88)

Accuracy (a measure of how the students behavior is


correct or fits a standard)

59.4 (117)

30.4 (86)

Duration (how much time passes between the onset and


the ending of a behavior)
80.2 (166)
26.9 (76)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

126

Table 28
Use of Specific Types of Baseline Data Collection

Baseline Data Collection Method


Direct behavioral observation

Sometimes
Use This
Method
% (n)
31.9 (61)

Always Use
This
Method
% (n)
66.5 (127)

Did Not
Respond
% (n)
32.5 (92)

3rd party behavior rating (from parent,


teacher, or related service provider)

52.4 (100)

46.6 (89)

32.5 (92)

Sociometric techniques

60.7 (108)

3rd party interview


Objective self-report

5.1

(9)

37.1 (105)

60.2 (112)

30.1 (56)

34.2 (97)

68.1 (124)

22.0 (40)

35.7 (101)

Projective-expressive technique
34.6 (63)
02.2 (4) 35.7 (101)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond
Table 29
Average Number of Baseline Data Points Collected To Establish a Stable Pattern of
Student Behavior
Number of Points
Collected
% of Responses (n)
1
03.7 (07)
2
12.0 (23)
3
37.2 (71)
4
14.1 (27)
5
18.8 (36)
6
03.7 (07)
7 or Greater
10.5 (20)
Note: 32.5% (92) Non-Response rate

127

Table 30
Use of Specific Criteria When Writing Behavioral Goals
Sometimes
Use This
Criteria
% (n)

Always Use
This
Criteron
% (n)

Did Not
Respond
% (n)

36.3 (66)

56.6 (103)

35.7 (101)

Condition (the specific circumstances in


which the behavior will occur)

37.2 (67)

60.6 (109)

36.4 (103)

Behavior (written in objective, observable,


and measurable terms describing what the
student will be able to do)

19.8 (36)

78.6 (143)

35.7 (101)

Behavioral Goal Criterion


Timeframe (when the expected progress
will be made in terms of days, weeks, and
months)

Criteria (a standard for how well the


behavior is to be performed)
39.5 (70)
58.2 (103)
37.5 (106)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

128

Table 31
Use of Specific Components of Counseling Intervention Plans
Sometimes
Include This
Component
% (n)

Always Use
This
Component
% (n)

Did Not
Respond
% (n)

41.6 (77)

53.5 (99)

34.6 (98)

Documentation that the strategies to be


used have been empirically validated in
the literature on evidence-based
interventions

58.2 (107)

21.2 (39)

35.0 (99)

A description of the specific steps and


activities that will be engaged in during
counseling sessions

52.2 (95)

39.6 (72)

35.7 (101)

A description of how each step or activity


will be completed

46.2 (85)

32.1 (59)

35.0 (99)

The materials needed for each step or


activity

52.7 (96)

26.9 (49)

35.7 (101)

A description of what each person


engaged in the activity will do

47.3 (86)

35.7 (65)

35.7 (101)

Intervention Plan Component


A clear description of the procedures to be
used

The location where the intervention is to


take place
41.4 (75)
44.8 (81)
36.0 (102)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

129

Table 32
Use of Specific Components for Measuring Target Behaviors
Sometimes
Use This
Component
% (n)

Always Use
This
Component
% (n)

Did Not
Respond
% (n)

19.3 (34)

79.0 (139)

37.8 (107)

A clear description of where the behavior


will be measured

37.5 (66)

57.4 (101)

37.8 (107)

A clear description of when the behavior


will be measured

39.1 (68)

56.9 (99)

38.5 (109)

A clear delineation of who will measure


the behavior

37.1 (65)

58.3 (102)

38.2 (108)

A description of the recording method


most appropriate for the behavior

38.5 (67)

55.7

38.5 (109)

Measurement Component
A behavioral definition of the target
behavior

(97)

A description of the most appropriate


recording measure
44.2 (76)
45.3 (78)
39.2 (111)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

130

Table 33
Use of Specific Decision-Making Components

Decision-Making Component
A determination of the frequency of
behavioral measurements and data to be
collected

Sometimes
Always
Include This Include This
Component Component
% (n)
% (n)

Did Not
Respond
% (n)

37.8 (56)

60.8 (90)

47.7 (135)

A decision on how the data will be


summarized for the purposes of
intervention evaluation (e.g., visual
presentation, written report or summary)

53.1 (78)

36.1 (53)

48.1 (136)

A determination of how many behavioral


data points will be collected before the
intervention data will be analyzed

47.6 (70)

42.2 (62)

48.1 (136)

A determination of how much time will


pass before the intervention data will be
analyzed

42.6 (63)

52.7 (78)

47.7 (135)

A set of decision rules for responding to


specific data points
62.1 (90)
24.1 (35)
48.8 (138)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

131

respondents describe the procedures to be used (95.1%), the steps and activities to be
completed during sessions (91.8%), the location where the intervention is to take place
(86.2%), and what each persons role in the session is to be (83%). Respondents
indicated that, when writing measurement plans, they include a behavioral definition of
the target behavior (98.3%), a description of where (94.3%), when (96%) and who
(95.4%) will measure the behavior, a description of the recording method (89.5%), and a
rationale for why the method is appropriate for the target behavior (94.2%). Decisionmaking plans reportedly specify the frequency with which behavioral data would be
collected (98.6%), how the data would be summarized and reported (89.2%), how many
data points would be collected (89.8%) and how much time would pass before
intervention data analysis (95.3%), and decision rules for responding to specific data
points (86.2%).
Progress monitoring, formative evaluation, treatment integrity, and
summative assessment. The final intervention components of the problem-solving
model are progress monitoring, formative evaluation, treatment integrity, and summative
evaluation. Data regarding the use of these specific components can be found in Tables
34, 35, 36, 37, 38, and 39. Commonly used methods of collecting progress monitoring
data include direct behavioral observation (98.9%), third-party behavior rating scales
(97.2%), interviews (96%), and objective self-report measures (88%). Variability was
noted in the number of progress monitoring data points respondents considered necessary
to establish a stable pattern of student behavior, with some respondents indicating that
they collect a range of 3 to 8 data points. Only one-third of respondents who engage in
baseline and progress monitoring data collection consistently use the same procedure

132

Table 34
Use of Specific Progress Monitoring Techniques

Data Collection Technique


Direct behavioral observation

Sometimes
Use This
Technique
% (n)
43.0 (77)

Always Use
This
Technique
% (n)
55.9 (100)

Did Not
Respond To
This Item
% (n)
36.7 (104)

3rd party behavior rating scales (from


parent, teacher, or related service provider)

62.0 (111)

35.2 (63)

36.7 (104)

Sociometric techniques

59.5 (100)

Interviews
Objective self-report measures

1.8

(3)

40.6 (115)

57.6 (102)

38.4 (68)

37.5 (106)

71.3 (124)

16.7 (29)

38.5 (109)

Projective-expressive techniques
32.7 (56)
2.9 (5)
39.6 (112)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond
Table 35
Number of Progress Monitoring Data Points Collected to Establish a Stable Pattern of
Student Behavior
Number of Data
Points Collected % of Item Responses (n)
1
01.7 (03)
2
06.7 (12)
3
28.7 (51)
4
13.5 (24)
5
13.5 (24)
6
10.1 (18)
7
05.1 (09)
8
01.1 (02)
>8
19.7 (35)
Note: 32.5% (105) non-response rate

133

Table 36
Use of the Same Method for Baseline Data Point and Progress Monitoring Data Point
Collection
Response Type
Yes
Sometimes, depending on the situation
No
Note: 36.4% (103) non-response rate

% of Item
Responses (n)
33.3 (60)
65.6 (118)
01.1 (02)

Table 37
Sources of Data Considered During Formative Assessment

Data Source
The level of the behavior (how much the behavior is
occurring during baseline and intervention phases as
judged by repeated, objective measurements of its
frequency, duration, intensity, or the percentage of
intervals in which it occurs)

Consider
This Data
Source
% (n)

Did Not
Respond
% (n)

90.0 (108)

57.6 (163)

The trend of the behavior (whether the level of the


behavior is increasing or decreasing within the baseline
and intervention phases)

94.1 (111)

58.3 (165)

Anecdotal information from the student, his/her family,


teachers, or related service providers

95.0 (114)

57.6 (163)

Your own subjective assessment of the students


behavior

90.0 (108)

58.0 (164)

Data documenting the students performance in school


(e.g., work samples, grades, attendance records,
behavioral referrals)
95.8 (114)
58.0 (164)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

134

Table 38
Use of Methods to Measure Treatment Integrity
Use This
Measurement
Method
% (n)
86.5 (135)

Did Not
Respond
% (n)
44.9 (127)

Logs documenting sessions

88.0 (139)

44.2 (125)

Checklists for intervention components

62.9 (95)

46.6 (132)

Permanent products of student work

77.6 (121)

44.9 (127)

Measurement Method
Self-Report

Direct observation by a 3rd party not directly


involved in implementation
53.8 (84)
44.9 (127)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

135

Table 39
Sources of Data Considered During Summative Assessment

Data Source
The level of the behavior (how much the behavior is
occurring during baseline and intervention phases as
judged by repeated, objective measurements of its
frequency, duration, intensity, or the percentage of
intervals in which it occurs)

Consider
This Data
Source
% (n)

Did Not
Respond
% (n)

89.7 (130)

48.8 (138)

The trend of the behavior (whether the level of the


behavior is increasing or decreasing within the baseline
and intervention phases)

97.3 (143)

48.1 (136)

Anecdotal information from the student, his/her family,


teachers, or related service providers

98.6 (144)

48.4 (137)

Your own subjective assessment of the students


behavior

90.7 (127)

50.5 (143)

Data documenting the students performance in school


(e.g., work samples, grades, attendance records,
behavioral referrals)
97.9 (138)
50.2 (142)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond

136

when collecting each type of data (33.3%), while a majority use the same procedure only
sometimes (65.6%). About half of respondents did not respond to questions regarding
formative assessment, treatment integrity, or summative assessment. Respondents that
collect formative assessment data do so by considering the level (90%) and trend (94.1%)
of the behavior, anecdotal descriptions (95%), their own subjective assessment (90%),
and data documenting student performance (95.8%). Treatment integrity was most
commonly measured using self-reports (86.5%), session logs (88%), and permanent
products of student work (77.6%). Similarities in the collection of formative (discussed
previously) and summative data were found, with anecdotal data (98.6%), data
documenting student performance (97.9%), and the trend of student behavior (97.3%)
being the most popular methods of summative assessment, followed by subjective
assessment (90.7%) and evaluation of the level of the behavior (89.7%).
Comparison of Demographic Variables and Use of the Problem-Solving Model
Respondent data on frequency of use of the major components of the problemsolving model were also compared with demographic variables (training, years of
experience, school type, other roles and responsibilities) to determine whether these
variables impacted design and implementation of counseling as a direct intervention. The
same re-coded variables (with the addition of school type and psychologist:student ratio)
as described previously were compared against frequency data on use of the major
components of the problem-solving model in 12 separate binary logistic regression
models.
Overall, these demographic variables were not found to offer predictive value for
determining use of the major components of the problem-solving model, as indicated by

137

insignificant overall statistics and low R2 values. Summary tables for these models can be
found in Appendix G. Despite these results, six instances were noted in which one or
more predictive factors made a significant contribution to the model. Therefore, 6 chisquare analyses were conducted to explore any possible relationships; no significant
relationships were found between these demographic variables and use of the problemsolving model (see Appendix F for these chi square tables).
Comments
At the end of the survey a section was created for participants to provide any
additional comments they might have had regarding the survey or research topic being
explored. A total of 58 respondents (20.49%) provided comments. Further analysis of
the comments revealed two themes. Some respondents chose to comment in order to
specify their counseling practices (e.g., do not counsel, offer less or limited counseling
than in previous years, employed in a specialized school with specific population of
students). Furthermore, of those who do not currently offer counseling, many indicated
that they based their responses on previous counseling experiences, or behavioral and
academic interventions they administer by themselves or as a member of a school-based
team. Other respondents critiqued the length and focus of the survey (e.g., focus on a
more narrow set of behaviors, offer more options for elaboration and context on specific
use of techniques).

138

CHAPTER 5: Discussion
Overview
The emphasis placed on accountability for educational and behavioral outcomes
in schools has increased over time as research continues to document student needs and
achievement. Current best practices in instruction and the design of behavioral and
counseling interventions specify how school psychologists can make data-based
decisions in their counseling practice (Upah, 2008), while federal education legislation
holds them accountable for their work with students (Wright & Wright, 2009). The
problem-solving model integrates research from response to intervention and singlesubject design paradigms with a focus on repeated, objective, and observable
measurements of student behavior over time to demonstrate the effectiveness of an
intervention. School psychologists were surveyed regarding their counseling practices,
with a specific focus on their implementation of research and best practice guidelines
related to the use of evidence-based interventions, progress monitoring, and data-based
decision making.
This chapter focuses on the discussion of the results of this study. The discussion
begins with consideration of respondent characteristics and how these demographics
compare with those of other research studies. The chapter continues with a review of the
findings within the major areas (i.e., counseling practices, use of general and specific
components of the problem-solving model) examined in the current study. In addition,
the implications of the results of this study for the field of school psychology, limitations
of the study, and potential directions for future research are discussed.
Respondent Characteristics

139

When considering the results of a research study, it is important to determine


whether the results can be generalized to the larger population of school psychologists.
In terms of sampling method, Curtis, Hunley, and Grier (2004) cited the NASP registry
as a comprehensive source of current contact information for school psychologists
interested in doing research with this population. As described in Chapter 3, NCSPs were
sampled in this study to ensure uniformity in training, and to select respondents who were
currently practicing in the field. For the current study, demographic data reported in
Chapter 4 were compared with current data describing the demographics of practicing
school psychologists reported by Curtis, Lopez, Castillo, Batsche, Minch, and Smith
(2008). It is important to note differences in populations and purpose, and therefore
variables of interest between Curtis et al. (2008) and this study (i.e., to describe
demographic characteristics versus the counseling practices of current practitioners).
Direct comparisons were made between graduate degrees received, psychologist-tostudent ratio, school setting served, and themes of professional development attended in
the last five years. Data on these comparisons can be found in Table 40.
Despite the fact that Curtis et al. (2008) achieved a higher response rate (59.3%)
than the current study (22.39%), both samples appear to have had similar graduate
training and comparable representation along the continuum of psychologist-to-student
ratio. Suburban schools were the most common employment settings for both samples.
Although not reported in this table, the Curtis et al. (2008) sample appears to have more
experience (M = 14 years), while a more bimodal distribution between more (32.7%, >10
years of experience) and less (45%, 0-5 years of experience) experienced practitioners
was found in the current study. Several demographic studies (e.g., Curtis, Grier, &

140

Table 40
Comparison of Demographic Characteristics
Variable
Graduate Degrees Received
Masters/Specialist
Doctorate
Psychologist to Student Ratio
1:<1,000
1:<1,500
1:1,500-2,000
1:>2,000
Type of School Served
Urban
Suburban
Rural
Themes of Professional Development Attended
Over the Past Five Years
Behavioral Interventions
Social-Emotional Intervention/Provision of
Counseling
Response to Intervention

141

Curtis et al.
(2008)

Current
Study

71.6%
24.4%

77%
22.9%

40%
65%
17%
18%

40.1%
69.7%
17.2%
13%

28.4%
50.2%
28.8%

18.1%
38.4%
24.3%

47.1%

79.9%

28.7%
26.3%

36.7%
86.6%

Hunley, 2004; Curtis, Hunley, & Grier, 2004) document the graying of the field of
school psychology, and warn of projected shortages due to retiring practitioners
beginning in 2010. The split in years of experience in the current sample could be
explained by an influx of recent graduates entering the field to replace those who have
already retired.
Without over-interpretation of data provided by Curtis et al. (2008), several
commonalities were noted between topics explored through professional development.
Given the fact that opportunities for professional development face a variety of
constraints (e.g., financial resources, time availability, location), it may be more valuable
to note similarities in topics, rather than agreement in percentages of attendance.
Furthermore, data from the Curtis et al. (2008) study were gathered during the 2004-05
school year, around the same time that the Individuals With Disabilities Education Act
(2004; Wright & Wright, 2009) specified that schools could use a students response to
intervention to determine eligibility for special education and related services. Current
legislation, such as the No Child Left Behind Act (NCLB, 2002), specifies accountability
for student outcomes and achievement as well as data-based decision-making.
Respondents in the current study continued to pursue professional development on
Response to Intervention (86.6%), along with evidence-based behavioral interventions
(79.9%), data-based decision-making (68.9%), and academic and behavioral
accountability (57.2%).
Consistency in time allocation was also found when respondent data from the
current study were compared to previous research. For example, in the current study,
42.2% of respondents reported spending between 25-50% of their time on assessment,

142

while allocating 25% of their time or less on direct interventions (80%), consultation and
indirect services (70.9%), research (100%), administration (96.4%), or systems-level
activities (97.1%). Demographic studies conducted over the past few decades mirror
these findings, with respondents indicating that they spend approximately 50% of their
time on assessment, 20-25% engaged in direct intervention, 20-25% on consultation, and
their remaining time involved in systems-level or research activities (Bramlett, Murphy,
Johnson, Wallingsford, & Hall, 2002; Fisher, Jenkins, & Crumbley, 1986; Goldwasser,
Meyers, Chistenson, & Graden, 1983; Hartshorne & Johnson, 1985; Lacayo, Sherwood,
& Morris, 1981; Meacham & Peckam, 1978; Reschly & Wilson, 1995; Smith, 1984).
Overall, the statistics describing graduate preparation, experience, and
employment conditions (school setting, psychologist-to-student ratio, time allocation) for
this study are comparable to current and historical demographic data. One difference that
should be noted was years of experience. This should be considered when interpreting
results and generalizing to the larger population. The results of this study may apply to
the general population given that few differences were found between this sample and
current demographic research. At this time, however, the small sample size of this study
limits the ability to generalize these results to any population beyond those school
psychologists who responded to the survey.
Counseling Practices of School Psychologists
The opening questions of this study were designed to gather data on the general
counseling practices of school psychologists. Specific areas of interest included
counseling format, students served, discontinuation from counseling, and the use of print
and/or online resources. The majority of respondents (54.8%) see special and general

143

education students (73.5%) for counseling in group and individual formats. Respondents
with the most experience were more likely than new practitioners to provide individual
counseling exclusively. It is important to note that 24 respondents indicated in the
comments section that they do not currently counsel students. Additionally, of the 65
respondents who declined to participate, 23 did so because they do not counsel within
their school buildings. Due to the narrow focus of this study on the application of the
problem-solving model to counseling as a behavioral intervention, limited comparisons
could be made between this study and the literature on counseling practices. As stated in
Chapter 2, previous research findings indicate that a range of 53-88% of school
psychologists provide group and/or individual counseling (Hanchon & Fernald, 2011;
Yates, 2003), and surveys of time allotment indicate that 20-25% of schools
psychologists time is spent on direct interventions (Bramlett et al., 2002).
To develop an understanding of data-based decision-making, several survey items
explored the discontinuation practices of school psychologists and their use of print or
online resources. Most respondents recommended discontinuing 2 students from
counseling each year, with the most common reason being that counseling goals have
been met. Experience was found to have an impact on discontinuation practices.
Practitioners with 10 or more years of experience were more likely than expected to
discontinue 7 or more students from counseling, while those with the least amount of
experience were less likely to discontinue students at this frequency. Similar numbers of
respondents reported that they did or did not use print or online resources to assist in
writing goals, clarifying problem behavior, and when setting behavioral expectations.
Data regarding discontinuation of counseling services have not been reported in the

144

available literature; however, due to the small sample size in this study, these results
should be viewed as preliminary.
Use of the Problem-Solving Model
The major focus of this investigation was to determine the frequency with which
school psychologists use components of the problem-solving model. To this end,
respondents were asked to provide data on their use of general and specific components
of the problem-solving model. Provided in Table 41 is a summary of the frequency of
use of these components. Overall, general components of the problem-solving model
used to define and establish the behavior of concern (e.g., behavioral definition [87.5%],
and problem validation [83.8%]), as well as those involved in determining what should
be done about it (goal setting [81.3%], and intervention plan development [85.1%]) were
found to be used most often. In addition, the collection of progress monitoring data
(82.8%) was another practice in which a majority of respondents reported engaging.
Social-emotional behavioral (SEB) research conducted by Merrell (2010) pointed
out that while there are a variety of SEB assessment tools available, their utility lies in
determining problem behavior and the factors maintaining these behaviors, and provide
only limited guidance on intervention and determining whether remediation efforts have
been successful. A majority of respondents in this study have also attended professional
development in recent years focused on evidence-based behavioral interventions, databased decision-making and academic and behavioral accountability, during which time
they may have received strategies and resources for monitoring student behavior.
Agreement was also found between general and specific components used by
these respondents and legal requirements. For example, as reiterated in IDEA 2004,

145

Table 41
Use of General and Specific Intervention Components of the Problem-Solving Model
General and Specific PSM Component
Behavioral definition
Action verbs describing behavior in observable terms
Describe frequency
Describe intensity
Describe duration
Describe topography
Describe accuracy
Describe latency
Intervention plan development
Describe the procedures to be used
Describe activities/steps to be completed in each
session
Specify location where intervention takes place
Describe what each person will do
Specify materials needed for each activity/step
Document that intervention is empirically valid
Describe how each activity/step will be completed
Problem validation
Progress monitoring
Direct behavioral observation
Third-party behavioral rating scales
Interviews
Objective self-report measures
Sociometric techniques
Projective-expressive techniques
Goal setting
Specify behavior
Specify condition (circumstance in which behavior
occurs)
Specify criteria (standard for behavioral performance)
Specify timeframe when expected progress will be
made
Measurement strategy
Behavioral definition of the target behavior
Describe when the target behavior will be measured
Describe who will measure the target behavior
Describe where the target behavior will be measured
Describe appropriate recording measure

146

% Usage (n)
87.5 (217)
99.1 (214)
94.9 (203)
86.0 (178)
80.2 (166)
61.0 (119)
59.4 (117)
52.1 (101)
85.1 (194)
95.1 (176)

% NonResponse (n)
12.4 (35)
23.7 (67)
24.4 (69)
26.9 (76)
26.9 (76)
31.1 (88)
30.4 (86)
31.4 (89)
19.4 (55)
35.0 (99)

91.8 (167)
86.2 (156)
83.0 (151)
79.6 (145)
79.4 (146)
78.3 (141)
83.8 (196)
82.8 (173)
98.9 (177)
97.2 (174)
96.0 (170)
88.0 (153)
61.3 (103)
35.6 (61)
81.3 (191)
98.4 (179)

35.7 (101)
36.0 (102)
35.7 (101)
35.7 (101)
35.0 (99)
35.0 (99)
17.3 (49)
26.1 (74)
36.7 (104)
36.7 (104)
37.5 (106)
38.5 (109)
40.6 (115)
39.6 (112)
17.0 (48)
35.7 (101)

97.8 (176)
97.7 (173)

36.4 (103)
35.7 (106)

92.9 (169)
78.9 (176)
98.3 (173)
96.0 (167)
95.4 (167)
94.9 (167)
89.5 (154)

35.7 (101)
21.2 (60)
37.8 (107)
38.5 (109)
38.2 (108)
37.8 (107)
39.2 (111)
(table continues)

Table 41 continued
General and Specific PSM Component
Treatment Integrity
Logs documenting sessions
Self-reports
Permanent products of student work
Checklists for intervention components
Direct observation by non-involved 3rd party
Baseline data
Third-party behavior ratings
Direct behavioral observation
Third-party interviews
Objective self-report
Sociometric techniques
Projective-expressive techniques
Decision-making plan
Frequency with which data will be collected
How much time will pass before intervention data
analysis
How many data points will be collected before data
analysis
How data will be summarized for intervention
evaluation
Decision rules for responding to specific data points
Summative evaluation
Anecdotal information
Data documenting student performance in school
Trend (change in level from baselines to intervention
phases)
Practitioners subjective assessment of student behavior
Level (comparison of behavioral occurrence in baseline
and intervention phases)
Problem analysis
Formative evaluation
Data documenting student performance in school
Anecdotal information
Trend (change in level from baselines to intervention
phases)
Practitioners subjective assessment of student behavior
Level (comparison of behavioral occurrence in baseline
and intervention phases)

147

% Usage (n)
77.9 (159)
88.0 (139)
86.5 (135)
77.6 (121)
62.9 (95)
53.8 (84)
77.7 (188)
99.0 (189)
98.4 (188)
90.3 (168)
90.1 (164)
65.8 (117)
36.8 (67)
70.0 (156)
98.6 (146)

% NonResponse (n)
27.9 (79)
44.2 (125)
44.9 (127)
44.9 (127)
46.6 (132)
44.9 (127)
14.5 (41)
32.5 (92)
32.5 (92)
34.2 (97)
35.7 (101)
37.1 (105)
35.7 (101)
21.2 (60)
47.7 (135)

95.3 (141)

47.7 (135)

89.8 (132)
89.2 (131)

48.1 (136)
48.1 (136)

86.2
69.3
98.6
97.9

48.8
27.6
48.4
50.2

(125)
(142)
(144)
(138)

(138)
(78)
(137)
(142)

97.3 (143)
90.7 (127)

48.1 (136)
50.5 (143)

89.7
69.2
55.3
95.8
95.0

48.8
17.3
27.2
58.0
57.6

(130)
(162)
(114)
(114)
(114)

(138)
(49)
(77)
(164)
(163)

94.1 (111)
90.0 (108)

58.3 (165)
58.0 (164)

90.0 (108)

57.6 (163)

individualized education plans (IEPs) must include a description of the students present
levels of academic and functional abilities, including justification for why his or her
disability affects participation and progress in the general curriculum (Wright & Wright,
2009). This tenet of special education law fits with the literature defining problem
validation as the comparison of student behavior with a standard of appropriate behavior
based on the performance of peers, building or district norms, and/or teacher expectations
(Upah, 2008). Because the majority of respondents indicated that they work with general
and special education students, problem validation may be an activity that is
systematically engaged in when working with all students.
Furthermore, special education law requires that IEPs include measurable
academic and functional goals, a description of how those goals will be measured, and
when progress towards those goals will be evaluated (Wright & Wright, 2009). IEPs
must be evaluated periodically (Wright & Wright, 2009, p. 104) to determine whether
the student is making progress towards meeting goals and to revise goals when this is
considered necessary. In addition, IEPs in some states also specify the location where
related services, such as counseling, will take place. These clauses may explain the
reported frequency of goal setting, the use of action verbs, and the specific problemsolving model components that respondents may incorporate into service delivery plans.
Before discussing least often used components of the problem-solving model, it
should be noted that, as the survey continued, the percentage of non-responses per item
increased. Several items that were used infrequently also had higher rates of nonresponse. Components used most infrequently were also those that would be used to
monitor and determine the effectiveness of counseling interventions (e.g., formative

148

[55.3%] and summative [69.3%] evaluation, and decision-making plan [70%]).


Problem-analysis (69.2%) was indicated as another infrequently used component.
According to recommendations in the literature, problem analysis requires developing
and testing hypotheses about the function of the students behavior (Kern, 2005). Data
are gathered on student behavior under different conditions, and interventions are
designed based on the fit between data and hypotheses (Kern, 2005). Although
informative, the amount of time and possible teacher cooperation needed to adequately
complete problem analysis might be more than some school psychologists are able to
manage, especially given that the majority of respondents in this sample reportedly spend
at least half of their time on assessment.
Current research on behavioral interventions may help to explain some of the
results found in this study. As mentioned in Chapter 2, research on SEB disorders has
not identified a general outcome measure for behavioral interventions (Chafouleas,
Volpe, Gresham, & Cooke, 2010). Therefore, the identification of measureable tasks,
behaviors, and skills that represent sensitive indicators of SEB problems and form the
foundation for measurement tools, definitions of target behaviors, and screening and
progress monitoring plans is on-going (Chafouleas et al., 2010). At this time, designing
counseling interventions based on the problem-solving model along with response to
intervention or single-case design paradigms (in which the students behavior is
measured over time to determine whether behavioral improvements have been made)
promotes examining effectiveness and demonstrating accountability for student
outcomes.
Research guidelines on response to intervention (Briesch, Chafouleas, & Riley-

149

Tilman, 2010), and single subject design (Horner et al., 2005) share some common
themes with the steps of the problem-solving model. The student is the unit of analysis,
and student behavior prior to intervention is compared to behavior during and after the
intervention is completed. One or more observable behaviors are operationally defined
and measured repeatedly, with some assessment of the consistency of measurement. The
target behavior should also have some social significance and relevance to the student.
The intervention must also be operationally defined, with specific and detailed
descriptions of what will take place and where the intervention will occur. Fidelity of
implementation is also important to maintain as the intervention is applied over time.
Student behavior is compared during baseline and intervention phases through regular,
on-going documentation, and then by visual analysis of all phases of behavior (Kazdin,
1982; Upah, 2008). The focus on the objective measurement of observable behaviors
over time, along with demonstrations of behavioral growth and intervention fidelity,
allow for data-based decision making and accountability for student outcomes.
While school psychologists are applying many aspects of the problem-solving
model with some consistency, the results of this study suggest that they may not be doing
so in such a way that would allow for easy demonstration of accountability. Although
respondents have indicated that they are behaviorally defining target behaviors, the
infrequent use of formative and summative evaluation could be an indication that school
psychologists may not be measuring these behaviors adequately enough to use these data
to determine whether the student has successfully responded to the intervention. The
frequent use of interviews and self-reports during baseline data collection calls into
question whether school psychologists are able to establish an observable, objective

150

baseline of student behavior before beginning an intervention. Interviews and selfreports are more global, non-standardized, and non-empirically validated measures of
data collection, and as such, may not provide reliable and valid indications of student
behavior. Furthermore, high rates of non-response for specific methods of baseline data
collection invites speculation on the adequacy of these data for comparison of student
behavior across different phases (baseline to intervention). Steps that would allow for the
comparison of baseline and intervention data (e.g., decision-making plans [70%] and
measurement strategy [78.9%]) were not consistently used by respondents in this sample.
Although respondents reported using direct behavioral observation to progress
monitor their interventions, it is unclear whether these observations are being conducted
during sessions, or in an environment where the problem behavior occurs. Third-party
behavior rating scales have support in SEB literature, however many are not sensitive
enough to be used repeatedly to measure change over the course of an intervention
(Volpe & Gadow, 2010). In addition, less empirically valid methods of progress
monitoring, such as interviews and self-reports, were used almost as often as behavioral
observations. Limited time and resources may prevent practitioners from collecting
sufficient data to make informed decisions about student behavior (Briesch, Chafouleas,
& Riley-Tilman, 2010), which may explain the infrequent use of formative and
summative evaluation.
While data-gathering methods such as anecdotal information, self-reports,
interviews, and practitioners subjective assessment of student behavior may provide
valuable information, research recommendations call for observable, objective, and
empirical assessment of student behavior gathered over time and presented for visual

151

analysis (Kazdin, 1982; Upah, 2008). The use of these non-observable and subjective
methods to establish baseline and progress monitor could compromise the value of using
an analysis of level and trend as formative and summative assessment tools. Similarly,
the majority of respondents in this study reported discontinuing students because
counseling goals had been met; however, the low frequency with which students are
discontinued each year might also be related to the adequacy of the data gathering
methods used by this sample.
The high rates of non-response for treatment integrity are another deviation from
recommendations in the literature. Nearly half the sample in this study did not respond to
items involving specific methods for measuring treatment integrity. This suggests that,
for this sample, treatment integrity is not something that they regularly engage in during
their counseling practices. While useful for planning, session logs, self-reports, and
permanent products of student work may not provide direct evidence that intervention
components have been implemented consistently and accurately over time the way that
session checklists and direct observations might.
In summary, it would appear that the respondents in this sample are using many
available tools to define and determine target behaviors. Many of their reported
counseling practices appear to conform to legislative guidelines for working with special
education students. These school psychologists indicated that they apply many of the
steps of the problem-solving model in their counseling practice, especially when defining
target behaviors and planning interventions. These results, however, also call into
question the degree to which these practitioners engage in progress monitoring and databased decision making, as the quality and frequency of baseline and progress monitoring

152

data collection may not enable documentation and comparison of student behavior to
determine whether behavioral improvement has been made (formative and summative
evaluation).
Implications for School Psychology
Although the conclusions of the current study are tentative at this point in time,
the results suggest that school psychologists might improve their ability to make databased decisions and demonstrate accountability for student outcomes. Specific areas for
growth for current practitioners and training programs include gathering more observable
and objective measurements of behavior over time to clearly demonstrate behavioral
change across baseline and intervention phases, corresponding improvements in
formative and summative assessment, and more consistent demonstration and evaluation
of treatment integrity.
Current research on the roles and responsibilities of school psychologists should
be considered in the discussion on possible barriers to improvements in data gathering.
As mentioned in Chapter 2, the four roles of the school psychologist include assessment,
direct intervention, consultation, and systems-level intervention (Fagan, 2008; NASP,
2010). School psychologists have expressed a preference for spending less time on
assessment, and more time and resources on interventions and consultation (Hosp &
Reschly, 2002; Reschly & Wilson, 1995), with a specific focus on student mental health
needs (Agresta, 2004). Briesch, Chafouleas, and Riley-Tilman (2010) cited limited time
and resources as factors that prevent practitioners from collecting enough data to make
informed decisions about student behavior. Given the connection between emotional
well-being and successful learning experiences (Haertel, Walberg, & Weinstein, 1983;

153

Wang, Haertel, & Walberg, 1990) and the focus on accountability for student outcomes
(Wright & Wright, 2009), researchers, practitioners, and professional organizations
should continue to advocate for role expansion and re-allocation of the responsibilities of
school psychologists. A paradigm shift, whereby school psychologists translate
behavioral research into practice and become active problem-solvers within their school
buildings, is necessary to demonstrate accountability for student outcomes and
achievement.
To increase accountability, school psychologists may need to re-allocate the
amount of time they spend engaged in research, consultation, and systems level activities,
in addition to spending more time on direct interventions. Behavioral research and best
practice guidelines exist specifying how to most effectively gather and evaluate objective,
observable, and repeated measures of student behavior (Briesch, Chafouleas, & RileyTilman, 2010; Upah, 2008; Volpe & Gadow, 2010). Current and historical time
allocation data suggest that consuming and applying this information is not possible
unless school psychologists are able to prioritize research as a professional function. At
the systems level, school psychologists may need to advocate for gathering more precise
behavioral data at a higher frequency than what is currently done in many school
buildings. Although questions remain regarding who collects behavioral data and in what
environment, it would appear as though consensus needs to be reached on how data will
be collected, by whom, and at what frequency. This may involve school psychologists
collecting data themselves, but may also entail other school professionals who are in a
better position to see students display problematic behaviors collecting data. If other
school professionals collect data, then school psychologists may need to have more time

154

available for consultation in order to provide their colleagues with necessary tools and
training, to measure fidelity to established protocols, and to increase buy-in, if this
becomes a factor.
Issues related to data gathering, decision-making, and accountability should be
addressed by school psychology training programs, as well as organizations providing
continuing education and professional development. These issues highlight a necessary
paradigm shift where research is integrated into practice, with new and experienced
practitioners receiving training to become active problem-solvers. The results of this
study would suggest that the focus should be on developing and disseminating knowledge
and supervised experience gathering repeated, objective measurements of student
behavior to establish baseline, document progress during intervention, and to design and
evaluate formative and summative assessment. Building in measurement of treatment
integrity should also be an essential component of training and continuing education.
Experiences with these skills should be connected to coursework, practice, and
professional development with direct and indirect interventions, to ensure that
practitioners know how to implement standards of data-based decision making and
accountability for outcomes within the settings in which they are employed.
Limitations and Directions for Future Research
The results of this survey provide a broad overview of the implementation of
counseling interventions based on the problem-solving model, a topic that had not
previously been addressed in the literature. In the process of evaluating the results of this
study, it is necessary to address the limitations of this study, particularly with respect to
the reliability and validity of this survey in accurately capturing information on

155

counseling practices. Consideration of these limitations allows for speculation on what


could be done differently in future research.
Although some commonalities in training and work conditions were noted
between this sample and other current demographic surveys, the low response rate
achieved in this study allows for only a tentative review of its results. Following from
this, the low response rate for many items that would provide insight into the
respondents ability to engage in data-based decision making and accountability (e.g.,
treatment integrity, formative and summative evaluation, decision-making plan) is
problematic, as these were key themes of the problem-solving model that this study was
designed to explore. In addition, several respondents indicated that they do not currently
offer counseling, or declined to participate in this study for that reason.
At this time, it can only be speculated that, although legal requirements for
designing and implementing counseling appear to be met by this sample, the observable
and objective data gathering and assessment of student behavior over time to demonstrate
change and make decisions is likely not being done in accordance with research and best
practice guidelines. In order to have more confidence in the results obtained at this time,
a higher response rate for all items on this survey from practitioners who consistently
engage in counseling would be needed. To increase response rates, future research could
apply recommendations from Dillman (2007), by sampling using a variety of different
formats, including mail and telephone invitations for participation.
Regardless of the sampling method used, survey questions must be general
enough to facilitate comprehension by a large number of respondents, and as a result,
may omit questions of interest to the researcher and certain respondents (Barribeau et al.,

156

2005). The length of this survey was a concern from the beginning, and therefore,
question selection was limited to those items considered most likely to answer the
research questions. As such, items pertaining to specific components of problem analysis
and validation were omitted. When analyzing the results of this study, however, it would
be beneficial to have information regarding the data used to establish and learn about
behaviors of concern, how these data are gathered, and to what these data are compared.
Future research exploring data-based decision making and accountability should focus on
fewer components of the problem-solving model (e.g., baseline and progress monitoring
data collection, and/or formative and summative assessment) in greater depth, now that
preliminary research has examined these topics with a broad lens. Shorter measures may
also increase the response rate and provide a more valid picture of counseling practices
implemented in schools today.
Additional limitations for this survey match several of those listed in Chapter 2
for self-administered surveys. For example, once respondents begin completing the
survey, it cannot be altered, and some respondents may struggle to align their own
experiences with those choices presented to them on a survey (Barribeau et al., 2005). In
the case of this survey, after implementation, it was discovered that the treatment
integrity item allowed respondents to select whether they sometimes or always used
this component. All other items pertaining to the use of general components of the
problem-solving model forced respondents to state whether they did or did not employ
each component. When comparing the usage of general and specific components of the
problem-solving model with the rates of non-response, it would appear as though, in
some cases, respondents indicated that they engaged in a general component without

157

being able to describe their practice by selecting any of the corresponding specific
components. Large increases in the rate of non-response between general and specific
components for steps such as treatment integrity, and summative and formative
evaluation would suggest that, on some items, there might have been a discrepancy
between the intent of the researcher and the understanding of the respondent.
To address these limitations, future surveys should include uniform response
choices, and data on counseling might be obtained using a mixed method research design.
Focus groups comprised of school psychologists who spend varying amounts of time on
assessment and direct interventions could provide insight on their counseling practices to
determine the level of agreement between what is being done and what is specified in
research and best practices. An important question to ask relates to barriers and
facilitators practitioners face, especially when gathering empirical and objective
behavioral data. Researchers could also examine de-identified examples of
documentation related to counseling practices, such as intervention, decision-making, and
measurement plans, evidence of treatment integrity, and aggregated student data used for
visual analysis as part of formative and summative evaluation. Information gathered
from these data would allow school psychologists as a field to evaluate their level of
accountability, while providing guidance for researchers, professional organizations,
practitioners, and training programs.
Analysis of the results of this study also produced additional questions related to
where baseline and progress monitoring data gathering occurs. Respondents reported
using direct behavioral observation to gather baseline and progress monitoring data.
These items, however, did not specify where such behavioral observations occur.

158

Specifically, it is unknown whether behavioral observations are gathered during


counseling sessions, where practitioners have control over the environment and can
provide the student with support and correction, or whether observations are conducted in
an environment beyond the practitioners control, where the problem behavior may be at
its most severe. The wording of these items also presumes that the school psychologist is
the one conducting the observations; however, it is possible that some practitioners
misinterpreted this, and considered observations by others when responding, or when
evaluating these data to make decisions in their practice. Problems with validity and
reliability of behavioral observations arise, depending on the level of training,
standardization of methods, and agreement among behavioral raters.
Future research must examine the connection between what school psychologists
do with students in counseling and student behavior in the classroom. The question of
who gathers behavioral data for baseline and progress monitoring should be posed to
practitioners who offer counseling. It is important to specify who gathers behavioral data
and when and where these data are gathered. The question of barriers and facilitators to
data collection should also be addressed. Focus group research and/or brief survey
measures can be used to answer these questions.
In the event that school psychologists are relying on behavioral data gathered by
others who have more exposure to the student, such as teachers or related service
providers, then research on consultation becomes more relevant, in terms of creating
effective and unobtrusive measurement tools, and encouraging buy-in and cooperation
with others. The quality of behavioral data depends on who collects it and what tools and
resources they have available to them. The questions on this survey and the best practice

159

literature recommend stringent observations of student behavior conducted over the


course of the intervention. Whether these data gathering measures are used directly or
indirectly, as active problem solvers, current practitioners should be invited to provide
their input and help to create these tools and resources.
Summary
The field of school psychology emerged and has evolved over time to best meet
the needs of students in American public schools. Recent documentation of student
behavioral and mental health needs and the connection between emotional well-being and
academic achievement have placed accountability for outcomes at the forefront of
educational legislation, research, and practice. Counseling is a direct intervention that
school psychologists use to address the behavioral and social-emotional difficulties
students face. Current research supports the use of the problem-solving model (PSM) to
plan interventions as one method by which school psychologists can demonstrate
accountability for implementing counseling interventions that have a positive impact on
student behavioral outcomes and mental health. Similar to response to intervention and
single-case design paradigms, the PSM focuses on repeated, objective measures of
observable behavior at baseline and intervention phases to demonstrate whether
behavioral change has occurred as a result of counseling.
This study involved a survey of school psychologists counseling practices, with a
specific focus on their application of the PSM. The results of this survey suggest that
school psychologists are using available behavioral measurement tools in accordance
with special education law to engage in activities such as behaviorally defining the target
behavior, developing an intervention plan, validating the problem behavior, and

160

monitoring the progress of their interventions. Areas for improvement suggested by


these results include more deliberative formative and summative assessment, problem
analysis, and devising a decision-making plan. Based on these results, it is hypothesized
that, at this time, school psychologists may struggle to gather the type of objective,
empirical behavioral observations that would allow for accurate baseline and progress
monitoring data collection. Although behavioral data are being gathered, it appears as
though these data do not enable data-based decision making and the demonstration of
accountability using formative and summative assessment. Despite the fact that
respondents in this sample reported discontinuing students in the majority of cases
because counseling goals have been met, the low number of students discontinued each
year suggests that data-based decision making and accountability are areas for growth.
Noteworthy limitations of this study include several threats to validity, including a small
sample size, and increasing rates of non-response as respondents completed survey items.
Future research is needed to explore single facets of the PSM in greater depth using a
mixed methods approach to sampling and research design given that future legislative
initiatives continue to focus on accountability for student achievement. Implications for
school psychology practitioners and trainers focus on how best to train current and future
practitioners to become consumers of behavioral research and active problem solvers
who are able to collect and evaluate empirical behavioral measurements in accordance
with best practices.

161

References
Agresta, J. (2004). Professional role perceptions of school social workers, psychologists,
and counselors. Children and Schools, 26,151-163.
Ahrens, J., & Rexford, L. (2002). Cognitive processing therapy for incarcerated
adolescents with PTSD. Journal of Aggression, Maltreatment & Trauma, 6, 201216. doi:10.1300/J146v06n01_10
Alexander, L. (1986). Summary of Time for Results, report on education by governors
group. Chronicle of Higher Education, 33(1), 78-79.
Aldred, C., Green, J., & Adams, C. (2004). A new social communication intervention for
children with autism: Pilot randomized controlled treatment study suggesting
effectiveness. Journal of Child Psychology and Psychiatry, 45, 1420-1430.
doi:10.1111/j.1469-7610.2004.00338.x
Alreck, P.L., & Settle, R.B. (2004). The survey research handbook (3rd ed.). New York:
McGraw-Hill/Irwin.
American Counseling Association. (2011). Resources. Retrieved from
http://www.counseling.org/Resources/
American Psychiatric Association. (2000). Diagnostic and statistical manual of mental
disorders (4thed.). Washington, DC: Author.
American Psychological Association (APA). (2005). Policy statement on evidence-based
practice in psychology. Retrieved from http://www.mspp.net/APA%20policy%20
On%20EBT.htm
American Psychological Association (APA). (2010). Ethical principles of psychologists
and code of conduct. Retrieved from http://www.apa.org/ethics/code/index.aspx#

162

American Psychological Association (APA). (2011). Division 16 goals and objectives.


Retrieved from www.indiana.edu/~div16/goals.html#goals
Andrews, D. Nonnecke, B., & Preece, J. (2003). Electronic survey methodology: A case
study in reaching hard-to-involve internet users. International Journal of HumanComputer Interaction, 16, 185-210. doi:10.1207/S15327590IJHC1602_04
APA Presidential Task Force on Evidence-Based Practice. (1996). Evidence-based
practice in psychology. American Psychologist, 61, 271-285. DOI: 10.1037/0003066X.61.4.271
Asarnow, J.R., Scott, C.V., & Mintz, J. (2002). A combined cognitive-behavioral family
education intervention for depression in children: A treatment development study.
Cognitive Therapy and Research, 26, 221-229. doi:10.1023/A:1014573803928
Association for Behavioral and Cognitive Therapies (ABCT) & Society of Clinical Child
and Adolescent Psychology (SCCAP). (2010a). What is evidence-based practice?
Retrieved from http://www.abct.org/sccap/?m=sPublic&fa=pub_WhatIsEvident
Practice
Association for Behavioral and Cognitive Therapies (ABCT) & Society of Clinical Child
and Adolescent Psychology (SCCAP). (2010b). Myths and facts about
empirically supported treatments. Retrieved from http://www.abct.org/sccap/?m=s
Pro&fa=pro_ESTsupport#aTop
Bachar, E., Latzer, Y., Kreitler, S., & Berry, E.M. (1999). Empirical comparison of two
psychological therapies: Self psychology and cognitive orientation in the
treatment of anorexia and bulimia. Journal of Psychotherapy Practice and
Research, 8, 115-128.

163

Baer, D. M., Wolf, M. M., & Risley, T. R. (1987). Some still-current dimensions of
applied behavior analysis. Journal of Applied Behavior Analysis, 20, 313-327.
doi:10.1901/jaba.1987.20-313
Bandura, A. (1977). Social learning theory. Englewood Cliffs, NJ: Prentice-Hall.
Bandura, A. (2004). Models of causality in social learning theory. In A. Freeman,
M.J. Mahoney, P. Devito, & D. Martin (Eds.) Cognition and psychotherapy (2nd
ed., pp. 25-44). New York: Springer.
Barkley, R.A., Shelton, T.L., Crosswait, C., Moorehouse, M., Fletcher, K., Barrett, S.
(2000). Multimethod psychoeducational intervention for preschool children
with disruptive behavior: Preliminary results at post-treatment. Journal of Child
Psychology and Psychiatry and Allied Disciplines, 41, 319-332. doi:10.1111/
1469-7610.00616
Barrett, P. M. (1998). Evaluation of cognitive-behavioral group treatments for childhood
anxiety disorders. Journal of Clinical Child Psychology, 27, 459468.
doi:10.1207/s15374424jccp2704_10
Barrett, P. M., Dadds, M. R., & Rapee, R. M. (1996). Family treatment of childhood
anxiety: A controlled trial. Journal of Consulting and Clinical Psychology, 64,
333342. doi:10.1037/0022-006X.64.2.333
Barrett, P. M., Farrell, L., Pina, A. A., Peris, T. S., & Piacentini, J. (2008). Evidencebased psychosocial treatments for child and adolescent obsessive-compulsive
disorder. Journal of Clinical Child & Adolescent Psychology, 37, 131-155.
doi: 10.1080/15374410701817956
Barrett, P.M., Healy-Farrell, L.J., & March, J.S. (2004). Cognitive-behavioral family

164

treatment of childhood obsessive-compulsive disorder: A controlled trial.


Journal of the American Academy of Child & Adolescent Psychiatry, 43, 46-62.
doi:10.1097/00004583-200401000-00014
Barribeau, P., Butler, B., Corney, J., Doney, M., Gault, J., Gordon, J., Palmquist, M.
(2005). Survey research. Writing @ CSU. Colorado State University Department
of English. Retrieved from http://writing.colostate.edu/guides/research/survey/
com2d1.cfm
Batsche, G.M., Castillo, J.M., Dixon, D.N., & Forde, S. (2008). Best practices in linking
assessment to intervention. In A. Thomas & J. Grimes (Eds.), Best Practices in
School Psychology (5th ed., pp. 177-193). Bethesda, MD: National Association of
School Psychologists.
Battjes, R.J., Gordon, M.S., OGrady, K.E., Kinlock, T.W., Katz, E.C., & Sears, E.A.
(2004). Evaluation of a group-based substance abuse treatment program for
adolescents. Journal of Substance Abuse Treatment, 27, 123-134.
doi:10.1016/j.jsat.2004.06.002
Beidel, D.C., Turner, S.M., & Morris, T.L. (2000). Behavioral treatment of childhood
social phobia. Journal of Consulting and Clinical Psychology, 68, 1072-1080.
doi:10.1037/0022-006X.68.6.1072
Benjamin, L. T., & Baker, D. B. (2003). School psychology. In From sance to science:
A history of the profession of psychology in America (pp. 81-114). Belmont, CA:
Wadsworth.
Bergan, J. R., & Kratochwill, T. R. (1990). Behavioral consultation and therapy. New
York: Plenum Press.

165

Bernal, M.E., Klinnert, M.D., & Schultz, L.A. (1980). Outcome evaluation of behavioral
parent training and client-centered parent counseling for children with conduct
problems. Journal of Applied Behavior Analysis, 13, 677-691. doi:10.1901/jaba.
1980.13-677
Beyer, H. (1989). Education for All Handicapped Children Act: 1975-1989. A judicial
history. Exceptional Parent, 19(6), 52-58.
Block, J. (1978). Effects of a rational-emotive mental health program on poorly
achieving, disruptive high school students. Journal of Counseling Psychology,
25, 61-65. doi:10.1037/0022-0167.25.1.61
Bogels, S.M., & Siqueland, L. (2006). Family cognitive behavioral therapy for children
and adolescents with clinical anxiety disorders. Journal of the American Academy
of Child and Adolescent Psychiatry, 45, 134-141. doi:10.1097/01.chi.
0000190467.01072.ee
Bor, W., Sanders, M.R., & Markie-Dadds, C. (2002). The effects of the Triple-P
Positive Parenting Program on preschool children with co-occurring disruptive
behavior and attentional/hyperactive difficulties. Journal of Abnormal Child
Psychology, 30, 571-587. doi:10.1023/A:1020807613155
Borduin, C., Mann, B.J., Cone, L.T., Henggeler, S.W., Fucci, B.R. , Blaske, D.B.,
(1995). Multisystemic treatment of serious juvenile offenders: Long-term
prevention of criminality and violence. Journal of Consulting and Clinical
Psychology, 63, 569-578. doi:10.1037/0022-006X.63.4.569
Bramlett, R. K., Murphy, J. J., Johnson, J., Wallingsford, L., & Hall, J. D. (2002).
Contemporary practices in school psychology: A national survey of roles and

166

referral problems. Psychology in the Schools, 39, 327-335. doi:10.1002/pits.10022


Brener, N.D., Martindale, J., &Weist, M.D. (2001). Mental health and social services:
Results from the School Health Policies and Programs Study 2000. Journal of
School Health, 71, 305-312. doi:10.1111/j.1746-1561.2001.tb03507.x
Brent, D.A., Holder, D., Kolko, D. Birmaher, B., Baugher, M., Roth, C. (1997).
A clinical psychotherapy trial for adolescent depression comparing cognitive,
family, and supportive therapy. Archives of General Psychiatry, 54, 877-885.
Briesch, A.M., Chafouleas, S.M., & Riley-Tillman, T.C. (2010). Generalizability and
dependability of behavior assessment methods to estimate academic engagement:
A comparison of systematic direct observation and direct behavior rating. School
Psychology Review, 39(3), 408-421.
Brown-Chidsey, R., & Steege, M.W. (2010). Response to Intervention: Principles and
strategies for effective practice (2nd ed.). New York: Guilford.
Burns, M.K. (2007). Using curriculum-based assessment to match instruction and skill:
Implications for response to intervention. School Psychology Quarterly, 22, 297313. doi:10.1037/1045-3830.22.3.297
Burns, B.J., Costello, E.J., Angold, A., Tweed, D., Stangl, D., Farmer, E.M.Z.
(1995). Data watch: Childrens mental health service use across service sectors.
Health Affairs, 14, 147-159. doi:10.1377/hlthaff.14.3.147
Carnegie Council on Adolescent Development. (1996). Great transitions: Preparing
adolescents for a new century. New York: Author.
Cash, R.E., & Nealis, L.K. (2004). Mental health in the schools: Its a matter of public
Policy. Paper presented at the National Association of School Psychologists

167

Public Policy Institute, Washington, DC.


Casbarro, J. (2008). RTI: Response-To-Intervention. Port Chester, NY: National
Professional Resources.
Chafouleas, S.M., Volpe, R.J., Gresham, F.M., & Cook, C.R. (2010). School-based
behavioral assessment within problem-solving models: Current status and future
directions. School Psychology Review, 39(3), 343-349.
Chamberlain, P., & Reid, J.B. (1998). Comparison of two community alternatives to
incarceration for chronic juvenile offenders. Journal of Consulting and Clinical
Psychology, 66, 624-633. doi:10.1037/0022-006X.66.4.624
Chambless, D.L., Baker, M.J., Baucom, D.H., Beutler, L.E., Calhoun, K.S. (1998).
Update on empirically validated therapies, II. The Clinical Psychologist, 51, 3-16.
Chambless, D.L., & Hollon, S.D. (1998). Defining empirically supported therapies.
Journal of Consulting and Clinical Psychology, 66, 7-18. doi:10.1037/0022006X.66.1.7
Chambless, D.L., & Ollendick, T.H. (2001). Empirically supported psychological
interventions: Controversies and evidence. Annual Review of Psychology, 52,
685-716. doi:10.1146/annurev.psych.52.1.685
Chambless, D.L., Sanderson, W.C., Shoham, V., Johnson, S.B., Pope, K.S., CritsChristoph, P. (1996). An update on empirically validated therapies. The Clinical
Psychologist, 49, 5-18.
Charvat, J. L. (2005). NASP Study: How many school psychologists are there?
Communiqu, 33(6), 12-14.
Chesley, N. (2006). Families in a high-tech age: Technology usage patterns, work and

168

family correlates, and gender. Journal of Family Issues, 27, 587-608. doi:10.1177/
0192513X05285187
Chemtob, C., Nakashima, J., & Carlson, J. (2002). Brief treatment for elementary school
children with disaster-related posttraumatic stress disorder: A field study. Journal
Of Clinical Psychology, 58, 99-112. doi:10.1177/0192513X05285187
Christian, L.M., & Dillman, D.A. (2004). The influence of symbolic and graphical
language manipulations on answers to paper self-administered questionnaires.
Public Opinion Quarterly, 68, 57-80. doi:10.1093/poq/nfh004
Christensen, A., Johnson, S.M., Phillips, S., & Glasgow, R.E. (1980). Cost effectiveness
in behavioral family therapy. Behavior Therapy, 11, 208-226. doi:10.1016/S00057894(80)80021-9
Clarke, G.N., Hawkins, W., Murphy, M., Sheeber, L., Lewinsohn, P.M., & Seeley, J.
(1995). Targeted prevention of unipolar depressive disorder in an at risk sample
of high school adolescents: A randomized trial of a group cognitive intervention.
American Academy of Child and Adolescent Psychiatry, 34, 312-321. doi:10.1097
/00004583-199503000-00016
Clarke, G.N., Hornbrook, M., Lynch, F., Polen, M., Gale, J., Beardslee, W.R.
(2001). A randomized trial of a group cognitive intervention for preventing
depression in adolescent offspring of depressed parents. Archives of General
Psychiatry, 58, 1127-1134. doi:10.1001/archpsyc.58.12.1127
Clarke, G.N., Rohde, P., Lewinsohn, P.M., Hops, H., & Seeley, J. (1999). Cognitive
behavioral treatment of adolescent depression: Efficacy of acute group treatment
and booster session. Journal of the American Academy of Child and Adolescent

169

Psychiatry, 38, 272-279. doi:10.1097/00004583-199903000-00014


Cobham, V.E., Dadds, M.R., & Spence, S.H. (1998). The role of parental anxiety in the
treatment of childhood anxiety. Journal of Consulting and Clinical Psychology,
66, 893-905.
Coffee, G., & Ray-Subramanian, C.E. (2009, Winter). Goal attainment scaling: A
progress-monitoring tool for behavioral interventions. School Psychology Forum:
Research in Practice, 3(1), 1-12.
Cohen, H., Amerine-Dickens, M., & Smith, T. (2006). Early intensive behavioral
treatment: Replication of the UCLA model in a community setting.
Developmental and Behavioral Pediatrics, 27, S145-S155.
Cohen, J.A., Deblinger, E., Mannarino, A.P., & Steer, R.A. (2004). A multisite
randomized controlled study of sexually abused, multiply traumatized children
with PTSD: Initial treatment outcome. Journal of the American Academy of Child
and Adolescent Psychiatry, 43, 393-402.
Cohen, J.A., & Mannarino, A.P. (1996). A treatment outcome study for sexually abused
preschool children: Initial findings. Journal of the American Academy of Child
and Adolescent Psychiatry, 34, 42-50.
Cohen, J.A., & Mannarino, A.P. (1997). A treatment study for sexually abused preschool
children: Outcome during a one-year follow-up. Journal of the American
Academy of Child and Adolescent Psychiatry, 36, 1228-1235.
Cohen, J.A., Mannarino, A.P., & Knudsen, K. (2005). Treating sexually abused children:
1 year follow-up of a randomized controlled trial. Child Abuse & Neglect, 29,
135-145.

170

Cornwall, E., Spence, S.H., &Schotte, D. (1996). The effectiveness of emotive imagery
in the treatment of darkness phobia in children. Behaviour Change, 13, 223-229.
Counseling. (n.d.). In Merriam-Websters online dictionary. Retrieved from
http://mw4.merriam-webster.com/dictionary/counseling?show=0&t=1296248084
Couper, M.P., Kapteyn, A., Schonlau, M., & Winter, J. (2007). Noncoverage and
nonreponse in an internet survey. Social Science Research, 36, 131-148. doi:10.
1016/j.ssresearch.2005.10.002
Crespi, T. D. (2009). Group counseling in the schools: Legal, ethical, and treatment
issues in school practice. Psychology in the Schools, 46, 273-280. doi:10.1002
/pits.20373
Crespi, T.D., &Fischetti, B.A. (1997, September). Counseling and psychotherapy in the
schools: Rationale and considerations for professional practice. NASP
Communiqu, 26, 18-20.
Crespi, T.D., & Howe, E.A. (2002). Families in crisis: Considerations for special service
providers in the school. Special Services in the Schools, 18, 43-54. doi:10.1300
/J008v18n01_03
Crespi, T.D., &Rigazio-DiGilio, S.A. (1996). Adolescent homicide and family
pathology: Implications for research and treatment with adolescents. Adolescence,
31, 353-367.
Crespino, J. (2006). The best defense is a good offense: The Stennis Amendment and the
fracturing of liberal school desegregation policy, 1964-1972. Journal of Policy
History, 18, 304-325. doi:10.1353/jph.2006.0008
Curtis, M. J., Grier, E. J. C., & Hunley, S. A. (2004). The changing face of school

171

psychology: Trends in data and projections for the future. School Psychology
Review, 28, 104-116.
Curtis, M. J., Hunley, S. A., & Grier, E. C. (2004). The status of school psychology:
Implications of a major personnel shortage. Psychology in the Schools, 41, 431442. doi: 10.1002/pits.10186
Curtis, M.J., Lopez, A.D., Castillo, J.M., Batsche, G.M., Minch, D., & Smith, J.C.
(2008). The status of school psychology: Demographic characteristics,
employment conditions, professional practices, and continuing professional
development. Communiqu, 36, 27-29.
Curtis, M. J., Lopez, A. D., Batsche, G. M., & Smith, J. C. (2006, March). School
psychology 2005: A national perspective. Paper presented at the annual meeting
of the National Association of School Psychologists, Anaheim, CA.
Cutts, N. E. (1955). School psychologists at mid-century. Washington, DC: American
Psychological Association.
David-Ferdon, C., & Kaslow, N. (2008). Evidence-based psychosocial treatments for
child and adolescent depression. Journal of Clinical Child & Adolescent
Psychology, 37, 62-104. doi: 10.1080/15374410701817865
Deblinger, E., Lippman, J., & Steer, R. (1996). Sexually abused children suffering
posttraumatic stress symptoms: Initial treatment outcome findings. Child
Maltreatment, 1, 310-321.
Deblinger, E., Stauffer, L., & Steer, R. (2001). Comparative efficacies of supportive and
cognitive behavioral group therapies for young children who have been sexually
abused and their nonoffending mothers. Child Maltreatment, 6, 332-343.

172

Dennis, M., Godley, S.H., Diamond, G., Tims, F.M., Babor, T., Donaldson, J. (2004).
The Cannabis Youth Treatment (CYT) Study: Main findings from two
randomized trials. Journal of Substance Abuse Treatment, 27, 197-213.
Deno, S. L. (1995). School psychologist as problem solver. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology (3rd ed., pp. 471-484). Washington,
DC: National Association of School Psychologists.
Dillman, D.A. (2007). Recent developments in the design of web, mail, and mixed-mode
surveys. In Mail and internet surveys: The Tailored Design method (pp. 447-503).
Hoboken, NJ: John Wiley & Sons.
Dillman, D.A., & Redline, C.D. (2004). Testing paper self-administered questionnaires:
Cognitive interview and field test comparisons. In S. Presser et al. (Eds.),
Methods for testing and evaluating survey questionnaires (pp. 299-317). New
York: Wiley-Interscience.
Doll, B. (1996). Prevalence of psychiatric disorders in children and youth: An agenda for
advocacy by school psychology. School Psychology Quarterly, 11, 20-46.
Doll, B., & Cummings, J. A. (2008). Best practices in population-based school mental
health services. In A. Thomas & J. Grimes (Eds.), Best Practices in School
Psychology (5th ed., pp. 1333-1347). Bethesda, MD: National Association of
School Psychologists.
Drew, A., Baird, G., Baron-Cohen, S., Cox, A., Slomins, V., Wheelwright, S. (2002).
A pilot randomized control trial of a parent training intervention for pre-school
children with autism: Preliminary findings and methodological challenges.
European Child and Adolescent Psychiatry, 11, 266-272.

173

Eikeseth, S., Smith, T., Jahr, E., Eldevik, S. (2002). Intensive behavioral treatment at
school for 4- to 7-year-old children with autism: A 1-year comparison controlled
study. Behavioral Modification, 26, 49-68.
Elementary and Secondary Education Act of 1965, Pub. L. No. 89-10 Stat. 79 (1965).
Retrieved from http://nysl.nysed.gov/Archimages/91338.PDF
Elementary and Secondary Education Amendments of 1967, Pub. L No. 90-247, Stat. 81
(1968). Retrieved from http://nysl.nysed.gov/Archimages/91341.PDF
Elementary and Secondary Education Act Amendments of 1974, Pub. L. No. 93-380 Stat.
88. Retrieved from http://nysl.nysed.gov/Archimages/91344.PDF
Equal Education Opportunity Act of 1974, 20 U.S.C. 1701, 1702. Retrieved from
http://codes.lp.findlaw.com/uscode/20/39/I/1/1702
Evans, J.R., & Mathur, A. (2005). The value of online surveys. Internet Research, 13,
195-219.
Eyberg, S. M., Nelson, M. M., & Boggs, S. R. (2008). Evidence-based psychosocial
treatments for children and adolescents with disruptive behavior. Journal of
Clinical Child & Adolescent Psychology, 37, 215-237. doi: 10.1080/
15374410701820117
Fagan, T. K. (1986). The historical origins and growth of programs to prepare school
psychologists in the United States. Journal of School Psychology, 24, 9-22.
Fagan, T. K. (1988). The historical improvement of the school psychology service ratio:
Implications for future employment. School Psychology Review, 17, 447-458.
Fagan, T. K. (1992). Compulsory schooling, child study, clinical psychology, and special
education: Origins of school psychology. American Psychologist, 47(2), 236-243.

174

Fagan, T. K. (2008). Trends in the history of school psychology in the United States. In
A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology (5th ed., pp.
2069-2085). Bethesda, MD: National Association of School Psychologists.
Fantuzzo, J., Manz, P., Atkins, M., & Meyers, R. (2005). Peer-mediated treatment of
socially withdrawn maltreated preschool children: Cultivating natural community
resources. Journal of Clinical Child & Adolescent Psychology, 34, 320-325.
Fantuzzo, J., Sutton-Smith, B., Atkins, M., Meyers, R., Stevenson, H., Coolahan, K.
(1996). Community-based resilient peer treatment of withdrawn maltreated
preschool children. Journal of Consulting and Clinical Psychology, 64, 13771386.
Farmer, E.M., Burns, B.J., Philip, S.D., Angold, A., & Costello, E.J. (2003). Pathways
into and through mental health services for children and adolescents. Psychiatric
Services, 54, 60-67. doi:10.1176/appi.ps.54.1.60
Farrell, A.D., Guerra, N.G., &Tolan, P.H. (1996). Preventing aggression in inner city
children: Small group training to change cognitions, social skills, and behavior.
Journal of Child and Adolescent Group Therapy, 4, 229-242.
Fergusson, D.M., Horwood, L.L., &Lynskey, M. (1994). The childhoods of multipleproblem adolescents: A 15-year longitudinal study. Journal of Child Psychology
and Psychiatry, 35, 1123-1140. doi:10.1111/j.1469-7610.1994.tb01813.x
Feindler, E.L., Marriot, S.A., & Iwata, M. (1984). Group anger control training for junior
high school delinquents. Cognitive Therapy and Research, 8, 299-311. doi:10.
1007/BF01173000
Fisher, G. L., Jenkins, S. J., & Crumbley, J. D. (1986). A replication of a survey of school

175

psychologists: Congruence between training, practice, preferred role, and


competence. Psychology in the Schools, 23, 271-279. doi:10.1002/15206807(198607)23:3<271::AID-PITS2310230308>3.0.CO;2-Y
Flannery-Schroeder, E. C., & Kendall, P. C. (2000). Group and individual cognitivebehavioral treatments for youth with anxiety disorders: A randomized clinical
trial. Cognitive Therapy and Research, 24, 251278. doi:10.1023/A:
1005500219286
Flugum, K. R., & Reschly, D. J. (1994). Prereferral interventions: Quality indices and
outcomes. Journal of School Psychology, 32, 1-14. doi:10.1016/00224405(94)90025-6
Forman, S. G., & Burke, C. R. (2008). Best practices in selecting and implementing
evidence-based school interventions. In A. Thomas & J. Grimes (Eds.), Best
Practices in School Psychology (5th ed., pp. 799-810). Bethesda, MD: National
Association of School Psychologists.
Foster, S., Rollefson, M., Doksum, T., Noonan, D., Robinson, G., &Teich, J. (2005).
School mental health services in the United States, 2002-2003. DHHS Pub No.
(SMA) 05-4068. Rockville, MD: Center for Mental Health Services, Substance
Abuse and Mental Health Services Administration.
French, J. L. (1984). On the conception, birth, and early development of school
psychology, with special reference to Pennsylvania. American Psychologist, 39,
976-987. doi:10.1037/0003-066X.39.9.976
Fristad, M. A., Verducci, J. S., Walters, K. & Young, M. E. (2009). The impact of multifamily psychoeducational psychotherapy in treating children aged 8-12 with

176

mood disorders. Archives of General Psychiatry, 66, 1013-1021. doi:10.1001/


archgenpsychiatry.2009.112
Fuchs, L., & Fuchs, D. (1998). Treatment validity: A unifying concept for
reconceptualizing the identification of learning disabilities. Learning Disabilities
Research and Practice, 13, 204-219.
Fuchs, L., Fuchs, D., Hintze, J., & Lembke, E. (2006, July). Progress monitoring in the
context of responsiveness-to-intervention. Paper presented at the Summer
Institute on Progress Monitoring, Kansas City, MO.
Fuchs, L., Kovaleski, J.F., & Carruth, J. (2009, April 30). Data-based decision making
[Online forum content]. Retrieved from http://www.rtinetwork.org/professional
/forums/data-based-decision-making
Gallagher, H.M., Rabian, B.A., & McCloskey, M.S. (2003). A brief group cognitive
behavioral intervention for social phobia in childhood. Journal of Anxiety
Disorders, 18, 459-479. doi:10.1016/S0887-6185(03)00027-6
Gickling, E.E., & Armstrong, D.L. (1978). Levels of instructional difficulty as related to
on-task behavior, task completion, and comprehension. Journal of Learning
Disabilities, 11, 559-566. doi:10.1177/002221947801100905
Gickling, E., & Thompson, V. (1985). A personal view of curriculum-based assessment.
Exceptional Children, 52, 205-218.
Gillham, J.E., Reivich, K., Jaycox, L., & Seligman, M.E.P. (1995). Prevention of
depressive symptoms in school children: Two year follow-up. Psychological
Science, 6, 343-351. doi:10.1111/j.1467-9280.1995.tb00524.x
Goals 2000: Educate America Act, 20 U.S.C. 5811 et seq. Retrieved from

177

http://www2.ed.gov/legislation/GOALS2000/TheAct/intro.html
Goldwasser, E., Meyers, J., Christenson, S., & Graden, J. (1983). The impact of
PL 94-142 on the practice of school psychology: A national survey. Psychology in
the Schools, 20, 153-165. doi:10.1002/1520-6807(198304)20:2<153::AIDPITS2310200206>3.0.CO;2-W
Goldstein, T.R., Axelson, D.A., Birmaher, B., & Brent, D.A. (2007). Dialectical behavior
therapy for adolescents with bipolar disorder: A one-year open trial. Journal of
the American Academy of Child and Adolescent Psychiatry, 46, 820-830. doi:
10.1097/chi.0b013e31805c1613
Granello, D.H., & Wheaton, J.E. (2004). Online data collection: Strategies for research.
Journal of Counseling & Development, 82, 387-393.
Grant, W. V., & Eiden, L. J. (1980). Digest of educational statistics 1980. Washington,
DC: U. S. Government Printing Office.
Gray, P. J., Caulley, D. N., & Smith, N. L. (1982). A study in contrasts: Effects of the
Education Consolidation and Improvement Act of 1981 on SEA and LEA
evaluation (ERIC No ED 235 183). Northwest Regional Educational Lab.
Retrieved from http://www.eric.ed.gov/PDFS/ED235183.pdf
Gresham, F.M. (1989). Assessment of treatment integrity in school consultation and
prereferral intervention. School Psychology Review, 17, 211-226.
Gresham, F.M. (2002). Responsiveness to intervention: An alternative approach to the
identification of learning disabilities. In R. Bradley, L. Danielson, & D. Hallahan
(Eds.), Learning disabilities: Research to practice (pp. 467-519). Mahwah, NJ:
Lawrence Erlbaum.

178

Gresham, F.M. (2005). Response to intervention: An alternative means of identifying


students as emotionally disturbed. Education and Treatment of Children, 28, 329344.
Gresham, F.M. (2007). Evolution of the response-to-intervention concept: Empirical
foundations and recent developments. In S. Jimmerson, M. Burns, & A.
VanDerHeyden (Eds.), Handbook of response to intervention: The science and
practice of assessment and intervention (pp. 10-24). New York: Springer.
Gresham, F.M., Cook, C.R., Collins, T., Rasethwane, K., Dart, E., Truelson, E., & Grant,
S. (2010). Developing a change-sensitive brief behavior rating scale as a progress
monitoring tool for social behavior: An example using the Social Skills Rating
System Teacher Form. School Psychology Review, 39(3), 364-379.
Gresham, F.M., & Witt, J.C. (1997). Utility of intelligence tests for treatment planning,
classification, and placement decisions: Recent empirical findings and future
directions. School Psychology Quarterly, 12, 249-267. doi:10.1037/h0088961
Haertel, G. D., Walberg, H. J., & Weinstein, T. (1983). Psychological models of
educational performance: A theoretical synthesis of constructs. Review of
Educational Research, 53, 75-92.
Hamilton, S.B., & MacQuiddy, S.L. (1984). Self-administered behavioral parent training:
Enhancement of treatment efficacy using a time-out signal seat. Journal of
Clinical Child Psychology, 13, 61-69.
Hanchon, T., & Fernald, L. (2011, February). School psychologists provision of
counseling services: An exploratory study. Paper presented at the meeting of the
National Association of School Psychologists, San Fransisco, CA.

179

Hartshorne, T. S., & Johnson, M. C. (1985). The actual and preferred roles of the school
psychologist according to secondary school administrators. Journal of School
Psychology, 23, 241-246. doi:10.1016/0022-4405(85)90015-9
Hawkins, R. P., & Dobes, R. W. (1977). Behavioral definitions in applied behavior
analysis: Explicit or implicit. In B. C. Etzel, J. M. LeBlanc, & D. M. Baer (Eds.),
New developments in behavioral research: Theory, methods, and applications.
Hillsdale, NJ: Erlbaum.
Hayes, S.C., Barlow, D.H., & Nelson-Gray, R. O. (1999). The scientist-practitioner:
Research and accountability in the age of managed care. Boston: Allyn & Bacon.
Hayward, C., Varady, S., Albano, A.M., Thienamann, M., Henderson, L., & Schatzberg,
A.F. (2000). Cognitive-behavioral group therapy for social phobia in female
adolescents: Results of a pilot study. Journal of the American Academy of Child
and Adolescent Psychiatry, 39, 721-726. doi:10.1097/00004583-20000600000010
Heartland Area Education Agency. (2007). Improving childrens educational results
through data-based decision-making. Retrieved from http://www.aea11.k12.ia.us
/spedresources/ModuleFour.pdf
Henggeler, S.W., Clingempeel, W.G., Brondino, M.J., & Pickrel, S.G. (2002). Four-year
follow-up of multisystemic therapy with substance-abusing and substancedependent juvenile offenders. Journal of the American Academy of Child and
adolescent Psychiatry, 41, 868-874. doi:10.1097/00004583-200207000-00021
Henggeler, S.W., Melton, G.B., Brondino, M.J., Scherer, D.G., & Hanley, J.H. (1997).
Multisystemic therapy with violent and chronic juvenile offenders and their

180

families: The role of treatment fidelity in successful dissemination. Journal of


Consulting and Clinical Psychology, 65, 821-833. doi:10.1037/0022-006X.
65.5.821
Henggeler, S.W., Melton, G.B., & Smith, L.A. (1992). Family preservation using
multisystemic therapy: An effective alternative to incarcerating serious juvenile
offenders. Journal of Consulting and Clinical Psychology, 60, 953-961. doi:
10.1037/0022-006X.60.6.953
Henggeler, S.W., Pickrel, S.G., & Brondino, M.J. (1999). Multisystemic treatment of
substance-abusing and dependent delinquents: Outcomes, treatment fidelity,
and transportability. Mental Health Services Research, 1, 171-184. doi:10.1023
/A:1022373813261
Herschell, A. D., McNeil, C. B., & McNeil, D. W. (2004). Clinical child psychologys
progress in disseminating empirically supported treatments. Clinical Psychology:
Science and Practice, 11, 267-288. doi:10.1093/clipsy/bph082
Heun, C.T. (2001, October 15). Proctor and Gamble readies online market-research
push. Information Week, p. 26.
Heyne, D., King, N., Tonge, B., Rollings, S., Young, D., Pritchard, M. (2002).
Evaluation of child therapy and caregiver training in the treatment of school
refusal. Journal of the American Academy of Child and Adolescent Psychiatry,
41, 687-695.
Hinshaw, S. P. (2002). Intervention research, theoretical mechanisms, and causal
processes related to externalizing behavior patterns. Development and
Psychopathology, 14, 789-818. doi:10.1017/S0954579402004078

181

Hoagwood, K., & Erwin, H. (1997). Effectiveness of school-based mental health services
for children: A 10 year research review. Journal of Child and Family Studies, 6,
435-451. doi:10.1023/A:1025045412689
Hoagwood, K., Hibbs, E., Brent, D., & Jensen, P. (1995). Introduction to the special
section: Efficacy and effectiveness in studies of child and adolescent
psychotherapy. Journal of Consulting and Clinical Psychology, 63, 683-687.
doi:10.1037//0022-006X.63.5.683
Hoagwood, K., & Johnson, J. (2003). School psychology: A public health framework: I.
From evidence-based practices to evidence-based policies. Journal of School
Psychology, 41, 3-21. doi:10.1016/S0022-4405(02)00141-3
Hoath, F.E., & Sanders, M.R. (2002). A feasibility study of enhanced group Triple PPositive Parenting Program for parents of children with attention-deficit/
hyperactivity disorder. Behavior Change, 19, 191-206. doi:10.1375/bech.19.4.191
Hollingworth, L. S. (1922, May). Existing laws which authorize psychologists to perform
professional services. Journal of Criminal Law and Criminology, 12, 70-73.
Hollingworth, L. S. (1932, February). Psychological service for public schools. Address
delivered to the Child Study Club of Springfield, MA.
Hombeck, G. (1997). Toward terminological, conceptual, and statistical clarity in the
study of mediators and moderators: Examples from the child-clinical and pediatric
psychology literatures. Journal of Consulting and Clinical Psychology, 65, 599610.
Hops, H., Waldron, H.B., Davis, B., Barrera, M., Turner, C.W., Brody, J., &
Ozechowski, T.J. (2007). Ethnic influences on family processes and family

182

therapy outcomes for substance-abusing adolescents. Unpublished manuscript,


Oregon Research Institute.
Horner, R.H., Carr, E.G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use
of single subject research to identify evidence-based practice in special education.
Exceptional Children, 71(2), 165-179.
Hosp, J. L. (2008). Best practice in aligning academic assessment with instruction. In A.
Thomas & J. Grimes (Eds.), Best practices in school psychology (5th ed., pp. 363376). Bethesda, MD: National Association of School Psychologists.
Hosp, J. L., & Reschly, D. J. (2002). Regional differences in school psychology practice.
School Psychology Review, 31, 11-29.
House Committee on Education and Labor. (1986). Education of the Handicapped Act
Ammendments of 1986. Report (House-R99-860). Retrieved from
http://www.eric.ed.gov/PDFS/ED276165.pdf
House Committee on Education and Labor (1990). Chapter 1 survey of the HawkinsStafford School Improvement Amendments. A report prepared for the
Subcommittee on Elementary, Secondary, and Vocational Education of the
Committee on Education and Labor. House of Representatives, One Hundred
First Congress, Second Session (ERIC No. ED 320 998). Retrieved from
http://www.eric.ed.gov/PDFS/ED320998.pdf
Howell, K., & Nolet, V. (2000). Curriculum-based evaluation: Teaching and decision
making. Belmont, CA: Wadsworth/Thompson Learning.
Huang, L., Stroul, B., Friedman, R., Mrazek, P., Friesen, B., Pires, S., &Mayberg, S.
(2005). Transforming mental health care for children and their families. American

183

Psychologist, 60, 615-627. doi:10.1037/0003-066X.60.6.615


Huey, W.C., & Rank, R.C. (1984). Effects of counselor and peer-led group assertive
training on black-adolescent aggression. Journal of Counseling Psychology, 31,
95-98. doi:10.1037/0022-0167.31.1.95
Hutt, R. B. W. (1923). The school psychologist. Psychological Clinic, 15, 48-51.
Improving Americas Schools Act of 1994, 20 U.S.C. 10101 et seq. Retrieved from
http://www2.ed.gov/legislation/ESEA/index.html
Intervention. (n.d.). In Merriam-Websters online dictionary. Retrieved from http://
www.merriam-webster.com/dictionary/intervention
Irwin, E. (1915). Truancy. New York: Public Education Association.
Jaberghaderi, N. Greenwald, R., Rubin, A., Zand, S.O., & Dolatabadi, S. (2004). A
comparison of CBT and EMDR for sexually abused Iranian girls. Clinical
Psychology and Psychotherapy, 11, 358-368. doi:10.1002/cpp.395
Jackson, N.B. (2003, July 3). Opinions to spare? Click here. The New York Times, p. G1.
Jaycox, L., Reivich, K., Gillham, J.E., & Seligman, M.E.P. (1994). Prevention of
depressive symptoms in school children. Behavioral Research and Therapy, 32,
801-816. doi:10.1016/0005-7967(94)90160-0
Jocelyn, L.J., Casiro, O.G., Beattie, D., Bow, J., & Kneisz, J. (1998). Treatment of
children with autism: A randomized controlled trial to evaluate a caregiver-based
intervention program in community day-care centers. Developmental and
Behavioral Pediatrics, 19, 326-334. doi:10.1097/00004703-199810000-00002
Kahn, J., Kehle, T., Jenson, W., & Clark, E. (1990). Comparison of cognitive behavioral,
relaxation, and self-modeling interventions for depression among middle-school

184

students. School Psychology Review, 19, 196-211.


Kaminer, Y., & Burleson, J. (1999). Psychotherapies for adolescent substance abusers:
15-month follow-up. American Journal on Addictions, 8, 114-119. doi:10.1080
/105504999305910
Kaminer, Y., Burleson, J.A., Blitz, C., Sussman, J., & Rounsaville, B.J. (1998).
Psychotherapies for adolescent substance abuse: A pilot study. The Journal of
Nervous and Mental Disease, 186, 684-690. doi:10.1097/00005053-19981100000004
Kaminer, Y., Burleson, J.A., & Goldberger, R. (2002). Cognitive-behavioral coping skills
and psychoeducation therapies for adolescent substance abuse. The Journal of
Nervous and Mental Disease, 190, 737-745. doi:10.1097/00005053-20021100000003
Kataoka, S.H., Stein, B.D., Jaycox, L.H., Wong, M., Escudero, P., Tu, W. (2003).
A school-based mental health program for traumatized Latino immigrant children.
Journal of the American Academy of Child and Adolescent Psychiatry, 42, 311318. doi:10.1097/00004583-200303000-00011
Kataoka, S.H., Zhang, L., & Wells, K.B. (2002). Unmet need for mental health care
among U.S. children: Variation by ethnicity and insurance status. American
Journal of Psychiatry, 159, 1548-1555. doi:10.1176/appi.ajp.159.9.1548
Kamphaus, R.W., DiStefano, C., Dowdy, E., Eklund, K., & Dunn, A.R. (2010).
Determining the presence of a problem: Comparing two approaches for detecting
youth behavioral risk. School Psychology Review, 39(3), 395-407.
Kazdin, A.E. (1977). Asssessing the clinical and applied significance of behavior change

185

through social validation. Behavior Modification, 1, 427-452. doi:10.1177


/014544557714001
Kazdin, A. E. (1982). Single-case research designs: Methods for clinical and applied
settings. New York: Oxford University Press.
Kazdin, A.E. (2003). Psychotherapy for children and adolescents. Annual Review of
Psychology, 54, 253- 276. doi:10.1146/annurev.psych.54.101601.145105
Kazdin, A.E. (2005). Evidence-based assessment for children and adolescents: Issues in
measurement, development, and clinical application. Journal of Clinical Child
and Adolescent Psychology, 34, 548-558. doi:10.1207/s15374424jccp3403_10
Kazdin, A. E. (2008). Evidence-based treatment and practice: New opportunities to
bridge clinical research and practice, enhance the knowledge base, and improve
patient care. American Psychologist, 63, pp. 146-159. doi:10.1037/0003-066X.
63.3.146
Kazdin, A.E., Bass, D., Siegel, T.C., & Thomas, C. (1989). Cognitive behavior therapy
and relationship therapy in the treatment of children referred for antisocial
behavior. Journal of Consulting and Clinical Psychology, 57, 522-536. doi:
10.1037/0022-006X.57.4.522
Kazdin, A.E., Esveldt-Dawson, K., French, N.H., & Unis, A.S. (1987a). Effects of parent
mangagement training and problem-solving skills training combined in the
treatment of antisocial child behavior. Journal of the American Academy of
Child & Adolescent Psychiatry, 26, 416-424. doi:10.1097/00004583-19870500000024
Kazdin, A.E., Esveldt-Dawson, K., French, N.H., & Unis, A.S. (1987b). Problem-solving

186

skills training and relationship therapy in the treatment of antisocial behavior.


Journal of Consulting and Clinical Psychology, 55, 76-85. doi:10.1037/0022006X.55.1.76
Kazdin, A. E., Kratochwill, T. R., & VandenBos, G. (1986). Beyond clinical trials:
Generalizing from research to practice. Professional Psychology: Research and
Practice, 3, 391-398. doi:10.1037//0735-7028.17.5.391
Kazdin, A.E., Siegel, T.C., & Bass, D. (1992). Cognitive problem-solving skills training
and parent management training in the treatment of antisocial behavior in
children. Journal of Consulting and Clinical Psychology, 60, 733-747. doi:
10.1037/0022-006X.60.5.733
Keel, P. K., & Haedt, A. (2008). Evidence-based psychosocial treatments for eating
problems and eating disorders. Journal of Clinical Child & Adolescent
Psychology, 37, 39-61. doi: 10.1080/15374410701817832
Kendall, P. C. (1994). Treating anxiety disorders in children: Results of a randomized
clinical trial. Journal of Consulting and Clinical Psychology, 62, 200210. doi:
10.1037/0022-006X.62.1.100
Kendall, P. C., Flannery-Schroeder, E., Panichelli-Mindel, S. M., Southam-Gerow, M.,
Henin, A., & Warman, M. (1997). Therapy for youth with anxiety disorders: A
second randomized clinical trial. Journal of Consulting and Clinical Psychology,
65, 366380. doi:10.1037/0022-006X.65.3.366
Kern, L. (2005). Developing hypothesis statements. In L.M. Bambara & L. Kern (Eds.),
Individualized supports for students with problem behaviors: Designing positive
behavior plans (pp. 165-200). New York: Guilford Press.

187

King, C.A., & Kirschenbaum, D.S. (1990). An experimental evaluation of a school-based


program for children at risk: Wisconsin early-intervention. Journal of Community
Psychology, 18, 167-177. doi:10.1002/1520-6629(199004)18:2<167::AIDJCOP2290180208>3.0.CO;2-Z
King, N.J, Tonge, B.J., Heyne, D., Pritchard, M., Rollings, S., Young, D. (1998).
Cognitive-behavioral treatment of school-refusing children: A controlled
evaluation. Journal of the American Academy of Child and Adolescent
Psychiatry, 37, 395-403.
King, N.J., Tonge, B.J., Mullen, P., Myerson, N., Heyne, D., Rollings, S. (2000).
Treating sexually abused children with posttraumatic stress symptoms: A
randomized clinical trial. Journal of the American Academy of Child and
Adolescent Psychiatry, 39, 1347-1355.
Kolko, D.J. (1996). Individual cognitive behavioral treatment and family treatment and
family therapy for physically abused children and their offending parents: A
comparison of clinical outcomes. Child Maltreatment, 1, 322-342.
Kowlenko, N., Rapee, R.M., Simmons, J., Wignall, A., Hoge, R., Whitefield, K.
(2005). Short-term effectiveness of a school-based entry intervention program for
adolescent depression. Clinical Child Psychology and Psychiatry, 10, 493-507.
Kraemer, H.C., Wilson, G.T., Fairburn, C.G., & Agras, W.S. (2002). Mediators and
moderators of treatment effects in randomized clinical trials. Archives of General
Psychiatry, 59, 877-883.
Kratochwill, T.R., & Bergan, J.R. (1990). Behavioral consultation in applied settings: An
individual guide. New York: Plenum.

188

Kratochwill, T. R., & Shernoff, E. S. (2004). Evidence-based practice: Promoting


evidence-based interventions in school psychology. School Psychology Review,
33(1), 34-48.
Lacayo, N., Sherwood, G., & Morris, J. (1981). Daily activities of school psychologists:
A national survey. Psychology in the Schools, 18, 184-190. doi:10.1002/15206807(198104)18:2<184::AID-PITS2310180213>3.0.CO;2-R
Last, C.G., Hansen, C., & Franco, N. (1998). Cognitive-behavioral treatment of school
phobia. Journal of the American Academy of Child and Adolescent Psychiatry,
37, 404-411. doi:10.1097/00004583-199804000-00018
Leaf, P.J., Alegria, M., Cohen, P., Goodman, S.H., Horwitz, S.M., Hoven, C.W.
(1996). Mental health service use in the community and schools: Results from the
four-community MECA Study. Journal of the American Academy of Child and
Adolescent Psychiatry, 35, 889-897. doi:10.1097/00004583-199607000-00014
Le Grange, D., Crosby, R.D., Rathouz, P.J., & Levental, B.L. (2007). A randomized
controlled comparison of family-based treatment and supportive psychotherapy
for adolescent bulimia nervosa. Archives of General Psychiatry, 64, 1049-1056.
doi:10.1001/archpsyc.64.9.1049
Leung, C., Sanders, M.R., Leung, S., Mak, R., & Lau, J. (2003). An outcome evaluation
of the implementation of the Triple P-Positive Parenting Program in Hong Kong.
Family Process, 42, 531-544. doi:10.1111/j.1545-5300.2003.00531.x
Leve, L.D., Chamberlain, P., & Reid, J.B. (2005). Intervention outcomes for girls
referred from juvenile justice: Effects on delinquency. Journal of Consulting and
Clinical Psychology, 73, 1181-1185. doi:10.1037/0022-006X.73.6.1181

189

Lewinsohn, P.M., Clarke, G., Hops, H., & Andrews, J. (1990). Cognitive-behavioral
treatment for depressed adolescents. Behavior Therapy, 21, 385-401. doi:10.
1016/S0005-7894(05)80353-3
Lewinsohn, P.M., Clarke, G., Rohde, P., Hops, H., & Seeley, J. (1996). A course in
coping: A cognitive-behavioral approach to the treatment of adolescent
depression. In E.D. Hibbs & P.S. Jensen (Eds.), Psychosocial treatments for child
and adolescent disorders: Empirically based strategies for clinical practice (pp.
109-135). Washington, DC: American Psychological Association.
Lichtenstein, R. (2008). Best practices in identification of learning disabilities. In A.
Thomas & J. Grimes (Eds.), Best practices in school psychology (5th ed., pp.
1661-1672). Bethesda, MD: National Association of School Psychologists.
Liddle, H.A., Dakof, G.A., Diamond, G.S., Parker, G.S., Barrett, K., & Tejeda, M.
(2001). Multidimensional family therapy for adolescent substance abuse: Results
of a randomized clinical trial. American Journal of Drug and Alcohol Abuse, 27,
651-687. doi:10.1081/ADA-100107661
Liddle, H.A., Rowe, C.L., Dakof, G.A., Ungaro, R.A., & Henderson, C.E. (2004). Early
intervention for adolescent substance abuse: Pretreatment to posttreatment
outcomes of a randomized clinical trial comparing multidimensional family
therapy and peer group treatment. Journal of Psychoactive Drugs, 36,49-63.
Lieberman, A.F., Van Horn, P., & Ippen, C.G. (2005). Toward evidence-based treatment:
Child-parent psychotherapy with preschoolers exposed to marital violence.
Journal of the American Academy of Child & Adolescent Psychiatry, 44, 12411248. doi:10.1097/01.chi.0000181047.59702.58

190

Lochman, J.E., Coie, J.D., Underwood, M.K., & Terry, R. (1993). Effectiveness of a
social relations intervention program for aggressive and nonaggressive, rejected
children. Journal of Consulting and Clinical Psychology, 61, 1053-1058. doi:10.
1037/0022-006X.61.6.1053
Lonigan, C. J., Elbert, J. C., & Johnson, S. B. (1998). Empirically supported psychosocial
interventions for children: An overview. Journal of Clinical Child Psychology,
27, 138-145. doi:10.1207/s15374424jccp2702_1
Lovaas, O.I. (1987). Behavioral treatment and normal educational and intellectual
functioning in young autistic children. Journal of Consulting and Clinical
Psychology, 55, 3-9. doi:10.1037/0022-006X.55.1.3
Mangione, T. W. (1995). Mail surveys: Improving the quality. Applied Social Research
Methods Services, Vol. 40. Thousand Oaks, CA: Sage Publications.
Meacham, M., & Peckham, P. (1978). School psychologists at three-quarters century:
Congruence between training, practice, preferred role and competence. Journal of
School Psychology, 16, 195-206. doi:10.1016/0022-4405(78)90001-8
Mendlowitz, S. L., Manassis, K., Bradley, S., Scapillato, D., Miezitis, S., & Shaw, B. F.
(1999). Cognitive-behavioral group treatments in childhood anxiety disorders:
The role of parental involvement. Journal of the American Academy of Child and
Adolescent Psychiatry, 38, 12231229. doi:10.1097/00004583-199910000-00010
Melvin, G.A., Tonge, B.J., King, N.J., Heyne, D., Gordon, M.S., & Klimkeit, E. (2006).
A comparison of cognitive-behavioral therapy, sertraline, and their combination
for adolescent depression. Journal of the American Academy of Child and
Adolescent Psychiatry, 45, 1151-1161. doi:10.1097/01.chi.0000233157.21925.71

191

Merrell, K.W. (2008). Behavioral, social, and emotional assessment of children and
adolescents (3rd ed.). New York: Erlbaum/Taylor & Francis.
Merrell, K.W. (2010). Better methods, better solutions: Developments in school-based
behavioral assessment. School Psychology Review, 39(3), 422-426.
Merrell, K. W., Ervin, R. A., & Gimpel, G. A. (2006). Legal and ethical issues in school
psychology. In School psychology for the 21st century (pp. 113-138). New York:
Guilford.
Merrell, K. W., Ervin, R. A., & Gimpel, G. A. (2006). Working as a school psychologist.
In School psychology for the 21st century (pp. 94-112). New York: Guildford.
Miklowitz, D.J., Axelson, D.A., Birmaher, B., George, E.L., Taylor, D.O., Schneck,
C.D., Brent, D.A. (2008). Family-focused treatment for adolescents with
bipolar disorder: Results of a two-year randomized trial. Archives of General
Psychiatry, 65, 1053-1061. doi:10.1001/archpsyc.65.9.1053
Mills, C., Stephan, S.H., Moore, E., Weist, M.D., Daly, B.P., & Edwards, M. (2006).
The presidents New Freedom Commission: Capitalizing on opportunities to
advance school-based mental health services. Clinical Child and Family
Psychology Review, 9, 146-161. doi:10.1007/s10567-006-0003-3
Mills v. Board of Education of District of Columbia, 348 F. Supp. 866 (1972); Contempt
proceedings, 551 Educ. of the Handicapp L. Rep. 643 (D. D.C. 1980).
Miltenberger, R.G. (2004). Behavior modification: Principles and procedures (3rd ed.).
Pacific Grove, CA: Wadsworth.
Miltenberger, R.G. (2005). Strategies for measuring behavioral change. In L.M.
Bambara & L. Kern (Eds.), Individualized supports for students with problem

192

behaviors: Designing positive behavior plans (pp. 107-128). New York:


Guildford Press.
Minton, H. L. (1987). Lewis M. Terman and mental testing: In search of the democratic
ideal. In M. M. Sokal (Ed.). Psychological testing and American society, 18901930 (pp. 95-112). New Brunswick, NJ: Rutgers University Press.
Miranda, A., Presentacion, M.J., & Soriano, M. (2002). Effectiveness of a school-based
multicomponent program for the treatment of children with ADHD. Journal of
Learning Disabilities, 35, 546-562.
Mirkin, P., Deno, S., Tindal, G., & Kuehnle, K. (1982). Frequency of measurement and
data utilization as factors in standardized behavioral assessment of academic
skill. Journal of Psychopathology and Behavioral Assessment, 4, 361-370.
Moore, D.W. (1994). One in seven Americans are victims of child abuse. The Gallup
Poll Monthly, 18-22.
MTA Cooperative Group. (1999). 14-month randomized clinical trial of treatment
strategies for attention deficit hyperactivity disorder. Archives of General
Psychiatry, 56, 1073-1086. doi:10.1001/archpsyc.56.12.1073
Mufson, L.H., Dorta, K.P., Wickramaratne, P., Nomura, Y., Olfson, M., & Weissman,
M.N. (2004). A randomized effectiveness trial of interpersonal psychotherapy for
depressed adolescents. Archives of General Psychiatry, 61, 577-584. doi:10.1001
/archpsyc.61.6.577
Mufson, L.H., Weissman, M.M., Moreau, D., & Garfinkel, R. (1999). Efficacy of
interpersonal psychotherapy for depressed adolescents. Archives of General
Psychiatry, 56, 573-579. doi:10.1001/archpsyc.56.6.573

193

Muris, P., Merkelbach, H., Holdrinet, I., & Sijsenaar, M. (1998). Treating phobic
children: Effects of EMDR versus exposure. Journal of Consulting and Clinical
Psychology, 66, 193-198. doi:10.1037/0022-006X.66.1.193
Murphy, J. J. (2008). Best practices in conducting brief counseling with students. In A.
Thomas & J. Grimes (Eds.), Best Practices in School Psychology (5th ed., pp.
1439-1455). Bethesda, MD: National Association of School Psychologists.
National Association of School Psychologists (NASP). (2000). NASP standards for
training and field placement programs in school psychology. Retrieved from
http://www.nasponline.org/standards/FinalStandards.pdf
National Association of School Psychologists (NASP). (2006). Social/emotional
development: School-based mental health services and school psychologists.
Retrieved from http://www.nasponline.org/about_nasp/pospaper_iac.aspx
National Association of School Psychologists (NASP). (2010). Model for comprehensive
and integrated school psychological services. Retrieved from
www.nasponline.org/standards/2010standards/2_PracticeModel.pdf
National Association of Social Workers . (2008). Code of ethics of the National
Association of Social Workers. Retrieved from http://www.naswdc.org/pubs/
code/code.asp
National Center for Children and Youths with Disabilities (1998). The IDEA
amendments of 1997. New Digest, 26, 1-40.
National Center on Response to Intervention. (2011). 2011 Screening Call for
Submissions of RTI Screening Tools. Retrieved from http://www.rti4success.org
/resourceslanding

194

National Commission on Education. (1983). A nation at risk: The imperative for


educational reform. Retrieved from http://teachertenure.procon.org/sourcefiles/anation-at-risk-tenure-april-1983.pdf
National Conference of State Legislatures. (2010). Compulsory Education. Retrieved
from http://www.ncsl.org/default.aspx?tabid=12943
Nauta, M.H., Schooling, A., Emmelkamp, P.M.G., & Minderaa, R.B. (2003). Cognitivebehavioral therapy for children with anxiety disorders in a clinical setting: No
additional effect of a cognitive parent training. Journal of the American Academy
of Child and Adolescent Psychiatry, 42, 1270-1278. doi:10.1097/01.chi.
0000085752.71002.93
Nelson, E.L., Barnard, M., & Cain, S. (2003). Treating childhood depression over
videoconferencing. Telemedicine Journal and e-Health, 9, 49-55. doi:10.1089/
153056203763317648
Nelson, A., & Weinbaum, E. (2009). Federal education policy and the states, 1945-2009:
A Brief Synopsis. Retrieved from http://www.archives.nysed.gov/edpolicy/alt
formats/ed_background_overview_essay.pdf
Nemade, R., Reiss, N. S., & Dombeck, M. (2007). Psychotherapy Evidence-based
treatments for major depression. Retrieved from http://www.mentalhelp.net
/poc/view_doc.php?type=doc&id=13023&cn=5
Newcomb, M.D., Galaif, E.R., & Locke, T.F. (2001).Substance abuse diagnosis within a
community sample of adults: Distinction, comorbidity, and progression over
time. Professional Psychology: Research and Practice, 32, 239-247. doi:10.1037
/0735-7028.32.3.239

195

Newton, J.S., Horner, R.H., Algozzine, R.F., Todd, A.W., & Algozzine, K.M. (2009).
Using a problem-solving model to enhance data-based decision making in
schools. In W. Sailor, G. Dunlap, G. Sugai, & R.H. Horner (Eds.), Handbook of
positive behavior support (pp. 551-580). New York: Springer.
Nielsen Company. (2008, December). An overview of home internet access in the U.S.
Retrieved from http://blog.nielsen.com/nielsenwire/wp-content/uploads
/2009/03/overview-of-home-internet-access-in-the-us-jan-6.pdf
Nixon, R.D., Sweeney, L., Erickson, D.B., & Touyz, S.W. (2003). Parent-child
interaction therapy: A comparison of standard and abbreviated treatments for
oppositional defiant preschoolers. Journal of Consulting and Clinical
Psychology, 71, 251-260. doi:10.1037/0022-006X.71.2.251
No Child Left Behind Act of 2001, Pub. L. No. 107-110, 115 Stat. 1425 (2002). Accessed
from http://www2.ed.gov/policy/elsec/leg/esea02/107-110.pdf
Ollendick, T. H., & King, N. J. (2004). Empirically supported treatments for children and
adolescents: Advances toward evidence-based practice. In P. M. Barrett & T. H.
Ollendick (Eds.) Handbook of interventions that work with children and
adolescents: Prevention and treatment (pp. 3-25). New York: John Wiley &
Sons.
Osher, D., Dwyer, K., & Jackson, S. (2004). Safe, supportive, and successful schools:
Step by step. Longmont, CO: Sopris West.
Ost, L., Svensson, L., Hellstrom, K., & Lindwall, R. (2001). One-session treatment of
specific phobia in youth: A randomized clinical trial. Journal of Consulting and
Clinical Psychology, 69, 814-824. doi:10.1037/0022-006X.69.5.814

196

Patterson, G.R., Reid, J.B., Jones, R.R., & Conger, R.E. (1975). A social learning
approach to family intervention: Families with aggressive children (Vol. 1).
Eugene, OR: Castalia.
Pealer, L., & Weiler, R.M. (2003). Guidelines for designing a web-delivered college
health risk behavior survey: Lessons learned from the University of Florida
behavior survey. Health Promotion Practice, 4, 171-179. doi:10.1177/
1524839902250772
Pediatric OCD Treatment Study Team. (2004). Cognitive-behavior therapy, sertraline,
and their combination for children and adolescents with obsessive-compulsive
disorder. Journal of the American Medical Association, 292, 1969-1976.
Peed, S., Roberts, M., & Forehand, R. (1977). Evaluation of the effectiveness of a
standardized parent training program in altering the interaction of mothers and
their noncompliant children. Behavior Modification, 1, 323-350. doi:10.1177
/014544557713003
Pelham, W. E., & Fabiano, G. A. (2008). Evidence-based psychosocial treatments for
attention-deficit/hyperactivity disorder. Journal of Clinical Child & Adolescent
Psychology, 37, 184-214. doi: 10.1080/15374410701818681
Pelham, W.E., Fabiano, G.A., & Massetti, G.M. (2005). Evidence-based assessment of
attention deficit hyperactivity disorder in children and adolescents. Journal of
Clinical Child and Adolescent Psychology, 34, 449-476. doi:10.1207/
s15374424jccp3403_5
Pelham, W.E., Gnagy, E.M., Greiner, A.R., Hoza, B., Hinshaw, S.P., Swanson, J.M.
(2000). Behavioral vs. behavioral and pharmacological treatment in ADHD

197

children attending a summer treatment program. Journal of Abnormal Child


Psychology, 28, 507-525. doi:10.1023/A:1005127030251
Pennsylvania Association for Retarded Citizens (P.A.R.C.) v. Commonwealth of
Pennsylvania, 334 F. Supp. 1257 (D.C. E.D. Pa. 1971), 343 F. Supp. 279 (D.C.
E.D. Pa. 1972).
Perpina, C., Botella, C., Banos, R., Marco, H., Alcaniz, M., & Quero, S. (1999). Body
image and virtual reality in eating disorders: Is exposure to virtual reality more
effective than the classical body image treatment? Cyberpsychology and
Behavior, 2, 149-155. doi:10.1089/cpb.1999.2.149
Pope, H.G., & Hudson, J.I. (1992). Is childhood sexual abuse a risk factor for bulimia
nervosa? American Journal of Psychiatry, 4, 455-463.
Porter, J., & Holzberg, B. C. (1978, Fall). The changing role of the school psychologist in
the Age of PL 94-142: From conducting testing to enhancing instruction.
Education of the Visually Handicapped, 71-74.
Presidents Commission on Mental Health. (1978). Task panel reports. Washington, DC:
U.S. Government Printing Office.
Rapee, R. M., Abbott, M. J., & Lyneham, H. J. (2006). Bibliotherapy for children with
anxiety disorders using written materials for parents: A randomized controlled
trial. Journal of Consulting and Clinical Psychology, 74, 436444. doi:10.1037/
0022-006X.74.3.436
Redpath, D.P., Reynolds, G.L., Jaffe, A., Fisher, D.G., Edwards, J.W., & DeAugustine,
N. (2006). Internet access and use among homeless and indigent drug users in
Long Beach, California. CyberPsychology & Behavior, 9, 548-551. doi:10.1089

198

/cpb.2006.9.548
Rehabilitation Act of 1973, Pub. L. No. 93-112 (1973). Retrieved from
http://www.dotcr.ost.dot.gov/documents/ycr/REHABACT.HTM
Reisman, J.M. (1966). The development of clinical psychology. New York: AppletonCentury-Crofts.
Reisner, E. H. (1915). Evolution of the common school. New York: Macmillan.
Reschly, D., Tilly, W. D., & Grimes, J. (2000). Special education in transition:
Functional assessment and noncategorical programming. Longmont, CO: Sopris
West.
Reschly, D. J., & Connolly, L. M. (1990). Comparisons of school psychologists in the
city and country: Is there a rural school psychology? School Psychology
Review, 19, 534-549.
Reschly, D. J., & Wilson, M. S. (1995). School psychology practitioners and faculty:
1986 to 1991-92 trends in demographics, roles, satisfaction, and system reform.
School Psychology Review, 24, 62-80.
Reynolds, C. R., Gutkin, T., Elliot, S. N., & Witt, J. C. (1984). School psychology:
Essentials of theory and practice. New York: John Wiley.
Reynolds, W.M., & Coats, K. (1986). A comparison of cognitive-behavioral therapy and
relaxation training for the treatment of depression in adolescents. Journal of
Consulting and Clinical Psychology, 54, 653-660. doi:10.1037/0022006X.54.5.653
Ringel, J., & Sturm, R. (2001). National estimates of mental health utilization and
expenditure for children in 1998. Journal of Behavioral Health Services &

199

Research, 28, 319-332. doi:10.1007/BF02287247


Roberts, C., Kane, R., Thomson, H., Bishop, B., & Hart, B. (2003). The prevention of
depressive symptoms in rural school children: A randomized controlled trial.
Journal of Consulting and Clinical Psychology, 71, 622-628. doi:10.1037/0022006X.71.3.622
Robinson, T., Smith, S.W., & Miller, M. (2002). Effect of a cognitive-behavioral
intervention on responses to anger by middle school students with chronic
behavior problems. Behavioral Disorders, 27, 256-271.
Rogers, S. J., & Vismara, L. A. (2008). Evidence-based comprehensive treatments for
early autism. Journal of Clinical Child & Adolescent Psychology, 37, 8-38.
doi: 10.1080/15374410701817808
Rohde, P., Clarke, G., Mace, D.E., Jorgensen, J.S., & Seeley, J.R. (2004). An efficacy/
effectiveness study of cognitive-behavioral treatment for adolescents with
comorbid major depression and conduct disorder. Journal of the American
Academy of Child and Adolescent Psychiatry, 43, 660-668. doi:10.1097/01.chi.
0000121067.29744.41
Root, R.W., & Resnick, R.J. (2003). An update on the diagnosis and treatment of
attention-deficit/hyperactivity disorder in children. Professional Psychology:
Research and Practice, 34-41. doi:10.1037/0735-7028.34.1.34
Rossello, J., & Bernal, G. (1999). Adapting cognitive-behavioral and interpersonal
treatments for depressed Puerto Rican adolescents. In E.D. Hibbs & P. Jensen
(Eds.), Psychosocial treatments for child and adolescent disorders (pp. 157-185).
Washington, DC: American Psychological Association.

200

Russel, G.F.M., Szmukler, G.I., Dare, C., & Eisler, I. (1987). An evaluation of family
therapy in anorexia nervosa and bulimia nervosa. Archives of General
Psychiatry, 44, 1047-1056.
Sackett, D. L., Rosenberg, W. M., Gray, J. A., Haynes, R. B., & Richardson, W. S.
(1996). Evidence based medicine: What it is and what it isnt. British Medical
Journal, 312, 71-72.
Sanders, M.R., Markie-Dadds, C., Tully, L.A., & Bor, W. (2000). The Triple P- Positive
Parenting Program: A comparison of enhanced, standard, and self-directed
behavioral family intervention for parents of children with early onset conduct
problems. Journal of Consulting and Clinical Psychology, 68, 624-640. doi:10.
1037/0022-006X.68.4.624
Sandoval, J. (1993). The history of interventions in school psychology. Journal of School
Psychology, 31, 195-217.
Santisteban, D.A., Coatsworth, D.J., Perez-Vidal, A., Kurtines, W.M., Schwartz, S.J.,
LaPerriere, A. (2003). Efficacy of brief strategic family therapy in modifying
hispanic adolescent behavior problems and substance abuse. Journal of Family
Therapy, 17, 121-133.
Sax, L.J., Gilmartin, S.K., & Bryant, A.N. (2003). Assessing response rates and
nonresponse bias in web and paper surveys. Research in Higher Education, 44,
409-432. doi:10.1023/A:1024232915870
Schmidt, U., Lee, S., Beecham, J., Perkins, S., Treasure, J., Yi, I. (2007). A
randomized controlled trial of family therapy and cognitive behavioral therapy
guided self-care for adolescents with bulimia nervosa and related disorders.

201

American Journal of Psychiatry, 164, 591-598. doi:10.1176/appi.ajp.164.4.591


School of Public Health and Health Professions, University at Buffalo (2005). Special
education laws. Retrieved from http://atto.buffalo.edu/registered/ATBasics/
Foundation/Laws/specialed.php
Schuhmann, E.M., Foote, R.C., Eyberg, S.M., Boggs, S.R., & Algina, J. (1998). Efficacy
of parent-child interaction therapy: Interim report of a randomized trial with
short-term maintenance. Journal of Clinical Child Psychology, 27, 34-45. doi:
10.1207/s15374424jccp2701_4
Schwanz, K.A., & Barbour, B. (2005). Problem-solving teams: Information for educators
and parents. In A.S. Canter, C.Z. Page, A.D. Roth, I. Romero, & A. Carroll
(Eds.), Helping children at home and school: Handouts for families and
educators (2nd ed., pp. 133-136). Bethesda, MD: National Association of School
Psychologists.
Severson, H.H., Walker, H.M., Hope-Doolittle, J., Kratochwill, T.R., & Gresham, F.M.
(2007). Proactive, early screening to detect behaviorally at-risk students: Issues,
approaches, emerging innovations, and professional practices. Journal of School
Psychology, 45, 193-223. doi:10.1016/j.jsp.2006.11.003
Shinn, M.R. (1989). Curriculum-based measurement: Assessing special children. New
York: Guilford.
Shinn, M.R. (2007). Identifying students at risk, monitoring performance, and
determining eligibility within response to intervention: Research on educational
need and benefit from academic intervention. School Psychology Review, 36,
601-617.

202

Shinn, M.R. (2008). Best practices in curriculum-based measurement in a problemsolving model. In A. Thomas & J. Grimes (Eds.), Best practices in school
psychology (5th ed., pp. 243-261). Bethesda, MD: National Association of School
Psychologists.
Silverman, W. K., & Hinshaw, S. P. (2008). The second special issue on evidence-based
psychosocial treatments for children and adolescents: A 10-year update. Journal
of Clinical Child & Adolescent Psychology, 37, 1-7. doi:10.1080/
15374410701817725
Silverman, W.K., Kurtines, W.M., Ginsburg, G.S., Weems, C.F., Lumpkin, P.W., &
Carmichael, D.H. (1999). Treating anxiety disorders in children with group
cognitive-behavioral therapy: A randomized clinical trial. Journal of Consulting
and Clinical Psychology, 67, 995-1003. doi:10.1037/0022-006X.67.6.995
Silverman, W.K., Kurtines, W.M., Ginsburg, G.S., Weems, C.F., Rabian, B., & Serafini,
L.T. (1999). Contingency management, self-control, and education support in the
treatment of childhood phobic disorders: A randomized clinical trial. Journal of
Consulting and Clinical Psychology, 67, 675-687. doi:10.1037/0022-006X.67.5.
675
Silverman, W. K., Pina, A. A., & Viswesvaran. (2008). Evidence-based psychosocial
treatments for phobic and anxiety disorders in children and adolescents. Journal
of Clinical Child and Adolescent Psychology, 37, 105-130. doi: 10.1080/
15374410701817907
Smith, D. K. (1984). Practicing school psychologists: Their characteristics, activities, and
populations served. Professional Psychology Research and Practice, 15, 798-810.

203

doi:10.1037/0735-7028.15.6.798
Smith, T., Lovaas, N.W., & Lovaas, O.I. (2002). Behaviors of children with highfunctioning autism when paired with typically developing versus delayed peers:
A preliminary study. Behavioral Interventions, 17, 129-143. doi:10.1002/bin.114
Sonuga-Barke, E.J.S., Daley, D., Thompson, M., Laver-Bradbury, C., & Weeks, A.
(2001). Parent-based therapies for preschool attention-deficit/hyperactivity
disorder: A randomized, controlled trial with a community sample. Journal of
the American Academy of Child and Adolescent Psychiatry, 40, 402-408. doi:
10.1097/00004583-200104000-00008
Spence, S.H., Donovan, C., & Brechman-Toussaint, M. (2000). The treatment of
childhood social phobia: The effectiveness of a social skills training-based,
cognitive-behavioral intervention, with and without parental involvement.
Journal of Child Psychology and Psychiatry, 41, 713-726. doi:10.1111/14697610.00659
Spence, S.H., Holmes, J.M., March, S., & Lipp, O.V. (2006). The feasibility and outcome
of clinic plus internet delivery of cognitive-behavior therapy for childhood
anxiety. Journal of Consulting and Clinical Psychology, 74, 614-621. doi:
10.1037/0022-006X.74.3.614
Stark, K.D., Reynolds, W.M., & Kaslow, N.J. (1987). A comparison of the relative
efficacy of self-control therapy and behavior problem-solving therapy for
depression in children. Journal of Abnormal Child Psychology, 15, 91-113.
doi:10.1007/BF00916468
Stark, K.D., Rouse, L., & Livingston, R. (1991). Treatment of depression during

204

childhood and adolescence: Cognitive behavioral procedures for the individual


and family. In P. Kendall (Ed)., Child and adolescent therapy (pp. 165-206).
New York: Guilford.
Steege, M. W., & Wacker, D. P. (1995). Best practices in evaluating the effectiveness of
applied interventions. In A. Thomas & J. Grimes (Eds.), Best practices in school
psychology (3rd ed., pp. 625-636). Washington, DC: National Association of
School Psychologists.
Stein, B.D., Jaycox, L.H., Kataoka, S.H., Wong, M., Tu, W., Elliot, M.N. (2003).
A mental health intervention for school children exposed to violence. Journal of
the American Medical Association, 290, 603-611. doi:10.1001/jama.290.5.603
Stewner-Manzanares, G. (1988). The Bilingual Education Act: Twenty years later. New
Focus: The National Clearninghouse for Bilingual Education, 6, 1-10. Accessed
from http://www.ncela.gwu.edu/files/rcd/BE021037/Fall88_6.pdf
Suldo, S., Friedrich, A., & Michalowski, J. (2010). Personal and systems-level factors
that limit and facilitate school psychologists involvement in school-based mental
health services. Psychology in the Schools, 47(4), 354-373.
Sulzer-Azaroff, B., & Mayer, G.R. (1991). Behavior analysis for lasting change.
Chicago: Holt, Rinehart, & Winston.
Talley, R. C., & Short, R. J. (1996). Social reforms and the future of school practice:
Implications for American psychology. Professional Psychology: Research and
Practice, 27(1), 5-13.
Telzrow, C.F. (1995). Best practices in facilitating intervention adherence. In A. Thomas
& J. Grimes (Eds.), Best practices in school psychology (3rd ed., pp. 501-518).

205

Washington, DC: National Association of School Psychologists.


Thienemann, M., Moore, P. & Tompkins, K. (2006). A parent only group intervention
for children with anxiety disorders: Pilot study. Journal of the American
Academy of Child and Adolescent Psychiatry, 45, 37-46. doi:10.1097
/01.chi.0000186404.90217.02
Tilly, W.D. (2008). The evolution of school psychology to science-based practice:
Problem solving and the three tiered model. In A. Thomas & J. Grimes (Eds.),
Best practices in school psychology (5th ed., pp. 17-35). Bethesda, MD: National
Association of School Psychologists.
Tilly, W.D., III, & Flugum, K.R. (1995). Best practices in ensuring quality interventions.
In A. Thomas & J. Grimes (Eds.), Best practices in school psychology (3rd ed.,
pp. 485-500). Washington, DC: National Association of School Psychologists.
Tolan, P.H., & Dodge, K.A. (2005). Childrens mental health as a primary care and
concern: A system for comprehensive support and service. American
Psychologist, 60, 601-614. doi:10.1037/0003-066X.60.6.601
Tourangeau, R., Couper, M., & Conrad, F. (2004). Spacing, position, and order:
Interpretive heuristics for visual features of survey questions. Public Opinion
Quarterly, 68, 368-393. doi:10.1093/poq/nfh035
Treatment for Adolescents with Depression Study (TADS) Team. (2004). Fluoxetine,
cognitive-behavioral therapy, and their combination for adolescents with
depression: Treatment for Adolescents with Depression Study (TADS)
randomized controlled trial. Journal of the American Medical Association, 292,
807-820. doi:10.1001/jama.292.7.807

206

Treptow, M.A., Burns, M.K., & McComas, J.J. (2006). Reading at the frustration,
instructional, and independent levels: Effects on student time on task and
comprehension. School Psychology Review, 36, 159-166.
Upah, K. R. F. (2008). Best practices in designing, implementing, and evaluating quality
interventions. In A. Thomas & J. Grimes (Eds.), Best practices in school
psychology (5th ed., pp.209-220). Bethesda, MD: National Association of School
Psychologists.
U.S. Department of Education. (2003). Proven methods: Questions and answers on No
Child Left Behind. Retrieved from http://www2.ed.gov/nclb/methods/whatworks/
doing.html
U.S. Department of Education. (2009). Race to the Top program executive summary.
Retrieved from http://www2.ed.gov/programs/racetothetop/executivesummary.pdf
U. S. Department of Education (2010). An overview of the U. S. Department of
Education. Retrieved from http://www2.ed.gov/about/overview/focus/what.html
U.S. Department of Education Institute of Education Sciences. (2011). National
Assessment of Education Progress (NAEP). Retrieved from http://nces.ed.gov/
nationsreportcard/about/
U.S. Department of Education, Office of Planning, Evaluation and Policy Development.
(2010). ESEA Blueprint for reform. Retrieved from http://www2.ed.gov/policy/
Elsec/leg/blueprint/blueprint.pdf
U. S. Department of Health and Human Services. (1999). Mental health: A report of the
Surgeon General. Rockville, MD: Author.

207

U. S. National Archives and Records Administration. (2011). Records of the National


Institute of Education [NIE]. Retrieved from http://www.archives.gov/research/
guide-fed-records/groups/419.html
University of the State of New York. Regulations concerning qualifications for
certification of school psychologists. Albany, NY: Author.
VanDerHeyden, A.M., & Burns, M.K. (2010). Essentials of response to intervention.
Hoboken, NJ: John Wiley & Sons.
Van Selm, M., & Jankowski, N.W. (2006). Conducting online surveys. Quality and
Quantity, 40, 435-456. doi:10.1007/s11135-005-8081-8
Van Sickle, J. H., Witmer, L., & Ayers, L. P. (1911). Provisions for exceptional children
in public schools. Washington, DC: U. S. Government Printing Office.
Vaux, A., & Briggs, C.S. (2006). Conducting mail and internet surveys. In F.T.L. Leong
& J.T. Austin (Eds.), The psychology research handbook (2nd ed., pp. 186-209).
Thousand Oaks, CA: Sage Publications.
Velleman, R., & Aris, A. (2010). Counseling and helping (2nd ed.). West Sussex, UK:
BPS Blackwell.
Volpe, R.J., & Gadow, K.D. (2010). Creating abbreviated rating scales to monitor
classroom inattention-overactivity, aggression, and peer conflict: Reliability,
validity, and treatment sensitivity. School Psychology Review, 39(3), 350-363.
Waldron, H.B., Hops, H., Brody, J., Turner, C.W., Davis, B., Barrera, M. (2007).
Treatments for hispanic and anglo drug-abusing youth. Unpublished manuscript,
Oregon Research Institute.
Waldron, H.B., Ozechowski, T.J., Turner, C.W., & Brody, J. (2005, March). Treatment

208

outcomes for youth with problem alcohol use. Paper presented at the 2005 Joint
Meeting on Adolescent Effectiveness, Washington, DC.
Waldron, H.B., Slesnick, N., Brody, J.L., Turner, C.W., & Peterson, T.R. (2001).
Treatment outcomes for adolescent substance abuse at 4- and 7-month
assessments. Journal of Consulting and Clinical Psychology, 69, 802-813.
doi:10.1037/0022-006X.69.5.802
Waldron, H. B., & Turner, C. W. (2008). Evidence-based psychosocial treatments for
adolescent substance abuse. Clinical Child and Adolescent Psychology, 37, 238261. doi: 10.1080/15374410701820133
Walker, H. M., Horner, R. H., Sugai, G., Bullis, M., Sprague, J., Bricker, D.
(1996). Integrated approaches to preventing antisocial behavior patterns among
school-age children and youth. Journal of Emotional and Behavioral Disorders,
4, 194-209. doi:10.1177/106342669600400401
Walker, H.M., Kavanagh, K., Stiller, B., Golly, A., Seversen, H.H., & Feil, E.G. (1998).
First step to success: An early intervention approach for preventing school
antisocial behavior. Journal of Emotional and Behavioral Disorders, 6, 66-80.
doi:10.1177/106342669800600201
Wallin, J. E. W. (1914). The mental health of the school child. New Haven, CT: Yale
University Press.
Wallin, J. E. W., & Ferguson, D.G. (1967). The development of school psychological
services. In J. F. Magary (Ed.), School psychological services in theory and
practice: A handbook (pp. 1-29). Englewood Cliffs, NJ: Prentice-Hall.
Wang, M. C. Haertel, G. D., & Walberg, H. J. (1990). What influences leaning? A

209

content analysis of review literature. Journal of Educational Research, 84, 30-43.


Washington Research Project & NAACP Legal Defense and Educational Fund. (1969).
Title I of ESEA: Is It Helping Poor Children? (ERIC No. EDO 36600). Retrieved
from http://www.eric.ed.gov/PDFS/ED036600.pdf
Webster-Stratton, C., & Hammond, M. (1997). Treating children with early onset
conduct problems: A comparison of child and parent training interventions.
Journal of Consulting and Clinical Psychology, 65, 93-109. doi:10.1037/
0022-006X.65.1.93
Webster-Stratton, C., Reid, M., & Hammond, M. (2001). Social skills and problemsolving training for children with early-onset conduct problems: Who benefits?
Journal of Child Psychology and Psychiatry, 42, 943-952. doi:10.1111/14697610.00790
Webster-Stratton, C., Reid, M., & Hammond, M. (2004). Treating children with earlyonset conduct problems: Intervention outcomes for parent, child, and teacher
training. Journal of Clinical Child & Adolescent Psychology, 33, 105-124.
doi:10.1207/S15374424JCCP3301_11
Weiss, B., Harris, V., Catron, T., & Han, S. (2003). Efficacy of RECAP intervention
program for children with concurrent internalizing and externalizing problems.
Journal of Consulting and Clinical Psychology, 71, 364-374. doi:10.1037/0022006X.71.2.364
Weisz, J., Donenberg, G., Han, S., & Weiss, B. (1995). Bridging the gap between lab and
clinic in child and adolescent psychotherapy. Journal of Consulting and Clinical
Psychology, 63, 688-701. doi:10.1037//0022-006X.63.5.688

210

Weisz, J.R., Thurber, C., Sweeney, L., Profitt, V., & LeGagnoux, G. (1997). Brief
treatment of mild to moderate child depression using primary and secondary
control enhancement training. Journal of Consulting and Clinical Psychology,
65, 703-707. doi:10.1037/0022-006X.65.4.703
Wells, K.C., & Egan, J. (1988). Social learning and systems family therapy for childhood
oppositional disorder: Comparative treatment outcome. Comprehensive
Psychiatry, 29, 138-146. doi:10.1016/0010-440X(88)90006-5
West, A.E., Jacobs, R.H., Westerholm, R., Lee, A., Carbray, J., Heidenreich, & Pavuluri,
M.N. (2009). Child and family-focused cognitive-behavioral therapy for pediatric
bipolar disorder: Pilot study of group treatment format. Journal of the Canadian
Academy of Child and Adolescent Psychiatry, 18(3), 239-246.
Williams, S. (2010). RTI: Practical strategies for school psychologists (grades K-12).
Medina, WA: Institute for Educational Development.
Witmer, L. (1897). The organization of practical work in psychology. Psychological
Review, 4, 116-117.
Wolf, M.M. (1978). Social validity: The case for subjective measurement, or how applied
behavior analysis is finding its heart. Journal of Applied Behavior Analysis, 11,
203-214. doi:10.1901/jaba.1978.11-203
Wood, A., Harrington, R., & Moore, A. (1996). Controlled trial of a brief cognitivebehavioral intervention in adolescent patients with depressive disorders. Journal
of Child Psychology and Psychiatry, 37, 737-746. doi:10.1111/j.1469-7610.1996
.tb01466.x
Wood, J.J., Piacentini, J.C., Southam-Gerow, M., Chu, B., & Sigman, M. (2006). Family

211

cognitive behavioral therapy for child and anxiety disorders. Journal of the
American Academy of Child and Adolescent Psychiatry, 45, 314-321. doi:10.1097
/01.chi.0000196425.88341.b0
Wright, P. W. D., & Wright, P. D. (2009). Special education law (2nd ed.) Hartfield, VA:
Harbor House Law Press.
Yates, M. A. (2003). A survey of the counseling practices of school psychologists.
(Unpublished doctoral dissertation). University at Albany, State University of
New York, Albany, NY.
Yeaton, W.H., & Schrest, L. (1981). Critical dimensions in the choice and maintenance
of successful treatments: Strength, integrity, and effectiveness. Journal of
Consulting and Clinical Psychology, 49, 156-167. doi:10.1037/0022-006X.49.2.
156
Young, M. E. & Fristad, M.A., (2007). Evidence based treatments for bipolar disorder in
children and adolescents. Journal of Contemporary Psychotherapy, 37, 157-164.
doi:10.1007/s10879-007-9050-4
Yu, D.L., & Seligman, M.E.P. (2002). Preventing depressive symptoms in Chinese
children. Prevention and Treatment, 5, Article 9.

212

APPENDIX A: SCHOOL PSYCHOLOGIST SURVEY


Survey of School Psychologists Counseling Training and Practice
(Reformatted to Align With APA Style)
Instructions: Please complete the following questions to the best of your ability.
1) Do you provide group and/or individual counseling services?
o I provide only group counseling.
o I provide only individual counseling
o I provide both group and individual counseling.
2) What group(s) of children do you serve in your counseling practice?
o Special education students
o General education students
o Both special and general education students
3) On average, how many students do you recommend discontinuing counseling
services for each year?
o 0
o 1
o 2
o 3
o 4
o 5
o 6
o 7
o >7
4) What is the most common reason for you to recommend discontinuing counseling
services for a student?
o Individual counseling goals have been met
o Counseling does not appear to have a positive effect on the students
behavior
o Student leaves the school or district
o Parents prefer that counseling be discontinued
o Other (please specify)
_____________________________
5) How many years have you been a school psychologist employed in a school
setting?
o 0-5
o 6-10
o >10
6) What are the grade levels of the students you predominantly work with (please
check all that apply)?
o Elementary school students
o Middle/Junior high school students
o High school students
7) Please estimate the percentage of time you spend each year on the following
activities.
213

__________ Assessment
__________ Direct Interventions
__________ Consultation and Indirect services
__________ Research
__________ Administration
__________ Systems-level activities
__________ Other
8) In what type of school do you primarily work?
o Rural
o Suburban
o Urban
o Mixed
o Other (please specify)
______________________________
9) What is the psychologist:student ratio at your school district?
o 1:<500
o 1:500-999
o 1:1000-1499
o 1:1500-2000
o 1:>2000
10) What region do you work in?
o Northeast (Connecticut, Maine, Massachusetts, New Hampshire, Rhode
Island, Vermont, New Jersey, New York, Pennsylvania)
o Midwest (Indiana, Illinois, Michigan, Ohio, Wisconsin, Iowa, Nebraska,
Kansas, North Dakota, Minnesota, South Dakota, Missouri)
o South (Delaware, District of Columbia, Florida, Georgia, Maryland, North
Carolina, South Carolina, Virginia, West Virginia, Alabama, Kentucky,
Mississippi, Tennessee, Arkansas, Louisiana, Oklahoma, Texas)
o West (Arizona, Colorado, Idaho, New Mexico, Montana, Utah, Nevada,
Wyoming, Alaska, California, Hawaii, Oregon, Washington)
11) Have you attended any continuing education programs over the past 5 years that
were specifically focused on (check all that apply):
o The No Child Left Behind Act
o Accountability for student academic and behavioral outcomes
o The provision of counseling services
o Evidence-based behavioral interventions
o Data-based decision making
o Response to Intervention
o Other (please specify)
______________________________
12) How many years has it been since you received your last degree?
o 0-5
o 6-10
o >10
13) Was your graduate program accredited by (check all that apply):
o NASP
214

o APA
o NCATE
o Your state
o Not accredited
14) What is the highest degree that you have earned?
o MA/MS
o Certificate/Specialist
o PhD/PsyD/EdD
o Other (please specify)
______________________________
15) Did your graduate academic training include specific coursework in the following
areas (please check all that apply):
o Academic interventions
o Behavioral interventions
o Counseling and Psychotherapy with children
o Counseling children with developmental disabilities
o Group counseling
o Multicultural counseling
16) When planning, implementing, or up-dating a counseling intervention, do you use
any print or online resources (e.g., IEP Pro, IEP Direct) to help you write goals,
clarify the problem, or determine expectations for the student?
o Yes
o No
17) When planning for a counseling intervention, do you come up with a behavioral
definition of the problem?
o Yes
o No
When coming up with a definition of the behavior to be addressed in counseling,
what factors do you include?
Yes
No
18) Action verbs describing
o
o
what the student does in
observable terms
19) Frequency (the number of
o
o
times the behavior occurs
during an observation
period
20) Latency (how much time
o
o
passes between the
presentation of a stimulus
and the students response
or behavior)
21) Intensity (the strength or
o
o
force with which the
behavior is displayed)
22) Topography (the
o
o
215

configuration, shape, or
form of the behavior)
23) Accuracy (a measure of
how the students
behavior is correct or fits
a standard)
24) Duration (how much time
passes between the onset
and the ending of a
behavior)

25) When planning for a counseling intervention, do you collect baseline data before
beginning the intervention?
o Yes
o No
When you collect baseline data, how often do you use each of the following
techniques?
Never
Sometimes
Always
26) Direct behavioral
o
o
o
observation
27) 3rd party behavior
o
o
o
rating (from parent,
teacher, or related
service provider)
28) Sociometric
o
o
o
techniques
29) 3rd party interview
o
o
o
30) Objective self-report
o
o
o
31) Projectiveo
o
o
expressive technique
32) On average, how many baseline data points do you collect in order to establish a
stable pattern of the students behavior?
o 1
o 2
o 3
o 4
o 5
o 6
o 7 or greater
33) When planning for a counseling intervention, do you validate the problem
behavior by comparing the identified student with a peer or a standard of
performance?
o Yes
o No
34) When planning for a counseling intervention, do you analyze the problem
216

behavior by developing and testing hypotheses related to its function?


o Yes
o No
35) When planning for a counseling intervention, do you set one or more behavioral
goals using clear and measurable criteria, defining what the student will be able to
do if the intervention is effective?
o Yes
o No
When setting a behavioral goal for a counseling intervention, how often do you
use each of the following components?
Never
Sometimes
Always
36) Timeframe (when the
o
o
o
expected progress
will be made in terms
of days, weeks, and
months)
37) Condition (the
o
o
o
specific
circumstances in
which the behavior
will occur)
38) Behavior (written in
o
o
o
objective, observable,
and measurable terms
describing what the
student will be able to
do)
39) Criteria (a standard
o
o
o
for how well the
behavior is to be
performed)
40) When planning for a counseling intervention, do you come up with an
intervention plan?
o Yes
o No
When writing a counseling intervention plan, how often to you use each of the
following components?
Never
Sometimes
Always
41) A clear description
o
o
o
of the procedures
to be used
42) Documentation
o
o
o
that the strategies
to be used have
been empirically
217

validated in the
literature on
evidence-based
interventions
43) A description of
the specific steps
and activities that
will be engaged in
during counseling
sessions
44) A description of
how each step or
activity will be
completed
45) The materials
needed for each
step or activity
46) A description of
what each person
engaged in the
activity will do
47) The location
where the
intervention is to
take place

48) When planning for a counseling intervention, do you come up with a plan to
measure the problem behavior?
o Yes
o No
When coming up with a plan for measuring the target behavior, how often do you
include each of the following components?
Never
Sometimes
Always
49) A behavioral
o
o
o
definition of
the target
behavior
50) A clear
o
o
o
description
of where the
behavior will
be measured
51) A clear
o
o
o
description
of when the
behavior will
218

be measured
52) A clear
delineation
of who will
measure the
behavior
53) A
description
of the
recording
method most
appropriate
for the
behavior
54) A
description
of the most
appropriate
recording
measure

55) When planning for an intervention, do you come up with a decision-making plan
for determining how behavioral data on the student will be collected and
interpreted?
o Yes
o No
When developing a decision-making plan, how often do you use each of the
following components?
Never
Sometimes
Always
56) A
o
o
o
determination
of the
frequency of
behavioral
measurements
and data to be
collected
57) A decision on
o
o
o
how the data
will be
summarized
for the
purposes of
intervention
evaluation
(e.g., visual
219

presentation,
written report
or summary)
58) A
determination
of how many
behavioral
data points
will be
collected
before the
intervention
data will be
analyzed
59) A
determination
of how much
time will pass
before the
intervention
data will be
analyzed
60) A set of
decision rules
for responding
to specific
data points

61) During implementation of a counseling intervention, do you collect progress


monitoring data on the students behavior?
o Yes
o No
When you collect progress monitoring data, how often do you use each of the
following techniques?
Never
Sometimes
Always
62) Direct
o
o
o
behavioral
observation
63) 3rd party
o
o
o
behavior rating
scales (from
parent, teacher,
or related
service
provider)
64) Sociometric
o
o
o
220

techniques
65) Interviews
66) Objective selfreport
measures
67) Projectiveexpressive
techniques

o
o

o
o

o
o

68) On average, how many progress monitoring data points do you collected to
established a stable pattern of the students behavior?
o 1
o 2
o 3
o 4
o 5
o 6
o 7
o 8
o >8
69) Do you use the same method for collecting baseline data points as you do for
collecting progress monitoring data points?
o Yes
o Sometimes, depending on the situation
o No
70) During the implementation of a counseling intervention, do you engage in any
formative assessment of the students behavior?
o Yes
o No
When you engage in formative assessment of the students behavior, what sources
of data do you consider?
Yes
No
71) The level of the
o
o
behavior (how much
the behavior is
occurring during
baseline and
intervention phases
as judged by
repeated, objective
measurements of its
frequency, duration,
intensity, or the
percentage of
intervals in which it
occurs)
221

72) The trend of the


o
o
behavior (whether
the level of the
behavior is
increasing or
decreasing within
the baseline and
intervention phases)
73) Anecdotal
o
o
information from the
student, his/her
family, teachers, or
related service
providers
74) Your own
o
o
subjective
assessment of the
students behavior
75) Data documenting
o
o
the students
performance in
school (e.g., work
samples, grades,
attendance records,
behavioral referrals)
76) Do you measure the treatment integrity with which you implement counseling
intervention?
o Yes
o Sometimes, depending on the situation
o No
What methods do you use to measure the treatment integrity of the counseling
interventions you implement?
Yes
No
77) Self-Report
o
o
78) Logs documenting
o
o
sessions
79) Checklists for
o
o
intervention
components
80) Permanent products
o
o
of student work
81) Direct observation
o
o
by a 3rd party not
directly involved in
implementation

222

82) At the end of a counseling intervention, do you engage in any summative


assessment activities of the students behavior and the effectiveness of the
intervention?
o Yes
o No
When you engage in summative assessment of the students behavior, what
sources of data do you consider?
Yes
No
83) The level of the
o
o
behavior (how much
the behavior is
occurring during
baseline and
intervention phases
as judged by
repeated, objective
measurements of its
frequency, duration,
intensity, or the
percentage of
intervals in which it
occurs)
84) The trend of the
o
o
behavior (whether
the level of the
behavior is
increasing or
decreasing within
the baseline and
intervention phases)
85) Anecdotal
o
o
information from the
student, his/her
family, teachers, or
related service
providers
86) Your own
o
o
subjective
assessment of the
students behavior
87) Data documenting
o
o
the students
performance in
school (e.g., work
samples, grades,
attendance records,
223

behavioral referrals)
88) Thank you for taking the time to complete this survey! If you have any feedback
or comments related to this experience that you would like to share with the
researchers, please feel free to enter it here.

Thank you!
If you would like to have your name entered in to a raffle for one of two $50.00 gift
certificates, please click this link: https://www.psychdata.com/s.asp?SID=143237. You
will then be prompted to provide your name and email address. The information you
provided on the survey will in no way be connected with your contact information.
Winners will be chosen at random and notified once all data have been collected.
If you have any further questions, please feel free to contact me or my research advisor.
Rebecca Cole, M.S.
Doctoral Candidate
School Psychology
rmcole@albany.edu
Deborah Kundert
Dissertation Chair
School Psychology
dkundert@albany.edu

224

APPENDIX B: COVERLETTER EMAIL


Dear School Psychologist Colleagues:
Meeting the academic and behavioral needs of students in schools are two traditional
roles that school psychologists of the past and present have addressed in their practice.
Counseling as a direct intervention is one method that school psychologists have used to
address emotional and behavioral challenges faced by students. Current school
professionals find themselves in a climate where accountability for positive student
outcomes has been made a priority. Many school psychologists are familiar with the
problem-solving model, the Response to Intervention (RTI) movement, and with more
specific aspects of these paradigms related to repeated measures of student performance,
and data-based decision making.
At the end of this email, you will find a link to a survey that has been designed to gather
information on the counseling practices of school psychologists. In addition to
describing current counseling practices, it will also provide information on the
availability of counseling services based on current research and best practices.
This survey has been emailed to a random sample of school psychologists in the US who
are listed in the directory of Nationally Certified School Psychologists. Participation in
this study is strictly voluntary, and entails no known risks or discomforts. You are free to
skip any questions you do not wish to answer, and you may withdraw your responses at
any time. Completing this survey indicates your consent to participate in this study. All
responses are anonymous, and specific identifying information will not be collected.
Only group responses will be reported (e.g., type of school, rural, suburban, urban). As a
token of our appreciation, at the end of the survey, you will be directed to a separate link,
where you can provide identifying information for the purpose of entering a raffle for one
of two $50.00 gift certificates. If you have any questions concerning your rights as a
subject, you may contact the Office of Regulatory Research Compliance at (518) 4429050, or at orrc@uamail.albany.edu.
It is estimated that completion of this survey will require approximately 20 minutes of
your time. You can access the survey by clicking here. Thank you in advance for your
time and cooperation.
Sincerely,
Rebecca Cole
Doctoral Student
School Psychology
rc2398@albany.edu

Deborah Kundert, PhD


Dissertation Chairperson
School Psychology
Dkundert@albany.edu

225

APPENDIX C: FOLLOW-UP EMAIL


Date
Dear School Psychologist Colleague,
You recently received an email requesting your participation in a survey related to the
current counseling practices of school psychologists. If you have already responded to
this survey, we appreciate your time and input.
If you have not yet completed this survey, please do so as soon as possible. As
researchers, we are interested in obtaining a complete picture of the counseling practices
of school psychologists, and this is not possible without your responses!
You can access the survey by clicking here.
Thank you again for your time and cooperation!
Sincerely,
Rebecca Cole
Doctoral Student
School Psychology
rc2398@albany.edu

Deborah Kundert, PhD


Dissertation Chairperson
School Psychology
Dkundert@albany.edu

226

APPENDIX D: PILOT SURVEY FEEDBACK QUESTIONS


Dear School Psychologist,
Thank you for taking the time to complete a pilot version of my dissertation survey. Here
are several questions related to this survey. Your responses will help me to make any
necessary revisions and improvements. If you could provide your responses to me in an
email, I would appreciate it.
1) How long did it take you to complete this survey?
2) Did you find any questions to be confusing, or difficult to understand? If so,
which ones?
3) Did you understand all of the terms used?
4) Were there any questions that you thought were unnecessary or irrelevant? If so,
which ones?
5) Do you have any recommendations for making completing this survey a more
positive experience?
Thank you again for your time and assistance.
Sincerely,
Rebecca Cole
Doctoral Candidate
School Psychology

227

8)

228

2) Approximately how
many students do
school psychologists
recommend
declassifying from
counseling each year,
and what reasons are
most commonly cited
when making this
recommendation?

228

3) On average, how many students do you recommend discontinuing


counseling services for each year?
o 0
o 1
o 2
o 3
o 4
o 5
o 6
o 7
o >7
4) What is the most common reason for you to recommend discontinuing
counseling services for a student?
o Individual counseling goals have been met
o Counseling does not appear to have a positive effect on the
students behavior
o Student leaves the school or district

General Counseling Practices of School Psychologists


Research Question
Corresponding Survey Question(s)
1) What percentage of
1) Do you provide group and/or individual counseling services?
school psychologists
o I provide only group counseling.
provide group and/or
o I provide only individual counseling
individual
o I provide both group and individual counseling.
counseling? Are
2) What group(s) of children do you serve in your counseling practice?
school psychologists
o Special education students
counseling general or
o General education students
special education
o Both special and general education students
students, or both?

APPENDIX E: DATA ANALYSIS

Percentages

Frequencies

Data Analyses
Percentages

229
9)

8)

3) Are there any


5)
demographic
differences (e.g.,
training, professional
development, years of
experience, other
6)
roles and
responsibilities)
related to school
psychologists group
and individual
7)
counseling practices?

Research Question

229

Corresponding Survey Question(s)


o Parents prefer that counseling be discontinued
o Other (please specify)
How many years have you been a school psychologist employed in a
school setting?
o 0-5
o 6-10
o >10
What are the grade levels of the students you serve (please check all
that apply)?
o Elementary school students
o Middle/Junior high school students
o High school students
Please estimate the percentage of time you spend each year on the
following activities.
__________ Assessment
__________ Direct Interventions
__________ Consultation and Indirect services
__________ Research
__________ Administration
__________ Systems-level activities
__________ Other
In what type of school do you primarily work?
o Rural
o Suburban
o Urban
o Mixed
o Other (please specify)
What is the psychologist:student ratio at your school district?
o 1:<500
o 1:500-999
Percentages,
Chi Square
Analyses

Data Analyses

230

Research Question

230

Corresponding Survey Question(s)


Data Analyses
o 1:1000-1499
o 1:1500-2000
o 1:>2000
10) What region do you work in?
o Northeast (Connecticut, Maine, Massachusetts, New
Hampshire, Rhode Island, Vermont, New Jersey, New York,
Pennsylvania)
o Midwest (Indiana, Illinois, Michigan, Ohio, Wisconsin, Iowa,
Nebraska, Kansas, North Dakota, Minnesota, South Dakota,
Missouri)
o South (Delaware, District of Columbia, Florida, Georgia,
Maryland, North Carolina, South Carolina, Virginia, West
Virginia, Alabama, Kentucky, Mississippi, Tennessee,
Arkansas, Louisiana, Oklahoma, Texas)
o West (Arizona, Colorado, Idaho, New Mexico, Montana, Utah,
Nevada, Wyoming, Alaska, California, Hawaii, Oregon,
Washington)
11) Have you attended any continuing education programs over the past 5
years that were specifically focused on (check all that apply):
o The No Child Left Behind Act
o Accountability for student academic and behavioral outcomes
o The provision of counseling services
o Evidence-based behavioral interventions
o Data-based decision making
o Response to Intervention
o Other (please specify)
12) How many years has it been since you received your last degree?
o 0-5
o 6-10
o >10

231

4) What type of training


and professional
development have school
psychologists received
related to planning and
implementing counseling
as a direct social
emotional behavioral
intervention?

Research Question

231

Corresponding Survey Question(s)


13) Was your graduate program accredited by (check all that apply):
o NASP
o APA
o NCATE
o Your state
o Not accredited
14) What is the highest degree that you have earned?
o MA/MS
o Certificate/Specialist
o PhD/PsyD/EdD
o Other (please specify)
15) Did your graduate academic training include specific coursework in
the following areas (please check all that apply):
o Academic interventions
o Behavioral interventions
o Counseling and Psychotherapy with children
o Counseling children with developmental disabilities
o Group counseling
o Multicultural counseling
11) Have you attended any continuing education programs over the past 5
years that were specifically focused on (check all that apply):
o The No Child Left Behind Act
o Accountability for student academic and behavioral outcomes
o The provision of counseling services
o Evidence-based behavioral interventions
o Data-based decision making
o Response to Intervention
o Other (please specify)
14) What is the highest degree that you have earned?
o MA/MS
Percentages,
Chi Square
Analyses

Data Analyses

232

Research Question

232

Corresponding Survey Question(s)


o Certificate/Specialist
o PhD/PsyD/EdD
o Other (please specify)
15) Did your graduate academic training include specific coursework in
the following areas (please check all that apply):
o Academic interventions
o Behavioral interventions
o Counseling and Psychotherapy with children
o Counseling children with developmental disabilities
o Group counseling

Data Analyses

233
233

School Psychologists Use of Best Practices Related to the Problem-Solving Model


Research Question
Corresponding Survey Question(s)
5) Which aspects of the
16) When planning, implementing, or up-dating a counseling intervention,
general problem-solving
do you use any print or online resources (e.g., IEP Pro, IEP Direct) to
model (i.e., behavioral
help you write goals, clarify the problem behavior, or determine
definition, baseline data,
expectations for the student?
problem validation,
o Yes
problem analysis, goalo No
setting, intervention plan 17) When planning for a counseling intervention, do you come up with a
development,
behavioral definition of the problem?
measurement strategy,
o Yes
decision-making plan,
o No
progress monitoring,
25) When planning for a counseling intervention, do you collect baseline
formative evaluation,
data before beginning the intervention?
treatment integrity, and
o Yes
summative evaluation)
o No
are most commonly used 33) When planning for a counseling intervention, do you validate the
when designing and
problem behavior by comparing the identified student with a peer or a
implementing counseling
standard of performance?
as a direct intervention?
o Yes
o No
34) When planning for a counseling intervention, do you analyze the
problem behavior by developing and testing hypotheses related to its
function?
o Yes
o No
35) When planning for a counseling intervention, do you set one or more
behavioral goals using clear and measurable criteria, defining what the
student will be able to do if the intervention is effective?
o Yes
o No
Data Analyses
Percentages

234

Research Question

234

Corresponding Survey Question(s)


40) When planning for a counseling intervention, do you come up with an
intervention plan?
o Yes
o No
48) When planning for a counseling intervention, do you come up with a
plan to measure the problem behavior?
o Yes
o No
55) When planning for an intervention, do you come up with a decisionmaking plan for determining how behavioral data on the student will
be collected and interpreted?
o Yes
o No
61) During implementation of a counseling intervention, do you collect
progress monitoring data on the students behavior?
o Yes
o No
70) During the implementation of a counseling intervention, do you
engage in any formative assessment of the students behavior?
o Yes
o No
76) Do you measure the treatment integrity with which you implement
counseling intervention?
o Yes
o Sometimes, depending on the situation
o No
82) At the end of a counseling intervention, do you engage in any
summative assessment activities of the students behavior and the
effectiveness of the intervention?
o Yes

Data Analyses

235

6) How frequently are


specific components of
selected steps of the
problem-solving model
(i.e., behavioral
definition, baseline data,
behavioral goal,
intervention plan
development,
measurement strategy,
decision-making plan,
progress monitoring,
formative evaluation,
treatment integrity, and
summative evaluation)
implemented by school
psychologists as they
design and implement
counseling as a direct
intervention?

Research Question

Data Analyses

235

32) On average, how many baseline data points do you collect in order to
establish a stable pattern of the students behavior?
o 1
o 2
o 3
o 4

When you collect baseline data, how often do you use each of the
following techniques? (Never, Sometimes, Always)
26) Direct behavioral observation
27) 3rd party behavior rating (from parent, teacher, or related service provider)
28) Sociometric techniques
29) 3rd party interview
30) Objective self-report
31) Projective-expressive technique

o No
When coming up with a definition of the behavior to be addressed in
Percentages
counseling, what factors do you include? (Yes, No)
18) Action verbs describing what the student does in observable terms
19) Frequency (the number of times the behavior occurs during an observation
period
20) Latency (how much time passes between the presentation of a stimulus and
the students response or behavior)
21) Intensity (the strength or force with which the behavior is displayed)
22) Topography (the configuration, shape, or form of the behavior)
23) Accuracy (a measure of how the students behavior is correct or fits a
standard)
24) Duration (how much time passes between the onset and the ending of a
behavior)

Corresponding Survey Question(s)

236

Research Question

Data Analyses

236

When coming up with a plan for measuring the target behavior, how often
do you include each of the following components? (Never, Sometimes,
Always)
49) A behavioral definition of the target behavior

When writing a counseling intervention plan, how often to you use each
of the following components? (Never, Sometimes, Always)
41) A clear description of the procedures to be used
42) Documentation that the strategies to be used have been empirically validated in
the literature on evidence-based interventions
43) A description of the specific steps and activities that will be engaged in during
counseling sessions
44) A description of how each step or activity will be completed
45) The materials needed for each step or activity
46) A description of what each person engaged in the activity will do
47) The location where the intervention is to take place

When setting a behavioral goal for a counseling intervention, how often


do you use each of the following components?
(Never, Sometimes, Always)
36) Timeframe (when the expected progress will be made in terms of days,
weeks, and months)
37) Condition (the specific circumstances in which the behavior will occur)
38) Behavior (written in objective, observable, and measurable terms
describing what the student will be able to do)
39) Criteria (a standard for how well the behavior is to be performed)

o 5
o 6
o 7 or greater

Corresponding Survey Question(s)

237

Research Question

237

68) On average, how many progress monitoring data points do you

When you collect progress monitoring data, how often do you use each of
the following techniques? (Never, Sometimes, Always)
62) Direct behavioral observation
63) 3rd party behavior rating scales (from parent, teacher, or related service
provider)
64) Sociometric techniques
65) Interviews
66) Objective self-report measures
67) Projective-expressive techniques

When developing a decision-making plan, how often do you use each of


the following components? (Never, Sometimes, Always)
56) A determination of the frequency of behavioral measurements and
data to be collected
57) A decision on how the data will be summarized for the purposes of
intervention evaluation (e.g., visual presentation, written report or
summary)
58) A determination of how many behavioral data points will be collected
before the intervention data will be analyzed
59) A determination of how much time will pass before the intervention
data will be analyzed
60) A set of decision rules for responding to specific data points

Corresponding Survey Question(s)


Data Analyses
50) A clear description of where the behavior will be measured
51) A clear description of when the behavior will be measured
52) A clear delineation of who will measure the behavior
53) A description of the recording method most appropriate for the behavior
54) A description of the most appropriate recording measure

238

Research Question

Data Analyses

238

When you engage in formative assessment of the students behavior, what


sources of data do you consider? (Yes, No)
71) The level of the behavior (how much the behavior is occurring during
baseline and intervention phases as judged by repeated, objective
measurements of its frequency, duration, intensity, or the percentage of
intervals in which it occurs)
72) The trend of the behavior (whether the level of the behavior is increasing or
decreasing within the baseline and intervention phases)
73) Anecdotal information from the student, his/her family, teachers, or related
service providers
74) Your own subjective assessment of the students behavior
75) Data documenting the students performance in school (e.g., work samples,
grades, attendance records, behavioral referrals)

69) Do you use the same method for collecting baseline data points as you
do for collecting progress monitoring data points?
o Yes
o Sometimes, depending on the situation
o No

Corresponding Survey Question(s)


collected to established a stable pattern of the students behavior?
o 1
o 2
o 3
o 4
o 5
o 6
o 7
o 8
o >8

239
239

Percentages,
Chi Square
Analyses

When you engage in summative assessment of the students behavior,


what sources of data do you consider? (Yes, No)
83) The level of the behavior (how much the behavior is occurring during
baseline and intervention phases as judged by repeated, objective
measurements of its frequency, duration, intensity, or the percentage of
intervals in which it occurs)
84) The trend of the behavior (whether the level of the behavior is increasing or
decreasing within the baseline and intervention phases)
85) Anecdotal information from the student, his/her family, teachers, or related
service providers
86) Your own subjective assessment of the students behavior
87) Data documenting the students performance in school (e.g., work samples,
grades, attendance records, behavioral referrals)

Corresponding Survey Question(s)


Data Analyses
What methods do you use to measure the treatment integrity of the
counseling interventions you implement? (Yes, No)
77) Self-Report
78) Logs documenting sessions
79) Checklists for intervention components
80) Permanent products of student work
81) Direct observation by a 3rd party not directly involved in implementation

7) Are there any


5) How many years have you been a school psychologist employed in a
demographic
school setting?
differences (e.g.,
o 0-5
training, professional
o 6-10
development, years of
o >10
experience, other
6) What are the grade levels of the students you serve (please check all
roles and
that apply)?
responsibilities)
o Elementary school students

Research Question

240

Research Question
between school
psychologists in
terms of their use of
the general steps of
the problem-solving
model when
designing and
implementing
counseling as a direct
intervention?

240

Corresponding Survey Question(s)


o Middle/Junior high school students
o High school students
7) Please estimate the percentage of time you spend each year on the
following activities.
__________ Assessment
__________ Direct Interventions
__________ Consultation and Indirect services
__________ Research
__________ Administration
__________ Systems-level activities
__________ Other
8) In what type of school do you primarily work?
o Rural
o Suburban
o Urban
o Mixed
o Other (please specify)
9) What is the psychologist:student ratio at your school district?
o 1:<500
o 1:500-999
o 1:1000-1499
o 1:1500-2000
o 1:>2000
10) What region do you work in?
o Northeast (Connecticut, Maine, Massachusetts, New
Hampshire, Rhode Island, Vermont, New Jersey, New York,
Pennsylvania)
o Midwest (Indiana, Illinois, Michigan, Ohio, Wisconsin, Iowa,
Nebraska, Kansas, North Dakota, Minnesota, South Dakota,
Missouri)

Data Analyses

241

Research Question

241

Corresponding Survey Question(s)


Data Analyses
o South (Delaware, District of Columbia, Florida, Georgia,
Maryland, North Carolina, South Carolina, Virginia, West
Virginia, Alabama, Kentucky, Mississippi, Tennessee,
Arkansas, Louisiana, Oklahoma, Texas)
o West (Arizona, Colorado, Idaho, New Mexico, Montana, Utah,
Nevada, Wyoming, Alaska, California, Hawaii, Oregon,
Washington)
11) Have you attended any continuing education programs over the past 5
years that were specifically focused on (check all that apply):
o The No Child Left Behind Act
o Accountability for student academic and behavioral outcomes
o The provision of counseling services
o Evidence-based behavioral interventions
o Data-based decision making
o Response to Intervention
o Other (please specify)
12) How many years has it been since you received your last degree?
o 0-5
o 6-10
o >10
13) Was your graduate program accredited by (check all that apply):
o NASP
o APA
o NCATE
o Your state
o Not accredited
14) What is the highest degree that you have earned?
o MA/MS
o Certificate/Specialist
o PhD/PsyD/EdD

242

Research Question

242

Corresponding Survey Question(s)


o Other (please specify)
15) Did your graduate academic training include specific coursework in
the following areas (please check all that apply):
o Academic interventions
o Behavioral interventions
o Counseling and Psychotherapy with children
o Counseling children with developmental disabilities
o Group counseling
o Multicultural counseling

Data Analyses

APPENDIX F: NON-SIGNIFICANT CHI-SQUARE RESULTS


Chi Square Analysis Comparing Use of Problem Validation and Psychologist:Student
Ratio
Psychologist:Student Ratio
1:500 1:1000 1:15001:<500 -999 -1499
2000
1:>2000
Validate the Problem
Observed
Expected
Std. Residual
Do Not Validate the Problem
Observed
Expected
Std. Residual
Total
Observed
Expected
2
Note: = 7.22, df=4, Sig.=0.125

Total

27.0
27.7
-0.1

55.0
55.4
-0.1

51.0
54.5
-0.5

39.0
33.6
0.9

21.0
21.8
-0.2

193
193

6.0
5.3
0.3

11.0
10.6
0.1

14.0
10.5
1.1

1.0
6.4
-2.1

5.0
4.2
0.4

37
37

33.0
33.0

66.0
66.0

65.0
65.0

40.0
40.0

26.0
26.0

230
230

Chi Square Analysis Comparing Use of Behavioral Goals and Psychologist:Student Ratio

1:<500
Set behavioral goal(s)
Observed
Expected
Std. Residual
Do Not Set behavioral goal(s)
Observed
Expected
Std. Residual
Total
Observed
Expected
2
Note: = 5.43, df=4, Sig.=0.246

Psychologist:Student Ratio
1:500- 1:1000- 1:1500999
1499
2000 1:>2000

Total

27.0
27.0
0.0

59.0
54.0
0.7

52.0
54.0
-0.3

29.0
32.7
-0.7

22.0
21.3
0.2

189
189

6.0
6.0
0.0

7.0
12.0
-1.4

14.0
12.0
0.6

11.0
7.3
1.4

4.0
4.7
-0.3

42
42

33.0
33.0

66.0
66.0

66.0
66.0

40.0
40.0

26.0
26.0

231
231

243

Chi Square Analysis Comparing Use of Intervention Plans and Years of Experience
0-5

Years of Experience
6-10
>10

Use Behavioral Intervention Plan


Observed
Expected
Std. Residual
Do Not Use Behavioral
Intervention Plan
Observed
Expected
Std. Residual
Total
Observed
Expected
2
Note: = 6.04, df=2, Sig.=0.049

Total

76.0
81.7
-0.6

37.0
37.4
-0.1

81.0
74.9
0.7

194
194

20.0
14.3
1.5

7.0
6.6
0.2

7.0
13.1
-1.7

34
34

96.0
96.0

44.0
44.0

88.0
88.0

228
228

Chi Square Analysis Comparing Use of Decision-Making Plans and Time Spent
Counseling
Low
(0-9%)
Use Decision-Making Plans
Observed
Expected
Std. Residual
Do Not Use DecisionMaking Plans
Observed
Expected
Std. Residual
Total
Observed
Expected
Note: 2= 6.15, df=2, Sig.=0.046

% of Time Spent Counseling


Medium
High
(10-24%)
(25-100%)

Total

41.0
45.3
-0.6

76.0
67.6
1.0

37.0
41.1
-0.6

154
154

24.0
19.7
1.0

21.0
29.4
-1.6

22.0
17.9
1.0

67
67

65.0
65.0

97.0
97.0

59.0
59.0

221
221

244

Chi Square Analysis Comparing Use of Problem Analysis and Graduate Degree Earned
Degree Level
MA, MS,
Specialist,
Certificate
Use Problem Analysis
Observed
Expected
Std. Residual
Do Not Use Problem Analysis
Observed
Expected
Std. Residual
Total
Observed
Expected
Note: 2= 0.03, df=1, Sig.=0.859

PhD/PsyD/
EdD

Total

122.0
122.5
0.0

40.0
39.5
0.1

162
162

55.0
54.5
0.1

17.0
17.5
-0.1

72
72

177.0
177.0

57.0
57.0

234
234

Chi Square Analysis Comparing Use of Problem Analysis and Years of Experience
0-5

Years of Experience
6-10
>10

Use Problem Analysis


Observed
Expected
Std. Residual
Do Not Use Problem Analysis
Observed
Expected
Std. Residual
Total
Observed
Expected
2
Note: = 0.24, df=2, Sig.=0.885

Total

68.0
69.2
-0.1

30.0
30.5
-0.1

64.0
62.3
0.2

162
162

32.0
30.8
0.2

14.0
13.5
0.1

26.0
27.7
-0.3

72
72

100.0
100.0

44.0
44.0

90.0
90.0

234
234

245

Chi Square Analysis Comparing Students Served in Counseling and Years of Experience
Students
Special Education
Observed
Expected
Std. Residual
General Education
Observed
Expected
Std. Residual
Special and General Education
Observed
Expected
Std. Residual
Total
Observed
Expected
Note: 2= 9.76, df=4, Sig.=0.045

0-5

Years of Experience
6-10
>10

Total

33.0
26.1
1.4

12.0
11.9
0.0

16.0
23.0
-1.5

61
61

0.0
1.3
-1.1

0.0
0.6
-0.8

3.0
1.1
1.8

3
3

70.0
75.6
-0.6

35.0
34.5
0.1

72.0
66.8
0.6

177
177

103.0
103.0

47.0
47.0

91.0
91.0

241
241

246

Chi Square Analysis Comparing Number of Students Discontinued From Counseling and
Time Spent Counseling

Number of Students
0
Observed
Expected
Std. Residual
1-3
Observed
Expected
Std. Residual
4-6
Observed
Expected
Std. Residual
7
Observed
Expected
Std. Residual
Total
Observed
Expected
2
Note: = 12.06, df=6, Sig.=0.061

Low
(0-9%)

% of Time Spent Counseling


Medium
High
(10-24%)
(25-100%)

Total

8.0
8.3
-0.1

15.0
16.2
-0.3

11.0
9.5
0.5

34
34

37.0
29.3
1.4

57.0
57.1
0.0

26.0
33.5
-1.3

120
120

9.0
10.8
-0.5

18.0
20.9
-0.6

17.0
12.3
1.3

44
44

2.0
7.6
-2.0

19.0
14.8
1.1

10.0
8.7
0.5

31
31

56.0
56.0

109.0
109.0

64.0
64.0

229
229

247

APPENDIX G: LOGISTIC REGRESSION MODELS


Stepwise Logistic Regression Predicting Use of a Behavioral Definition From Graduate
Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block 0
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-1.95 (.195)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.363
.072
.480
.183
.905
.892
.972
.125
.541
.789
.232
.580
.525
.484
.657
.505

0.83
3.24
0.50
3.40
0.01
0.02
0.06
2.35
0.37
0.07
1.43
0.31
0.40
0.49
2.43
11.29

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

169.129

.049

.092

6.949

.542

169.136

.048

.092

6.632

.577

169.274

.048

.091

6.373

.606

169.625

.047

.088

6.033

.644

174.016

.029

.055

2.833

.829

175.434

.023

.044

2.359

.670

178.945

.009

.017

.000

181.116

.000

.000

.000

248

Stepwise Logistic Regression Predicting Collection of Baseline Data Points From


Graduate Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels
Served, and Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.543
.896
.075
.077
.576
.217
.467
.057
.059
.937
.853
.802
.987
.523
.885
.308

-1.24 (.156)
0.37
0.02
3.18
5.13
0.31
1.52
1.52
3.63
3.57
0.01
0.03
0.06
0.00
0.41
1.16
13.89

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

237.406

.058

.088

10.059

.261

238.327

.054

.082

2.057

.957

238.546

.053

.081

2.067

.979

238.960

.051

.078

3.230

.919

240.588

.045

.068

5.744

.570

244.334

.030

.045

2.895

.235

249

Stepwise Logistic Regression Predicting Problem Validation From Graduate Degree


Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-1.641 (.180)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.652
.655
.334
.621
.527
.583
.788
.983
.716
.686
.742
.909
.170
.011
.132
.610

0.20
0.20
0.94
0.95
0.40
0.30
0.50
0.00
0.13
0.16
0.11
0.01
1.89
6.46
7.07
10.07

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

189.785

.053

.090

5.938

.654

190.096

.052

.088

4.113

.847

190.239

.051

.087

3.076

.930

191.422

.046

.079

3.305

.914

191.768

.045

.076

4.234

.835

192.169

.043

.073

2.594

.858

192.819

.040

.069

.000

1.000

250

Stepwise Logistic Regression Predicting Problem Analysis From Graduate Degree


Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-.800 (.143)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.725
.500
.278
.554
.859
.718
.879
.529
.102
.721
.616
.628
.884
.972
.982
.950

0.12
0.46
1.18
1.18
0.03
0.13
0.26
0.40
2.68
0.13
0.25
0.24
0.02
0.00
0.41
5.24

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

278.216

.023

.033

9.904

.272

278.217

.023

.033

7.155

.520

278.854

.020

.029

5.095

.747

278.993

.020

.028

6.365

.606

279.028

.020

.028

2.276

.893

279.326

.018

.026

0.188

.998

280.874

.012

.016

0.000

283.564

.000

.000

0.000

251

Stepwise Logistic Regression Predicting Setting Behavioral Goals From Graduate


Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-1.493 (.171)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.244
.754
.735
.934
.723
.452
.753
.408
.599
.980
.980
.054
.475
.081
.229
.701

1.36
0.10
0.12
0.14
0.13
0.57
0.57
0.68
0.28
0.00
0.00
3.70
0.51
3.05
5.63
9.02

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

208.998

.040

.064

7.951

.438

209.272

.038

.063

2.504

.927

209.308

.038

.062

3.101

.928

209.835

.036

.059

2.627

.917

210.390

.034

.055

3.125

.926

211.354

.030

.048

.634

.966

212.531

.025

.040

.000

1.000

218.246

.000

.000

.000

252

Stepwise Logistic Regression Predicting Use of an Intervention Plan From Graduate


Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-1.745 (.189)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.224
.652
.620
.592
.025
.907
.054
.217
.691
.479
.631
.112
.295
.605
.446
.299

1.48
0.20
0.25
1.05
5.02
0.01
5.84
1.53
0.16
0.50
0.23
2.53
1.10
0.27
3.71
14.03

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

172.042

.064

.112

12.237

.141

172.059

.064

.112

6.218

.623

173.155

.059

.104

4.505

.809

176.717

.044

.077

2.321

.940

177.392

.041

.072

1.727

.943

178.388

.036

.064

.094

.993

180.595

.027

.047

.000

1.000

253

Stepwise Logistic Regression Predicting Use of a Measurement Plan From Graduate


Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-1.280 (.165)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.154
.608
.067
.154
.397
.582
.425
.894
.204
.133
.655
.728
.790
.650
.857
.587

0.02
0.26
3.36
3.74
0.72
0.30
1.71
0.02
1.61
2.26
0.20
0.12
0.07
0.21
1.33
10.33

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

215.504

.049

.075

7.937

.440

215.560

.049

.075

7.285

.506

215.645

.048

.074

7.975

.436

217.615

.039

.061

6.484

.593

218.773

.034

.053

1.350

.995

219.825

.030

.045

.590

.964

224.011

.011

.016

.000

226.301

.000

.000

.000

254

Stepwise Logistic Regression Predicting Use of a Decision-Making Plan From Graduate


Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-.821 (.148)

.000

0.81
2.38
6.58
6.59
0.95
0.63
1.12
0.13
0.07
0.61
0.01
0.01
1.55
0.46
2.96
13.01

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.369
.123
.010
.037
.329
.429
.572
.717
.786
.433
.943
.912
.214
.499
.564
.368

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

252.367

.061

.086

3.826

.872

252.546

.060

.085

3.964

.860

253.469

.056

.079

7.805

.350

253.735

.055

.077

3.337

.911

257.253

.039

.055

2.218

.899

258.111

.035

.050

.375

.945

259.178

.031

.043

.000

1.000

255

Stepwise Logistic Regression Predicting Collection of Progress Monitoring Data From


Graduate Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels
Served, and Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-1.569 (.186)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.207
.477
.504
.340
.847
.460
.649
.520
.205
.448
.128
.829
.888
.606
.540
.652

1.59
0.51
0.45
2.16
0.04
0.55
0.87
0.41
1.61
0.58
2.32
0.05
0.02
0.27
3.11
9.59

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

176.323

.050

.082

10.425

.236

176.364

.049

.082

11.635

.168

179.016

.037

.061

4.335

.826

179.350

.035

.059

6.920

.545

181.450

.025

.042

7.453

.489

182.281

.021

.035

.899

.925

185.016

.008

.013

.000

186.635

.000

.000

.000

256

Stepwise Logistic Regression Predicting Problem Analysis From Graduate Degree


Earned, Time Spent Counseling, Years of Experience, Grade Levels Served, and
Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-.221 (.142)

.121

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.017
.741
.861
.947
.003
.075
.010
.135
.249
.318
.969
.880
.214
.342
.666
.080

5.68
0.11
0.03
0.11
8.86
3.17
9.15
2.24
1.33
1.00
0.00
0.02
1.54
0.90
2.38
19.36

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

254.468

.097

.130

7.948

.439

255.408

.093

.124

5.826

.667

255.864

.090

.121

8.885

.352

256.531

.087

.117

10.141

.255

260.551

.069

.092

2.498

.869

261.442

.065

.087

.589

.964

257

Stepwise Logistic Regression Predicting Measurement of Treatment Integrity From


Graduate Degree Earned, Time Spent Counseling, Years of Experience, Grade Levels
Served, and Psychologist:Student Ratio
Block O
Variable
Variables Included in the Model
Constant
Variables Not Included in the Model
Graduate Degree
Low Counseling
Medium Counseling
High Counseling
0-5 yrs of Experience
6-10 yrs of Experience
>10 yrs of Experience
Elementary School
Middle/Junior High School
High School
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Overall Statistics

B (SE)

df

Sig.

-1.295 (.172)

.000

1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12

.703
.813
.539
.821
.078
.386
.210
.863
.140
.337
.613
.354
.164
.716
.568
.558

0.15
0.06
0.38
0.39
3.10
0.75
3.12
0.03
2.18
0.92
0.26
0.87
1.93
0.13
2.94
10.67

Block 1
Hosmer and Lemeshow Test
Step
Number

-2 Log
Likelihood

Cox and
Snell R2

Nagelkerke
R2

df

Sig

197.222

.053

.083

3.024

.933

197.236

.053

.082

2.845

.944

197.527

.052

.080

11.255

.188

198.253

.049

.075

12.713

.122

202.547

.028

.043

4.808

.683

202.850

.026

.041

.456

.978

206.013

.011

.017

.000

208.203

.000

.000

.000

258