Sie sind auf Seite 1von 13

PhD Seminar - Research Methods - 1

Special Topics Seminar: Research Methods


PhD Seminar (16:545:620:01)
Rutgers University, School of Management and Labor Relations
Paula M. Caligiuri, Ph.D
Fall 2007

Required Readings:
Textbook:
Quasi-Experimentation: Design and Analysis issues by Thomas D. Cook
and Donald T. Campbell, 1979, ISBN 10:0395307902; 13:978-
0395307908.

Additional Readings:
These articles are listed in the Course Outline section of the syllabus and
are available on-line through the Rutgers Library Reserves. To access
these articles you will need to do the following:
- Go to the URL: http://www.libraries.rutgers.edu
- Log-in on the upper left of the page
- Roll your cursor over Find Reserves on the left side of the page
- Click the word Reserves where it says Connect to: Reserves
- Type caligiuri and then click instructor

Class Time and Location:
Class meets in 106 Levin on Wednesdays
Class time is from 1:00 p.m. to 3:40 p.m.

Contact Information:
Office #: 732-445-5228
E-mail: caligiuri@smlr.rutgers.edu
Office Hours: By appointment

Course Summary:
This course is a PhD-level course in Research Methods. Topics will include study
designs (experimentation, quasi-experimentation, etc.), developing and measuring
constructs (reliability, validity, etc.), sampling strategies, generalizability of findings,
quantitative and qualitative methodologies and the like. While the research skills
discussed in this course are transferable across a variety of academic disciplines,
many examples will be from the human resource management and labor relations
fields.
This course will be interactive. I fully expect a high level of class participation
in the form of you asking and answering questions and contributing fully during the
class discussions. I will do my best to provide an environment where you should
feel comfortable asking questions and offering your opinions. This method of
instruction puts responsibility on you to be prepared for class.


PhD Seminar - Research Methods - 2
Course Outline:
Date Topics Covered Assignment
Due
Readings
September 5 Course Introduction:
o Theory
o Research
o Hypothesis Testing
For your future reference:
Cook & Campbell Chapter 1

Sutton, R., & Staw, B. (1995). What a theory is not. Administrative Science Quarterly,
40: 371-384.

Pfeffer, J. (1993). Barriers to the advancement of organizational science: Paradigm
development as a dependent variable. Academy of Management Review, 18: 599-620.

Weick, K. (1995). What theory is Not, Theorizing is, Administrative Science Quarterly, 40,
385-390.

Daft R. L., (1983). Learning the Craft of Organizational Research, Academy of
Management Review, 8 (4) pp. 539-546.

Whetton, D.A. (1989). What constitutes a theoretical contribution? Academy of
Management Review, 14, 490-495.


September 12

Measures:
o Operationalization
o Measuring Constructs
o Level(s) of Analysis

Measurement
Assignment:
Identify Measures
for a Selected
Construct

Klein, K., Dansereau, F. and Hall, R.. (1994). Levels issues in theory development, data
collection and analysis. Academy of Management Review, 19: 195-229.

Edwards, J. R., & Bagozzi, R. P. (2000). On the nature and direction of relationships
between constructs and measures. Psychological Methods, 5, 155-174.

Vandenberg, R.J. (2006). Statistical and methodological myths and urban legends:
Where, pray tell, did they get this idea? Organizational Research Methods, 9, 194-201.



PhD Seminar - Research Methods - 3


September 19

Design:
o Experimentation
o Quasi-
Experimentation
o Field Research
o Exploratory Research

Design
Assignment:
Testing Causal
Hypotheses

Cook & Campbell Chapters 2 & 3

Williams, L.J., & Podsakoff, P.M. (1989). Longitudinal field methods for studying
reciprocal relationships in OB research: Toward improved causal analysis. Research in
Organizational Behavior, 11, 247-293.

Meyer, B. (1995), Natural and Quasi-Experiments in Economics, Journal of Business and
Economic Statistics, (13:2), pp. 151-161.

Bartunek, J.M., Bobko, P., & VenKatraman, N. (1993). Toward innovation and diversity in
management research methods. Academy of Management Journal, 36, 1362-1373.

Schmitt, N. (1994). Method bias: The importance of theory and measurement. Journal of
Organizational Behavior, 15, 393-398.

James, L.R., Mulaik, S.A., & Brett, J.M. (2006). A tale of two methods. Organizational
Research Methods, 9, 233-244.

Greenberg, J., & Tomlinson, E.C. (2004). Situated experiments in organizations:
Transplanting the lab to the field. Journal of Management, 30(5), 703-724.


September 26

Sample:
o Statistical Power &
Effect Size
o Sampling Strategies
o Randomization
o Generalizability
o Response Rates

Sample
Assignment:
Inferring
Generalizability
across Samples

Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155-159.

Cowles, M., & Davis, C. (1982). On the origins of the .05 level of statistical significance.
American Psychologist, 37, 553-558.

Carlson, K.D., & Schmidt, F.L. (1999). Impact of experimental design on effect size:
Findings from the research literature on training. Journal of Applied Psychology, 84, 851
862.

Tomanskovic-Devey, D., Leiter, J., & Thompson, S. (1994). Organizational Survey
Nonresponse. Administrative Science Quarterly. 39; 439-457.

Berk, R. (1983). An introduction to sample selection bias in sociological data. American
Sociological Review, 48: 386-399.


PhD Seminar - Research Methods - 4


October 3

Survey Research:
o Scale Development
o Reliability & Validity
o MTMM

Scale
Measurement
Assignment:
Developing a
Scale to Test a
Construct

Hinkin, T. R. (1998). A brief tutorial on the development of measures for use in survey
questionnaires. Organizational Research Methods, 1: 104-121

Peter, J.P. (1979). Reliability: A review of psychometric basics and recent marketing
practices. Journal of Marketing Research, 16, 6-17.

Cortina, J. (1993). What is Coefficient Alpha? An Examination of Theory and
Applications. Journal of Applied Psychology, 78: 98-104.

Peter, J. P. (1981). Construct validity: A review of basic issues and marketing practices.
Journal of Marketing Research, 18, 133-145.

Campion, M.A. (1993). Article review checklist: A criterion checklist for reviewing research
articles in applied psychology. Personnel Psychology, 46, 705-718.


October 10

Survey Research
(continued):
Qualtrics Demo
Electronic Surveys

o Common Method
Bias
o Social Desirability


Archival Data and
Secondary Data

Critique #1 Due:
Experimental
Design




Podsakoff, P. and Organ, D. (1986). Self-Reports in Organizational Research: Problems
and Prospects. Journal of Management, 12: 531-544

Feldman, J.M., & Lynch, J.C. (1988). Self-generated validity and other effects of
measurement on belief, attitude, intention, and behavior. Journal of Applied Psychology,
73, 421 435.

Podsakoff, P.M., MacKenzie, S.B., Lee, J.Y, and Podsakoff, N.P. (2003). Common
Method Biases in Behavioral Research: A Critical Review of the Literature and
Recommended Remedies. Journal of Applied Psychology. 88, 5, 879-903.

Doty, D.H., and Glick, W.H. (1998). Common Methods Bias: Does Common Methods
Variance Really Bias Results? Organizational Research Methods, 1(4): 374-406.


October 17


Mid-Term Exam


PhD Seminar - Research Methods - 5


October 24

Practical Challenges for
Data Collection:
o Gaining Access
o Participant
Recruitment
o Missing Data &
Outliers


Cook & Campbell Chapter 8

Hosseini, J.C., and Armacost, R.L. 1993. Gathering Sensitive Data in Organizations. American Behavioral
Scientist, 36(4):443-471.

To prepare for this class, please interview your advisor or one of your professors using the following
interview protocol*:
1. Given your academic discipline and area of expertise, what practical advice do you have for new
PhDs who will be conducting research in IR and/or HR?
2. What have been some of your greatest challenges in gaining access to research or data collection
sites for the types of studies you do? How did you overcome those challenges? What are the key
lessons I should share with my classmates from your experiences in gaining access to research sites?
3. Would you share a critical incident or two that happened while conducting the research that resulted in
one of your own favorite articles? What are the key lessons I can share with my classmates from that
experience? (After the interview, for context, read the professors article that he or she mentioned.)
*Of course you are welcome to write our own questions, but I want you to have the conversation around
these themes the realities or practical challenges of research. Please be prepared to talk about each of
these questions in an open discussion.

October 31

Qualitative Methods:
Guest Lecturer, Professor
Sharon Ryan from the
Graduate School Of
Education at Rutgers


TBA

Eisenhardt, K. M. & Graebner, M.E. (2007). Theory building from cases: Opportunities
and challenges. Academy of Management Journal, 50:25-32

Siggelkow, N. (2007). Persuasion with case studies. Academy of Management Journal,
50:20-24

Lee, A.S. (1991). Integrating positivist and interpretive approaches to organizational
research. Organization Science, 2: 342-365

November 7

Qualitative Methods
(continued):
Guest Lecturer, Professor
Sharon Ryan from GSE

Guest Speaker, Dr. Mary
Gatta from the Center for
Women and Work at
Rutgers

TBA



PhD Seminar - Research Methods - 6


November 14

Professional Issues of
Academic Researchers:

o Being a Peer
Reviewer
o Working with
Collaborators
o Ethical Issues in
Research

Critique #2 Due:
Survey Research
Design


Daft, R. L., (1985), Why I recommended that your manuscript be rejected and what you
can do about it, in Publishing in Organizational Sciences, Ed. L. L. Cummings & P. J.
Frost. Richard D. Irwin Inc.

Fiske, D.W., & Campbell, D.T. (1992). Citations do not solve problems. Psychological
Bulletin, 112, 393-395.

Rosenthal, R. (1994). Science and ethics in conducting, analyzing, and reporting
psychological research. Psychological Science, 5: 127-134.
The Academy of Management code of ethical conduct. Found on the Academy of
Management website
Berado, F., (1989). Scientific Norms and Research Publication Issues and Professional
Ethics, Sociological Inquiry 59:3, 249-266.
Kerr, N.L. (1998). HARKing: Hypothesizing after the results are known. Personality and
Social Psychology Review, 2 (3): 196-217.

November 21

No Class Happy Thanksgiving


November 27
and 28

From Theory to
Research: Strategic
HRM Joint Project and
Presentations

These sessions will be conducted jointly with SMLRs Strategic HRM Theory course (taught by Mark Huselid).
Pairs of students (one from each course) will work together to develop an integrated research study proposal,
which draws on the materials from the Strategic HRM Theory course and our Research Methods course. The
deliverable of this exercise will be a 30 minute presentation to both classes. More information, including
group assignments and specific presentation dates, will be provided in class.


December 5


Presenting Research in
an Academic Context

Research Presentations

December 12


December 19


Final Exam


PhD Seminar - Research Methods - 7
Grading:
15% Participation
10% Two Research Critiques
10% Four Assignments
25% Midterm Exam
15% Research Presentation
25% Final Exam

Participation: To get the maximum points for participation I expect:
o Fully participate in class discussion (i.e., read before you come to
class so you are able to contribute to the class).
o Complete all assignments and be prepared to discuss/present
o Attend class
o Fully participate in the combined Strategic HRM/Research Methods
classes (and preparation for those classes) on November 27 and 28.

Research Critiques:
Each critique will be a 1-2 page (single spaced) review of a pre-
selected manuscripts (to be distributed in class). Critiques should be
written as though you were the peer reviewing this manuscript for a
journal only with a limited focus on the methodological issues
(design, sample, procedures, and measures).

Assignments:
These assignments will be due on the date indicated on the syllabus
and will be handed out one week before they are due.

Midterm: The midterm will be an application of the concepts and skills you have
learned in the first half of the class.

Research Presentation:
You will give a fifteen-minute research presentation, similar to the way
you would present at an academic conference. Details on this
research presentation will be given in class.

Final: This will be a comprehensive final exam incorporating concepts from
the entire course. The final will test your deeper understanding of
research methods and will test how well you are able to integrate the
material covered in the class.


PhD Seminar - Research Methods - 8
Additional Optional Readings

Science, Theories, and Research
Davis, Murray S. 1971. Thats interesting! Towards a phenomenology of sociology
and a sociology of phenomenology. Philosophy of Social Science, 1: 309-344.
Pfeffer, Jeffrey. 1993. Barriers to the advance of organizational science: paradigm
development as a dependent variable. Academy of Management Review, 18: 599-
620.
Wagner, David G., and Joseph Berger. 1985. Do sociological theories grow?
American Journal of Sociology, 90: 4: 697-728.
Van Maanen, John. 1995. Style as theory. Organization Science, 6:133-143.
Pfeffer, Jeffrey. 1995. Mortality, reproducibility and the persistence of styles of
theory, Organization Science, 6:681-686.
Van Maanen, John. 1995. Fear and loathing in organization studies, Organization
Science, 6:687-692.
Weick, K. E. (1989), "Theory Construction as Disciplined Imagination," Academy of
Management Review, 14, 516-531
Bacharach, S. B., 1989, "Organizational Theories: Some Criteria for Evaluation,"
Academy of Management Review, 14 (4), 496-515.


Measures
House, Robert, Denise Rousseau and Melissa Thomas-Hunt. 1995. The meso
paradigm: A framework for the integration of micro and macro organizational
behavior. Research in Organizational Behavior, 17: 71-114.
Cappelli, Peter and Peter Sherer. 1991. The missing role of context in OB: The
need for a meso-level approach. Research in Organizational Behavior, 13: 55-110.
Schmidt, F. L., & Hunter, J. E. 1996. Measurement error in psychological research:
Lessons from 26 research scenarios. Psychological Methods, 2: 199-223.
Ganster, D.C., Hennessey, H.W., & Luthans, F. 1983. Social desirability response
effects: Three alternative models. Academy of Management Journal, 26: 321-331.
Simsek, Z., and Veiga, J.F. 2000. The Electronic Survey Technique: An Integration
and Assessment. Organizational Research Methods, 3(1): 92-114
McWilliams, A., & Siegel, D. 1997. Event studies in management research:
Theoretical and empirical issues. Academy of Management Journal, 40: 626-657.
Avolio, B.J., and Bass, B.M. Identifying common methods variance with data
collected from a single source: An unresolved sticky issue. Journal of Management,
1991, 17(3): 571-587.
Aiken, L. R. (1994). Some observations and recommendations concerning research
methodology in the behavioral sciences. Educational and Psychological
Measurement, 54, 848-860.

Design and Methods (see also Additional Optional Books):
Ilgen, Daniel. 1986. Laboratory Research: A Question of When, Not If. In E. Locke
(ed.), Generalizing from Laboratory to Field Settings: Research Findings from
Industrial-Organization Psychology, Organizational Behavior, and Human Resource


PhD Seminar - Research Methods - 9
Management. Lexington, MA: Lexington Press.
Roth, P.L., and BeVier, C.A. 1998. Response Rates in HRM/OB Survey Research:
Norms and Correlates, 1990-1994. Journal of Management, 24(1) 97-117.
Abelson, R. P. (1985). A variance explanation paradox: When a little is a lot.
Psychological Bulletin, 97, 129-133.
Mazen, A.M., Graf, L.A., Kellog, C.E., and Hemmasi, M. 1987. Statistical Power in
Contemporary Management Research. Academy of Management Journal, 30(2),
369-380.
Mitchell, T. R. (1985). An evaluation of the validity of correlational research
conducted in organizations. Academy of Management Review, 10, 192-205.
Stone-Romero, E.F., Weaver, A.E., & Glenar, J.L. (1995). Trends in Research
Design and Data Analytic Strategies in Organization Research. Journal of
Management, 21(1):141-157.


Survey Research (see also Additional Optional Books):
Hinkin, Timothy R. 1995. A Review of Scale Development Practices in the Study of
Organizations. Journal of Management. 21(5); 967-988.
Campbell, Donald and Donald Fiske. 1959. Convergent and discriminant
validations by the multitrait-multimethod matrix. Psychological Bulletin, 56: 81-105.
Crampton, S. M., & Wagner, J. A., III (1994). Percept-percept inflation in
microorganizational research: An investigation of prevalence and effect. Journal of
Applied Psychology, 79, 67-76.
Spector, P. E. (1994). Using self-report questionnaires in OB research: A comment
on the use of a controversial method. Journal of Organizational Behavior, 15, 385-
392.
Schmitt, N. (1994). Method bias: The importance of theory and measurement.
Journal of Organizational Behavior, 15, 393-398.
Howard, G. S. (1994). Why do people say nasty things about self-reports? Journal
of Organizational Behavior, 15, 399-404.
Schmitt, N. & Stults, D. M. (1986). Methodology review: Analysis of
multitraitmultimethod matrices. Applied Psychological Measurement, 10, 1-22.

Qualitative Research
Salancik, Gerald. 1979. Field stimulation for organizational behavior research.
Administrative Science Quarterly, 24: 638-649.
Lieberson, Stanley. 1991. Small Ns and big conclusions: An examination of the
reasoning in comparative studies based on a small number of cases. Science
Forces, 70: 307-320.
Miles, Matthew B. 1979. Qualitative data as an attractive nuisance: The problem of
analysis. Administrative Science Quarterly, 24: 590-601.
Jick, Todd. 1979. Mixing Qualitative and Quantitative Methods: Triangulation in
Practice. Administrative Science Quarterly, 24: 602-611.

Selected Topics in Research Methods
Mann, C. (1990). Meta-analysis in the breech. Science, 249, 476-480.
Social Forces


PhD Seminar - Research Methods - 10
Guzzo, R.A., Jackson, S.E., & Katzell, R.A. (1987). Meta-analysis. Research in
Organizational Behavior, 9, 407-442.
Wanous, J. P., Sullivan, S. E., & Malinak, J. (1989). The role of judgment calls in
metaanalysis. Journal of Applied Psychology, 74, 259-264.
Williams, L.J., Edwards, J.R., & Vandenberg, R.J. (2003). Recent advances in
causal modeling methods for organizational and management research. Journal of
Management, 29(6), 903-936.
Olkin, I. (1992). Reconcilable differences: Gleaning insight from conflicting scientific
studies. The Sciences, 32, 4, 30-36.
Oxman, A.D., & Guyatt, G.H. (1993). The science of reviewing research. Annals of
the New York Academy of Sciences, 703, 125-134.
Schmidt, F.L. (1992). What do data really mean? Research findings, meta-analysis,
and cumulative knowledge in psychology. American Psychologist, 47, 1173-1181.
Ostroff, C. & Harrison, D.A. (1999). Meta-analysis, level of analysis, and best
estimates of population correlations: Cautions for interpreting meta-analytic results
in organizational behavior. Journal of Applied Psychology, 84, 260-270.

Professional Issues as Researchers
Fiske, Donald W., and L. Fogg. 1990. But the reviewers are making different
criticisms of my paper! Diversity and uniqueness in reviewer comments. American
Psychologist, 45: 591-598.
Rosenthal, R. (1994). Science and ethics in conducting, analyzing, and reporting
psychological research. Psychological Science, 5, 127-134.
Kurtines, W.M., Alvarez, M., & Azmitia, M. (1990). Science and morality: The role of
values in science and the scientific study of moral phenomena. Psychological
Bulletin, 107, 283-295.



PhD Seminar - Research Methods - 11
Additional Optional Books

Miles, M.B., & Huberman, A.M. (1994). Qualitative Data Analysis: An Expanded
Sourcebook. Thousand Oaks, CA: Sage.
Babbie, E. (2007).The Basics of Social Research, 4
th
edition. Wadsworth Publishing.
American Psychological Association (2001). APA Publications Manual(5th edition).
Washington, D.C.: Author.
Schwab, D. P. (1999). Research methods for organizational studies. Mahwah, NJ:
Erlbaum.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd
edition). Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
Kerlinger, F. N., & Lee, H. B. (2000). Foundations of behavioral research (4th
edition). Wadsworth.
Nunnally, J. C., & Bernstein, I. H. (1994). Psychometric theory (3rd edition). New
York, NY: McGraw-Hill, Inc.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and
quasiexperimental designs for generalized causal inference. Houghton Mifflin.
Cahuc, Pierre and Andre Zylberberg (2004), Labor Economics, Cambridge, MA: MIT
Press.
Schmitt, N.W., & Klimoski, R.J. (1991). Research methods in human resources
management. Cincinnati: South-Western Publishing.
Sekaran, U. (1992). Research method for business (2nd ed.). New York: John Wiley
& Sons.
Selltiz, C., Wrightsman, L.S., & Cook, S.W. (1976). Research methods in social
relations. N.Y.: Holt, Rinehart, and Winston.
Zeller, R., & Carmines, E. (1980). Measurement in the social sciences. Cambridge:
Cambridge University Press.
Smithson, M. (2003). Confidence Intervals. Sage QASS Series, No. 140.
Fowler, F.J. (2002). Survey Research Methods. (3rd Ed.) Sage (Applied Social
Research Methods Series, Vol. 1) (165 pp.)
Kalton, G. (1983). Introduction to Survey Sampling. Sage QASS Series, no. 35. (96
pp.)
Keicolt, K.J., & Nathan, L.E. (1985). Secondary Analysis of Survey Data. Sage
QASS Series, no 53. (87 pp.)
Spector, P.E. (1981). Research Designs. Sage QASS Series, no. 23 (80 pp.)
Stewart, D.W., Kamins, D. W. (1993). Secondary Research. (2nd Ed). Sage
(Applied Social Research Series, Vol. 4). (138 pp.)
Kalton. G. (1983). Introduction to survey sampling. Sage QASS Series, no. 35. (96
pp.)
Campbell, J.P., Daft, R.L., & Hulin, C.L. (1982). What to study: Generating and
developing research questions. Beverly Hills: Sage. (167 pp.)
Dunnette, M.D., & Hough, L. (Eds.) (1990). Handbook of I/O psychology, vol. 1. Palo
Alto, CA: Consulting Psychologists Press.



PhD Seminar - Research Methods - 12
Critiquing and Reviewing Empirical Research
1


I. Theory and Hypothesis Formulation
1. Was a significant problem or research question examined?
2. To what extent were the aims or purpose of the study delineated?
3. How well was the literature reviewed? integrated?
4. Were the concepts defined well?
5. How well were the hypotheses specified? Did they follow from 2, 3, and 4
above? Are they falsifiable?
6. Were boundaries and levels of analysis specified? included in
hypotheses?
7. Are alternative hypotheses/theories specified?

II. Method
1. Was the study designed to answer the problem or research question?
2. Did the method follow from the theory/hypotheses?
3. Measurement
A. What were the independent, dependent, and control
(moderator,etc.) variables?
B. For each variable: was it operationalized? justified?
C. For each variable: what evidence for validity was given?
D. For each variable: what evidence for reliability was given?
E. Were the measures quantifiable?
F. Was the unit or level of analysis specified and defined? Measures
at appropriate level of analysis?
4. Sample
A. Is the population and sample described? justified?
B. What sampling procedure was used?
C. To what extent was the population sampled appropriate for the
research question?
D. To what extent was the sample adequate to answer the research
question?
E. Where different/multiple units and levels of analysis specified
sampled?
5. Study Design
A. What specific design was used?
B. Can it be diagramed?
C. How were subjects/objects assigned to conditions (treatments)?
D. How was comparability of groups established?
E. To what extent were threats to internal validity controlled?
F. To what extent were threats to external validity controlled?
G. To what extent were threats to statistical conclusion validity
controlled?

1 - These questions are based, in part, on Professor F. Yammarinos syllabus for a Management Course
in Research Methodology offered at SUNY-Binghamton. Used with permission.



PhD Seminar - Research Methods - 13
H. To what extent were threats to construct validity controlled?
I. Can the study be replicated?
J. Are the manipulations adequate?
K. To what extent is correlation distinguished from causation?
L. Are various controls used?
6. Data Collection
A. How were the data collected? at multiple/single level(s)?
B. Was the technique appropriate?
C. Were data coding explained adequately?
D. Were data sources specified? multiple or single source(s)?

III. Analyses and Interpretation
1. How were the data analyzed? Was the technique appropriate?
2. What were the empirical findings? Descriptive statistics?
3. To what extent were the findings statistically and practically significance?
Effect sizes reported?
4. To what extent and how were extraneous variables controlled?
5. Was a "treatment" effect of relationship among variables found?
6. What was the magnitude of the effect or relationship?
7. Were the "treatments" manipulated or delivered as intended?
8. Were multiple levels of analysis examined? How?

IV. Discussion
1. Interpretation
A. Did the results provide support for a hypothesis/theory relative to
other hypotheses/theories which were tested?
B. Were the conclusions (verbal statements) consistent with the
empirical findings (data)?
C. Were there other extraneous variables which could lead to an
alternative explanation? What were the alternative explanations?
D. What is a way to control for the extraneous variable(s) involved?
E. To what extent can the findings be generalized from the sample to
population? from sample to theory?
F. To what extent can the findings be generalized to other
populations, "treatments", time periods, methods of
measurement, or theory?
G. Was statistical significance confused with practical and/or
substantive significance?
H. Were boundary conditions and levels of analysis accounted for?
2. Conclusions and Practical Implications
A. Were the results integrated with a stream of scientific knowledge?
B. Were the implications for research and practice stated?
C. To what extent did the study contribute to scientific knowledge?
Was there a value-added contribution to the literature?
D. Was it well-organized, well-presented, well-written, etc.?
E. Does it reach the target audience?

Das könnte Ihnen auch gefallen