Sie sind auf Seite 1von 13

Introduction to Instructional Design

Instructional Design is the systematic development of instructional specifications using learning


and instructional theory to ensure the quality of instruction. It is the entire process of analysis of
learning needs and goals and the development of a delivery system to meet those needs. It
includes development of instructional materials and activities; and tryout and evaluation of all
instruction and learner activities.

An instructional system is an arrangement of resources and procedures to promote learning.


Instructional design is the systematic process of developing instructional systems and
instructional development is the process of implementing the system or plan.

Instructional Design is a field that prescribes specific instructional actions to achieve desired
instructional outcomes; the process decides the best methods of instruction for enacting desired
changes in knowledge and skills for a specific course content and learner population. Instructional
design is usually the initial stage of systematic instruction, for which there are dozens of models,
For example, Instructional Systems Design (ISD) includes instructional development, delivery,
and evaluation.

Instructional Design as a Process:


Instructional Design is the systematic development of instructional specifications using
learning and instructional theory to ensure the quality of instruction. It is the entire
process of analysis of learning needs and goals and the development of a delivery
system to meet those needs. It includes development of instructional materials and
activities; and tryout and evaluation of all instruction and learner activities.
Instructional Design as a Discipline:
Instructional Design is that branch of knowledge concerned with research and theory
about instructional strategies and the process for developing and implementing those
strategies.
Instructional Design as a Science:
Instructional Design is the science of creating detailed specifications for the development,
implementation, evaluation, and maintenance of situations that facilitate the learning of
both large and small units of subject matter at all levels of complexity.
Instructional Design as Reality:
Instructional Design can start at any point in the design process. Often a glimmer of an
idea is developed to give the core of an instruction situation. By the time the entire
process is done the designer looks back and she or he checks to see that all parts of the
"science" have been taken into account. Then the entire process is written up as if it
occurred in a systematic fashion.

Glossary: Instructional Design

Case method. The presentation of real or fictional situations or problems to learners to analyze,
to discuss, and to recommend actions to be taken.

Coaching. A technique of cognitive apprenticeship whereby the instructor observes students as


they try to complete tasks and provides hints, help and feedback as needed.

Cognitive apprenticeship. An instructional model that seeks to emulate the opportunities for
extended practice on authentic tasks that apprentices have while working under a master
craftsman.
Cognitive flexibility theory. A theory of learning for advanced knowledge. Advanced knowledge
is seen as less rule-based and rigid than introductory knowledge. The theory recommends
approaching content from multiple perspectives through multiple analogies and the use of
hypertext instruction.

Cognitive psychology. "The scientific analysis of human mental processes and memory
structures in order to understand human behavior"

Education. Instruction which emphasizes far-transfer learning objectives; traditionally knowledge-


based instruction which is not tied to a specific job, as opposed to training.

Effectiveness. A measure of whether a procedure or action achieves its purpose.

Efficiency. A measure of the timeliness and affordability of an action.

Environment analysis. The context of any instructional system, both where the instruction will
occur and how the instructional materials will be used.

Fading. A technique of cognitive apprenticeship whereby the instructor gradually withdraws


support and transfers full control of a performance to the learner.

Formative evaluation. On-going evaluation of instruction with the purpose of improvement.

Functional context training. A model of instruction that works from simple, familiar tasks and
proceeds to more complex tasks with ample opportunities for practice.

Heuristic. A rule of thumb or guideline (as opposed to an invariant procedure). Heuristics may
not always achieve the desired outcome, but are extremely valuable to problem-solving
processes.

Hypertext. Non-linear text. Image a computer screen with a word in bold. You click on the word
and it "zooms in" to greater detail. Hypertext allows you to zoom in and zoom out of subjects and
make connections between topics. Hypertext programs are useful for instruction and for
information access.

Inert knowledge. Knowledge a learner has acquired but fails to activate in appropriate situations.

Instructional design. The activity of planning and designing for instruction. Also, a discipline
associated with the activity.

Layers of necessity model. A model of instructional design and development which prioritizes
the needs of a project into layers; "each layer being a self-contained model." Additional layers are
developed as time and resources allow.

Microworld. A computer-based simulation with opportunities for manipulation of content and


practice of skills.

Minimalist training. An instructional approach which seeks to provide the minimum amount of
instruction needed to help the learner master a skill. It emphasizes active learning and meaningful
learning tasks.
Performance analysis. A specific, performance-based needs assessment technique that
precedes any design or development activities by analyzing the performance problems of a work
organization.

Performance support systems. Computer program that aids the user in doing a task. Examples
include help systems, job aids, and expert system advisors.

Problem solving. The creative application of "various rules, procedures, techniques, or


principles to solve complex problems where there is no single correct. . . answer" .

Rapid prototyping. In a design process, early development of a small-scale prototype used to


test out certain key features of the design. Most useful for large-scale or projects.

Scaffolding. A technique of cognitive apprenticeship whereby the instructor performs parts of a


task that the learner is not yet able to perform.

Simulation. "A simulation is a simulated real life scenario displayed on the computer, which the
student has to act upon".

Spoon-feeding problem. The dilemma in training between (1) how much to simplify and control
the learning situation and (2) how much to provide for exploration and exposure to real-world
complexity.

Training. Instruction which emphasizes job-specific, near-transfer learning objectives;


traditionally skills-based instruction, as opposed to education.

Instructional Design Process

STEP 1: ANALYZE.

1. Goal - One of the keys to successful instructional design is beginning with a clear picture
of your desired end result. In other words, you have to know exactly where you want to
go!

Begin by reviewing the overall goal of your technology project. Consider the following
questions before formulating and writing your goal statement on the planning form:

o Why are you doing this project?


o How do you hope this project will enhance learning for your students?
o What learning challenge(s) is this project expected to conquer?
2. Audience - Another key to successful instructional planning is having at least a general
idea of the learning characteristics and needs of the students.

Continue your analysis by listing the probable characteristics of students who will be the
target audience for your project. Consider the following questions to help guide your
thinking as you develop your learner profile:

o What classification of students generally take this course?


o Are most of them majors or non-majors in the discipline?
o What have they struggled with most in the past?
o Why do most of them take the course (general education, major requirement,
elective, etc.)?
o How much background knowledge do they typically have on the subject?
o Generally speaking, what are their attitudes toward the course content?
o What is the extent of prior experience with the content for most students who
take the course?

STEP 2: DESIGN AND DEVELOP

1. General topics - The first step in designing your specific learning outcomes is to define
the scope of the project. You began thinking about the scope when you stated the overall
goal. Continue by listing the major topics of information and/or knowledge you expect
students to study.

Before listing the general topics that will define the scope of your project, consider the
following questions:

o What is the big picture?


o What are the major topics studied in this class?
o What topics are listed on the syllabus?
o What are the general chapter headings in the textbook?
2. "Performance-Based" Learning Outcomes - The terms listed below are essentially
synonymous. They refer to course goals that:
1. specify the information and/or skills to be mastered AND
2. specify what students will do to demonstrate mastery.

• learning outcomes
• performance-based outcomes
• learning objectives
• performance-based learning outcomes
• course objectives
• performance-based objectives
• performance outcomes
• performance-based learning objectives

Once developed, these learning outcomes are included in the course syllabus for two reasons.
First, they clarify for students exactly what they will be expected to learn. Second, they tell
students exactly what they will have to do to earn grades reflecting various levels of mastery.
When developing performance-based learning outcomes, it is important to keep the following
distinction in mind:

• activities designed to help students master information and skills ARE DIFFERENT
FROM
• activities designed to allow students to demonstrate the extent to which they have
mastered the information and skills

do to earn grades reflecting various levels of mastery.


When developing performance-based learning outcomes, it is important to keep the following
distinction in mind:

• activities designed to help students master information and skills ARE DIFFERENT
FROM
• activities designed to allow students to demonstrate the extent to which they have
mastered the information and skills
Instructional Design Taxonomies

• Bloom's Cognitive Taxonomy


o evaluation - judge value of ideas, appraise, predict, assess, select, rate, choose
o synthesis - put together parts, compose, construct, formulate, manage, prepare,
design, plan
o analysis - dissect parts, detect relationships, diagram, compare, differentiate,
criticize, debate
o application - use methods, concepts, principles, apply, practice, demonstrate,
illustrate, operate
o comprehension - understand information, discuss, explain, restate, report, tell,
locate, express, recognize
o knowledge - recall information, define, repeat, list, name, label, memorize
• Krathwohl's Affective Taxonomy
o characterizing - incorporate ideas completely into practice, recognized by the use
of them
o organizing - commits to using ideas, incorporates them into activity
o valuing - thinks about how to take advantage of ideas, able to explain them well
o responding - answers questions about ideas
o receiving - listens to ideas

VERBS that reflect the various levels of cognitive thinking in


BLOOM'S TAXONOMY
KNOWLEDGE (literal-level thinking)
cite label name define list quote pronounce reproduce
identify match recite state tell recall remember repeat
recognize describe memorize locate draw write select
COMPREHENSION (literal-level thinking)
alter discover manage relate summarize explain rephrase restate
give convert substitute match represent change depict translate
vary describe illustrate reword distinguish interpret paraphrase transform
infer review generalize extend examples express predict compare
APPLICATION (critical thinking)
apply make manage relate classify employ predict show
use evidence prepare solve demonstrate manifest present utilize
direct practice compute report illustrate change choose interpret
draw model modify sketch dramatize paint collect produce
ANALYSIS (critical thinking)
ascertain diagram outline diagnose reduce contrast survey differentiate
associate conclude examine designate determine organize research investigate
separate compare point out dissect categorize infer subdivide distinguish
construct classify separate analyze divide find discriminate
SYNTHESIS (critical thinking)
combine devise integrate revise conceive propose generalize originate
compose extend construct organize design add to rearrange synthesize
create pose project produce expand rewrite develop role play
modify plan imagine compile invent theorize formulate hypothesize
EVALUATION (critical thinking)
judge conclude appraise evaluate select defend weigh recommend
decide criticize compare consider contrast deduce verify summarize
relate solve critique assess justify debate

Objectives of Instructional Design

Learning is an active process in which learners construct new ideas or concepts based upon their
current/past knowledge. The learner selects and transforms information, constructs hypotheses,
and makes decisions, relying on a cognitive structure to do so. Cognitive structure (i.e., schema,
mental models) provides meaning and organization to experiences and allows the individual to
"go beyond the information given".

Thus,

1. Instruction must be concerned with the experiences and contexts that make the student
willing and able to learn (readiness).
2. Instruction must be structured so that it can be easily grasped by the student (spiral
organization).
3. Instruction should be designed to facilitate extrapolation and or fill in the gaps (going
beyond the information given).

Foster a learning culture

1. Offer training, within an overall culture that encourages cooperation, risk-taking, and growth.

2. Get learners' buy-in and commitment in achieving training goals.

Motivate learners.

3. Demonstrate the value of of the training to the learners and cultivate their sense of confidence
in their ability to master the objectives

Make training problem-centered.

4. Draw on authentic needs and contexts; make requirements of learning tasks similar to
important requirements of job tasks.

5. Encourage learners' active construction of meaning, drawing on their existing knowledge


(Resnick, 1983).

6. Teach multiple learning outcomes together (Gagne & Merrill, 1990).

7. Sequence instruction so that learners can immediately benefit from what they learn by applying
it to real-world tasks.

Help learners assume control of their learning.


8. Provide coaching.

9. Provide scaffolding and support in performing complex tasks.

a. Adjust tools (equipment), task, and environment.

b. Provide timely access to information and expertise.

c. Provide timely access to performance feedback.

d. Utilize group problem-solving methods.

e. Provide help only when the learner is at an impasse and only enough help for the learner to
complete the task.

10. Fade support.

11. Minimize mean time to help (i.e., provide "just-in-time" training).

12. Encourage learners to reflect on their actions.

13. Encourage exploration.

14. Encourage learners to detect and learn from their errors.

Provide meaningful "practice."

15. Provide opportunities for learners to apply what they've learned in authentic contexts. If it is
not feasible to practice on real tasks, provide cases or simulations.

16. Personalize practice (Ross & Morrison, 1988).

Designing for Instructional Events

There are nine instructional events and corresponding cognitive processes:

1. Gaining attention (reception) - show a variety of examples related to the issue to be


covered ...
2. Iinfoming learners of the objective (expectency) - pose questions, and outline the
objectives ...
3. Stimulating recall of prior learning (retrieval) - review summaries, introductions and
issues covered ...
4. Presenting the stimulus (selective perception) - adopt a definition and framework for
learning/understanding
5. Providing learning guidance (semantic encoding) - show case studies and best
practices ...
6. Eliciting performance (responding) - get user-students to create outputs based on
issues learnt ...
7. Providing feedback (reinforcement) - check all examples as correct/incorrect
8. Assessing performance (retrieval) - provide scores and remediation
9. Enhancing retention and transfer (generalization) - show examples and statements
and ask students to identify issues learnt ...
These events should satisfy or provide the necessary conditions for learning and serve as the
basis for designing instruction and selecting appropriate media

Examples of technology-based learning activities and interface design

Learning Activities Interface design

1. guided multiple choice exercises helpful hints that require the learner to
analyze their answer

2. stand-alone multiple choice incremental feedback that increases with


each step or answer

3. simple fill in the blank responses alternative choices that are based on their
response

4. move screen objects from one place to


another responses based on location of object(s)
moved

5. "hypertext" references
definitions or new pieces of content displayed

6. clickable images responses dependent on clicking correct


images

7. computer-guided learning sequence responses appear through guided steps of a


process

8. simple presentation of examples and


non-examples based on correct identification of elements in
a concept

9. presentation of smaller questions of a


larger problem text-based is based on answers to simple
questions

10. problem presented with text, image, and


audio narration based on correct identification of elements in
a concept

Key Aspects of an E-Learning System

Hari Srinivas
Aspect Issues
Target • Who is the eventual end-user?
Identification • What level does the end-user operate?
• What is the scope of activities of the end-user?
• What networks bring the end-users together?
• What local institutions/organizations work with the end-user?
• What do mandate and other guiding documents say of the end-user?
Needs Besides the above points,
Assessment
• What are the end-users' targets?
• What are the end-user's objectives and goals?
• What will be included in a needs-assessment questionnaire (keeping
in mind that these assessment change depending on the end-users'
targets, on the time scale, and scope/level of operation)?

Packaging • How can existing projects be 'redesigned' to generate info for e-


Information learning products?
• How will the above needs assessment exercise help in identifying
the format of the e-learning package?
• How can different information packages be developed from the same
block of data/information of a project?
• How can the information format be matched with the differing needs
of the end-user?
• How can the subsidiary of the information used in e-learning be
maintained – delivering the right info at the right time to the right
level and the right end-user?

Marketing • How can the info on e-learning products be delivered to the end-
user?
• What events can be used to disseminate e-learning meta-info, both
online and offline?
• How can effective networking serve the timely delivery of e-learning
products?

Delivery • What appropriate information infrastructure needs to be developed


Modalities for e-learning: online and offline?
• What components of the e-learning products need to be on-line and
what components be offline (eletronic and hardcopy)?
• Depending on the target end-user, what will be the frequency, format
and mode of delivery of the e-learning products?

Support • Besides the main e-learning product, what kinds of support systems
Systems need to be put in place: ongoing – during the course of e-learning;
and follow-up – as a continuous learning exercise?
• What kinds of value-added resources be delivered to the main e-
learning product in order to make it more relevant to different end-
users - 'individualized' or 'regionalized'?
• How will queries and comments/suggestions etc. from users be
processed? How will it be used to enhance the quality of e-learning
products/processes?

Monitoring and • What will be the components of a M&E system?


Evaluation o Monitoring the end-user's use of the e-learning product itself
o Monitoring the end-user's use of knowledge gained
o Assessment of the quality/quantity of information/knowledge
provided in the e-learning package
Factors affecting Learning

Instructional Design is largely affected by how a user learns:

Meaningfulness effect Highly meaningful words are easier to learn and remember than less
meaningful words. This is true whether meaningful is measured by
1) the number of associations the learner has for the word,
2) by frequency of the word
3) or by familiarity with the sequential order of letters,
4) or the tendency of the work to elicit clear images.
An implication is that retention will be improved to the extent the user can make meaning of the
material.

Serial position effects Serial position effects result from the particular placement of an item within
a list. Memory is better for items placed at beginning or end of list rather than in the middle. An
exception to these serial positions is the distinctiveness effect - an item that is distinctively
different from the others will be remembered better, regardless of serial position.

Practice effects Active practice or rehearsal improves retention, and distributed practice is usually
more effective than massed practice. The advantage to distributed practice is especially
noticeable for lists, fast presentation rates or unfamiliar stimulus material. The advantage to
distributed practice apparently occurs because massed practice allows the learner to associate a
word with only a single context, but distributed practice allows association with many different
contexts.

Transfer effects Transfer effects are effects of prior learning on the leaning of new material.
Positive transfer occurs when previous learning makes new learning easier. Negative transfer
occurs when it makes the new learning more difficult. The more that two tasks have in common,
the more likely that transfer effects occur.

Interference effects. Interference effects occur when memory or particular material is hurt by
previous or subsequent learning. Interference effects occur when trying to remember material that
has previously been learned. Interference effects are always negative.

Organization effects Organization effects occur when learners chunk or categorize the input. Free
recall of lists is better when learners organize the items into categories rather than attempt to
memorize the list in serial order.

Levels-of-Processing effects The more deeply a word is processed, the better it will be
remembered. Semantic encoding of content is likely to lead to better memory. Elaborative
encoding, improves memory by making sentences more meaningful.

State-Dependent effects State- or Context-dependent effects occur because learning takes place
in within a specific context that must be accessible later, at least initially, within the same context.
For example, lists are more easily remembered when the test situation more closely resembles
the leaning situation, apparently due to contextual cues available to aid in information retrieval.

Mnemonic effects Mnemonics - strategies for elaborating on relatively meaningless input by


associating the input with more meaningful images or semantic context. Four well-known
mnemonic methods are the place method, the link method, the peg method and the keyword
method.

Abstraction effects Abstraction is the tendency of learners to pay attention to and remember the
gist of a passage rather than the specific words of a sentence. In general, to the extent that
learners assume the goal is understanding rather than verbatim memory and the extent that the
material can be analyzed into main ideas and supportive detail, learners will tend to concentrate
on the main ideas and to retain these in semantic forms that are more abstract and generalized
than the verbatim sentences included in the passage.

Levels effect This effect occurs when the learner perceives that some parts of the passage are
more important than others. Parts that occupy higher levels in the organization of the passage will
be learned better than parts occupying low levels.

Prior Knowledge effects Prior knowledge effects will occur to the extent that the learner can use
existing knowledge to establish a context or construct a schema into which the new information
can be assimilated.

Inference effects Inference effects occur when learners use schemas or other prior knowledge to
make inferences about intended meanings that go beyond what is explicitly stated in the text.
Three kinds of inferences are case grammar pre-suppositions, conceptual dependency
inferences and logical deductions.

Student misconception effects. Prior knowledge can lead to misconceptions. Misconceptions may
be difficult to correct due to fact that learner may not be aware that knowledge s a misconception.
Misconception occurs when input is filtered through schemas that are oversimplified, distorted or
incorrect.

Text Organization Effects Text organization refers to the effects that the degree and type of
organization built into a passage have on the degree and type of information that learners encode
and remember. Structural elements such as advanced organizers, previews, logical sequencing,
outline formats, higlighting of main ideas and summaries assist learning in retaining information.
These organization effects facilitate chunking, subsumption of material into schemas and related
processes that enable encoding as an organized body of meaningful knowledge. In addition, text
organization elements cue learners to which aspects of the material are most important.

Mathemagenic Effects
Mathemagenic effects, coined by Rothkopf (1970) , refer to various things that learners do to
prepare and assist their own learning. These effects refer to the active information processing by
learners. Mathemagenic activities include answering adjunct questions or taking notes and can
enhance learning.

Tools to Enable Instructional Strategies

If you selected one of ... then the following technology tools can help enable your
the following strategies strategies:
...
A. Conversing, Discussing e-mail, listservs, discussion boards, chat
B. Mentoring, e-mail; live, synchronous camera(s) for mentor/mentee to discuss;
Questioning, Supporting a chat room with white board, digital drop boxes for file sharing and
Partner written critiques
e-mail, discussion boards, web sites that showcase controversies or
C. Debating experts with opinions and theories; use resources as the basis for
discussion, such as www.ideachannel.com
D. Impersonating, Role asynchronous tools (i.e., e-mail, discussion boards, chat) or
Playing synchronous tools (i.e., Symposium, CU-SeeMe, live net-cams)
E. Sharing Data, e-mail, listservs, spreadsheets, data analysis software
Analyzing
web page editors for students, e-mail and other communication tools,
digital drop boxes for file sharing, server space to post projects
F. Developing a New
online, tools that allow for voting on or attaching comments to
Product or Artifact
students' work for the purpose of recognizing best or improving weak
artifacts
a significant grant budget may be required to create live expeditions,
G. Traveling Virtually, consisting of technology to upload live broadcasts to satellites and
Situating Curriculum in the back down to Internet servers with live audio/video streams;
Context of Expeditions alternatively, quests could be videotaped and delivered at a later time
via standard Internet video streaming
H. Seeking, Collecting,
web resources, either individual pages related to a course, or entire
Organizing, Synthesizing
archives from which students conduct research to identify topics of
Online Information
interest or relevance to assignments
(Research)
web-page editors (e.g., Dreamweaver), photo editors (e.g.,
I. Exploring Real World
Photoshop), perhaps video editors (e.g., Premiere) and knowledge of
Cases and Problems
video streaming for Internet (e.g., Real Producer)
J. Accessing Tutorials
for creating virtual exercises, knowledge of multimedia development
with Exercises, Quizzes,
programs (e.g., Director, Flash) and/or mechanisms for placing them
Questions, Online Drill-
on the Internet (e.g., Shockwave, Java)
and- Practice

Assess Instructional Outcomes

Assessing competencies developed as a result of learning is critical ... this table shows
competencies and ways to measure them
Competencies Measures

rubrics, critical thinking scales; rate quality of student arguments, predictions,


evaluation
conclusions

products or artifacts synthesized by students (web pages, reports); rate


synthesis according to desired criteria: originality, organizational scheme, appropriate use
of evidence versus conjecture

debates, critiques, discussions, case analyses; assess student ability to extract


analysis
relevant variables underlying a problem, issue, or situation

word problems, experiments; assess student ability to apply principles and


application
theories to solve novel problems

comprehension short answer questions

knowledge multiple choice, true-false, matching

characterizing practical experiences; interview, observe student beyond class, in real settings

organizing projects, cases

valuing discussions

responding problems, questions


receiving problems, questions

Das könnte Ihnen auch gefallen