Sie sind auf Seite 1von 7

1. Fixed Ratio Schedule Description: A reinforcer is given after a specified number of correct responses.

This schedule is used to increase and maintain a steady rate of specific responses. It is the best schedule for learning a new behavior. Produces:

A high steady rate until reinforced A post-reinforcement pause Ratio strain with large increases in the response requirement High RTE

Examples in natural environments:

Jobs that pay based on units delivered. Employees often find this schedule undesirable because it produces a rate of response that leaves them nervous and exhausted at the end of the day. They may feel pressured not to slow down or take rest breaks, since they feel that such will costs them money. This is an example of how a schedule can produce a high rate of response even though the response rate is aversive to the subject.

Examples in video games

Collecting tokens. Many games require the player to collect a fixed number of tokens to advance to the next level, obtain a new life point, or receive some other reinforcer.

Attaining a new level in an RPG. Some RPG's clearly indicate how much experience is required to achieve the next level. A high degree of certainty as to the level of work that will be required to achieve the next level puts the player on a fixed ratio schedule.

Fixed Interval Defined In the world of psychology, fixed interval refers to a schedule of reinforcement used within operant conditioning. You might remember that operant conditioning is a type of associative learning in which a person's behavior changes according to that behavior's consequences.

Fixed interval schedules of reinforcement are but one of four traditional ways in which this type of associative learning occurs. So, how do fixed interval schedules of reinforcement work? Let's take a closer look at the two words that comprise the concept itself. First, what is meant by fixed? In this context, it means that a behavior is being reinforced every single time some reinforcement occurs, such as a reward. If reinforcement only happens some of the time, then it is not fixed. The second component, interval, refers to the passage of time. So, a fixed interval schedule of reinforcement happens when some sort of reinforcement occurs after a set amount of time and impacts behavior. Everyday Examples It doesn't matter what your age is, where you live, or what you do for a living - fixed interval schedules impact us all every day. Let's look at an example that involves perhaps the most common reinforcer: money. Say you're a salaried employee, and you receive your paycheck biweekly. The behavior of showing up for work when you're scheduled is reinforced with a paycheck every two weeks. Knowing when your paycheck is coming should make it much more likely that you will continue to show up to work. Let's look at another example. Say you get a new pet fish. You get home from school every day at 4:00 p.m., and the first thing you do is feed your new fish. After a few days of this, you might start noticing your fish starts swimming toward the top of his tank everyday around 4:00. Does the fish know that it's 4:00? Not exactly, but thanks to operant conditioning and fixed intervals of reinforcement, he learns that, after a set amount of time, if he hangs out around the top of the tank, he gets to eat. Here is one more example. If you have children, or have ever been around children, you are well aware that there are times when children don't always want to do what you think they should be doing. Let's say that you have a 3-year old who is fond of using her crayons on the wall instead of the pages of her coloring book. Most parents would be interested in learning how to alter this little one's coloring behavior, and using a fixed interval schedule of reinforcement would be one way to accomplish this. Here's one way to do it; after you show

her how to color on the pages, offer her praise for coloring in the coloring book on a fixed schedule of time (say once every 60 seconds). As long as your praise is consistent, it won't take long before she will come to expect the praise every 60 seconds and associate it with coloring in her coloring book. 2. Variable Ratio Schedule Description: A reinforcer is given after a specified number of correct responses. This schedule is used to increase and maintain a steady rate of specific responses. This schedule is best for maintaining a behavior. Produces:

A high steady rate until reinforced No (or only a very small) pos-treinforcement pause. The value of a variable rate schedule can be increased a greater rate than a fixed rate schedule without producing ratio strain, and

A high RTE and the values of variable ratio that can maintain responding are somewhat higher than those of a fixed ratio.

Examples in natural environments:

Slot machines. Slot machines are programmed on VR schedule. The gambler has no way of predicting how many times he must put a coin in the slot and pull the lever to hit a payoff but the more times a coin is inserted the greater the chance of a payout. People who play slot machines are often reluctant to leave them, especially when they have had a large number of unreinforced responses. They are concerned that someone else will win the moment they leave.

Playing golf. It only takes a few good shots to encourage the player to keep playing or play again. The player is uncertain how good each shot will be, but the more often they play, the more likely they are to get a good shot.

Door to door salesmen. It is uncertain how many houses they will have to visit to make a sale, but the more houses they try, the more likely that they will succeed.

Examples in video games

Collecting tokens. Some games require the player to collect tokens to achieve a life point or reward but vary the number of tokens required.

Achieving a new level in an RPG. Some RPG's give no clear indication of how much experience is required to achieve the next level. The puts the player on a variable ratio schedule.

Obtaining frags in an FPS. Many of your shots will fail and it is not certain what the outcome of any particular shot will be. However, the more targets you shoot at, the more likely you are to get a frag.

Crafting in an RPG. It may take multiple attempts to succeed, or to gain a new level, but the more you try, the more likely your behavior is to be reinforced.

4. Variable Interval Schedule Description: Produces


The

first

response

after

variable

time

interval

is

reinforced.

Moderately steady rate of responding and No (or only small) post-reinforcement pause. High resistance to extinction

Examples in natural environments:

A parent attending to the cries of a child. Parents will not typically attend to the child each time it cries, but will leave he or she to fuss for a period before attending.

Checking voicemail. Calls can arrive at any time so there is a variable interval between each voicemail received.

Fishing. If you just cast your hook and wait, fishing operates on a variable interval basis. However, if you have to cast several times to catch a fish, this is more like a duration schedule (see below).

Examples in video games:

Waiting for monsters to re-spawn in a game where respawning occurs at variable intervals. Note: In multiplayer games other players may be waiting for the monster to respawn as well, in which case there is a variable interval, limited hold schedule (see below).

Checking for mail messages from other players.

Fixed Duration Schedules Description: To be reinforced, the behavior must occur continuously throughout a fixed time interval. This is used to increase a behavior when it is desirable that the behavior persist throughout a period of time. Produces: Long periods of continuous behavior until reinforced A post-reinforcement pause Strain with a large increase in duration Moderate RTE Examples from natural environments: A worker paid by the hour. To receive pay, the worker needs to work steadily throughout the work period. Student practicing piano. It is desired that the student continue practicing throughout the practice period, so a reinforcement will only be given if they practice continuously. Examples in video games

Game tutorials. The player may be required to perform a behavior for a fixed period of time to complete a tutorial and advance. Fixed duration schedules are particularly suited for teaching players the mechanics of a game. Games with a fixed time limit for a level. In order to advance, the player must continuously perform an activity throughout the period; e.g., shooting alien ships or clearing all items from the level. Variable Duration Schedules Description: To be reinforced, the behavior must occur continuously throughout a variable time interval. This schedule used to increase behaviors that should persist throughout a period of time. Produces Long periods of continuous behavior until reinforced No post-reinforcement pause. Less strain with increases in average duration High RTE Examples in natural environments. Stalking prey. The interval between catching prey is variable and the animal must continue to stalk its prey throughout the period. Note that there will be a pause in stalking after the animal has caught its prey, but this is due to satiation, not due to a typical post-reinforcement pause.

Rubbing two sticks together to produce fire. The time it takes to produce fire varies, but the activity must continue throughout the whole period.

Examples in video games

Hunting simulations. Where the player is required to stalk prey they are on a variable duration schedule. There is a variable period required to succeed and the player must continue to stalk throughout the whole period.

Flight simulators. The player must fly the plane throughout the period to successfully land the plane, or complete a mission.

Racing games. The player must drive the vehicle for the entire race to win.

Das könnte Ihnen auch gefallen