Controls


Course homepage


Studies page


E-mail Instructor


E-mail TA

Study 8: Concurrent Schedules of Reinforcement

15 March 1999


Abstract

Rats will be given a choice between two variable interval schedules of reinforcement. An interchangeover analysis will be used to determine whether the amount of time spent on one side is different than the amount spent on the other.


Introduction

Schedules of reinforcement have become standard tools for the study of choice. One paradigm, in particular, has dominated operant research on choice; that paradigm is the use of concurrent schedules of reinforcement.

In a concurrent procedure, an animal can choose between two (or more) independent schedules of reinforcement. Typically, each schedule is associated with a different response. Of interest is the amount and pattern of responding on each schedule. A good review is provided by de Villiers (1977). Interestingly, however, there are few studies involving rats as subjects. Norman and McSweeney (1978) reported one such study; it provides a good overview of how a concurrent schedule can be established.

In our study, we will examine our rats' performance on a concurrent variable-interval variable-interval schedule. That is, our rats will be able to choose between two schedules, each of which delivers reinforcement on a variable interval schedule. One of these will be a VI-100 sec schedule and the other will be a VI-30 sec schedule. For this week, we will concentrate on the pattern of switches, or changeovers, between the different responses. Next week, we'll examine how choice depends upon the relative rate of reinforcement earned from each schedule.


Methods

Subjects:

Our Sprague-Dawley rats will serve as subjects.

Apparatus:

We will be using the six custom-constructed chambers to test our animals. Each of these chambers will be fitted with two response levers that the animal can press. In addition, we will use electric clocks to measure time intervals during training. Reinforcers will consist of chocolate sprinkles delivered to the food cup in each box after a tap on the chamber wall.

Procedure:

Our lab will be 60 minutes long. During that time we will reinforce the rats according to a VI-30 sec schedule on the right lever of each chamber, and a VI- 100 sec schedule on the left lever.

The interreinforcement times for each schedule are given in Tables 1 and 2 (to be passed out in class). As in last week's study,reinforcement is delivered for the first response after the time interval has elapsed since reinforcement. In this case, however, we have two responses, each with its own schedule of reinforcement. This means that a time interval could elapse on, say, the left response, while the rat is pressing the bar on the right. Your team will have to keep track of both schedules, and the times at which a reinforcement "hold" is in effect. In addition, to analyze changeovers, you should record the time that an animal starts responding on a particular lever and when he switches to the other.

Probably the best way to do this is to have each member of a team watch one response lever. He or she can keep track of behavior and reinforcement availability on that lever. As was the case last week, the digital timers will be available to save eyestrain. However, since only one timer is available, you won't be able to reset the time after each reinforcement. You'll have to note down the time that a reinforcement occurred, add the next interval to that time, and then reinforce the animal when he responds at the new time. For example, suppose you're in charge of the right response. If the timer reads 154 seconds when a rat is reinforced for a right response, and the next interval is 17 seconds, then reinforce the rat for the next right response after the timer reads 171 seconds. Remember also to record the time that the animal switches from the left response to the right, and the time that he switches from the right to the left.

After the session, weigh and feed your rat.


Results

We will be looking at changeover performance for this lab. In point of fact, although the distribution of changeover times is important to many theoretical analyses of operant behavior, there are few studies that have looked at the pattern of changeovers. Heyman's study with pigeons (Heyman 1979) is one of these; he found that changeovers were random in time­which suggests that changeover times should be distributed according to an exponential distribution. To my knowledge, there are no studies that have examined this question with respect to operant behavior in rats.

As a simple graphical test of this hypothesis, plot the frequency histogram of left and right changeover times. Because there are training factors involved with learning the concurrent schedule, you may want to plot changeovers from the latter part of the session. Decide what part of the session you want to plot, and comment on what you think is a good basis for that choice.


References

de Villiers, P.A. (1977) Choice in concurrent schedules and quantitative formulations of the law of effect. In: W.K. Honig and J.E.R. Staddon (eds.) Operant behavior. New York: Prentice Hall, pp. 233-287.

Heyman, G.M. (1979) A markov model description of changeover probabilities on concurrent variable-interval schedules. Journal of the Experimental Analysis of Behavior, 31: 41-51.

McSweeney, W.D. (1978) Matching, contrast, and equalizing in the concurrent lever-press responding of rats. Journal of the Experimental Analysis of Behavior, 29: 453-462.


Page Created: 13 March 1999 Last updated: 13 March 1999
Site maintained by: Michael R. Snyder <msnyder@psych.ualberta.ca>