Opportunities to Respond
Study: Lambert et al. (2006)
Summary
Opportunities to Respond is an intervention that involves providing all students in a group or classroom with the means (e.g., dry erase board, response cards) to respond to all questions posed by the teacher. The intent is to increase engagement by giving students the opportunity to respond to academic questions at a higher rate than the traditional form of hand raising provides.
- Target Grades:
- K, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
- Target Populations:
-
- Students with disabilities only
- Students with learning disabilities
- Students with emotional or behavioral disabilities
- English language learners
- Any student at risk for emotional and/or behavioral difficulties
- Area(s) of Focus:
-
- High Levels of Disengagement
- Disruptive Behavior
- Anxiety
- Where to Obtain:
- N/A
- Initial Cost:
- Free
- Replacement Cost:
- Contact vendor for pricing details.
-
Only supplies are needed for responding (e.g., cards, white boards and markers).
- Staff Qualified to Administer Include:
-
- Special Education Teacher
- General Education Teacher
- Reading Specialist
- Math Specialist
- EL Specialist
- Interventionist
- Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
- Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
- Paraprofessional
- Other: Usually implemented by teacher or researchers, but could be implemented by paraprofessional.
- Training Requirements:
- Training not required
-
In published research, training was generally not described. When training occurred, it consisted of brief (e.g., 1 hour or less) didactic description and sometimes modeling.
No-however, intervention clearly described in published research.
- Access to Technical Support:
- Not available
- Recommended Administration Formats Include:
-
- Small group of students
- BI ONLY: A classroom of students
- Minimum Number of Minutes Per Session:
- Minimum Number of Sessions Per Week:
- Minimum Number of Weeks:
- Detailed Implementation Manual or Instructions Available:
- Yes
- Is Technology Required?
- No technology is required.
Program Information
Descriptive Information
Please provide a description of program, including intended use:
Opportunities to Respond is an intervention that involves providing all students in a group or classroom with the means (e.g., dry erase board, response cards) to respond to all questions posed by the teacher. The intent is to increase engagement by giving students the opportunity to respond to academic questions at a higher rate than the traditional form of hand raising provides.
The program is intended for use in the following age(s) and/or grade(s).
Age 3-5
Kindergarten
First grade
Second grade
Third grade
Fourth grade
Fifth grade
Sixth grade
Seventh grade
Eighth grade
Ninth grade
Tenth grade
Eleventh grade
Twelth grade
The program is intended for use with the following groups.
Students with learning disabilities
Students with intellectual disabilities
Students with emotional or behavioral disabilities
English language learners
Any student at risk for academic failure
Any student at risk for emotional and/or behavioral difficulties
Other
If other, please describe:
ACADEMIC INTERVENTION: Please indicate the academic area of focus.
Early Literacy
Alphabet knowledge
Phonological awareness
Phonological awarenessEarly writing
Early decoding abilities
Other
If other, please describe:
Language
Grammar
Syntax
Listening comprehension
Other
If other, please describe:
Reading
Phonics/word study
Comprehension
Fluency
Vocabulary
Spelling
Other
If other, please describe:
Mathematics
Concepts and/or word problems
Whole number arithmetic
Comprehensive: Includes computation/procedures, problem solving, and mathematical concepts
Algebra
Fractions, decimals (rational number)
Geometry and measurement
Other
If other, please describe:
Writing
Spelling
Sentence construction
Planning and revising
Other
If other, please describe:
BEHAVIORAL INTERVENTION: Please indicate the behavior area of focus.
Externalizing Behavior
Verbal Threats
Property Destruction
Noncompliance
High Levels of Disengagement
Disruptive Behavior
Social Behavior (e.g., Peer interactions, Adult interactions)
Other
If other, please describe:
Internalizing Behavior
Anxiety
Social Difficulties (e.g., withdrawal)
School Phobia
Other
If other, please describe:
Acquisition and cost information
Where to obtain:
- Address
- Phone Number
- Website
Initial cost for implementing program:
- Cost
- $0.00
- Unit of cost
Replacement cost per unit for subsequent use:
- Cost
- Unit of cost
- Duration of license
Additional cost information:
Describe basic pricing plan and structure of the program. Also, provide information on what is included in the published program, as well as what is not included but required for implementation (e.g., computer and/or internet access)
Only supplies are needed for responding (e.g., cards, white boards and markers).Program Specifications
Setting for which the program is designed.
Small group of students
BI ONLY: A classroom of students
If group-delivered, how many students compose a small group?
3-30Program administration time
- Minimum number of minutes per session
- Minimum number of sessions per week
- Minimum number of weeks
- If intervention program is intended to occur over less frequently than 60 minutes a week for approximately 8 weeks, justify the level of intensity:
Does the program include highly specified teacher manuals or step by step instructions for implementation?- Yes
BEHAVIORAL INTERVENTION: Is the program affiliated with a broad school- or class-wide management program?- No
-
If yes, please identify and describe the broader school- or class-wide management program: -
Does the program require technology? - No
-
If yes, what technology is required to implement your program? -
Computer or tablet
Internet connection
Other technology (please specify)
If your program requires additional technology not listed above, please describe the required technology and the extent to which it is combined with teacher small-group instruction/intervention:
Training
- How many people are needed to implement the program ?
- 1
Is training for the instructor or interventionist required?- No
- If yes, is the necessary training free or at-cost?
- Free
Describe the time required for instructor or interventionist training:- Less than 1 hour of training. Training is not required.
Describe the format and content of the instructor or interventionist training:- In published research, training was generally not described. When training occurred, it consisted of brief (e.g., 1 hour or less) didactic description and sometimes modeling.
What types or professionals are qualified to administer your program?
General Education Teacher
Reading Specialist
Math Specialist
EL Specialist
Interventionist
Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
Paraprofessional
Other
If other, please describe:
Usually implemented by teacher or researchers, but could be implemented by paraprofessional.- Does the program assume that the instructor or interventionist has expertise in a given area?
-
Yes
If yes, please describe:
Classroom instruction
Are training manuals and materials available?- No
-
Describe how the training manuals or materials were field-tested with the target population of instructors or interventionist and students: - No-however, intervention clearly described in published research.
Do you provide fidelity of implementation guidance such as a checklist for implementation in your manual?- Yes
-
Can practitioners obtain ongoing professional and technical support? -
No
If yes, please specify where/how practitioners can obtain support:
Summary of Evidence Base
- Please identify, to the best of your knowledge, all the research studies that have been conducted to date supporting the efficacy of your program, including studies currently or previously submitted to NCII for review. Please provide citations only (in APA format); do not include any descriptive information on these studies. NCII staff will also conduct a search to confirm that the list you provide is accurate.
-
Armendariz, F. & Umbreit, J. (1999). Using active responding to reduce disruptive behavior in a general education classroom. Journal of Positive Behavior Interventions, 1, 152-158.
Christle, C. A., & Schuster, J. W. (2003). The Effects of Using Response Cards on Student Participation, Academic Achievement, and On-Task Behavior During Whole-Class, Math Instruction. Journal Of Behavioral Education, 12(3), 147-165.
Davis, L. L., & O’Neill, R. E. (2004). Use of Response Cards with a Group of Students with Learning Disabilities Including Those for whom English is a Second Language. Journal of Applied Behavior Analysis, 37(2), 219-222.
George, C. L. (2010). The effects of response cards on performance and participation in social studies for middle school students with emotional and behavioral disorder. Behavioral Disorders, 35(3), 200-213.
Lambert, M. C., Cartledge, G., Heward, W. L., & Lo, Y. Y. (2006). Effects of response cards on disruptive behavior and academic responding during math lessons by fourth-grade urban students. Journal of Positive Behavior Interventions, 8(2), 88-99.
Maheady, L., Michielli-Pendl, J., Mallette, B., & Harper, G. F. (2002). A collaborative research project to improve the academic performance of a diverse sixth grade science class. Teacher Education and Special Education: The Journal of the Teacher Education Division of the Council for Exceptional Children, 25(1), 55-70.
Munro, D. W., & Stephenson, J. (2009). The effects of response cards on student and teacher behavior during vocabulary instruction. Journal of Applied Behavior Analysis, 42(4), 795-800.
Narayan, J. S., Heward, W. L., Garner, R., Courson, F. H., & Omness, C. K. (1990). Using response cards to increase student participation in an elementary classroom. Journal of Applied Behavioral Analysis, 23(4), 483-490.
Study Information
Study Citations
Lambert, M. C., Cartledge, G., Heward, W. L. & Lo, Y. Y. (2006). Effects of Response Cards on Disruptive Behavior and Academic Responding During Math Lessons by Fourth-Grade Urban Students. Journal of Positive Behavior Interventions, 8(2) 88-99.
Participants
- Describe how students were selected to participate in the study:
- Nine students were targeted from two fourth-grade general education classrooms during math lessons in a Midwestern urban elementary school. Students were nominated by classroom teachers who believed they were the most disruptive, least attentive, and had the worst math performance. This resulted in 9 target students (4 from classroom A and 5 from classroom B).
-
Describe how students were identified as being at risk for academic failure (AI) or as having emotional/behavioral difficulties (BI): - Students are not described as having sped labels.
-
ACADEMIC INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:- below the 30th percentile on local or national norm, or
- identified disability related to the focus of the intervention?
- %
-
BEHAVIORAL INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:- emotional disability label,
- placed in an alternative school/classroom,
- non-responsive to Tiers 1 and 2, or
- designation of severe problem behaviors on a validated scale or through observation?
- %
Provide a description of the demographic and other relevant characteristics of the case used in your study (e.g., student(s), classroom(s)).
Case (Name or number) | Age/Grade | Gender | Race / Ethnicity | Socioeconomic Status | Disability Status | ELL status | Other Relevant Descriptive Characteristics |
---|---|---|---|---|---|---|---|
test | test | test | test | test | test | test | test |
Design
- Please describe the study design:
- A reversal (ABAB) design was used to demonstrate the differential effects of the two teacher presented conditions.
Clarify and provide a detailed description of the treatment in the submitted program/intervention:- Each student was given a laminate board, a dry erase maker, and a tissue or piece of paper towel. The teacher lectured and modeled the math skill being taught followed by asking questions or problems related to the day’s lesson. When the teacher asked a question, students were expected to respond by writing their answers on their response cards. After sufficient wait time, the teacher said “cards up,” and the students would hold their response cards over their heads for the teacher to see. The teacher then scanned the answers. If all students answered correctly, the teacher praised the class and instructed them to wipe off their boards. When more than ¼ of the class was incorrect, the teacher explained the steps and directed students to correct their answers. After the correction, the teacher repeated the same question again for the students to practice answering correctly. If only 2 or 3 students answered incorrectly, the teacher gave the correct response, instructed the students to check their answers and moved onto the next question without having the students present their response cards again for the same question.
Clarify what procedures occurred during the control/baseline condition (third, competing conditions are not considered; if you have a third, competing condition [e.g., multi-element single subject design with a third comparison condition], in addition to your control condition, identify what the competing condition is [data from this competing condition will not be used]):- Lessons followed typical procedures already being used in the classrooms including lecture, skill practice using question-and-answer format and independent worksheets. Teachers recorded which target students raised their hand for a particular question and which target students, if any, were called on to make a response. During instruction, teacher presented a question either orally or visually (written on chalkboard or overhead) and students would volunteer to answer by raising their hand. The teacher then randomly selected a single student who had volunteered to answer the question. If it was answered correctly, the teacher provided praise (e.g., “Very good. The area of the square is 16 square feet”) and then moved on to the next question. If the student answered incorrectly, the teacher called on another student to answer. The correct answer was given and the math problem was explained to the whole class by the teacher if two incorrect answers were given.
Please describe how replication of treatment effect was demonstrated (e.g., reversal or withdrawal of intervention, across participants, across settings)- Replication of treatment was demonstrated across participants and settings.
-
Please indicate whether (and how) the design contains at least three demonstrations of experimental control (e.g., ABAB design, multiple baseline across three or more participants). - At least three demonstrations of control were demonstrated across 9 participants in two classrooms.
If the study is a multiple baseline, is it concurrent or non-concurrent?- N/A
Fidelity of Implementation
- How was the program delivered?
-
Individually
Small Group
Classroom
If small group, answer the following:
- Average group size
- 15
- Minimum group size
- Maximum group size
What was the duration of the intervention (If duration differed across participants, settings, or behaviors, describe for each.)?
- Weeks
- Sessions per week
- Duration of sessions in minutes
- Weeks
- Sessions per week
- Duration of sessions in minutes
- Weeks
- Sessions per week
- Duration of sessions in minutes
- What were the background, experience, training, and ongoing support of the instructors or interventionists?
- Both classroom teachers were Caucasian, certified in elementary education, and had two years of teaching experience in their current classrooms. Teacher A was a male who was in his first teaching assignment and Teacher B was female who had previously taught one 1.5 years in a middle school. Previously, both participated in in-service training provided by the authors as part of a larger project on academic and behavioral interventions, but neither had used response cards.
Describe when and how fidelity of treatment information was obtained.- Integrity checks were conducted for 30% of sessions within each condition using a checklist for each condition.
What were the results on the fidelity-of-treatment implementation measure?- Across both classrooms, the teachers averaged 95.7% (range 91%-99%) compliance with steps during single-student responding and 97.5% (range 96%-98%) compliance during response cards.
Was the fidelity measure also used in baseline or comparison conditions?- Yes, two different fidelity checklists were developed (one for each condition) and were checked throughout the study.
Measures and Results
Measures Broader :
Study measures are classified as targeted, broader, or administrative data according to the following definitions:
-
Targeted measures
Assess outcomes, such as competencies or skills, that the program was directly targeted to improve.- In the academic domain, targeted measures typically are not the very items taught but rather novel items structured similarly to the content addressed in the program. For example, if a program taught word-attack skills, a targeted measure would be decoding of pseudo words. If a program taught comprehension of cause-effect passages, a targeted measure would be answering questions about cause-effect passages structured similarly to those used during intervention, but not including the very passages used for intervention.
- In the behavioral domain, targeted measures evaluate aspects of external or internal behavior the program was directly targeted to improve and are operationally defined.
-
Broader measures
Assess outcomes that are related to the competencies or skills targeted by the program but not directly taught in the program.- In the academic domain, if a program taught word-level reading skill, a broader measure would be answering questions about passages the student reads. If a program taught calculation skill, a broader measure would be solving word problems that require the same kinds of calculation skill taught in the program.
- In the behavioral domain, if a program taught a specific skill like on-task behavior in one classroom, a broader measure would be on-task behavior in another setting.
- Administrative data measures apply only to behavioral intervention tools and are measures such as office discipline referrals (ODRs) and graduation rates, which do not have psychometric properties as do other, more traditional targeted or broader measures.
Targeted Measure | Reverse Coded? | Evidence | Relevance |
---|---|---|---|
Targeted Measure 1 | Yes | A1 | A2 |
Broader Measure | Reverse Coded? | Evidence | Relevance |
---|---|---|---|
Broader Measure 1 | Yes | A1 | A2 |
Administrative Data Measure | Reverse Coded? | Relevance |
---|---|---|
Admin Measure 1 | Yes | A2 |
- If you have excluded a variable or data that are reported in the study being submitted, explain the rationale for exclusion:
Results
- Describe the method of analyses you used to determine whether the intervention condition improved relative to baseline phase (e.g., visual inspection, computation of change score, mean difference):
- Visual analysis and mean difference.
Please present results in terms of within and between phase patterns. Data on the following data characteristics must be included: level, trend, variability, immediacy of the effect, overlap, and consistency of data patterns across similar conditions. Submitting only means and standard deviations for phases is not sufficient. Data must be included for each outcome measure (targeted, broader, and administrative if applicable) that was described above.- When analyzing disruptive behavior between conditions (single-student responding and response cards) for classroom A, there was an immediate effect from SSR 1 to RC 1 with only 1 overlapping data point across all 4 participants. There was an immediate increase in disruptive behaviors from RC 1 to SSR 2, again with only 1 overlapping data point. In the last phase change from SSR 2 to RC 2, there was also an immediate reduction in disruptive behavior, but a higher rate of overlapping data (5 data points). Similar results are found for Classroom B in terms of immediacy and only 1 of 5 had overlapping data points from SSR 1 to RC 1 (4/5 overlapping points). When returning to SSR 2, however, 3 of 4 students had overlapping data points (2/7 for student B1, 0/6 for student B2, 1/7 for student B3, 6/7 for B4, and 4/6 for B5). When returning to RC 2, effects were immediate with 3/5 students having overlapping data points (3/11 for B1, 0 for B2, 0 for B3, 8/8 for B4 and 10/10 for B5). Data for academic responding showed that the mean number of disruptive behavior was lowers in the RC conditions (classroom A mean for SSR 1 = 7.4 and 8.2 for SSR 2; classroom B 6.7 and 5.3) than during the RC conditions (classroom A for RC 1 = 1.1 and 2.1 for RC 2; classroom B for RC 1 = 1.0 and 1.0 for RC 2). A visual analysis of responses per minute across both classrooms showed that for classroom A, there was an immediate increase in the rate of academic responding/min when going from each SSR condition to each RC condition with no overlapping data points in any phase for all students across both classrooms. The mean academic responses per minute and mean percentage of responding accuracy for each student across conditions were also analyzed. In classrooms 1 and 2, students’ academic responses during SSR conditions were lower than RC conditions. Classroom 1 (M for SSR 1 = 0.7, M for SSR 2 = 0.7, M for RC 1 = 1.03 and RC 2 = .90) and classroom 2 (M for SSR 1 = 0.13, M for SSR 2 = .08, M for RC 1 = .92 and RC 2 = .91) both had more responses during the RC condition. There did not appear to be much difference in academic accuracy. Targeted students in classroom A were accurate an average of 89.7% during SSR 1, 88.3% during RC 1, 100% during SSR 2 and 90% during RC 2. In classroom B, targeted students were accurate an average of 87.4% during SSR 1, 90.3% during RC 1, 97.4% during SSR 2 and 92.0% during RC 2.
Additional Research
- Is the program reviewed by WWC or E-ESSA?
- No
- Summary of WWC / E-ESSA Findings :
What Works Clearinghouse Review
This program was not reviewed by What Works Clearinghouse.
- How many additional research studies are potentially eligible for NCII review?
- 0
- Citations for Additional Research Studies :
Data Collection Practices
Most tools and programs evaluated by the NCII are branded products which have been submitted by the companies, organizations, or individuals that disseminate these products. These entities supply the textual information shown above, but not the ratings accompanying the text. NCII administrators and members of our Technical Review Committees have reviewed the content on this page, but NCII cannot guarantee that this information is free from error or reflective of recent changes to the product. Tools and programs have the opportunity to be updated annually or upon request.