Opportunities to Respond
Study: Armendariz & Umbreit (1999)
Summary
Opportunities to Respond is an intervention that involves providing all students in a group or classroom with the means (e.g., dry erase board, response cards) to respond to all questions posed by the teacher. The intent is to increase engagement by giving students the opportunity to respond to academic questions at a higher rate than the traditional form of hand raising provides.
- Target Grades:
- K, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12
- Target Populations:
-
- Students with disabilities only
- Students with learning disabilities
- Students with emotional or behavioral disabilities
- English language learners
- Any student at risk for emotional and/or behavioral difficulties
- Area(s) of Focus:
-
- High Levels of Disengagement
- Disruptive Behavior
- Anxiety
- Where to Obtain:
- N/A
- Initial Cost:
- Free
- Replacement Cost:
- Contact vendor for pricing details.
-
Only supplies are needed for responding (e.g., cards, white boards and markers).
- Staff Qualified to Administer Include:
-
- Special Education Teacher
- General Education Teacher
- Reading Specialist
- Math Specialist
- EL Specialist
- Interventionist
- Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
- Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
- Paraprofessional
- Other: Usually implemented by teacher or researchers, but could be implemented by paraprofessional.
- Training Requirements:
- Training not required
-
In published research, training was generally not described. When training occurred, it consisted of brief (e.g., 1 hour or less) didactic description and sometimes modeling.
No-however, intervention clearly described in published research.
- Access to Technical Support:
- Not available
- Recommended Administration Formats Include:
-
- Small group of students
- BI ONLY: A classroom of students
- Minimum Number of Minutes Per Session:
- Minimum Number of Sessions Per Week:
- Minimum Number of Weeks:
- Detailed Implementation Manual or Instructions Available:
- Yes
- Is Technology Required?
- No technology is required.
Program Information
Descriptive Information
Please provide a description of program, including intended use:
Opportunities to Respond is an intervention that involves providing all students in a group or classroom with the means (e.g., dry erase board, response cards) to respond to all questions posed by the teacher. The intent is to increase engagement by giving students the opportunity to respond to academic questions at a higher rate than the traditional form of hand raising provides.
The program is intended for use in the following age(s) and/or grade(s).
Age 3-5
Kindergarten
First grade
Second grade
Third grade
Fourth grade
Fifth grade
Sixth grade
Seventh grade
Eighth grade
Ninth grade
Tenth grade
Eleventh grade
Twelth grade
The program is intended for use with the following groups.
Students with learning disabilities
Students with intellectual disabilities
Students with emotional or behavioral disabilities
English language learners
Any student at risk for academic failure
Any student at risk for emotional and/or behavioral difficulties
Other
If other, please describe:
ACADEMIC INTERVENTION: Please indicate the academic area of focus.
Early Literacy
Alphabet knowledge
Phonological awareness
Phonological awarenessEarly writing
Early decoding abilities
Other
If other, please describe:
Language
Grammar
Syntax
Listening comprehension
Other
If other, please describe:
Reading
Phonics/word study
Comprehension
Fluency
Vocabulary
Spelling
Other
If other, please describe:
Mathematics
Concepts and/or word problems
Whole number arithmetic
Comprehensive: Includes computation/procedures, problem solving, and mathematical concepts
Algebra
Fractions, decimals (rational number)
Geometry and measurement
Other
If other, please describe:
Writing
Spelling
Sentence construction
Planning and revising
Other
If other, please describe:
BEHAVIORAL INTERVENTION: Please indicate the behavior area of focus.
Externalizing Behavior
Verbal Threats
Property Destruction
Noncompliance
High Levels of Disengagement
Disruptive Behavior
Social Behavior (e.g., Peer interactions, Adult interactions)
Other
If other, please describe:
Internalizing Behavior
Anxiety
Social Difficulties (e.g., withdrawal)
School Phobia
Other
If other, please describe:
Acquisition and cost information
Where to obtain:
- Address
- Phone Number
- Website
Initial cost for implementing program:
- Cost
- $0.00
- Unit of cost
Replacement cost per unit for subsequent use:
- Cost
- Unit of cost
- Duration of license
Additional cost information:
Describe basic pricing plan and structure of the program. Also, provide information on what is included in the published program, as well as what is not included but required for implementation (e.g., computer and/or internet access)
Only supplies are needed for responding (e.g., cards, white boards and markers).Program Specifications
Setting for which the program is designed.
Small group of students
BI ONLY: A classroom of students
If group-delivered, how many students compose a small group?
3-30Program administration time
- Minimum number of minutes per session
- Minimum number of sessions per week
- Minimum number of weeks
- If intervention program is intended to occur over less frequently than 60 minutes a week for approximately 8 weeks, justify the level of intensity:
Does the program include highly specified teacher manuals or step by step instructions for implementation?- Yes
BEHAVIORAL INTERVENTION: Is the program affiliated with a broad school- or class-wide management program?- No
-
If yes, please identify and describe the broader school- or class-wide management program: -
Does the program require technology? - No
-
If yes, what technology is required to implement your program? -
Computer or tablet
Internet connection
Other technology (please specify)
If your program requires additional technology not listed above, please describe the required technology and the extent to which it is combined with teacher small-group instruction/intervention:
Training
- How many people are needed to implement the program ?
- 1
Is training for the instructor or interventionist required?- No
- If yes, is the necessary training free or at-cost?
- Free
Describe the time required for instructor or interventionist training:- Less than 1 hour of training. Training is not required.
Describe the format and content of the instructor or interventionist training:- In published research, training was generally not described. When training occurred, it consisted of brief (e.g., 1 hour or less) didactic description and sometimes modeling.
What types or professionals are qualified to administer your program?
General Education Teacher
Reading Specialist
Math Specialist
EL Specialist
Interventionist
Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
Paraprofessional
Other
If other, please describe:
Usually implemented by teacher or researchers, but could be implemented by paraprofessional.- Does the program assume that the instructor or interventionist has expertise in a given area?
-
Yes
If yes, please describe:
Classroom instruction
Are training manuals and materials available?- No
-
Describe how the training manuals or materials were field-tested with the target population of instructors or interventionist and students: - No-however, intervention clearly described in published research.
Do you provide fidelity of implementation guidance such as a checklist for implementation in your manual?- Yes
-
Can practitioners obtain ongoing professional and technical support? -
No
If yes, please specify where/how practitioners can obtain support:
Summary of Evidence Base
- Please identify, to the best of your knowledge, all the research studies that have been conducted to date supporting the efficacy of your program, including studies currently or previously submitted to NCII for review. Please provide citations only (in APA format); do not include any descriptive information on these studies. NCII staff will also conduct a search to confirm that the list you provide is accurate.
-
Armendariz, F. & Umbreit, J. (1999). Using active responding to reduce disruptive behavior in a general education classroom. Journal of Positive Behavior Interventions, 1, 152-158.
Christle, C. A., & Schuster, J. W. (2003). The Effects of Using Response Cards on Student Participation, Academic Achievement, and On-Task Behavior During Whole-Class, Math Instruction. Journal Of Behavioral Education, 12(3), 147-165.
Davis, L. L., & O’Neill, R. E. (2004). Use of Response Cards with a Group of Students with Learning Disabilities Including Those for whom English is a Second Language. Journal of Applied Behavior Analysis, 37(2), 219-222.
George, C. L. (2010). The effects of response cards on performance and participation in social studies for middle school students with emotional and behavioral disorder. Behavioral Disorders, 35(3), 200-213.
Lambert, M. C., Cartledge, G., Heward, W. L., & Lo, Y. Y. (2006). Effects of response cards on disruptive behavior and academic responding during math lessons by fourth-grade urban students. Journal of Positive Behavior Interventions, 8(2), 88-99.
Maheady, L., Michielli-Pendl, J., Mallette, B., & Harper, G. F. (2002). A collaborative research project to improve the academic performance of a diverse sixth grade science class. Teacher Education and Special Education: The Journal of the Teacher Education Division of the Council for Exceptional Children, 25(1), 55-70.
Munro, D. W., & Stephenson, J. (2009). The effects of response cards on student and teacher behavior during vocabulary instruction. Journal of Applied Behavior Analysis, 42(4), 795-800.
Narayan, J. S., Heward, W. L., Garner, R., Courson, F. H., & Omness, C. K. (1990). Using response cards to increase student participation in an elementary classroom. Journal of Applied Behavioral Analysis, 23(4), 483-490.
Study Information
Study Citations
Armendariz, F. & Umbreit, J. (1999). Using active responding to reduce disruptive behavior in a general education classroom. . Journal of Positive Behavior Interventions, 1() 152-158.
Participants
- Describe how students were selected to participate in the study:
- Students were selected from a third-grade class in an urban public elementary school and included all 21 students in the class.
-
Describe how students were identified as being at risk for academic failure (AI) or as having emotional/behavioral difficulties (BI): - Not discussed.
-
ACADEMIC INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:- below the 30th percentile on local or national norm, or
- identified disability related to the focus of the intervention?
- %
-
BEHAVIORAL INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:- emotional disability label,
- placed in an alternative school/classroom,
- non-responsive to Tiers 1 and 2, or
- designation of severe problem behaviors on a validated scale or through observation?
- %
Provide a description of the demographic and other relevant characteristics of the case used in your study (e.g., student(s), classroom(s)).
Case (Name or number) | Age/Grade | Gender | Race / Ethnicity | Socioeconomic Status | Disability Status | ELL status | Other Relevant Descriptive Characteristics |
---|---|---|---|---|---|---|---|
test | test | test | test | test | test | test | test |
Design
- Please describe the study design:
- ABA reversal design with follow-up.
Clarify and provide a detailed description of the treatment in the submitted program/intervention:- Active responding in which students used response cards (laminate boards with dry-erase markers) to answer the teacher’s questions.
Clarify what procedures occurred during the control/baseline condition (third, competing conditions are not considered; if you have a third, competing condition [e.g., multi-element single subject design with a third comparison condition], in addition to your control condition, identify what the competing condition is [data from this competing condition will not be used]):- Active responding in which students used response cards (laminate boards with dry-erase markers) to answer the teacher’s questions.
Please describe how replication of treatment effect was demonstrated (e.g., reversal or withdrawal of intervention, across participants, across settings)- Following baseline, intervention was implemented then withdrawn.
-
Please indicate whether (and how) the design contains at least three demonstrations of experimental control (e.g., ABAB design, multiple baseline across three or more participants). - The design did not contain at least three demonstrations of experimental control.
If the study is a multiple baseline, is it concurrent or non-concurrent?- N/A
Fidelity of Implementation
- How was the program delivered?
-
Individually
Small Group
Classroom
If small group, answer the following:
- Average group size
- 21
- Minimum group size
- Maximum group size
What was the duration of the intervention (If duration differed across participants, settings, or behaviors, describe for each.)?
- Weeks
- 2.00
- Sessions per week
- 3.00
- Duration of sessions in minutes
- 20.00
- Weeks
- Sessions per week
- Duration of sessions in minutes
- Weeks
- Sessions per week
- Duration of sessions in minutes
- What were the background, experience, training, and ongoing support of the instructors or interventionists?
- The teacher was a 42-year-old woman with 15 years of teaching experience. Training was conducted across four class periods. During the first two class periods, the researcher taught the lesson and modeled intervention procedures. The teacher then practiced for two sessions and received feedback from the researcher. Also, the teacher was given a written sample of the sequence to be followed including a) keep the pace brisk, b) cue the students to write their answers on the card, c) cue students to show the board, d) provide the correct answer, e) choose student who did not get the correct answer to help the teacher work out the problem, and f) present the next problem.
Describe when and how fidelity of treatment information was obtained.- Not discussed.
What were the results on the fidelity-of-treatment implementation measure?- Not discussed.
Was the fidelity measure also used in baseline or comparison conditions?- N/A.
Measures and Results
Measures Broader :
Study measures are classified as targeted, broader, or administrative data according to the following definitions:
-
Targeted measures
Assess outcomes, such as competencies or skills, that the program was directly targeted to improve.- In the academic domain, targeted measures typically are not the very items taught but rather novel items structured similarly to the content addressed in the program. For example, if a program taught word-attack skills, a targeted measure would be decoding of pseudo words. If a program taught comprehension of cause-effect passages, a targeted measure would be answering questions about cause-effect passages structured similarly to those used during intervention, but not including the very passages used for intervention.
- In the behavioral domain, targeted measures evaluate aspects of external or internal behavior the program was directly targeted to improve and are operationally defined.
-
Broader measures
Assess outcomes that are related to the competencies or skills targeted by the program but not directly taught in the program.- In the academic domain, if a program taught word-level reading skill, a broader measure would be answering questions about passages the student reads. If a program taught calculation skill, a broader measure would be solving word problems that require the same kinds of calculation skill taught in the program.
- In the behavioral domain, if a program taught a specific skill like on-task behavior in one classroom, a broader measure would be on-task behavior in another setting.
- Administrative data measures apply only to behavioral intervention tools and are measures such as office discipline referrals (ODRs) and graduation rates, which do not have psychometric properties as do other, more traditional targeted or broader measures.
Targeted Measure | Reverse Coded? | Evidence | Relevance |
---|---|---|---|
Targeted Measure 1 | Yes | A1 | A2 |
Broader Measure | Reverse Coded? | Evidence | Relevance |
---|---|---|---|
Broader Measure 1 | Yes | A1 | A2 |
Administrative Data Measure | Reverse Coded? | Relevance |
---|---|---|
Admin Measure 1 | Yes | A2 |
- If you have excluded a variable or data that are reported in the study being submitted, explain the rationale for exclusion:
Results
- Describe the method of analyses you used to determine whether the intervention condition improved relative to baseline phase (e.g., visual inspection, computation of change score, mean difference):
- Mean scores of intervals of disruptive behavior for each student in each phase were analyzed.
Please present results in terms of within and between phase patterns. Data on the following data characteristics must be included: level, trend, variability, immediacy of the effect, overlap, and consistency of data patterns across similar conditions. Submitting only means and standard deviations for phases is not sufficient. Data must be included for each outcome measure (targeted, broader, and administrative if applicable) that was described above.- Mean condition data are reported for each student. Each student had lower mean percentage of disruptive behavior during the response card condition than during the hand-raising conditions. Percentage decrease in problem behavior ranged from 59% t0 100). Graphic data for disruptive behavior are reported for the class as a whole and indicate an immediate level change (with no overlapping data points) from baseline to intervention (and within phase consistency) and an upward trend in disruptive behavior during return to baseline. Mean class disruption during initial baseline was 43.3%, intervention was 8.3% respectively, return to baseline was 15.3% and follow-up (baseline condition) was 34.8%. During baseline, individual levels of disruptive behavior varied considerably. Individual variability per session decreased when response cards were implemented and gradually increased with the return to the baseline/hand-raising condition.
Additional Research
- Is the program reviewed by WWC or E-ESSA?
- No
- Summary of WWC / E-ESSA Findings :
What Works Clearinghouse Review
This program was not reviewed by What Works Clearinghouse.
- How many additional research studies are potentially eligible for NCII review?
- 0
- Citations for Additional Research Studies :
Data Collection Practices
Most tools and programs evaluated by the NCII are branded products which have been submitted by the companies, organizations, or individuals that disseminate these products. These entities supply the textual information shown above, but not the ratings accompanying the text. NCII administrators and members of our Technical Review Committees have reviewed the content on this page, but NCII cannot guarantee that this information is free from error or reflective of recent changes to the product. Tools and programs have the opportunity to be updated annually or upon request.