Taped Problems
Study: Bliss et al. (2010)

Summary

Taped Problems is a fluency-building intervention typically used to increase knowledge of math facts (addition, subtraction, multiplication, or division) or numerals. With Taped Problems, teachers first decide on a set of math facts for the student to practice. The teacher makes a worksheet with the facts and a blank space for the fact answer. The teacher then creates an audio recording where each fact is read aloud. The teacher pauses for a brief delay (e.g., 1-5 seconds). Then, the teacher says the answer to the math fact. After the recording is created, the student listen to the recording. When the teacher pauses on the recording after saying a fact, the student is supposed to write the answer to the math fact during the pause. Then, the student’s answer is reviewed as the teacher on the recording states the answer. With Taped Problems, the brief delay on the recording may be altered to encourage more rapid or automatic response to the math fact.

Target Grades:
Target Populations:
Area(s) of Focus:
  • Computation
  • Concepts and/or word problems
Where to Obtain:
Initial Cost:
Free
Replacement Cost:
Free

$0 but may have to buy a recording device

Staff Qualified to Administer Include:
  • Paraprofessional
  • Other:
Training Requirements:
Training not required


Access to Technical Support:
Recommended Administration Formats Include:
  • Individual students
Minimum Number of Minutes Per Session:
8
Minimum Number of Sessions Per Week:
5
Minimum Number of Weeks:
3
Detailed Implementation Manual or Instructions Available:
Is Technology Required?

Program Information

Descriptive Information

Please provide a description of program, including intended use:

Taped Problems is a fluency-building intervention typically used to increase knowledge of math facts (addition, subtraction, multiplication, or division) or numerals. With Taped Problems, teachers first decide on a set of math facts for the student to practice. The teacher makes a worksheet with the facts and a blank space for the fact answer. The teacher then creates an audio recording where each fact is read aloud. The teacher pauses for a brief delay (e.g., 1-5 seconds). Then, the teacher says the answer to the math fact. After the recording is created, the student listen to the recording. When the teacher pauses on the recording after saying a fact, the student is supposed to write the answer to the math fact during the pause. Then, the student’s answer is reviewed as the teacher on the recording states the answer. With Taped Problems, the brief delay on the recording may be altered to encourage more rapid or automatic response to the math fact.

The program is intended for use in the following age(s) and/or grade(s).

not selected Age 0-3
not selected Age 3-5
not selected Kindergarten
not selected First grade
not selected Second grade
not selected Third grade
not selected Fourth grade
not selected Fifth grade
not selected Sixth grade
not selected Seventh grade
not selected Eighth grade
not selected Ninth grade
not selected Tenth grade
not selected Eleventh grade
not selected Twelth grade


The program is intended for use with the following groups.

not selected Students with disabilities only
not selected Students with learning disabilities
not selected Students with intellectual disabilities
not selected Students with emotional or behavioral disabilities
not selected English language learners
not selected Any student at risk for academic failure
not selected Any student at risk for emotional and/or behavioral difficulties
not selected Other
If other, please describe:

ACADEMIC INTERVENTION: Please indicate the academic area of focus.

Early Literacy

not selected Print knowledge/awareness
not selected Alphabet knowledge
not selected Phonological awareness
not selected Phonological awarenessEarly writing
not selected Early decoding abilities
not selected Other

If other, please describe:

Language

not selected Expressive and receptive vocabulary
not selected Grammar
not selected Syntax
not selected Listening comprehension
not selected Other
If other, please describe:

Reading

not selected Phonological awareness
not selected Phonics/word study
not selected Comprehension
not selected Fluency
not selected Vocabulary
not selected Spelling
not selected Other
If other, please describe:

Mathematics

selected Computation
selected Concepts and/or word problems
not selected Whole number arithmetic
not selected Comprehensive: Includes computation/procedures, problem solving, and mathematical concepts
not selected Algebra
not selected Fractions, decimals (rational number)
not selected Geometry and measurement
not selected Other
If other, please describe:

Writing

not selected Handwriting
not selected Spelling
not selected Sentence construction
not selected Planning and revising
not selected Other
If other, please describe:

BEHAVIORAL INTERVENTION: Please indicate the behavior area of focus.

Externalizing Behavior

not selected Physical Aggression
not selected Verbal Threats
not selected Property Destruction
not selected Noncompliance
not selected High Levels of Disengagement
not selected Disruptive Behavior
not selected Social Behavior (e.g., Peer interactions, Adult interactions)
not selected Other
If other, please describe:

Internalizing Behavior

not selected Depression
not selected Anxiety
not selected Social Difficulties (e.g., withdrawal)
not selected School Phobia
not selected Other
If other, please describe:

Acquisition and cost information

Where to obtain:

Address
Phone Number
Website

Initial cost for implementing program:

Cost
$0.00
Unit of cost

Replacement cost per unit for subsequent use:

Cost
$0.00
Unit of cost
Duration of license

Additional cost information:

Describe basic pricing plan and structure of the program. Also, provide information on what is included in the published program, as well as what is not included but required for implementation (e.g., computer and/or internet access)

$0 but may have to buy a recording device

Program Specifications

Setting for which the program is designed.

selected Individual students
not selected Small group of students
not selected BI ONLY: A classroom of students

If group-delivered, how many students compose a small group?

  

Program administration time

Minimum number of minutes per session
8
Minimum number of sessions per week
5
Minimum number of weeks
3
not selected N/A (implemented until effective)

If intervention program is intended to occur over less frequently than 60 minutes a week for approximately 8 weeks, justify the level of intensity:

Does the program include highly specified teacher manuals or step by step instructions for implementation?

BEHAVIORAL INTERVENTION: Is the program affiliated with a broad school- or class-wide management program?

If yes, please identify and describe the broader school- or class-wide management program:

Does the program require technology?

If yes, what technology is required to implement your program?
not selected Computer or tablet
not selected Internet connection
not selected Other technology (please specify)

If your program requires additional technology not listed above, please describe the required technology and the extent to which it is combined with teacher small-group instruction/intervention:

Training

How many people are needed to implement the program ?

Is training for the instructor or interventionist required?
No
If yes, is the necessary training free or at-cost?

Describe the time required for instructor or interventionist training:
Training not required

Describe the format and content of the instructor or interventionist training:

What types or professionals are qualified to administer your program?

not selected Special Education Teacher
not selected General Education Teacher
not selected Reading Specialist
not selected Math Specialist
not selected EL Specialist
not selected Interventionist
not selected Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
not selected Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
selected Paraprofessional
not selected Other

If other, please describe:

Does the program assume that the instructor or interventionist has expertise in a given area?
No   

If yes, please describe: 


Are training manuals and materials available?

Describe how the training manuals or materials were field-tested with the target population of instructors or interventionist and students:

Do you provide fidelity of implementation guidance such as a checklist for implementation in your manual?

Can practitioners obtain ongoing professional and technical support?

If yes, please specify where/how practitioners can obtain support:

Summary of Evidence Base

Please identify, to the best of your knowledge, all the research studies that have been conducted to date supporting the efficacy of your program, including studies currently or previously submitted to NCII for review. Please provide citations only (in APA format); do not include any descriptive information on these studies. NCII staff will also conduct a search to confirm that the list you provide is accurate.

Bliss, S. L., Skinner, C. H., McCallum, E., Saecker, L. B., Rowland-Bryant, E., & Brown, K. S. (2010). A comparison of taped problems with and without a brief post-treatment assessment of multiplication fluency. Journal of Behavioral Education, 19, 156-168. doi:10.1007/s10864-010-906-5

 

Cressey, J., & Ezbicki, K. (2008). Improving automaticity with basic addition facts: Do taped problems work faster than cover, copy, compare? NERA Conference Proceedings 2008. Paper 12. http://digitalcommons.uconn.edu/nera_2008/12            

 

Krohn, K. R., Skinner, C.H., Fuller, E. J., & Greear, C. (2012). Using a taped intervention to improve kindergarten students’ number identification. Journal of Applied Behavior Analysis, 45, 437-441.

 

McCallum, E., & Schmitt, A. J. (2011). The taped problems intervention: Increasing the math fact fluency of a student with an intellectual disability. International Journal of Special Education, 26, 276-284.

 

McCallum, E., Skinner, C. H., & Hutchins, H. (2004). The taped-problems intervention. Journal of Applied School Psychology, 20, 129-147. doi:10.1300/J370v20n02_08  

          

Poncy, B. C., Skinner, C. H., & Jaspers, K. E. (2007). Evaluating and comparing interventions designed to enhance math fact accuracy and fluency: Cover, copy, and compare versus taped problems. Journal of Behavioral Education, 16, 27-37. doi:10.1007/s10864-006-9025-7                                                                       

Study Information

Study Citations

Bliss, S. L., Skinner, C. H., McCallum, E., Saecker, L. B., Rowland-Bryant, E. & Brown, K. S. (2010). A comparison of taped problems with and without a brief post-treatment assessment of multiplication fluency.. Journal of Behavioral Education, 19() 156-168.

Participants Empty Bobble

Describe how students were selected to participate in the study:
This study included 6 participants (3 males, 3 females) from a 5th-grade math classroom. All participants were receiving mathematics instruction in a class for the lowest-performing mathematics students in fifth grade. Achievement test data placed students in the remedial class.

Describe how students were identified as being at risk for academic failure (AI) or as having emotional/behavioral difficulties (BI):
All participants were receiving math education together in a class for the lowest-performing mathematics students in fifth-grade. Achievement test and teacher referrals were used to place students in this remedial class.

ACADEMIC INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:
  • below the 30th percentile on local or national norm, or
  • identified disability related to the focus of the intervention?
%

BEHAVIORAL INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:
  • emotional disability label,
  • placed in an alternative school/classroom,
  • non-responsive to Tiers 1 and 2, or
  • designation of severe problem behaviors on a validated scale or through observation?
%

Provide a description of the demographic and other relevant characteristics of the case used in your study (e.g., student(s), classroom(s)).

Case (Name or number) Age/Grade Gender Race / Ethnicity Socioeconomic Status Disability Status ELL status Other Relevant Descriptive Characteristics
test test test test test test test test

Design Full Bobble

Please describe the study design:
An adapted alternating treatments design was used to compare Taped Problems to a version of Taped Problems with an additional immediate assessment. Students worked on one-digit by one-digit multiplication problems.

Clarify and provide a detailed description of the treatment in the submitted program/intervention:
Taped Problems involved 36 problems. The Taped Problems delay was 0 sec for the first 12 problems, 2 sec for the second 12 problems, and 1 sec for the third 12 problems.

Clarify what procedures occurred during the control/baseline condition (third, competing conditions are not considered; if you have a third, competing condition [e.g., multi-element single subject design with a third comparison condition], in addition to your control condition, identify what the competing condition is [data from this competing condition will not be used]):
During baseline, assessments were administered to the students. Each assessment includes 36 one-digit by one-digit multiplication problems.

Please describe how replication of treatment effect was demonstrated (e.g., reversal or withdrawal of intervention, across participants, across settings)
Baseline was collected for the first 4 sessions. Intervention began on the 4th day (after baseline assessment data was collected). During intervention phase, students completed an assessment. Then, students participated in Taped Problems and Taped Problems plus Immediate Assessment. (The order of Taped Problems first or Taped Problems plus Immediate Assessment first was counterbalanced across days.) On Taped Problems, students participated in a Taped Problems worksheet with 36 problems. On Taped Problems plus Immediate Assessment days, students participated in a Taped Problems worksheet with 36 problems which was immediately followed by an assessment of 36 problems.

Please indicate whether (and how) the design contains at least three demonstrations of experimental control (e.g., ABAB design, multiple baseline across three or more participants).
The study has 6 participants. A control assessment was administered at the beginning of each treatment session.

If the study is a multiple baseline, is it concurrent or non-concurrent?

Fidelity of Implementation Half Bobble

How was the program delivered?
selected Individually
not selected Small Group
not selected Classroom

If small group, answer the following:

Average group size
Minimum group size
Maximum group size

What was the duration of the intervention (If duration differed across participants, settings, or behaviors, describe for each.)?

Condition A
Weeks
3.00
Sessions per week
5.00
Duration of sessions in minutes
9.00
Condition B
Weeks
Sessions per week
Duration of sessions in minutes
Condition C
Weeks
Sessions per week
Duration of sessions in minutes
What were the background, experience, training, and ongoing support of the instructors or interventionists?
The background of the experimenter is not described.

Describe when and how fidelity of treatment information was obtained.
A teaching assistant collected treatment integrity data for 30% for the treatment sessions according to a checklist of steps of the intervention.

What were the results on the fidelity-of-treatment implementation measure?
Treatment integrity ranged from 94-100% with an average of 98%.

Was the fidelity measure also used in baseline or comparison conditions?
The fidelity checklist was used during baseline and intervention.

Measures and Results

Measures Targeted : Empty Bobble
Measures Broader : Dash

Study measures are classified as targeted, broader, or administrative data according to the following definitions:

  • Targeted measures
    Assess outcomes, such as competencies or skills, that the program was directly targeted to improve.
    • In the academic domain, targeted measures typically are not the very items taught but rather novel items structured similarly to the content addressed in the program. For example, if a program taught word-attack skills, a targeted measure would be decoding of pseudo words. If a program taught comprehension of cause-effect passages, a targeted measure would be answering questions about cause-effect passages structured similarly to those used during intervention, but not including the very passages used for intervention.
    • In the behavioral domain, targeted measures evaluate aspects of external or internal behavior the program was directly targeted to improve and are operationally defined.
  • Broader measures
    Assess outcomes that are related to the competencies or skills targeted by the program but not directly taught in the program.
    • In the academic domain, if a program taught word-level reading skill, a broader measure would be answering questions about passages the student reads. If a program taught calculation skill, a broader measure would be solving word problems that require the same kinds of calculation skill taught in the program.
    • In the behavioral domain, if a program taught a specific skill like on-task behavior in one classroom, a broader measure would be on-task behavior in another setting.
  • Administrative data measures apply only to behavioral intervention tools and are measures such as office discipline referrals (ODRs) and graduation rates, which do not have psychometric properties as do other, more traditional targeted or broader measures.
Targeted Measure Reverse Coded? Evidence Relevance
Targeted Measure 1 Yes A1 A2
Broader Measure Reverse Coded? Evidence Relevance
Broader Measure 1 Yes A1 A2
Administrative Data Measure Reverse Coded? Relevance
Admin Measure 1 Yes A2
If you have excluded a variable or data that are reported in the study being submitted, explain the rationale for exclusion:

Results Half Bobble

Describe the method of analyses you used to determine whether the intervention condition improved relative to baseline phase (e.g., visual inspection, computation of change score, mean difference):
The authors note inconsistent results across students. Students 1, 2, 3, and 5 all showed increases greater than 4.5 digits correct per minute (DCM) favoring Taped Problems. However, Student 4 showed increases less than 1.5 DCM and Student 6 increased his DCM more (i.e., 5.5 DCM more) on the control set of problems relative to the Taped Problems set of problems. When Taped Problems plus Immediate Assessment is compared to the control set, five of the six students had increases of 8.5 DCM or greater on the Taped Problems plus Immediate Assessment. Again Student 4 showed little increase over the control set. To compare Taped Problems to Taped Problems plus Immediate Assessment, average growth (from baseline to intervention) indicates increases for Students 2 and 6 on Taped Problems plus Immediate Assessment. Student 5 showed greater increases on Taped Problems alone. Students 1 and 3 demonstrated much smaller gains across the two Taped Problems activities. Student 4 demonstrated little change.

Please present results in terms of within and between phase patterns. Data on the following data characteristics must be included: level, trend, variability, immediacy of the effect, overlap, and consistency of data patterns across similar conditions. Submitting only means and standard deviations for phases is not sufficient. Data must be included for each outcome measure (targeted, broader, and administrative if applicable) that was described above.

Additional Research

Is the program reviewed by WWC or E-ESSA?
No
Summary of WWC / E-ESSA Findings :

What Works Clearinghouse Review

This program was not reviewed by What Works Clearinghouse.

 

Evidence for ESSA

This program was not reviewed by Evidence for ESSA.

How many additional research studies are potentially eligible for NCII review?
0
Citations for Additional Research Studies :

Data Collection Practices

Most tools and programs evaluated by the NCII are branded products which have been submitted by the companies, organizations, or individuals that disseminate these products. These entities supply the textual information shown above, but not the ratings accompanying the text. NCII administrators and members of our Technical Review Committees have reviewed the content on this page, but NCII cannot guarantee that this information is free from error or reflective of recent changes to the product. Tools and programs have the opportunity to be updated annually or upon request.