Number Rockets

Study: Fuchs, Compton, Fuchs, Paulsen, Bryant, et al. (2005)

Fuchs, L.S., Compton, D.L., Fuchs, D., Paulsen, K., Bryant, J.D., & Hamlett, C.L. (2005). The prevention, identification, and cognitive determinants of math difficulty. Journal of Educational Psychology, 97, 493-513.
Descriptive Information Usage Acquisition and Cost Program Specifications and Requirements Training

Number Rockets is a small-group tutoring program based on the concrete-representational-abstract model, which relies on concrete objects to promote conceptual learning. Lessons follow a sequence of 17 scripted topics, and each topic includes a worksheet and manipulative (e.g., base-10 blocks for place value instruction) activities.The sequence of topics is: identifying and writing numbers to 99; identifying more, less, and equal with objects; sequencing numbers; using <, >, and = symbols; skip counting by 10s, 5s, and 2s; understanding place value (introduction); identifying operations; place value (0-50); writing number sentences; place value (0-99); addition facts (sums to 18); subtraction facts (minuends to 18); review of addition and subtraction facts; place value; 2-digit addition (no regrouping); 2-digit subtraction (no regrouping); and missing addends. Review of topics 1-4 is conducted after winter break.

Mastery of the topic is assessed each day. For mastery assessment, students complete worksheets independently, with percentage of correct answers determining mastery (for most topics, 90% accuracy). After the last day on a topic, the group progresses to the next topic regardless of mastery status. On the first day of each topic, the students complete a cumulative review worksheet covering previous topics. Mastery assessment and cumulative review takes approximately 10 minutes and provides additional practice. Tutors follow scripts to ensure consistency but are not permitted to read or memorize scripts. During the final 10 minutes of each intervention session, students complete drill and practice activities to help students develop automatic retrieval of math facts, and students are taught efficient counting strategies as backups to automatic retrieval.

 

Number Rockets is intended for use in first grade. It is designed for use with students with disabilities (including learning disabilities, intellectual disabilities, and behavioral disabilities) and any student at risk of academic failure. The academic area of focus is math (including computation, concepts, and word problems).

Number Rockets has been used in more than 100 schools across the country.

 

Where to obtain:
Lynn Davies
228 Peabody
Vanderbilt University
Nashville, TN 37220
Phone: 615-343-4782

Lynn.a.davies@vanderbilt.edu

Web Site: http://vkc.mc.vanderbilt.edu/numberrockets/

Cost:

Initial cost for implementing program:

$64 per tutor (for the two manuals), plus $10 licensing fee to photocopy supporting materials

~$25 per student in photocopying supporting materials

~$30 per tutor for supplemental concrete reinforcers

Replacement cost for implementing program:

~$25 per student in photocopying supporting materials

$10 annual licensing fee to photocopy manuals/materials

Included: Manual – Book 1 Scripts ($29), and Book 2 - Master Copy of Supporting Materials ($35)
Not included: concrete reinforcers (available online or in school supply store), individual student copies of materials

The manual provides all information necessary for implementation and includes master copies of all materials. Schools need to make copies of materials (lamination for posters and reusable materials is recommended) and provide concrete reinforcers and manipulatives involved in the program.

Order form for tutoring manuals: http://vkc.mc.vanderbilt.edu/numberrockets/wp-content/uploads/2014/03/NUMBER-ROCKETS-order-form-revised-with-questions.pdf

Number Rockets is designed for use with individual students or small groups of two to three students.

Number Rockets takes 40 minutes per session with a recommended three sessions per week for 16 weeks.

The program includes a highly specified teacher’s manual.

There are two options for delivering math fact practice, one with and one without computers. Thus, the use of computers is optional; if computers are selected, iBooks or other Mac computers are needed.

 

One full day of training, plus follow-up by school or district staff, with weekly supervision of tutors.

In a one-day training workshop for tutors, (a) an overview of the tutoring program, goals, and topics is presented, and (b) tutoring procedures are modeled and practiced for each activity in four program topics. Following demonstration by the trainer, tutors practice techniques and activities in pairs and receive feedback. Additional consultation with the trainer is available by email or phone following training. Tutors attend weekly meetings to learn about and practice upcoming program topics and to discuss challenges. These weekly meetings are supervised by a building or district instructional support person.

Instructors may be certified teachers or paraprofessionals. The training manuals have been used widely, and users report high levels of satisfaction.

To schedule Number Rockets tutor training, contact Lynn.A.Davies@vanderbilt.edu

 

Participants: Convincing Evidence

Sample size: 127 students in 10 schools in first grade (139 students were intially pretested; 63 students in the treatment group and 64 students in the control group)

Risk Status: In whole-class format, we tested the 667 (89% of) students for whom we received parent consent. The measures were Curriculum-Based Measurement (CBM) Computation, Addition Fact Fluency, Subtraction Fact Fluency, and CBM Concept/Applications. Based on a factor score computed across these measures, we identified the 308 lowest-scoring students for individual testing, all of whom failed the local benchmark for designating risk status in math using the CBM Computation measure. Staff shared the names of these students with teachers, who nominated 11 additional students as potentially AR. Staff administered an individual battery to these 319 children. Then, based on the Week 4 CBM score, we identified the lowest 139 performing students as AR (i.e., 21% of the consented students); scores for all 139 students fell below the CBM benchmark for risk in math. These 139 students were randomly assigned to control or tutoring conditions, blocking by classrooms to ensure comparable distribution of AR students in the control and tutoring conditions within classrooms.
 
The at-risk sample was at the 21st percentile of a representative sample on pretest (Week 4 was before intervention, but having provided students with sufficient time to acclimate to the assessment) Curriculum-Based Measurement-Calculations (a reliable assessment of overall math competence at beginning of first grade). The term “representative sample” is used in the research design sense, i.e., representing the full range of performance (e.g., not among a sample of students selected low or high performing). In the case of this study/sample, students were in a metropolitan area with a high proportion of subsidized lunch students. So in terms of a national sample, it is safe to assume the samples are below the 25th percentile of a nationally representative sample in the demographic sense.

Demographics:

 

Program

Control

p of chi square

Number

Percentage

Number

Percentage

Grade level

  Kindergarten

 

 

 

 

 

  Grade 1

 63

 100%

 64

100% 

 1.00 NS

  Grade 2

 

 

 

 

 

  Grade 3

 

 

 

 

 

  Grade 4

 

 

 

 

 

  Grade 5

 

 

 

 

 

  Grade 6

 

 

 

 

 

  Grade 7

 

 

 

 

 

  Grade 8

 

 

 

 

 

  Grade 9

 

 

 

 

 

  Grade 10

 

 

 

 

 

  Grade 11

 

 

 

 

 

  Grade 12

 

 

 

 

 

Race-ethnicity

  African-American

32

50%

31

49%

NS (chi-square = 0.19)

  American Indian

0

0%

0

0%

 

  Asian/Pacific Islander

0

0%

0

0%

 

  Hispanic

3

5%

4

6%

 

  White

29

45%

28

44%

 

  Other

0

0%

0

0%

 

Socioeconomic status

  Subsidized lunch

36

56%

32

51%

NS (chi-square = 0.41)

  No subsidized lunch

23

36%

26

41%

 

Disability status

  Speech-language impairments

 

 

 

 

 

  Learning disabilities

 

 

 

 

 

  Behavior disorders

 

 

 

 

 

  Intellectual disabilities

 

 

 

 

 

  Other

 

 

 

 

 

  Not identified with a disability

63

100%

64

100%

1.00 NS

ELL status

  English language learner

0

0%

0

0%

1.00 NS

  Not English language learner

63

100%

64

100%

 

Gender

Female

33

53%

31

48%

NS (chi-square = 1.77)

Male

30

47%

33

52%

 

Training of Instructors: None of the tutors was a certified teacher; only one tutor had previous experience tutoring. Training occurred as follow. In a 1-day training session for tutors, (a) an overview of the tutoring program, goals, and topics was presented, and (b) the tutoring procedures were explained for each activity in the first four tutoring topics. After presentation of each activity, tutors practiced the activity with a partner, with more practice completed in the next two weeks. One week later, in a second session, tutors learned to use the drill/practice math fact activities. At the end of that week, a review session was held. Tutoring began one week later. Also, tutors attended weekly meetings to learn about and practice upcoming tutoring topics. In these weekly sessions, tutors also discussed difficulties they faced. Supervisors facilitated the weekly meetings and helped tutors problem solve.

Design: Convincing Evidence

Did the study use random assignment?: Yes.

If not, was it a tenable quasi-experiment?: Not applicable.

If the study used random assignment, at pretreatment, were the program and control groups not statistically significantly different and had a mean standardized difference that fell within 0.25 SD on measures used as covariates or on pretest measures also used as outcomes?: Yes.

If not, at pretreatment, were the program and control groups not statistically significantly different and had a mean standardized difference that fell within 0.25 SD on measures central to the study (i.e., pretest measures also used as outcomes), and outcomes were analyzed to adjust for pretreatment differences?: Not applicable.

Were the program and control groups demographically comparable at pretreatment?: Yes.

Was there attrition bias1 ?: No.

Did the unit of analysis match the unit for random assignment (for randomized studies) or the assignment strategy (for quasi-experiments)?: Yes.

1 NCII follows guidance from the What Works Clearinghouse (WWC) in determining attrition bias. The WWC model for determining bias based on a combination of differential and overall attrition rates can be found on pages 13-14 of this document: http://ies.ed.gov/ncee/wwc/pdf/reference_resources/wwc_procedures_v2_1_standards_handbook.pdf

 

Fidelity of Implementation: Convincing Evidence

Describe when and how fidelity of treatment information was obtained: All tutoring sessions were audiotaped. Tutors did not know which audiotapes would be checked for fidelity. We checked tapes for all 27 tutoring groups for Topic 4 (Day 1 or 2) and Topic 16 (Day 1) using a checklist that corresponded to the steps included in the lesson’s script, with 9-19 items (mean: 12) per checklist. Each checklist item was marked as observed, not observed, or not applicable. Fidelity was indexed as percentage of items implemented (observed divided by the sum of observed + not observed). A second coder re-checked fidelity for a random sample of 25% of the audiotapes. Agreement between coders was 88.3%.

Provide documentation (i.e., in terms of numbers) of fidelity of treatment implementation: Across tutors and sessions, the percentage of fidelity for the first check was 95.6; for the second check, 93.5.

Measures Targeted: Convincing Evidence

Measures Broader: Convincing Evidence

Targeted Measure Score type & range of measure Reliability statistics Relevance to program instructional content

Curriculum-Based Measurement (CBM)Computation

Number correct (0-25)

 

Coefficient alpha on this sample 0.93

 

Sampled computation problem types addressed in typical first-grade curricula; computation was addressed in Number Rockets (program) tutoring

Woodcock Johnson Calculation

Number correct (0-25)

 

Coefficient alpha on this sample 0.93

 

Sampled computation problem types addressed in the K-12 curricula; in tutoring, only first-grade computation skills were addressed in Number Rockets (program) tutoring.

Fact Retrieval Addition

Number correct (0-25)

 

Coefficient alpha on this sample 0.91

 

Answers ranged from 0-12. Number Rockets (program) tutoring addressed this content.

First-grade Concepts/ Applications

Number correct (0-25)

 

Coefficient alpha on this sample 0.92 

 

Sampled concepts/applications items, most of which were not addressed in Number Rockets (program) tutoring (none of actual problems was addressed)

Story Problems

Number correct (0-14)

Coefficient alpha on this sample 0.86

Sampled items, one-third of problem types were addressed in tutoring (none of the actual problems was addressed in Number Rockets (program) tutoring.)

Broader Measure Score type & range of measure Reliability statistics Relevance to program instructional content

Fact Retrieval Subtraction

 

Number correct (0-25)

 

Coefficient alpha on this sample 0.77

 

Answers ranged from 0-12. Number Rockets (program) tutoring addressed this content.

Woodcock-Johnson Applied Problems

 

Number correct (0-33)

 

Coefficient alpha on this sample 0.91

 

Sampled applications items, only a handful of which were addressed in Number Rockets tutoring (none of actual problems was addressed).

 

Number of Outcome Measures: 7 Math

Mean ES - Targeted: 0.45*

Mean ES - Broader: 0.10

Effect Size:

Targeted Measures

Construct Measure Effect Size
Math CBM Computation 0.32
Math Fact Retrieval Addition 0.28
Math Woodcock Johnson Calculations 0.60***
Math First-grade Concepts/Applications 0.51**
Math Story Problems 0.53**

Broader Measures

Construct Measure Effect Size
Math Fact Retrieval Subtractions 0.13
Math Woodcock Johnson Applied Problems 0.08

 

Key
*        p ≤ 0.05
**      p ≤ 0.01
***    p ≤ 0.001
–      Developer was unable to provide necessary data for NCII to calculate effect sizes
u      Effect size is based on unadjusted means
†      Effect size based on unadjusted means not reported due to lack of pretest group equivalency, and effect size based on adjusted means is not available

 

Visual Analysis (Single Subject Design): N/A

Disaggregated Data for Demographic Subgroups: No

Disaggregated Data for <20th Percentile: No

Administration Group Size: Small Group, (n=2-3)

Duration of Intervention: 40 minutes, 3 times a week, 16 weeks

Minimum Interventionist Requirements: Paraprofessional, 8 hours of training plus, weekly follow-up

Reviewed by WWC or E-ESSA: E-ESSA

What Works Clearinghouse Review

This program was not reviewed by What Works Clearinghouse.

 

Evidence for ESSA

Program Outcomes: One qualifying study evaluated Number Rockets with students below the 38th percentile in math. Students in the control group did not receive any tutoring or organized remediation. On TEMA-3 tests, students in Number Rockets scored significantly higher than controls, with an effect size of +0.34. This qualifies the program for the ESSA “Strong” category. An earlier study of Number Rockets also showed positive effects, but the tutors were the authors’ graduate students, so that study did not meet inclusion standards.

Number of Studies: 1

Average Effect Size: 0.34

Full Report

 

Other Research: Potentially Eligible for NCII Review: 2 studies

Fuchs, L. S., Geary, D. C., Compton, D. L., Fuchs, D., Schatschneider, C., Hamlett, C. L., & Changas, P. (2013). Effects of First-Grade Number Knowledge Tutoring with Contrasting Forms of Practice. Journal of Educational Psychology, 105, 58-77.

Rolfhus, E., Gersten, R., Clarke, B., Decker, L., Wilkins, C., & Dimino, J. (20120). An Evaluation of Number Rockets: a Tier-2 Intervention for Grade 1 Students At Risk for Difficulties in Mathematics (NCEE 2012-4007). Washington, D.C.: National Center for Educational Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.