Spring Math
Study: Codding et al. (2016)

Summary

Spring Math is a web-based MTSS/RTI system for mathematics. Please note: As an RTI system, Spring Math include screening, progress monitoring, and intervention however, NCII has only reviewed the intervention component for the purposes of the Academic Intervention Tools Chart.

Target Grades:
K, 1, 2, 3, 4, 5, 6, 7, 8
Target Populations:
  • Any student at risk for academic failure
Area(s) of Focus:
  • Computation
  • Concepts and/or word problems
  • Whole number arithmetic
  • Comprehensive: Includes computation/procedures, problem solving, and mathematical concepts
  • Algebra
  • Fractions, decimals (rational number)
Where to Obtain:
Amanda VanDerHeyden / Sourcewell Technology
2340 Energy Park Dr. St. Paul, MN 55108
(651) 999-6000 Option 2
www.springmath.com
Initial Cost:
$10.00 per student
Replacement Cost:
$10.00 per student per year

We require initial "onboarding training" which is customized to your district, conducted directly with your staff. We call this our Onboarding Advantage and it is a one-time setup, training and consultation cost of $795. This training completes all roster set-up and trains your teachers in the basics of navigating the tool. Teachers and administrators interact with Spring Math online in a password protected account similar to other student assessment and intervention systems. When the teacher logs in to his or her account, the teacher dashboard provides everything that is needed to conduct math MTSS each day including all assessments, intervention protocols, automated decision making and summary reports, and a full coach dashboard that characterizes intervention use and effects within the school to facilitate problem-solving team meetings and to direct in-class coaching support where its needed for better results. An extensive support portal provides games, word problems, materials for supplementing core instruction, and instructional calendars for all grade levels. Sites must have access to one computer per teacher, internet connection, and the ability to print in black and white. Spring Math provides extensive implementation support at no additional cost through a support portal to which all users have access. Support materials include how-to videos, brief how-to documents, access to all assessments and acquisition lesson plans for 130 skills, and live and archived webinars. In addition to the support portal, sites that wish to purchase additional coaching support can do so by purchasing our OnGoing Advantage or Coaching Advantage services. Our network of trained coaches have expertise in RtI/MTSS leadership and specific training in Spring Math. OnGoing support includes examining your system's data and conducting virtual systems-level problem-solving meetings to improve results. These packages range from $1850 to $3500 for up to 6 hours of virtual coaching assistance. In-person face to face consultation can be arranged for sites that desire such assistance. We also offer our coach cohort program free to all interested users which includes a web-based community and monthly live coaching sessions conducted by our leadership team with Q and A, shared note-taking, and archived recordings for later viewing or reviewing with your team members. https://www.sourcewelltech.org/math-intervention-spring-math/see-the-difference

Staff Qualified to Administer Include:
  • Special Education Teacher
  • General Education Teacher
  • Math Specialist
  • Interventionist
  • Paraprofessional
  • Other:
Training Requirements:
1-3 hours for onboarding training directly with school teams

Training is provided by implementation specialists. The training orients the user to the software and helps the system complete the onboarding process so they are ready to begin screening. The basics of MTSS in mathematics are covered as well as specifics on how to conduct the screening and how to conduct classwide and individual intervention. The research basis for the assessments and interventions are detailed in documents and videos provided in the support portal. An alignment study is provided detailing the alignment of the skills covered with Common Core State Standards. Users can access a full list of assessments and supplemental readings. An FAQ section is included that addresses questions like: Why are the assessments timed? How were the screening measures selected? How does Spring Math determine that a student is at risk or not? What does the “weeks with scores” metric mean? What are “tool skills” in math? Why are assessments given as part of the intervention? Why do the risk criteria differ across grades for the same skill? How do the assessments in Spring Math differ from other math assessments? Why do the screening measures seem so hard for my students? What research evidence supports the use of Spring Math? What research was used as the basis for developing the assessments?


The training instructions and materials were originally field tested in a district-wide trial of RtI that included use of classwide math intervention in all classes grades 1-8 in the district (VanDerHeyden & Burns, 2005; VanDerHeyden, Witt, & Gilbertson, 2007). These materials have been used in multiple research studies and implementation projects since 2002. A previous version of Spring Math, called Intervention Advisor, was pilot tested in the Boston public schools using the training materials and protocols that are now part of Spring Math.

Access to Technical Support:
Online support is provided on the site with short, embedded video tutorials explaining all aspects of implementation from screening to intervention selection and management. Support is embedded in the tool via the use of automated data interpretation, summary reports and prompts for the teacher to take the next action, intervention scripts, and consequence supports in the form of student growth reports. There is a coach dashboard that organizes implementation and effect data at the school level and directs coaches to check-in with teachers whose classes may need some troubleshooting. When interventions are not producing the anticipated effect on student learning, the system will provide an acquisition lesson to re-teach the skill within the teacher's dashboard. For systems that desire more training, we offer a range of virtual and in-person support some of which is free and some of which is provided at cost.
Recommended Administration Formats Include:
  • Individual students
  • Small group of students
Minimum Number of Minutes Per Session:
15
Minimum Number of Sessions Per Week:
5
Minimum Number of Weeks:
15
Detailed Implementation Manual or Instructions Available:
Yes
Is Technology Required?
  • Computer or tablet
  • Internet connection

Program Information

Descriptive Information

Please provide a description of program, including intended use:

Spring Math is a web-based MTSS/RTI system for mathematics. Please note: As an RTI system, Spring Math include screening, progress monitoring, and intervention however, NCII has only reviewed the intervention component for the purposes of the Academic Intervention Tools Chart.

The program is intended for use in the following age(s) and/or grade(s).

not selected Age 0-3
not selected Age 3-5
selected Kindergarten
selected First grade
selected Second grade
selected Third grade
selected Fourth grade
selected Fifth grade
selected Sixth grade
selected Seventh grade
selected Eighth grade
not selected Ninth grade
not selected Tenth grade
not selected Eleventh grade
not selected Twelth grade


The program is intended for use with the following groups.

not selected Students with disabilities only
not selected Students with learning disabilities
not selected Students with intellectual disabilities
not selected Students with emotional or behavioral disabilities
not selected English language learners
selected Any student at risk for academic failure
not selected Any student at risk for emotional and/or behavioral difficulties
not selected Other
If other, please describe:

ACADEMIC INTERVENTION: Please indicate the academic area of focus.

Early Literacy

not selected Print knowledge/awareness
not selected Alphabet knowledge
not selected Phonological awareness
not selected Phonological awarenessEarly writing
not selected Early decoding abilities
not selected Other

If other, please describe:

Language

not selected Expressive and receptive vocabulary
not selected Grammar
not selected Syntax
not selected Listening comprehension
not selected Other
If other, please describe:

Reading

not selected Phonological awareness
not selected Phonics/word study
not selected Comprehension
not selected Fluency
not selected Vocabulary
not selected Spelling
not selected Other
If other, please describe:

Mathematics

selected Computation
selected Concepts and/or word problems
selected Whole number arithmetic
selected Comprehensive: Includes computation/procedures, problem solving, and mathematical concepts
selected Algebra
selected Fractions, decimals (rational number)
not selected Geometry and measurement
not selected Other
If other, please describe:

Writing

not selected Handwriting
not selected Spelling
not selected Sentence construction
not selected Planning and revising
not selected Other
If other, please describe:

BEHAVIORAL INTERVENTION: Please indicate the behavior area of focus.

Externalizing Behavior

not selected Physical Aggression
not selected Verbal Threats
not selected Property Destruction
not selected Noncompliance
not selected High Levels of Disengagement
not selected Disruptive Behavior
not selected Social Behavior (e.g., Peer interactions, Adult interactions)
not selected Other
If other, please describe:

Internalizing Behavior

not selected Depression
not selected Anxiety
not selected Social Difficulties (e.g., withdrawal)
not selected School Phobia
not selected Other
If other, please describe:

Acquisition and cost information

Where to obtain:

Address
2340 Energy Park Dr. St. Paul, MN 55108
Phone Number
(651) 999-6000 Option 2
Website
www.springmath.com

Initial cost for implementing program:

Cost
$10.00
Unit of cost
student

Replacement cost per unit for subsequent use:

Cost
$10.00
Unit of cost
student
Duration of license
year

Additional cost information:

Describe basic pricing plan and structure of the program. Also, provide information on what is included in the published program, as well as what is not included but required for implementation (e.g., computer and/or internet access)

We require initial "onboarding training" which is customized to your district, conducted directly with your staff. We call this our Onboarding Advantage and it is a one-time setup, training and consultation cost of $795. This training completes all roster set-up and trains your teachers in the basics of navigating the tool. Teachers and administrators interact with Spring Math online in a password protected account similar to other student assessment and intervention systems. When the teacher logs in to his or her account, the teacher dashboard provides everything that is needed to conduct math MTSS each day including all assessments, intervention protocols, automated decision making and summary reports, and a full coach dashboard that characterizes intervention use and effects within the school to facilitate problem-solving team meetings and to direct in-class coaching support where its needed for better results. An extensive support portal provides games, word problems, materials for supplementing core instruction, and instructional calendars for all grade levels. Sites must have access to one computer per teacher, internet connection, and the ability to print in black and white. Spring Math provides extensive implementation support at no additional cost through a support portal to which all users have access. Support materials include how-to videos, brief how-to documents, access to all assessments and acquisition lesson plans for 130 skills, and live and archived webinars. In addition to the support portal, sites that wish to purchase additional coaching support can do so by purchasing our OnGoing Advantage or Coaching Advantage services. Our network of trained coaches have expertise in RtI/MTSS leadership and specific training in Spring Math. OnGoing support includes examining your system's data and conducting virtual systems-level problem-solving meetings to improve results. These packages range from $1850 to $3500 for up to 6 hours of virtual coaching assistance. In-person face to face consultation can be arranged for sites that desire such assistance. We also offer our coach cohort program free to all interested users which includes a web-based community and monthly live coaching sessions conducted by our leadership team with Q and A, shared note-taking, and archived recordings for later viewing or reviewing with your team members. https://www.sourcewelltech.org/math-intervention-spring-math/see-the-difference

Program Specifications

Setting for which the program is designed.

selected Individual students
selected Small group of students
not selected BI ONLY: A classroom of students

If group-delivered, how many students compose a small group?

  

Program administration time

Minimum number of minutes per session
15
Minimum number of sessions per week
5
Minimum number of weeks
15
not selected N/A (implemented until effective)

If intervention program is intended to occur over less frequently than 60 minutes a week for approximately 8 weeks, justify the level of intensity:

Does the program include highly specified teacher manuals or step by step instructions for implementation?
Yes

BEHAVIORAL INTERVENTION: Is the program affiliated with a broad school- or class-wide management program?

If yes, please identify and describe the broader school- or class-wide management program:

Does the program require technology?
Yes

If yes, what technology is required to implement your program?
selected Computer or tablet
selected Internet connection
not selected Other technology (please specify)

If your program requires additional technology not listed above, please describe the required technology and the extent to which it is combined with teacher small-group instruction/intervention:
Although all materials are provided to the teacher via an online interface, the actual administration of the assessments and interventions within Spring Math do not require technology because they are printed and delivered via paper and pencil. Spring Math classwide math intervention can be delivered in small groups, as could the individual interventions (so long as there is a small group of students who require the same individual intervention, which would adjust weekly).

Training

How many people are needed to implement the program ?

Is training for the instructor or interventionist required?
Yes
If yes, is the necessary training free or at-cost?
At-cost

Describe the time required for instructor or interventionist training:
1-3 hours for onboarding training directly with school teams

Describe the format and content of the instructor or interventionist training:
Training is provided by implementation specialists. The training orients the user to the software and helps the system complete the onboarding process so they are ready to begin screening. The basics of MTSS in mathematics are covered as well as specifics on how to conduct the screening and how to conduct classwide and individual intervention. The research basis for the assessments and interventions are detailed in documents and videos provided in the support portal. An alignment study is provided detailing the alignment of the skills covered with Common Core State Standards. Users can access a full list of assessments and supplemental readings. An FAQ section is included that addresses questions like: Why are the assessments timed? How were the screening measures selected? How does Spring Math determine that a student is at risk or not? What does the “weeks with scores” metric mean? What are “tool skills” in math? Why are assessments given as part of the intervention? Why do the risk criteria differ across grades for the same skill? How do the assessments in Spring Math differ from other math assessments? Why do the screening measures seem so hard for my students? What research evidence supports the use of Spring Math? What research was used as the basis for developing the assessments?

What types or professionals are qualified to administer your program?

selected Special Education Teacher
selected General Education Teacher
not selected Reading Specialist
selected Math Specialist
not selected EL Specialist
selected Interventionist
not selected Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
not selected Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
selected Paraprofessional
not selected Other

If other, please describe:

Does the program assume that the instructor or interventionist has expertise in a given area?
No   

If yes, please describe: 


Are training manuals and materials available?
Yes

Describe how the training manuals or materials were field-tested with the target population of instructors or interventionist and students:
The training instructions and materials were originally field tested in a district-wide trial of RtI that included use of classwide math intervention in all classes grades 1-8 in the district (VanDerHeyden & Burns, 2005; VanDerHeyden, Witt, & Gilbertson, 2007). These materials have been used in multiple research studies and implementation projects since 2002. A previous version of Spring Math, called Intervention Advisor, was pilot tested in the Boston public schools using the training materials and protocols that are now part of Spring Math.

Do you provide fidelity of implementation guidance such as a checklist for implementation in your manual?
Yes

Can practitioners obtain ongoing professional and technical support?
Yes

If yes, please specify where/how practitioners can obtain support:

Online support is provided on the site with short, embedded video tutorials explaining all aspects of implementation from screening to intervention selection and management. Support is embedded in the tool via the use of automated data interpretation, summary reports and prompts for the teacher to take the next action, intervention scripts, and consequence supports in the form of student growth reports. There is a coach dashboard that organizes implementation and effect data at the school level and directs coaches to check-in with teachers whose classes may need some troubleshooting. When interventions are not producing the anticipated effect on student learning, the system will provide an acquisition lesson to re-teach the skill within the teacher's dashboard. For systems that desire more training, we offer a range of virtual and in-person support some of which is free and some of which is provided at cost.

Summary of Evidence Base

Please identify, to the best of your knowledge, all the research studies that have been conducted to date supporting the efficacy of your program, including studies currently or previously submitted to NCII for review. Please provide citations only (in APA format); do not include any descriptive information on these studies. NCII staff will also conduct a search to confirm that the list you provide is accurate.

VanDerHeyden, A. M., McLaughlin, T., Algina, J., & Snyder, P. (2012). Randomized evaluation of a supplemental grade-wide mathematics intervention. American Education Research Journal, 49, 1251-1284. http://aer.sagepub.com/cgi/reprint/49/6/1251?ijkey=CHbWMLJp8/kRc&keytype=ref&siteid=spaer

 

VanDerHeyden, A. M. & Codding, R. (2015). Practical effects of class wide mathematics intervention. School Psychology Review, 44, 169-190. doi: http://dx.doi.org/10.17105/spr-13-0087.1

 

Codding, R., VanDerHeyden, Martin, R. J., & Perrault, L. (2016). Manipulating treatment dose: Evaluating the frequency of a small group intervention targeting whole number operations. Learning Disabilities Research & Practice, 31, 208-220

Study Information

Study Citations

Codding, R., VanDerHeyden, A. M., Martin, R. J. & Perrault, L. (2016). Manipulating treatment dose: Evaluating the frequency of a small group intervention targeting whole number operations. Learning Disabilities Research & Practice, 31(4) 208-220.

Participants Full Bobble

Describe how students were selected to participate in the study:
All students in grades 2, 3, and 4 in the participating school were eligible (N = 236). Eligible students were screened for mathematics difficulties using standard CBM screening procedures for mathematics and 141 of the screened students met at-risk criteria (less than 40 digits correct per two minutes for grade 2 and 3 students and less than 80 digits correct per two minutes for students in grade 4) for inclusion in study procedures. Because the study used a fluency-building intervention to examine dosage effects, follow-up assessments were conducted with all 141 students to verify that a fluency-building intervention would be a good match with their learning needs. Specifically, students were excluded if (a) they scored in the frustrational range (less than 20 digits correct per two minutes for grades 2 and 3 and less than 40 digits correct for grade 4) on all grade-level probes (this sequence is described below); (b) they could not represent a computation problem using drawings; or (c) they were receiving special education services. The final sample was 101 students in 10 classrooms from grade 2 (n = 39), grade 3 (n = 46), and grade 4 (n = 16).

Describe how students were identified as being at risk for academic failure (AI) or as having emotional or behavioral difficulties (BI):
All students in grades 2, 3, and 4 were screened using standard CBM procedures. The grade 2 screening was addition and subtraction fact families 0-20; Grade 3 screening was mixed multi-digit addition and subtraction with and without regrouping; Grade 4 was multiplication and division fact families 0-12. Any student who scored below the mastery criterion for these measures at their grade level, participated in a follow-up survey-level assessment that was administered individually by researchers. In Grade 2, this involved administering subtraction with minuends to 20, subtraction with minuends to 9, subtraction with minuends to 5, addition with sums to 20, addition with sums to 12, and addition with sums to 6 in that order. Grade 3 skills assessed were subtraction of 2-digit by 2-digit with and without regrouping, subtraction with minuends to 20, subtraction with minuends to 9, subtraction with minuends to 5, addition of 2-digit by 2-digit with and without regrouping, sums to 20, sums to 12, and sums to 6. Grade 4 skills assessed were division 0-12, division 0-5, multiplication 0-12, multiplication 0-5, addition & subtraction fact families 0-20, subtraction 0-20, and sums to 20. Assessment progressed until the child scored in the instructional range (grades 2-3- 20-39 digits correct in two minutes and grade 4- 40-79 digits correct per two minutes). For the instructional level skill, the student was asked to draw a solution to the problem to assess the child’s mathematical thinking, reasoning, and to verify conceptual understanding of the task (i.e., understanding multiplication as repeated addition can be demonstrated by the child drawing factor sets such as 5 x 3 drawn as three sets of five or five sets of three).

ACADEMIC INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:
  • below the 30th percentile on local or national norm, or
  • identified disability related to the focus of the intervention?
100.0%

BEHAVIORAL INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:
  • emotional disability label,
  • placed in an alternative school/classroom,
  • non-responsive to Tiers 1 and 2, or
  • designation of severe problem behaviors on a validated scale or through observation?
%

Specify which condition is the submitted intervention:
4 days per week intervention using the classwide math intervention protocol with small groups.

Specify which condition is the control condition:
Weekly progress monitoring only, which we refer to as the control condition in the LDRP article.

If you have a third, competing condition, in addition to your control and intervention condition, identify what the competing condition is (data from this competing condition will not be used):
We had two additional conditions: once weekly intervention and twice weekly intervention.

Using the tables that follow, provide data demonstrating comparability of the program group and control group in terms of demographics.

Grade Level

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Age less than 1
Age 1
Age 2
Age 3
Age 4
Age 5
Kindergarten
Grade 1
Grade 2 42.3 % 38.5 % 0.10
Grade 3 46.2 % 42.3 % 0.10
Grade 4 11.5 % 19.2 % 0.33
Grade 5
Grade 6
Grade 7
Grade 8
Grade 9
Grade 10
Grade 11
Grade 12

Race–Ethnicity

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
African American
American Indian
Asian/Pacific Islander
Hispanic
White
Other

Socioeconomic Status

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Subsidized Lunch
No Subsidized Lunch

Disability Status

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Speech-Language Impairments
Learning Disabilities
Behavior Disorders
Emotional Disturbance
Intellectual Disabilities
Other
Not Identified With a Disability 100.0 % 100.0 % 0.00

ELL Status

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
English Language Learner
Not English Language Learner

Gender

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Female
Male

Mean Effect Size

0.13

For any substantively (e.g., effect size ≥ 0.25 for pretest or demographic differences) or statistically significant (e.g., p < 0.05) pretest differences between groups in the descriptions below, please describe the extent to which these differences are related to the impact of the treatment. For example, if analyses were conducted to determine that outcomes from this study are due to the intervention and not demographic characteristics, please describe the results of those analyses here.

In our multi-level models, grade was included as a predictor. Note: We reported demographic data for the full eligible sample. We were not able to report demographic data in our final sample by treatment conditions because the school did not provide demographic data by student. The eligible sample was 51% male, 48% receiving free/reduced price lunch, 15% ELL, and 53% Caucasian, 20% African American, 16% Hispanic, 7% Asian, and 4% Multiracial. Following screening and survey-level assessment, students were sorted into groups by grade level and starting skill for intervention. Stratified random sampling was used to assign students to study conditions so that each condition consisted of approximately equal numbers of students from each grade level and each starting skill level (starting skill level is related to student performance proficiency).

Design Half Bobble

What method was used to determine students' placement in treatment/control groups?
Random
Please describe the assignment method or the process for defining treatment/comparison groups.
Participants were assigned to one of four conditions (i.e., four times weekly, twice weekly, once weekly, control) using stratified random sampling, controlling for skill level and grade so that each condition consisted of approximately equivalent numbers of students from each grade and skill level. Six groups of 3–5 students were arranged per condition (i.e., 24 total groups).

What was the unit of assignment?
Students
If other, please specify:

Please describe the unit of assignment:

What unit(s) were used for primary data analysis?
not selected Schools
not selected Teachers
selected Students
not selected Classes
not selected Other
If other, please specify:

Please describe the unit(s) used for primary data analysis:

Fidelity of Implementation Full Bobble

How was the program delivered?
not selected Individually
selected Small Group
not selected Classroom

If small group, answer the following:

Average group size
4
Minimum group size
3
Maximum group size
5

What was the duration of the intervention (If duration differed across participants, settings, or behaviors, describe for each.)?

Weeks
4.00
Sessions per week
4.00
Duration of sessions in minutes
12.00
What were the background, experience, training, and ongoing support of the instructors or interventionists?
Eight school psychology graduate students collected Screening and progress monitoring data (3 doctoral, 5 specialist-level). Six of these students (3 doctoral, 3 specialist-level) also led the intervention groups for each condition. Each graduate student was trained to use curriculum-based assessment and implement academic interventions as part of their program coursework. Graduate students participated in two training sessions that were specific to the procedures employed in the current study. First, graduate students viewed a 20-min video, in which the first author modeled the treatment protocol with three graduate students (not otherwise serving in the study) acting as participants. Graduate students observed the video in pairs and practiced implementing the treatment protocol with one another. During the second session, the first author reviewed all screening materials and observed test administration to correct errors and provide feedback. Scripted protocols guided all assessment and intervention activities.

Describe when and how fidelity of treatment information was obtained.
Procedural fidelity of intervention steps was evaluated using a checklist that accounted for 18 distinct interventionist behaviors representing one 12-min treatment session. All intervention sessions for each group were recorded using handheld audiotape recorders. Two specialist-level school psychology graduate students were trained in the intervention procedures and acted as independent reviewers. These graduate students were not otherwise involved with the study. The observers listened to 54 audio recordings (i.e., 12-min sessions) representing between 2 and 6 days of treatment for each of the six interventionists (57% of treatment sessions administered). We also observed and quantified child engagement during 12-18% of intervention sessions to ensure equivalence across experimental conditions.

What were the results on the fidelity-of-treatment implementation measure?
Mean treatment adherence was 93% (60% - 100%) as measured across all six interventionists. There were no commission or omission errors. Errors reflected wording alterations when intervention activities were explained. Mean treatment adherence was (a) 100% and 86% (66%-100%) for the four-times-weekly interventionists, (b) 97% (89% - 100%) and 87% (60% - 100%) for the twice-weekly interventionists, and (c) 100%, respectively, for both once-weekly interventionists. Mean observed student engagement was 98% (94-100%) on the Behavioral Observation of Students in Schools (BOSS; Shapiro, 2004).

Was the fidelity measure also used in control classrooms?
No, not for the intervention procedures because control students were only exposed to weekly progress monitoring assessment. Interscorer agreement data were collected for assessments for students in both control and treatment conditions.

Measures and Results

Measures Targeted : Full Bobble
Measures Broader : Full Bobble

Study measures are classified as targeted, broader, or administrative data according to the following definitions:

  • Targeted measures
    Assess outcomes, such as competencies or skills that the program was directly targeted to improve.
    • In the academic domain, targeted measures typically are not the very items taught but rather novel items structured similarly to the content addressed in the program. For example, if a program taught word-attack skills, a targeted measure would be decoding of pseudo words. If a program taught comprehension of cause-effect passages, a targeted measure would be answering questions about cause-effect passages structured similarly to those used during intervention, but not including the very passages used for intervention.
    • In the behavioral domain, targeted measures evaluate aspects of external or internal behavior the program was directly targeted to improve and are operationally defined.
  • Broader measures
    Assess outcomes that are related to the competencies or skills targeted by the program but not directly taught in the program.
    • In the academic domain, if a program taught word-level reading skill, a broader measure would be answering questions about passages the student reads. If a program taught calculation skill, a broader measure would be solving word problems that require the same kinds of calculation skill taught in the program.
    • In the behavioral domain, if a program taught a specific skill like on-task behavior in one classroom, a broader measure would be academic performance in that setting or on-task behavior in another setting.
  • Administrative data measures apply only to behavioral intervention tools and are measures such as office discipline referrals (ODRs) and graduation rates which do not have psychometric properties as do other, more traditional targeted or broader measures.

Click here for more information on effect size.


What populations are you submitting outcome data for?
selected Full sample
selected Students at or below the 20th percentile
not selected English language learners
not selected Racial/ethnic subgroups
not selected Economically disadvantaged students (low socioeconomic status)
Targeted Measure Reverse Coded? Reliability Relevance Exposure
Broader Measure Reverse Coded? Reliability Relevance Exposure
Administrative Data Measure Reverse Coded? Relevance

Posttest Data

Targeted Measures (Full Sample)

Measure Sample Type Effect Size P

Broader Measures (Full Sample)

Measure Sample Type Effect Size P

Administrative Measures (Full Sample)

Measure Sample Type Effect Size P

Targeted Measures (Subgroups)

Measure Sample Type Effect Size P

Broader Measures (Subgroups)

Measure Sample Type Effect Size P

Administrative Measures (Subgroups)

Measure Sample Type Effect Size P
For any substantively (e.g., effect size ≥ 0.25 for pretest or demographic differences) or statistically significant (e.g., p < 0.05) pretest differences, please describe the extent to which these differences are related to the impact of the treatment. For example, if analyses were conducted to determine that outcomes from this study are due to the intervention and not pretest characteristics, please describe the results of those analyses here.
Please explain any missing data or instances of measures with incomplete pre- or post-test data.
If you have excluded a variable or data that are reported in the study being submitted, explain the rationale for exclusion:
Describe the analyses used to determine whether the intervention produced changes in student outcomes:
Multilevel modeling was used to examine predictors of slope and final score across all measures. SAS PROC MIXED was used for the analyses due to the longitudinal nature of the data and our interest in monitoring individual progress of each student. Two-level models were analyzed where students served as the level-2 unit and repeated observations (1 pretest screening + 4 weeks of consecutive progress monitoring) served as the level-1 unit. Across all models, the restricted maximum likelihood (REML) estimation method and unstructured covariance were employed. Predictors for the models were grade and treatment assignment (i.e., four times weekly, twice weekly, once weekly, or control). Treatment assignment was coded (1 to 4) so that variables representing group membership could be used to predict differences in students’ final scores and growth rates over time. The control group and grade 4 students were used as the reference groups. For parsimony, the 1st-level equation of students’ progress over time was combined with the 2nd level equation of differences in students’ performance as a function of a treatment assignment and grade (Singer & Willett, 2003). Progress monitoring scores were centered at the end of treatment (final session at end of five weeks), resulting in negative numbers when growth was positive. Table 3 (p. 9) displays the unconditional and final models across measures. The unconditional model was fit in order to determine whether students’ performances varied over time across each measure. The random effects parameters demonstrated that across measures, students’ scores significantly varied around the mean, and there were significant differences between each student’s observed and predicted scores over time. Final models were determined to be a better fit than the unconditional models because AIC and BIC values were lower (Singer & Willett, 2003).

Additional Research

Is the program reviewed by WWC or E-ESSA?
No
Summary of WWC / E-ESSA Findings :
This program was not reviewed by What Works Clearinghouse.
How many additional research studies are potentially eligible for NCII review?
0
Citations for Additional Research Studies :
This program was not reviewed by Evidence for ESSA.

Data Collection Practices

Most tools and programs evaluated by the NCII are branded products which have been submitted by the companies, organizations, or individuals that disseminate these products. These entities supply the textual information shown above, but not the ratings accompanying the text. NCII administrators and members of our Technical Review Committees have reviewed the content on this page, but NCII cannot guarantee that this information is free from error or reflective of recent changes to the product. Tools and programs have the opportunity to be updated annually or upon request.