Headsprout
Study: Tyler et al. (2015)

Summary

Headsprout is a research-based, online supplemental reading program that teaches reading fundamentals to grades preK-2, and reading comprehension strategies to grades 3-5. The highly engaging, interactive program uses patented technology that allows adaptive online instruction. Headsprout was designed according to strict standards centered on the results from usability testing and externally-validated reading outcomes. The program is designed to tailor its instruction to the needs and learning pace of every student. Its patented research-based teaching methods have been tested and verified in real classroom environments to improve students’ reading skills. What truly sets Headsprout apart from other online reading programs is its one-of-a-kind scaffolded teaching approach that automatically adapts so every student receives the individualized practice and instruction they need. With the Headsprout Early Reading component, students learn to read and read to learn. Early readers interact with engaging online episodes and read printable eBooks designed to instill key reading fundamentals like phonemic awareness, phonics, fluency, vocabulary, and beginning comprehension. Once readers have demonstrated a solid grasp of the basics, they move on to the Headsprout Reading Comprehension episodes. These episodes were created to teach the four primary components of reading comprehension: finding facts, making inferences, identifying themes, and learning vocabulary in context. The Headsprout Early Reading component incorporates hundreds of instructional routines that automatically adapt to the specific needs and learning pace of each student. All have been tested for effectiveness in the lab and in actual classrooms. The Headsprout Reading Comprehension component provides students with instructional strategies to increase their ability to comprehend what they read, to demonstrate their understanding across different subject areas, and to apply those skills on standardized tests. Headsprout reading programs are built on evidence-based practices, data-based decision making, and efficient placement and progress monitoring to ensure reading success for all students. The built-in reports allow for high levels of collaboration between teachers, interventionists, school psychologists, and administrators.

Target Grades:
K, 1, 2, 3, 4, 5
Target Populations:
  • Students with learning disabilities
  • Students with intellectual disabilities
  • Students with emotional or behavioral disabilities
  • English language learners
  • Any student at risk for academic failure
Area(s) of Focus:
  • Phonological awareness
  • Phonics/word study
  • Comprehension
  • Fluency
  • Vocabulary
  • Spelling
Where to Obtain:
Learning A-Z
1840 East River Rd, #320 Tucson, AZ 85718
866-889-3729
https://www.headsprout.com
Initial Cost:
$199.95 per classroom
Replacement Cost:
$199.95 per classroom per year

Headsprout's basic pricing plan includes one classroom license that is valid for one year. A license grants access to the Headsprout website as well as permission to use its copyrighted resources as part of the classroom curriculum. Each educator using the resources must have a license in order to obtain the necessary permission. Each Headsprout license is valid for one family or classroom only (with up to 36 students). Licenses must be maintained for continued permission to use downloaded, copyrighted materials. Each license must be registered in the name of the classroom teacher using the resources. Learning A-Z believes that ongoing Professional Development is critical to the success of any implementation. To that end, Professional Development is included with every license purchased. Levels of Professional Development provided will be based upon the amount of each individual purchase as outlined below: 0 to $1,999 On Demand Videos accessed via the Headsprout website $2,000 to $4,999 2 customized webinars, 1 standard e-learning module $5,000 to $9,999 6 customized webinars, 1 standard e-learning module $10,000 to $24,999 2 days customized, onsite Professional Development, 12 customized webinars, 1 customized e-learning module $25,000 to $49,999 8 days customized, onsite Professional Development, 20 customized webinars, 4 customized e-learning modules $50,000+ 20 days customized, onsite Professional Development, 30 customized webinars, 6 customized e-learning modules

Staff Qualified to Administer Include:
  • Special Education Teacher
  • General Education Teacher
  • Reading Specialist
  • Math Specialist
  • EL Specialist
  • Interventionist
  • Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
  • Paraprofessional
  • Other:
Training Requirements:
Training not required

a. Free Public Webinars: Written, designed, and presented by educators for educators. Learn skills, tips, and strategies to make Learning A-Z tools support your classroom and students’ needs. Convenient webinar dates and times are available throughout each month. b. Customized Webinars: These webinars are tailored to meet the specific needs of your teaching staff. Agenda topics and discussion are tailored to ensure that everything your staff needs to know is covered. Additionally, these webinars are scheduled at a time that works for your group. c. On-Site Training and Support: With the support of a Learning A-Z Professional Development Specialist, discover how to use the resources more effectively through personalized, on-going and hands-on instruction. On-site training can take numerous formats and is designed to scaffold teachers in using Learning A-Z’s products in a manner that is fully integrated into a district’s existing curriculum and philosophies of learning. Additionally, receive tips and ideas to customize resources in a way that meets the needs of all learners!


Headsprout: The Headsprout instructional resources were thoroughly tested using single-subject control analyses and between groups experimental designs. Over 250 learners participated in user testing, and only once all learners responded correctly in at least 90% of opportunities an instructional segment was deemed ready for use. Over 10,000 changes to the program were made as a result of user testing feedback prior to the program being made available for purchase. Implementation: Learning A-Z collects performance data on all eLearning modules, and when users demonstrate high rates of errors on particular segments, the segment is adjusted, retested, and the iterative development cycle continues. All trainings also include hands-on activities, and user success is measured at the time of the session. Additionally, we continuously evaluate program usage following training events to identify whether or not a training sequence needs to be adjusted.

Access to Technical Support:
Learning A-Z provides ongoing technical support for all licensed customers. In addition, free and fee based professional development services are available based on the purchase levels outlined in the previous sections.
Recommended Administration Formats Include:
  • Individual students
Minimum Number of Minutes Per Session:
20
Minimum Number of Sessions Per Week:
3
Minimum Number of Weeks:
25
Detailed Implementation Manual or Instructions Available:
Yes
Is Technology Required?
  • Computer or tablet
  • Internet connection

Program Information

Descriptive Information

Please provide a description of program, including intended use:

Headsprout is a research-based, online supplemental reading program that teaches reading fundamentals to grades preK-2, and reading comprehension strategies to grades 3-5. The highly engaging, interactive program uses patented technology that allows adaptive online instruction. Headsprout was designed according to strict standards centered on the results from usability testing and externally-validated reading outcomes. The program is designed to tailor its instruction to the needs and learning pace of every student. Its patented research-based teaching methods have been tested and verified in real classroom environments to improve students’ reading skills. What truly sets Headsprout apart from other online reading programs is its one-of-a-kind scaffolded teaching approach that automatically adapts so every student receives the individualized practice and instruction they need. With the Headsprout Early Reading component, students learn to read and read to learn. Early readers interact with engaging online episodes and read printable eBooks designed to instill key reading fundamentals like phonemic awareness, phonics, fluency, vocabulary, and beginning comprehension. Once readers have demonstrated a solid grasp of the basics, they move on to the Headsprout Reading Comprehension episodes. These episodes were created to teach the four primary components of reading comprehension: finding facts, making inferences, identifying themes, and learning vocabulary in context. The Headsprout Early Reading component incorporates hundreds of instructional routines that automatically adapt to the specific needs and learning pace of each student. All have been tested for effectiveness in the lab and in actual classrooms. The Headsprout Reading Comprehension component provides students with instructional strategies to increase their ability to comprehend what they read, to demonstrate their understanding across different subject areas, and to apply those skills on standardized tests. Headsprout reading programs are built on evidence-based practices, data-based decision making, and efficient placement and progress monitoring to ensure reading success for all students. The built-in reports allow for high levels of collaboration between teachers, interventionists, school psychologists, and administrators.

The program is intended for use in the following age(s) and/or grade(s).

not selected Age 0-3
not selected Age 3-5
selected Kindergarten
selected First grade
selected Second grade
selected Third grade
selected Fourth grade
selected Fifth grade
not selected Sixth grade
not selected Seventh grade
not selected Eighth grade
not selected Ninth grade
not selected Tenth grade
not selected Eleventh grade
not selected Twelth grade


The program is intended for use with the following groups.

not selected Students with disabilities only
selected Students with learning disabilities
selected Students with intellectual disabilities
selected Students with emotional or behavioral disabilities
selected English language learners
selected Any student at risk for academic failure
not selected Any student at risk for emotional and/or behavioral difficulties
not selected Other
If other, please describe:

ACADEMIC INTERVENTION: Please indicate the academic area of focus.

Early Literacy

not selected Print knowledge/awareness
not selected Alphabet knowledge
not selected Phonological awareness
not selected Phonological awarenessEarly writing
not selected Early decoding abilities
not selected Other

If other, please describe:

Language

not selected Expressive and receptive vocabulary
not selected Grammar
not selected Syntax
not selected Listening comprehension
not selected Other
If other, please describe:

Reading

selected Phonological awareness
selected Phonics/word study
selected Comprehension
selected Fluency
selected Vocabulary
selected Spelling
not selected Other
If other, please describe:

Mathematics

not selected Computation
not selected Concepts and/or word problems
not selected Whole number arithmetic
not selected Comprehensive: Includes computation/procedures, problem solving, and mathematical concepts
not selected Algebra
not selected Fractions, decimals (rational number)
not selected Geometry and measurement
not selected Other
If other, please describe:

Writing

not selected Handwriting
not selected Spelling
not selected Sentence construction
not selected Planning and revising
not selected Other
If other, please describe:

BEHAVIORAL INTERVENTION: Please indicate the behavior area of focus.

Externalizing Behavior

not selected Physical Aggression
not selected Verbal Threats
not selected Property Destruction
not selected Noncompliance
not selected High Levels of Disengagement
not selected Disruptive Behavior
not selected Social Behavior (e.g., Peer interactions, Adult interactions)
not selected Other
If other, please describe:

Internalizing Behavior

not selected Depression
not selected Anxiety
not selected Social Difficulties (e.g., withdrawal)
not selected School Phobia
not selected Other
If other, please describe:

Acquisition and cost information

Where to obtain:

Address
1840 East River Rd, #320 Tucson, AZ 85718
Phone Number
866-889-3729
Website
https://www.headsprout.com

Initial cost for implementing program:

Cost
$199.95
Unit of cost
classroom

Replacement cost per unit for subsequent use:

Cost
$199.95
Unit of cost
classroom
Duration of license
year

Additional cost information:

Describe basic pricing plan and structure of the program. Also, provide information on what is included in the published program, as well as what is not included but required for implementation (e.g., computer and/or internet access)

Headsprout's basic pricing plan includes one classroom license that is valid for one year. A license grants access to the Headsprout website as well as permission to use its copyrighted resources as part of the classroom curriculum. Each educator using the resources must have a license in order to obtain the necessary permission. Each Headsprout license is valid for one family or classroom only (with up to 36 students). Licenses must be maintained for continued permission to use downloaded, copyrighted materials. Each license must be registered in the name of the classroom teacher using the resources. Learning A-Z believes that ongoing Professional Development is critical to the success of any implementation. To that end, Professional Development is included with every license purchased. Levels of Professional Development provided will be based upon the amount of each individual purchase as outlined below: 0 to $1,999 On Demand Videos accessed via the Headsprout website $2,000 to $4,999 2 customized webinars, 1 standard e-learning module $5,000 to $9,999 6 customized webinars, 1 standard e-learning module $10,000 to $24,999 2 days customized, onsite Professional Development, 12 customized webinars, 1 customized e-learning module $25,000 to $49,999 8 days customized, onsite Professional Development, 20 customized webinars, 4 customized e-learning modules $50,000+ 20 days customized, onsite Professional Development, 30 customized webinars, 6 customized e-learning modules

Program Specifications

Setting for which the program is designed.

selected Individual students
not selected Small group of students
not selected BI ONLY: A classroom of students

If group-delivered, how many students compose a small group?

  

Program administration time

Minimum number of minutes per session
20
Minimum number of sessions per week
3
Minimum number of weeks
25
not selected N/A (implemented until effective)

If intervention program is intended to occur over less frequently than 60 minutes a week for approximately 8 weeks, justify the level of intensity:

Does the program include highly specified teacher manuals or step by step instructions for implementation?
Yes

BEHAVIORAL INTERVENTION: Is the program affiliated with a broad school- or class-wide management program?

If yes, please identify and describe the broader school- or class-wide management program:

Does the program require technology?
Yes

If yes, what technology is required to implement your program?
selected Computer or tablet
selected Internet connection
not selected Other technology (please specify)

If your program requires additional technology not listed above, please describe the required technology and the extent to which it is combined with teacher small-group instruction/intervention:
All Learning A-Z resources are browser based and server based, and are compatible with Chrome, Internet Explorer and Windows 7, 8, 8.1 or newer. Schools do not need to install programs locally. Learning A-Z system requirements are available on our website at http://help.learninga-z.com/customer/portal/articles/1649242-system-requirements. Additionally, users can perform a system check at https://www.learninga-z.com/help/browsercheck.htm.

Training

How many people are needed to implement the program ?

Is training for the instructor or interventionist required?
No
If yes, is the necessary training free or at-cost?

Describe the time required for instructor or interventionist training:
1-4 hours of training

Describe the format and content of the instructor or interventionist training:
a. Free Public Webinars: Written, designed, and presented by educators for educators. Learn skills, tips, and strategies to make Learning A-Z tools support your classroom and students’ needs. Convenient webinar dates and times are available throughout each month. b. Customized Webinars: These webinars are tailored to meet the specific needs of your teaching staff. Agenda topics and discussion are tailored to ensure that everything your staff needs to know is covered. Additionally, these webinars are scheduled at a time that works for your group. c. On-Site Training and Support: With the support of a Learning A-Z Professional Development Specialist, discover how to use the resources more effectively through personalized, on-going and hands-on instruction. On-site training can take numerous formats and is designed to scaffold teachers in using Learning A-Z’s products in a manner that is fully integrated into a district’s existing curriculum and philosophies of learning. Additionally, receive tips and ideas to customize resources in a way that meets the needs of all learners!

What types or professionals are qualified to administer your program?

selected Special Education Teacher
selected General Education Teacher
selected Reading Specialist
selected Math Specialist
selected EL Specialist
selected Interventionist
selected Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
not selected Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
selected Paraprofessional
not selected Other

If other, please describe:

Does the program assume that the instructor or interventionist has expertise in a given area?
No   

If yes, please describe: 


Are training manuals and materials available?
Yes

Describe how the training manuals or materials were field-tested with the target population of instructors or interventionist and students:
Headsprout: The Headsprout instructional resources were thoroughly tested using single-subject control analyses and between groups experimental designs. Over 250 learners participated in user testing, and only once all learners responded correctly in at least 90% of opportunities an instructional segment was deemed ready for use. Over 10,000 changes to the program were made as a result of user testing feedback prior to the program being made available for purchase. Implementation: Learning A-Z collects performance data on all eLearning modules, and when users demonstrate high rates of errors on particular segments, the segment is adjusted, retested, and the iterative development cycle continues. All trainings also include hands-on activities, and user success is measured at the time of the session. Additionally, we continuously evaluate program usage following training events to identify whether or not a training sequence needs to be adjusted.

Do you provide fidelity of implementation guidance such as a checklist for implementation in your manual?
Yes

Can practitioners obtain ongoing professional and technical support?
Yes

If yes, please specify where/how practitioners can obtain support:

Learning A-Z provides ongoing technical support for all licensed customers. In addition, free and fee based professional development services are available based on the purchase levels outlined in the previous sections.

Summary of Evidence Base

Please identify, to the best of your knowledge, all the research studies that have been conducted to date supporting the efficacy of your program, including studies currently or previously submitted to NCII for review. Please provide citations only (in APA format); do not include any descriptive information on these studies. NCII staff will also conduct a search to confirm that the list you provide is accurate.

Headsprout Early Reading Research Base (2009): http://issuu.com/headsprout/docs/her_the_research_base

 

Huffstetter, M., King, J. R., Onwuegbuzie, A. J., Schneider, J. J., & Powell-Smith, K. A. (2010). Effects of a computer-based early reading program on the early reading and oral language skills of at-risk preschool children. Journal of Education for Students Placed at Risk, 15, 279-298.

 

Grindle, C. F., Hughes, C. J., Saville, M., Huxley, K., & Hastings, R. P. (2013). Teaching early reading skills to children with autism using MimioSprout Early Reading. Behavioral Interventions, 28, 203-224.

 

Twyman, J. S., Layng, T. V. J., & Layng, Z. (2011). The likelihood of instructionally beneficial, trivial, or negative results for kindergarten and first grade learners who complete at least half of Headsprout Early Reading. Behavioral Technology Today, 6, 1-19.

 

Layng, T. V. J., Sota, M., & Leon, M. (2011). Thinking through text comprehension I: foundation and guiding relations. Behavior Analyst Today, 12, 3-11.

 

Sota, M., Leon, M., & Layng, T. V. J. (2011). Thinking through text comprehension II: analysis of verbal and investigative repertoires. Behavior Analyst Today, 12, 12-22.

 

Leon, M., Layng, T. V. J., & Sota, M. (2011). Thinking through text comprehension III: the programing of verbal and investigative repertoires. Behavior Analyst Today, 12, 21-30.

Leon, M., Ford, V., Shimizu, H., Heimlich, A., Thompson, J., Sota, M., Twyman, J. S., & Layng, T. V. J. (2011). Comprehension by design: teaching young learners how to comprehend what they read.

Performance Improvement, 50, 40-46.

 

Layng, Z. R. & Layng, T. V. J. (2012). Building the case for large-scale behavioral education adoptions. Behavior Analyst Today, 13, no. 1.

 

Layng, T. V. J., Twyman, J. S., & Stikeleather, G. (2004). Engineering discovery learning: the contingency adduction of some precursors of textual responding in a beginning reading program. The Analysis of Verbal Behavior, 20, 99-109.

 

Layng, J., Twyman, J. S., & Stikeleather, G. (2003). Headsprout Early Reading: reliably teaching children to read. Behavioral Technology Today, 3, 7-20.

 

Twyman, J. S., Layng, T. V. J., Stikeleather, G. and Hobbins, K. A. (2004). A non–linear approach to curriculum design: The role of behavior analysis in building an effective reading program. In W. L. Heward et al. (Eds.), Focus on Behavior Analysis in Education, Vol. 3, (pp. 55-68). Upper Saddle River, NJ: Merrill/Prentice.

 

Layng, T. V. J., Twyman, J. S., & Stikeleather, G. (2004). Selected for success: how Headsprout Reading Basics teaches children to read. In D. J. Moran & R. Malott (Eds.), Evidence-Based Education Methods (pp. 171-197). St. Louis, MO: Elsevier Science/Academic Press.

 

Layng, T. V. J., Stikeleather, G., & Twyman, J. S. (2006). Scientific formative evaluation: The role of individual learners in generating and predicting successful educational outcomes. In R. Subotnik & H. Walberg (Eds.), The Scientific Basis of Educational Productivity (pp. 29-44). Greenwich, CT: Information Age Publishing.

SECTION II: TECHNICAL INFORMATION: STUDY FORM A (GROUP DESIGNS

Study Information

Study Citations

Tyler, E., Hughes, J., Beverley, M. & Hastings, R. (2015). Improving early reading skills for beginning readers using an online programme as supplementary instruction. European Journal of Psychology of Education, () 212-226.

Participants Empty Bobble

Describe how students were selected to participate in the study:
Pupils in Year 2 (aged 6–7 years) from two mainstream primary schools in North Wales participated in the study. In the first part of the school year, 51 children were randomly allocated to either the Headsprout Early Reading (HER) group or a waiting list control group (C). Twenty-five were allocated to the HER group (Female = 12, Male = 13) and 26 to the control group (Female = 5, Male = 21). In the pre-test reading assessments, a number of participants demonstrated reading ages beyond a beginning reading level for which the HER programme is designed and were therefore excluded from the study. Thus, at the beginning of the intervention period, there were 24 children in the HER group (Female = 11, Male = 13) and 17 children in the control group (Female = 3, Male = 14). Eight participants were learning English as an additional language (HER = 5, C=3).

Describe how students were identified as being at risk for academic failure (AI) or as having emotional or behavioral difficulties (BI):
The study excluded children already reading beyond the at-risk level of skills taught by HER based on pre-test reading assessments

ACADEMIC INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:
  • below the 30th percentile on local or national norm, or
  • identified disability related to the focus of the intervention?
%

BEHAVIORAL INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:
  • emotional disability label,
  • placed in an alternative school/classroom,
  • non-responsive to Tiers 1 and 2, or
  • designation of severe problem behaviors on a validated scale or through observation?
%

Specify which condition is the submitted intervention:
Headsprout Early Reading

Specify which condition is the control condition:
Single treatment condition

If you have a third, competing condition, in addition to your control and intervention condition, identify what the competing condition is (data from this competing condition will not be used):

Using the tables that follow, provide data demonstrating comparability of the program group and control group in terms of demographics.

Grade Level

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Age less than 1
Age 1
Age 2 24 17 0.00
Age 3
Age 4
Age 5
Kindergarten
Grade 1
Grade 2
Grade 3
Grade 4
Grade 5
Grade 6
Grade 7
Grade 8
Grade 9
Grade 10
Grade 11
Grade 12

Race–Ethnicity

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
African American
American Indian
Asian/Pacific Islander
Hispanic
White
Other

Socioeconomic Status

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Subsidized Lunch
No Subsidized Lunch

Disability Status

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Speech-Language Impairments
Learning Disabilities
Behavior Disorders
Emotional Disturbance
Intellectual Disabilities
Other
Not Identified With a Disability

ELL Status

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
English Language Learner 5 3 0.12
Not English Language Learner 19 17 3.38

Gender

Demographic Program
Number
Control
Number
Effect Size: Cox Index
for Binary Differences
Female 12 15 1.21
Male 19 16 0.86

Mean Effect Size

1.11

For any substantively (e.g., effect size ≥ 0.25 for pretest or demographic differences) or statistically significant (e.g., p < 0.05) pretest differences between groups in the descriptions below, please describe the extent to which these differences are related to the impact of the treatment. For example, if analyses were conducted to determine that outcomes from this study are due to the intervention and not demographic characteristics, please describe the results of those analyses here.

Design Empty Bobble

What method was used to determine students' placement in treatment/control groups?
Random
Please describe the assignment method or the process for defining treatment/comparison groups.
Pre programme: We assessed all participants on all measures before beginning the programme. Additionally, prior to episode one, Mousing Around was completed. This is a short introductory online episode that familiarises the child with the instructional language of the programme and provides practice of appropriate responding prior to introducing the reading episodes. In each school, one training session was conducted so that a teaching assistant and the undergraduate students could implement the programme. Researchers were present for the initial session, after which we monitored online episode data to ensure fidelity of implementation. An implementation checklist to guide the running of the sessions was adapted from Huffstetter et al. (2010) for use during the training session and thereafter in all sessions conducted in both schools. This included items such as: ‘Have you checked every child is responding audibly to the speak-out-loud activi-ties?’, ‘Have you responded to any requests for help by redirecting the child back to the programme?’, ‘Have you read the HER stories and scored performance on the appropriate sheets?’ and ‘Have you checked each child has achieved 90 % accuracy immediately after episode completion?’. Online episodes: Episodes were conducted according to implementation guidelines provided Online episodes Episodes were conducted according to implementation guidelines provided by Headsprout. Daily sessions of approximately 45 min were conducted, in which the childrenin the HER group participated in the programme. During the HER sessions, the control group participated in their usual ‘free choice’ session, which included a variety of semi-structured activities for the children to choose from. Some of these activities involved literacy skills, some mathematics skills and some general problem solving activities. HER participants did not therefore miss any specific classroom instruction by enrolling in the programme. Participants engaged in episodes at a computer set-up ready to access their individual profile. Two student researchers or school staff members were present in each class session. However, they did not interact with the child other than to offer encouragement to stay on task. This was to ensure there was no interference with the sophisticated correction procedure built into the programme, and that the responses made provided accurate feedback of the child’s current ability and progress. When each child finished an episode, online data were examined to ensure they had attained the required accuracy level 90 % in each episode. Following episode completion, each child chose a sticker to place on their progress map that indicated which episode they had completed. Stories: In accordance with implementation guidelines, children were also required to read stories provided by the programme after specified episodes. If the child struggled, the instructors reminded them to sound out the word and implemented a Model-Lead-Test error correction procedure. This procedure is consistent with the Direct Instruction approach developed by Engelmann and Carnine (1982), which is the instructional approach on which HER is based. If a child misread or omitted a word, they were first asked to try again. If the word was then read correctly, they were praised and then asked to read the whole sentence. If they misread the word again, the staff member/researcher would model sounding out the word and saying it fast (Model), then do this with the student (Lead) and then ask the student to do this on their own (Test). They would then be asked to read the whole sentence. Additional support In addition to the online episodes, frequency-building exercises accompany the HER programme. There are two tiers of this additional support—Targeted Practice and Intensive Practice—both comprising frequency-building activities to help develop fluency in the elements and strategies introduced in the episodes. In accordance with implementation protocol, all children began the programme with only the online episodes and Sprout stories. Benchmark reading assessments Twelve of the 80 stories are considered Benchmark Reading Assessments, to be conducted after specified episodes. For the Benchmark readers, data on reading accuracy was taken (i.e., number of words read correctly) and a rating of reading proficiency of either Independent (read with few errors), Satisfactory (read with some errors and slight hesitation) or Needs Practice (read with frequent errors). Those involved in implementing the programme were instructed to record these data either electronically through the HER site or on printed sheets available to download. These data were then used, alongside the programme data, to guide decisions on whether additional frequency-building activities were required. At the end of the school year (after 8 months of HER intervention), we repeated assessments with all children, regardless of whether they finished the programme earlier and regardless of whether they had finished all episodes of the programme

What was the unit of assignment?
Students
If other, please specify:

Please describe the unit of assignment:

What unit(s) were used for primary data analysis?
not selected Schools
not selected Teachers
selected Students
not selected Classes
not selected Other
If other, please specify:

Please describe the unit(s) used for primary data analysis:

Fidelity of Implementation Empty Bobble

How was the program delivered?
selected Individually
not selected Small Group
not selected Classroom

If small group, answer the following:

Average group size
Minimum group size
Maximum group size

What was the duration of the intervention (If duration differed across participants, settings, or behaviors, describe for each.)?

Weeks
16.00
Sessions per week
5.00
Duration of sessions in minutes
45.00
What were the background, experience, training, and ongoing support of the instructors or interventionists?
See Table 3 on Page 285

Describe when and how fidelity of treatment information was obtained.
In each school, one training session was conducted so that a teaching assistant and the undergraduate students could implement the programme. Researchers were present for the initial session, after which we monitored online episode data to ensure fidelity of implementation. An implementation checklist to guide the running of the sessions was adapted from Huffstetter et al. (2010) for use during the training session and thereafter in all sessions conducted in both schools. This included items such as: ‘Have you checked every child is responding audibly to the speak-out-loud activities?’, ‘Have you responded to any requests for help by redirecting the child back to the programme?’, ‘Have you read the HER stories and scored performance on the appropriate sheets?’ and ‘Have you checked each child has achieved 90 % accuracy immediately after episode completion?’.

What were the results on the fidelity-of-treatment implementation measure?
Seven children in the HER group were excluded from the analysis because they did not complete the full 80-episode programme within the school year. Progress through the episodes and reasons for these children not completing the programme varied. Four children reached the second half of the programme (ranging from episodes 41–71). Three of these children did not complete due to many school absences (either long periods of absence or absence during the HER sessions). Two children completed a significant proportion of the programme (39 and 47 episodes) but required additional input in later episodes which slowed progress. This additional input was in the form of the Targeted Practice tier of support and was to be delivered during the usual HER sessions for those children. As such, they did not have as many opportunities to complete the online episodes.

Was the fidelity measure also used in control classrooms?
Yes

Measures and Results

Measures Targeted : Empty Bobble
Measures Broader : Dash

Study measures are classified as targeted, broader, or administrative data according to the following definitions:

  • Targeted measures
    Assess outcomes, such as competencies or skills that the program was directly targeted to improve.
    • In the academic domain, targeted measures typically are not the very items taught but rather novel items structured similarly to the content addressed in the program. For example, if a program taught word-attack skills, a targeted measure would be decoding of pseudo words. If a program taught comprehension of cause-effect passages, a targeted measure would be answering questions about cause-effect passages structured similarly to those used during intervention, but not including the very passages used for intervention.
    • In the behavioral domain, targeted measures evaluate aspects of external or internal behavior the program was directly targeted to improve and are operationally defined.
  • Broader measures
    Assess outcomes that are related to the competencies or skills targeted by the program but not directly taught in the program.
    • In the academic domain, if a program taught word-level reading skill, a broader measure would be answering questions about passages the student reads. If a program taught calculation skill, a broader measure would be solving word problems that require the same kinds of calculation skill taught in the program.
    • In the behavioral domain, if a program taught a specific skill like on-task behavior in one classroom, a broader measure would be academic performance in that setting or on-task behavior in another setting.
  • Administrative data measures apply only to behavioral intervention tools and are measures such as office discipline referrals (ODRs) and graduation rates which do not have psychometric properties as do other, more traditional targeted or broader measures.

Click here for more information on effect size.


What populations are you submitting outcome data for?
selected Full sample
not selected Students at or below the 20th percentile
not selected English language learners
not selected Racial/ethnic subgroups
not selected Economically disadvantaged students (low socioeconomic status)
Targeted Measure Reverse Coded? Reliability Relevance Exposure
Broader Measure Reverse Coded? Reliability Relevance Exposure
Administrative Data Measure Reverse Coded? Relevance

Posttest Data

Targeted Measures (Full Sample)

Measure Sample Type Effect Size P

Broader Measures (Full Sample)

Measure Sample Type Effect Size P

Administrative Measures (Full Sample)

Measure Sample Type Effect Size P

Targeted Measures (Subgroups)

Measure Sample Type Effect Size P

Broader Measures (Subgroups)

Measure Sample Type Effect Size P

Administrative Measures (Subgroups)

Measure Sample Type Effect Size P
For any substantively (e.g., effect size ≥ 0.25 for pretest or demographic differences) or statistically significant (e.g., p < 0.05) pretest differences, please describe the extent to which these differences are related to the impact of the treatment. For example, if analyses were conducted to determine that outcomes from this study are due to the intervention and not pretest characteristics, please describe the results of those analyses here.
Please explain any missing data or instances of measures with incomplete pre- or post-test data.
If you have excluded a variable or data that are reported in the study being submitted, explain the rationale for exclusion:
Describe the analyses used to determine whether the intervention produced changes in student outcomes:

Additional Research

Is the program reviewed by WWC or E-ESSA?
WWC & E-ESSA
Summary of WWC / E-ESSA Findings :

What Works Clearinghouse Review

Early Childhood Education Protocol

Effectiveness: Headsprout® Early Reading was found to have potentially positive effects on oral language and print knowledge.

Studies Reviewed: 1 study meets standards out of 2 studies total

Full Report(link is external)

Evidence for ESSA

No studies met inclusion requirements.

How many additional research studies are potentially eligible for NCII review?
3
Citations for Additional Research Studies :

Grindle, C. F., Hughes, C. J., Saville, M., Huxley, K., & Hastings, R. P. (2013). Teaching early reading skills to children with autism using MimioSprout Early Reading. Behavioral Interventions, 28, 203-224.

Layng, T. V. J., Twyman, J. S., & Stikeleather, G. (2004). Engineering discovery learning: The contingency adduction of some precursors of textual responding in a beginning reading program. Analysis of Verbal Behavior, 20, 99–109.

Twyman, J. S., Layng, T. V. J., & Layng, Z. (2011). The likelihood of instructionally beneficial, trivial, or negative results for kindergarten and first grade learners who complete at least half of Headsprout Early Reading. Behavioral Technology Today, 6, 1-19.

Disclaimer

Most tools and programs evaluated by the NCII are branded products which have been submitted by the companies, organizations, or individuals that disseminate these products. These entities supply the textual information shown above, but not the ratings accompanying the text. NCII administrators and members of our Technical Review Committees have reviewed the content on this page, but NCII cannot guarantee that this information is free from error or reflective of recent changes to the product. Tools and programs have the opportunity to be updated annually or upon request.