Tutoring Buddy
Study: DuBois et al. (2016)
Summary
This evidence-based intervention aims to improve early literacy skills.
- Target Grades:
- Age 3-5, K, 1
- Target Populations:
-
- Any student at risk for academic failure
- Area(s) of Focus:
-
- Print knowledge/awareness
- Alphabet knowledge
- Phonological awareness
- Where to Obtain:
- Twin Lights Education, LLC
- 6 Oakland Ave., Rockport, MA 01966
- 617-602-5626
- www.twinlightsed.com
- Initial Cost:
- $39.00 per student
- Replacement Cost:
- Contact vendor for pricing details.
-
Use of the system including the iOS app, web app, and online training materials and support is $399 per year per grade-level in the school. An entire school can use the program for $499 per year- the number of students and number of sessions is unlimited.
- Staff Qualified to Administer Include:
-
- Special Education Teacher
- General Education Teacher
- Reading Specialist
- Math Specialist
- EL Specialist
- Interventionist
- Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
- Paraprofessional
- Other: Tutoring Buddy was designed to be completed by laypeople.
- Training Requirements:
- 1-4 hours of training
-
Training can be accomplished via our web-based or in app training materials. It involves reading the procedures, and practicing the pronunciation of each letter sound. Correct pronunciation is modeled by the software.
The training materials we now have are prettier versions of our original training materials. These were used with both graduate research assistants and parents.
- Access to Technical Support:
- Practitioners receive an email address and calls are answered within 24 hours by one of the two developers.
- Recommended Administration Formats Include:
-
- Individual students
- Minimum Number of Minutes Per Session:
- 5
- Minimum Number of Sessions Per Week:
- 3
- Minimum Number of Weeks:
- 6
- Detailed Implementation Manual or Instructions Available:
- Yes
- Is Technology Required?
-
Program Information
Descriptive Information
Please provide a description of program, including intended use:
This evidence-based intervention aims to improve early literacy skills.
The program is intended for use in the following age(s) and/or grade(s).
Age 3-5
Kindergarten
First grade
Second grade
Third grade
Fourth grade
Fifth grade
Sixth grade
Seventh grade
Eighth grade
Ninth grade
Tenth grade
Eleventh grade
Twelth grade
The program is intended for use with the following groups.
Students with learning disabilities
Students with intellectual disabilities
Students with emotional or behavioral disabilities
English language learners
Any student at risk for academic failure
Any student at risk for emotional and/or behavioral difficulties
Other
If other, please describe:
ACADEMIC INTERVENTION: Please indicate the academic area of focus.
Early Literacy
Alphabet knowledge
Phonological awareness
Phonological awarenessEarly writing
Early decoding abilities
Other
If other, please describe:
Language
Grammar
Syntax
Listening comprehension
Other
If other, please describe:
Reading
Phonics/word study
Comprehension
Fluency
Vocabulary
Spelling
Other
If other, please describe:
Mathematics
Concepts and/or word problems
Whole number arithmetic
Comprehensive: Includes computation/procedures, problem solving, and mathematical concepts
Algebra
Fractions, decimals (rational number)
Geometry and measurement
Other
If other, please describe:
Writing
Spelling
Sentence construction
Planning and revising
Other
If other, please describe:
BEHAVIORAL INTERVENTION: Please indicate the behavior area of focus.
Externalizing Behavior
Verbal Threats
Property Destruction
Noncompliance
High Levels of Disengagement
Disruptive Behavior
Social Behavior (e.g., Peer interactions, Adult interactions)
Other
If other, please describe:
Internalizing Behavior
Anxiety
Social Difficulties (e.g., withdrawal)
School Phobia
Other
If other, please describe:
Acquisition and cost information
Where to obtain:
- Address
- 6 Oakland Ave., Rockport, MA 01966
- Phone Number
- 617-602-5626
- Website
- www.twinlightsed.com
Initial cost for implementing program:
- Cost
- $39.00
- Unit of cost
- student
Replacement cost per unit for subsequent use:
- Cost
- Unit of cost
- Duration of license
Additional cost information:
Describe basic pricing plan and structure of the program. Also, provide information on what is included in the published program, as well as what is not included but required for implementation (e.g., computer and/or internet access)
Use of the system including the iOS app, web app, and online training materials and support is $399 per year per grade-level in the school. An entire school can use the program for $499 per year- the number of students and number of sessions is unlimited.Program Specifications
Setting for which the program is designed.
Small group of students
BI ONLY: A classroom of students
If group-delivered, how many students compose a small group?
Program administration time
- Minimum number of minutes per session
- 5
- Minimum number of sessions per week
- 3
- Minimum number of weeks
- 6
- If intervention program is intended to occur over less frequently than 60 minutes a week for approximately 8 weeks, justify the level of intensity:
Does the program include highly specified teacher manuals or step by step instructions for implementation?- Yes
BEHAVIORAL INTERVENTION: Is the program affiliated with a broad school- or class-wide management program?-
If yes, please identify and describe the broader school- or class-wide management program: -
Does the program require technology? - Yes
-
If yes, what technology is required to implement your program? -
Computer or tablet
Internet connection
Other technology (please specify)
If your program requires additional technology not listed above, please describe the required technology and the extent to which it is combined with teacher small-group instruction/intervention:
Tutoring Buddy runs native on the iOS (iPad, iPhone, iPod Touch), but also runs as a mobile enabled web-app that can be run on any computer or tablet.
Training
- How many people are needed to implement the program ?
Is training for the instructor or interventionist required?- Yes
- If yes, is the necessary training free or at-cost?
Describe the time required for instructor or interventionist training:- 1-4 hours of training
Describe the format and content of the instructor or interventionist training:- Training can be accomplished via our web-based or in app training materials. It involves reading the procedures, and practicing the pronunciation of each letter sound. Correct pronunciation is modeled by the software.
What types or professionals are qualified to administer your program?
General Education Teacher
Reading Specialist
Math Specialist
EL Specialist
Interventionist
Student Support Services Personnel (e.g., counselor, social worker, school psychologist, etc.)
Applied Behavior Analysis (ABA) Therapist or Board Certified Behavior Analyst (BCBA)
Paraprofessional
Other
If other, please describe:
Tutoring Buddy was designed to be completed by laypeople.- Does the program assume that the instructor or interventionist has expertise in a given area?
-
No
If yes, please describe:
Are training manuals and materials available?- Yes
-
Describe how the training manuals or materials were field-tested with the target population of instructors or interventionist and students: - The training materials we now have are prettier versions of our original training materials. These were used with both graduate research assistants and parents.
Do you provide fidelity of implementation guidance such as a checklist for implementation in your manual?- Yes
-
Can practitioners obtain ongoing professional and technical support? -
Yes
If yes, please specify where/how practitioners can obtain support:
Practitioners receive an email address and calls are answered within 24 hours by one of the two developers.
Summary of Evidence Base
- Please identify, to the best of your knowledge, all the research studies that have been conducted to date supporting the efficacy of your program, including studies currently or previously submitted to NCII for review. Please provide citations only (in APA format); do not include any descriptive information on these studies. NCII staff will also conduct a search to confirm that the list you provide is accurate.
-
DuBois, M. R., Volpe, R. J., & Hemphill, E. M. (2014). A randomized trial of a computer-assisted tutoring program targeting letter sound expression via incremental rehearsal. School Psychology Review, 43, 210-221.
Study Information
Study Citations
DuBois, M., Volpe, R. J., Burns, M. K. & Hoffman, J. A. Parent-administered computer-assisted tutoring targeting letter-sound knowledge: Evaluation via multiple-baseline across three preschool Students. (Doctoral dissertation). Northeastern University, Boston, MA,
Participants
- Describe how students were selected to participate in the study:
- An advertisement describing the study was posted in the program’s weekly newsletter and interested caregivers were instructed to contact the principal investigator. There were no participation restrictions regarding gender, ethnicity, race, socio-economic status, health, or disability status. Only children who were able to see letters in 68-point font, who were able to orally articulate letters sounds, and who knew six or fewer letter sounds were eligible to participate.
-
Describe how students were identified as being at risk for academic failure (AI) or as having emotional/behavioral difficulties (BI): - Two of the three students were receiving early intervention due to some kind of disability.
-
ACADEMIC INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:- below the 30th percentile on local or national norm, or
- identified disability related to the focus of the intervention?
- %
-
BEHAVIORAL INTERVENTION: What percentage of participants were at risk, as measured by one or more of the following criteria:- emotional disability label,
- placed in an alternative school/classroom,
- non-responsive to Tiers 1 and 2, or
- designation of severe problem behaviors on a validated scale or through observation?
- %
Provide a description of the demographic and other relevant characteristics of the case used in your study (e.g., student(s), classroom(s)).
Case (Name or number) | Age/Grade | Gender | Race / Ethnicity | Socioeconomic Status | Disability Status | ELL status | Other Relevant Descriptive Characteristics |
---|---|---|---|---|---|---|---|
test | test | test | test | test | test | test | test |
Design
- Please describe the study design:
- The present study utilized a non-concurrent multiple-baseline-across-participants design to assess the efficacy of the Tutoring Buddy intervention.
Clarify and provide a detailed description of the treatment in the submitted program/intervention:- Caregivers administered the Tutoring Buddy to their child at home 3 times a week for 6 weeks. Using Tutoring Buddy, two known and four unknown letter sounds, as determined by performance during the pre-intervention assessment (i.e., unknown letter sounds were those that students articulated incorrectly, and known letter sounds were those that students articulated correctly) were selected for intervention, regardless of whether they had been rehearsed during prior sessions. Continuous sounds (e.g., e and v) were targeted for intervention first, and letters with similar characteristics (e.g., b and d) were rehearsed during different sessions whenever possible. During each intervention session students were presented with the first targeted unknown sound and their parent modeled the sound corresponding to the letter (e.g., This letter makes the sound /m/, like the first sound in ‘mmmmmop…mop.’ What sound?). Once the student made the correct sound the parent advanced to the incremental rehearsal screen wherein a string of 14 letters were presented in an incremental rehearsal sequence i.e., first unknown, first known, first unknown, first known, second known, first unknown, first known, second known, third known, first unknown, first known, second known, third known, fourth known. Parents provided immediate error corrections for any errors made by the student. Once the first sequence was completed, a second unknown letter sound was introduced just as the first. However, the first unknown became the first known and the fourth unknown was removed.
Clarify what procedures occurred during the control/baseline condition (third, competing conditions are not considered; if you have a third, competing condition [e.g., multi-element single subject design with a third comparison condition], in addition to your control condition, identify what the competing condition is [data from this competing condition will not be used]):- During baseline, all three children were assessed by the 1st author of the paper and a research assistant on the dependent measures (LSK, LSF, and NWF) in their homes. These assessments were administered three times consecutively in each session and the median score for each represented their score for that session. The Tutoring Buddy intervention required that children know at least one letter sound prior to treatment initiation. However, Jane did not know any letter sounds following her first baseline assessment. Accordingly, Jane’s caregiver was instructed to teach Jane the grapheme-phoneme correspondence of the first letter of her name (i.e., /j/). This training consisted of Jane’s mother making a flashcard with the letter on it, Jane’s mother demonstrating the sound (e.g., “This letter makes the sound /j/, like Jane”), and Jane’s mother asking her to repeat the sound (e.g., “What sound does this make?”). Jane’s mother continued this procedure nightly until she could articulate the letter’s sound within three seconds of presentation (in the absence of training). The training resulted in Jane learning one letter sound during baseline. Other than this activity, caregivers were instructed not to change their literacy activities during baseline.
Please describe how replication of treatment effect was demonstrated (e.g., reversal or withdrawal of intervention, across participants, across settings)- Intervention effects observed for the first student (Jane) were replicated for both Andy and Mary in a multiple baseline design across participants.
-
Please indicate whether (and how) the design contains at least three demonstrations of experimental control (e.g., ABAB design, multiple baseline across three or more participants). - The study demonstrates replication via a multiple baseline design across three participants.
If the study is a multiple baseline, is it concurrent or non-concurrent?- Non-concurrent
Fidelity of Implementation
- How was the program delivered?
-
Individually
Small Group
Classroom
If small group, answer the following:
- Average group size
- Minimum group size
- Maximum group size
What was the duration of the intervention (If duration differed across participants, settings, or behaviors, describe for each.)?
- Weeks
- Sessions per week
- Duration of sessions in minutes
- Weeks
- Sessions per week
- Duration of sessions in minutes
- Weeks
- Sessions per week
- Duration of sessions in minutes
- What were the background, experience, training, and ongoing support of the instructors or interventionists?
- Prior to implementing the intervention, caregivers were required to meet with the researcher for a one-hour training session in their home. This session occurred immediately prior to the first intervention session. To begin, the researcher trained caregivers how to pronounce the 24 targeted letter sounds. For each letter sound, the research articulated the sound, provided an example of a word that began with that sound, and asked the caregiver to repeat the sound. Caregivers were then presented with each letter in random order and were asked to orally articulate its sound. Caregivers were required to achieve 100% accuracy on this assessment before administering the intervention. Next, the researcher downloaded the Tutoring Buddy application onto the caregiver’s iPad. Using role-play, the researcher then demonstrated how to implement the intervention, with the caregiver acting as the child. The caregiver then practiced implementing the intervention with the researcher serving as the child. Role-play continued until the caregiver demonstrated that she was able to implement each component of the intervention accurately (the researcher used the 22-item procedural checklist to measure accuracy).
Describe when and how fidelity of treatment information was obtained.- Measures of treatment fidelity were obtained via direct observation and through the tutoring program. Specifically, the researcher observed caregivers engaging in the intervention once or twice a week and completed the 22-item procedural checklist. These data were used to calculate session integrity (or the percentage of steps that were administered correctly during each intervention session). Treatment integrity data were collected for 44% of intervention sessions for Jane and Andy and 53% of intervention sessions for Mary, which exceeds the 20% of intervention sessions criterion recommended for monitoring treatment integrity (Perepletchikova & Kazdin, 2005). In addition to treatment integrity, the Tutoring Buddy program was used to measure weekly dosage (or the number of intervention sessions delivered each week) and total dosage (or the total number of intervention sessions that were delivered during the intervention), as the program recorded the date of each intervention session.
What were the results on the fidelity-of-treatment implementation measure?- Treatment integrity scores across observed intervention sessions was 88% for Jane (range: 68% - 100%), 97% for Andy (range: 86% - 100%), and 91% for Mary (range: 82% - 100%). One observed intervention session for Jane fell below 80%. In regard to dosage, Jane’s mother and Andy’s mother implemented the intervention three times each week, for a total dosage of 18 intervention sessions. Therefore, they were able to perfectly adhere to intervention schedule outlined prior to treatment initiation. Mary’s mother implemented the intervention 15 times during the 6 weeks. Therefore, she implemented 15 of the desired 18 intervention sessions.
Was the fidelity measure also used in baseline or comparison conditions?- N/A
Measures and Results
Measures Broader :
Study measures are classified as targeted, broader, or administrative data according to the following definitions:
-
Targeted measures
Assess outcomes, such as competencies or skills, that the program was directly targeted to improve.- In the academic domain, targeted measures typically are not the very items taught but rather novel items structured similarly to the content addressed in the program. For example, if a program taught word-attack skills, a targeted measure would be decoding of pseudo words. If a program taught comprehension of cause-effect passages, a targeted measure would be answering questions about cause-effect passages structured similarly to those used during intervention, but not including the very passages used for intervention.
- In the behavioral domain, targeted measures evaluate aspects of external or internal behavior the program was directly targeted to improve and are operationally defined.
-
Broader measures
Assess outcomes that are related to the competencies or skills targeted by the program but not directly taught in the program.- In the academic domain, if a program taught word-level reading skill, a broader measure would be answering questions about passages the student reads. If a program taught calculation skill, a broader measure would be solving word problems that require the same kinds of calculation skill taught in the program.
- In the behavioral domain, if a program taught a specific skill like on-task behavior in one classroom, a broader measure would be on-task behavior in another setting.
- Administrative data measures apply only to behavioral intervention tools and are measures such as office discipline referrals (ODRs) and graduation rates, which do not have psychometric properties as do other, more traditional targeted or broader measures.
Targeted Measure | Reverse Coded? | Evidence | Relevance |
---|---|---|---|
Targeted Measure 1 | Yes | A1 | A2 |
Broader Measure | Reverse Coded? | Evidence | Relevance |
---|---|---|---|
Broader Measure 1 | Yes | A1 | A2 |
Administrative Data Measure | Reverse Coded? | Relevance |
---|---|---|
Admin Measure 1 | Yes | A2 |
- If you have excluded a variable or data that are reported in the study being submitted, explain the rationale for exclusion:
- Letter sound fluency is the dependent variable for the study. Baseline data were not collected for LSE, but scores from this measure are charted for the intervention phase. We administered the Wechsler Preschool and Primary Scale of Intelligence – Fourth Edition (WPPSI) for descriptive purposes. On the WPPSI-IV, Jane’s Verbal Comprehension Index (VCI) score fell at the 55th percentile (Standard Score = 102). Similarly, Andy’s VCI score fell at the 82nd percentile (Standard Score = 114) and Mary’s VCI score fell at the 75th percentile (Standard Score = 110). Likewise, we also conducted observations with the The Child/Home Early Language and Literacy Observation (CHELLO; Nueman, Dwyer, Koh, 2007). The physical environment of Jane’s home was rated as ‘Excellent’ and Jane’s mother’s interactions with her were rated as ‘Above Average.’ The physical environment of Andy’s home was rated as ‘Fair’ and Andy’s mother’s interactions with him were rated as ‘Basic’ (these nominal descriptors are used instead of the descriptor ‘Average’). The physical environment of Mary’s home was rated as ‘Excellent’ and Mary’s mother’s interactions with her were rated as ‘Above Average.’ Therefore, the literacy environments of all three children were average to above average.
Results
- Describe the method of analyses you used to determine whether the intervention condition improved relative to baseline phase (e.g., visual inspection, computation of change score, mean difference):
- We used visual inspection and calculated PAND and PND.
Please present results in terms of within and between phase patterns. Data on the following data characteristics must be included: level, trend, variability, immediacy of the effect, overlap, and consistency of data patterns across similar conditions. Submitting only means and standard deviations for phases is not sufficient. Data must be included for each outcome measure (targeted, broader, and administrative if applicable) that was described above.- Baseline data were low and stable for all three students. However, because the IR procedure requires knowledge of at least one known letter sound, Jane was taught one letter sound during the baseline phase. Baseline intercepts for the three children ranged from 1 to 6 for LSK and 0 to 4 for LSF and NWF. The average daily gains for LSK during baseline, as indicated by slope values, were .13, .00, and .00 respectively for Jane, Andy, and Mary. For LSF, baseline slopes were .00 for all children. For NWF, baseline slopes were .00, -.07, and .00 for Jane, Andy, and Mary respectively. As demonstrated in Figure 1, positive and steady growth in LSK was observed for all three children immediately following introduction of the Tutoring Buddy intervention. Notable changes in level and trend were demonstrated for each child during intervention. The slopes for Jane, Andy, and Mary improved to .39, .41, and .23, respectively. In regard to average weekly growth, Jane, Andy, and Mary learned 2.67 (range: 1 - 5), 3.00 (range: 0 - 5), and 1.83 (range: 1 – 3) letters sound each week, respectively. Importantly, only letter sounds that were targeted during intervention, or were known at baseline, were articulated correctly during the final LSK measurement for all three children. This finding provides support that the observed gains in LSK were attributable to the intervention. For Andy and Mary, consistent positive growth was observed for LSF and NWF immediately following implementation of the Tutoring Buddy intervention. Notable changes in level and trend were also demonstrated. For LSF, the slopes for Andy and Mary improved to .30 and .27, respectively. For NWF, the slopes for Andy and Mary improved to .33 and .21, respectively. During the NWF task, neither Andy nor Mary engaged in recoding or blending (this was expected given that there was no instruction in blending during the intervention). In regard to average weekly growth, Andy’s LSF and NWF scores increased by 2.67 (range 1 - 4) and 2.00 (range: 0 – 5) units each week, respectively. On average, Mary’s LSF and NWF scores increased by 1.67 (range 0 - 3) and 1.50 (range: 0 – 2) units each week, respectively. Jane’s trends in LSF and NWF were more variable, however. Jane made no gains in LSF and NWF following the first week of treatment. Although steady gains in these dependent measures were observed in weeks 2 and 3 of treatment, declines in performance were observed during week 4 (i.e., NWF returned to baseline, LSF was 5 correct letter sounds per minute lower in week 4 than in week 3). Following this decline in performance, Jane made positive and steady gains in LSF and NWF during weeks 5 and 6. Despite this variability, Jane’s slopes for LSF and NWF improved during intervention to .14 and .07, respectively. During the NWF task, Jane did not engage in recoding or blending. A summary of means for LSK, LSF, and NWF during the baseline and intervention phases is presented in Table 1. In addition to visual analyses and slope parameters, effect sizes were derived from Percentage of All Non-Overlapping Data (PAND; Parker, et al., 2007). PAND statistics were then translated into a Pearson’s Phi Coefficient to determine effect sizes. For Andy and Mary, PAND was 100% for LSK, LSF, and NWF. The corresponding effect sizes for Andy and Mary were 1.00 for LSK, LSF, and NWF. For Jane, PAND was 100% for LSK and LSF, and 89% for NWF. The resultant effect sizes were 1.00, 1.00, and 0.83 for LSK, LSF, and NWF, respectively. Across the three children, PAND was 100%, 100%, and 96% for LSK, LSF, and NWF, respectively. The resultant effect sizes were 1.00, 1.00, and 0.91 for LSK, LSF, and NWF, respectively.
Additional Research
- Is the program reviewed by WWC or E-ESSA?
- E-ESSA
- Summary of WWC / E-ESSA Findings :
What Works Clearinghouse Review
This program was not reivewed by What Works Clearinghouse.
Evidence for ESSA
No studies considered met Evidence for ESSA's inclusion requirements.
- How many additional research studies are potentially eligible for NCII review?
- 0
- Citations for Additional Research Studies :
Data Collection Practices
Most tools and programs evaluated by the NCII are branded products which have been submitted by the companies, organizations, or individuals that disseminate these products. These entities supply the textual information shown above, but not the ratings accompanying the text. NCII administrators and members of our Technical Review Committees have reviewed the content on this page, but NCII cannot guarantee that this information is free from error or reflective of recent changes to the product. Tools and programs have the opportunity to be updated annually or upon request.