Lexia Reading Core5 (formerly Lexia Reading)

Study: Macaruso & Walker (2008)

Macaruso, P., & Walker, A. (2008). The efficacy of computer-assisted instruction for advancing literacy skills in kindergarten children. Reading Psychology, 29, 266-287.

Descriptive Information

Usage

Acquisition and Cost

Program Specifications and Requirements

Training

Overview: Lexia Reading Core5 (Core5) is designed as a user-centered, interactive, and collaborative model of personalized learning and is appropriate for accelerating reading skills development for students of all abilities in Pre-K to Grade 5. Students begin by taking an Auto Placement assessment which assigns them to the appropriate start level in the program’s scope and sequence. Students then progress through the program levels at their own pace. Teachers and school staff monitor the implementation through dashboards on the myLexia website.

Alignment to Standards: Core5 is closely aligned to most rigorous state and national standards, including the Common Core State Standards for Reading (Foundational Skills, Reading Literature, and Reading Informational Text) as well as many Writing, Language, and Speaking and Listening standards.

Scope & Sequence: Core5’s scope and sequence provides balanced skill development for all five strands of the “Essential Elements of Scientific Reading Instruction” as identified by the National Reading Panel (2000) — Phonemic Awareness, Phonics, Vocabulary, Fluency, and Comprehension. In addition, a sixth strand in Core5 targeting “Structural Word Analysis” helps form the bridge from decoding skills to advanced vocabulary and comprehension. 

Core5 is used by nearly 8,000 individual sites across all 50 states, Washington DC, and an additional 30+ countries including Canada, Australia, Great Britain, South Korea, and New Zealand. As of November 2015, over 1.5 million unique students used Core5.

The Core5 online activities are accessible online through an internet browser or through the Core5 app for iPad or Android tablets. Students can work on Core5 in school, at home, in extended-day programs, or libraries and other community centers — anywhere there is internet access and a browser.

Students use the online program for 20–30 minutes per session, 1–5 times per week, for 25–30 weeks.

For struggling students, the online prescription is a minimum of 60 minutes/week for K-3 students and 80 minutes/week for grades 4–5. Students spend an additional 40–80 minutes a week engaging with offline program components (teacher-directed Lessons and Instructional Connections, as well as independent/partner Skill Builders).

Access to assessment data is available to teachers and administrators in real time through an internet browser or through the myLexia app for the iPhone, iPad, iPod Touch, and Apple Watch. Teachers are notified by web-based reports or email when students require support or intervention.

Where to Obtain:
Lexia Learning Systems

300 Baker Avenue Suite 320

Concord, MA 01742

Phone #: 978-405-6200

Web Site: www.lexialearning.com

Cost: There are two ways to buy Core5 – individual student licenses or a site license (unlimited number of students at that site). Individual licenses costs between $30-40 a year per student, depending on the number of licenses purchased. A site license for a school that has 500 students would be $17 per student for a single-year license. As a subscription service, a one-year renewal is at the base rate, and multi-year renewals will reflect discount. Although purchasing training is not required, a launch training and two follow-up trainings per year are recommended. These are available in person (price may vary based on the needs of the school) or via webinar at different price points. E-learning modules are also available – nearly all of training videos are available for free through the program’s admin portal, myLexia.com.

The Core5 program requires a web-enabled device, such as a desktop computer, laptop, or tablet (7 inches or more — IOS or Android). The online component is conducted by each student independently, with one device. Implementation monitoring through myLexia can be accessed through a web browser on any device or through our IOS app, myLexia (versions for iPad, iPhone, and Apple Watch).

Instructional and supplemental materials require printing.Offline instructional experiences may require pedagogical materials commonly found in elementary school classrooms.

Core5 includes an extensive online resource library of interactive professional development videos, documentation, Lexia Lessons, Lexia Instructional Connections, and Lexia Skill Builders embedded into the administrative component of the program.

Lexia offers Implementation Support Services that includes trainings in person, via webinar, and through e-learning course modules. A full support package includes an Implementation Manager that consults with and assists district and school leadership throughout the year. Activities may include: creating an implementation plan, professional learning events, reviewing implementation milestones, data coaching and analyses, assistance in developing sustainable models and staff expertise, and assisting with seasonal account maintenance activities.

Additionally, teachers can access Training On Demand, a robust series of training modules that are available anytime and anywhere.  These interactive modules cover a wide range of topics such as the Core5 Scope and Sequence, Navigating within a Core5 Activity, and Student Reports in myLexia. Designed for teachers and administrators with no prior experience using Core5, teachers can explore modules at their own pace, can interact and engage with content, and can test their knowledge with interactive quizzes at the end of each module.

Our customer support has online resources that are available 24/7 as well as live support via a toll-free number Monday through Friday 8am–6pm EST, except for holidays.

 

Participants: Unconvincing Evidence

Sample size: 71 students (26 program, 45 control)

Risk Status: Students were identified as at risk for academic failure due to their low performance on DIBELS, the Dynamic Indicators of Basic Early Literacy Skills.

Demographics:

  Program Control p of chi square
Number Percentage Number Percentage
Grade level
  Kindergarten 26 100% 45 100%  
  Grade 1          
  Grade 2          
  Grade 3          
  Grade 4          
  Grade 5          
  Grade 6          
  Grade 7          
  Grade 8          
  Grade 9          
  Grade 10          
  Grade 11          
  Grade 12          
Mean Age          
Race-ethnicity
  African-American          
  American Indian          
  Asian/Pacific Islander     2 4%  
  Hispanic 1 4% 12 27%  
  White 22 85% 26 58%  
  Other 3 11% 5 11%  
Socioeconomic status
  Subsidized lunch     13 29%  
  No subsidized lunch 26 100% 32 71%  
Disability status
  Speech-language impairments          
  Learning disabilities          
  Behavior disorders          
  Intellectual disabilities          
  Other (SPED Status)          
  Not identified with a disability 26 100% 45 100%  
ELL status
  English language learner          
  Not English language learner 26 100% 45 100%  
Gender
  Female 14 54% 25 56%  
  Male 12 46% 20 44%  

Training of Instructors: The kindergarten teachers and laboratory staff members took part in orientation and training sessions for software implementation.

Design: Partially Convincing Evidence

Did the study use random assignment?: Yes.

If not, was it a tenable quasi-experiment?: Not applicable.

If the study used random assignment, at pretreatment, were the program and control groups not statistically significantly different and had a mean standardized difference that fell within 0.25 SD on measures used as covariates or on pretest measures also used as outcomes?: Yes.

If not, at pretreatment, were the program and control groups not statistically significantly different and had a mean standardized difference that fell within 0.25 SD on measures central to the study (i.e., pretest measures also used as outcomes), and outcomes were analyzed to adjust for pretreatment differences? Not applicable.

Were the program and control groups demographically comparable at pretreatment?: No.

Was there attrition bias1? Yes.

Did the unit of analysis match the unit for random assignment (for randomized studies) or the assignment strategy (for quasi-experiments)?: No.

1 NCII follows guidance from the What Works Clearinghouse (WWC) in determining attrition bias. The WWC model for determining bias based on a combination of differential and overall attrition rates can be found on pages 13-14 of this document: http://ies.ed.gov/ncee/wwc/pdf/reference_resources/wwc_procedures_v2_1_standards_handbook.pdf

 

Fidelity of Implementation: Unconvincing Evidence

Describe when and how fidelity of treatment information was obtained: The kindergarten teachers and laboratory staff members took part in orientation and training sessions for software implementation.

Provide documentation (i.e., in terms of numbers) of fidelity of treatment implementation: During periodic check-ins, each teacher reported following the same scope and sequence of reading instruction for her treatment and control classes. The software also tracks sessions completed for each student (number of sessions and the length of sessions). The failure of some students to meet required use patterns was due to illness, truancy, and the general issue of transience typical of students in a low SES urban school district.

Measures Targeted: Convincing Evidence

Measures Broader: Convincing Evidence

Targeted Measure Score type and range of measure Reliability statistics Relevance to program instructional content
DIBELS - Letter Naming Fluency Scores are based on the number of words children name in one minute. Alternative-form reliability
1 probe: 0.93
3 probes: 0.98
(Bakerson & Gothberg, Western Michigan University)
The program teaches students letter names.
DIBELS – Phoneme Segmentation Fluency Scores are based on the correct number of phonemes produced correctly in one minute. Alternative-form reliability
1 probe: 0.88
3 probes: 0.96
(Bakerson & Gothberg, Western Michigan University)
The program teaches and provides practice for students to segment sounds and combinations of sounds.


 

Broader Measure Score type and range of measure Reliability statistics Relevance to program instructional content
Gates-MacGinitie Reading Test, Level PRE (Pre-Reading) Raw scores for each subtest and a normal curve equivalent (NCE) score based on the total raw score. (Note: NCE scores are on a 100- point scale with a mean of 50 and a standard deviation of 21.1.) Kuder-Richardson 20 for PRE and R levels are in the 0.90s. (http://dese.mo.gov/divimprove/curriculum/commarts/readassess.pdf) The program addresses skills in the subtests.
Subtest - literacy concepts     The program presents beginning literacy concepts.
Subtest - oral language concepts     The program teaches and practices phonological awareness skills
Subtest - letters and letter–sound
correspondences
    The program teaches and allows practice for students to match letters to sounds that correspond to the letters.
Subtest - listening (story) comprehension     The program teaches strategies and practice for students to understand the meaning of a passage read to them.

 

 

Number of Outcome Measures: 4 Prereading, 2 Reading

Mean ES - Targeted: -0.11

Mean ES - Broader: 0.31*

Effect Size:

Targeted Measures

Construct Measure Effect Size
Prereading DIBELS – LNF  -0.12
Prereading DIBELS – PSF -0.09

Broader Measures

Construct Measure Effect Size
Prereading GMRT – Letters and Letter-sound Correspondences  0.14
Prereading GMRT – Oral Language Concepts  0.51*
Reading GMRT – Literacy Concepts  0.30
Reading GMRT – Listening Comprehension  0.27

 

Key
*      p ≤ 0.05
**    p ≤ 0.01
***  p ≤ 0.001
–      Developer was unable to provide necessary data for NCII to calculate effect sizes
u      Effect size is based on unadjusted means
†      Effect size based on unadjusted means not reported due to lack of pretest group equivalency, and effect size based on adjusted means is not available

 

Visual Analysis (Single Subject Design): N/A

Disaggregated Data for Demographic Subgroups: Yes

Targeted Measures

Construct Measure Effect Size
Prereading DIBELS – LNF – Low Performers 0.06
Prereading DIBELS – PSF – Low Performers -0.01

Broader Measures

Construct Measure Effect Size
Prereading GMRT – Letters and Letter-sound Correspondences – Low Performers 0.71
Prereading GMRT – Oral Language Concepts – Low Performers 1.17**
Reading GMRT – Literacy Concepts – Low Performers 0.61
Reading GMRT – Listening Comprehension – Low Performers 0.53

 

Key
*      p ≤ 0.05
**    p ≤ 0.01
***  p ≤ 0.001
–      Developer was unable to provide necessary data for NCII to calculate effect sizes
u      Effect size is based on unadjusted means
†      Effect size based on unadjusted means not reported due to lack of pretest group equivalency, and effect size based on adjusted means is not available

 

Disaggregated Data for <20th Percentile: No

Administration Group Size: Individual

Duration of Intervention: 15-20 minutes, 2-3 times per week, 23-24 weeks

Minimum Interventionist Requirements: Paraprofessional, 1-8 hours of training

Reviewed by WWC or E-ESSA: WWC & E-ESSA

What Works Clearinghouse Review

Beginning Readers Protocol

Effectiveness: Lexia Reading was found to have potentially positive effects on alphabetics, no discernible effects on fluency, potentially positive effects on comprehension, and no discernible effects on general reading achievement.

Studies Reviewed: 3 studies meet standards out of 4 studies total

Full Report

 

Evidence for ESSA

Program Outcomes: Two studies, both in urban Massachusetts districts, evaluated Lexia in comparison to control groups. Outcomes were positive, but not significant at the school level. There were significant effects at the student level, however, qualifying Lexia for the ESSA “Promising” category.

Number of Studies: 2

Average Effect Size: 0.31

Full Report

Other Research: Potentially Eligible for NCII Review: 3 studies

Gale, D. (2006). The effect of computer-delivered phonological awareness training on the early literacy skills of students identified as at-risk for reading failure. Retrieved May, 2008 from the University of South Florida website: http://purl.fcla.edu/usf/dc/et/SFE0001531.
 

Macaruso, P. & Rodman, A. (2011). Benefits of computer-assisted instruction to support reading acquisition in English Language Learners. Bilingual Research Journal, 34, 301-315.
 

McMurray, S. (2013). An evaluation of the use of Lexia Reading software with children in Year 3, Northern Ireland (6‐to 7‐year olds). Journal of Research in Special Educational Needs13(1), 15-25.