fbpx
What our partners say…

Technical Report # 2008-5

This report examines data from two schools within the same Oregon school district. One school adopted the Reading Mastery Direct Instruction program as the core reading curriculum for all primary children, while the other used a “three tiered” model, occasionally employing DI for students that teachers felt would benefit from the instruction.

Full Report  Reading Achievement in a Direct Instruction School and a "Three Tier" Curriculum School.  Technical Report # 2008-5, Eugene, Oregon: National Institute for Direct Instruction, November, 2008.

 

Technical Report # 2008-7

The National Reading Panel recently concluded that pre-literacy and early literacy instruction is appropriate for kindergarten students and an important element of promoting higher achievement in later grades.  This paper examines the relationship of receiving the Direct Instruction (DI) kindergarten curriculum, Reading Mastery, on students' oral reading fluency in first and second grade.

Full Report  Academic Kindergarten and Later Academic Success: The Impact of Direct Instruction.  Technical Report # 2008-7, Eugene, Oregon: National Institute for Direct Instruction, December, 2008.

 

Technical Report # 2008-1

This study compares students’ achievement in BCPSS schools that 1) implemented Direct Instruction with support from NIFDI, 2) implemented DI without NIFDI support, and 3) used a traditional curriculum (Open Court) from 1998 through 2003. Students in the NIFDI supported schools had significantly higher levels of achievement than students in the other schools. Achievement scores of all first grade students in the BCPSS were higher in 2003 than in 1998, but the increases in the NIFDI supported schools were more than twice as great as in the other schools.

Full Report   Improving First Grade Reading Achievement in a Large Urban District: The Effects of NIFDI-Supported Implementation of Direct Instruction in the Baltimore City Public School System, Technical Report # 2008-1, Eugene, Oregon: National Institute for Direct Instruction, September, 2008.

Technical Report # 2008-2

This report examines the impact of receiving Direct Instruction in first grade on reading achievement in fifth grade. Results indicate that students who received Direct Instruction had significantly higher reading scores in fifth grade than other students. On average, students in NIFDI-supported schools had a 25 percent gain in composite reading achievement scores from first grade to fifth grade compared to a gain of only 5 percent for students in control schools.

Full Report  The Long-Term Impact of NIFDI-Supported Implementation of Direct Instruction on Reading Achievement: An Analysis of Fifth Graders in the Baltimore City Public School System. Technical Report # 2008-2, Eugene, Oregon: National Institute for Direct Instruction, September, 2008.

Technical Report # 2008-3

This report examines the impact of receiving Direct Instruction on mathematics achievement. Results indicate that students who received Direct Instruction had significantly higher mathematics achievement than other students. The differences in achievement between DI schools and other schools became larger over time. In addition, students who had DI in first grade had significantly greater change in their achievement scores from first grade to fifth grade than students in the Control schools.

Full Report  Improving Elementary Level Mathematics Achievement in a Large Urban District: The Effects of NIFDI-Supported Implementation of Direct Instruction in the Baltimore City Public School System. Technical Report # 2008-3, Eugene, Oregon: National Institute for Direct Instruction, September, 2008.

*(Bolded words signify specific keywords used in reviews)

Design:

There are eight types of designs that serve to classify studies:

  • Randomized Experiment: Pretest and posttest measures are collected, a control group is included, and participants are randomly assigned to treatment and control groups.
  • Comparison:

o Matched Comparison (Demographics): Pretest and posttest measures are collected, a control group is included, and participants are divided into treatment and control groups in a manner that demographic variables (gender, age, grade, ethnicity, disability, socioeconomic status) are similar for both groups.

o Matched Comparison (Pretest): Pretest and posttest measures are collected, a control group is included, and pretest scores are used to divide participants into groups that are similar in ability.

o Non-matched Comparison: Pretest and posttest measures are collected, a control group is included, but groups are not manipulated to account for demographic or pretest differences.

  • One Group Pretest-Posttest: Pretest and Posttest measures are collected, but there is no control group.
  • Static Group Comparison: Posttest measures only are collected and a control group is included.
  • Single Subject: Any study that allows for the participant to serve as their own control; no control group is utilized.
  • Meta-analysis: A method that combines the results of several studies that address comparable research questions
  • Case Study: Posttest measures only are collected; no control group in utilized.
  • Longitudinal: Measures are collected multiple times over a span of more than 2 years

 

Students Included:

Characteristics of students included in studies are classified into:

  • Grade (Kindergarten Students, First Grade Students, etc.)
  • Age (Elementary Students, Adolescents, etc.)
  • Disability (Students w/ Disabilities, Students w/Autism, etc.)
  • Placement (Special Education, General Education)
  • Remedial (Students who are below grade level)
  • At-risk (Low SES, At-risk for academic failure)
  • Specific Status (Incarcerated Students, English Language Learners, etc.)
  • Ethnicity
  • o Students are classified based on 2006 NAEP statistics:
  • § Nationally, minority enrollment in public school is as follows:
  • 20.2% Hispanic
  • 15.6% African American
  • 3.8% Asian
  • 2.7% Multiracial
  • 0.7% Alaskan/Native American
  • 0.2% Pacific Islander
  • o If the proportion of minority representation in studies meets or exceeds these national enrollment proportions, then a study is classified as including the specific demographic.

Location:

Location refers to the geographical region of the study.  Regions are classified as follows:

  • Northeast:

o New England: Maine, New Hampshire, Vermont, Massachusetts, Rhode Island, Connecticut

o Middle Atlantic: New York, Pennsylvania, New Jersey

  • Midwest:

o East North Central: Wisconsin, Michigan, Illinois, Indiana, Ohio

o West North Central: North Dakota, South Dakota, Nebraska, Kansas, Minnesota, Iowa, Missouri

  • South:

o South Atlantic: Delaware, Maryland, Washington D.C., Virginia, West Virginia, North Carolina, South Carolina, Georgia, Florida

o East South Central: Kentucky, Tennessee, Mississippi, Alabama

o West South Central: Oklahoma, Texas, Arkansas, Louisiana

  • West:

o Mountain: Idaho, Montana, Wyoming, Nevada, Utah, Colorado, Arizona, New Mexico

o Pacific: Alaska, Washington, Oregon, California, Hawaii

Setting:

Setting refers to the type of school that the study takes place:

  • Elementary School, Middle School, High School, Preschool
  • Charter School, Private School, Alternative School, Juvenile Corrections Facility

Fidelity Measured:

If a study indicates that fidelity data was collected, then this identifier will be marked with a Yes.  If there is no indication of fidelity monitoring in the study, then No will be marked.

Other Tags:

Program/Intervention: The curricula, programs, instruction methods or interventions included in the study will be indicated (Reading Mastery, Houghton-Mifflin, etc.) Both experimental and comparison programs will be listed.  Also, the subject of interest (reading, math, etc.) will be indicated.

Dependent Measures: Measures that were used to determine efficacy will be tagged (Woodcock Johnson Achievement Test, DIBELS, etc.)

 

Interested in obtaining a copy of an article? Submit your request here.

 

The What Works Clearinghouse (WWC) was established in 2002 with funding from the U.S. Department of Education. The Clearinghouse was charged with producing user-friendly guides for educators on effective instructional practices in order to understand what instructional programs have been shown to be effective. Unfortunately, the WWC has failed to live up to its promise.

The WWC's reports promote curricula that the scientific community has found to be ineffective and inefficient and denigrate those that the scientific community has found to be highly effective. Here are some of the major problems documented by NIFDI staff:WWC logo2

  • The WWC ignores large elements of the research base in searching the research literature.
  • The WWC uses inconsistent and flawed criteria to choose studies to examine in depth.
  • The WWC’s interpretations of the research are often inaccurate and misrepresent the conclusions of the studies.
  • The WWC’s procedures and methods are very different than those used by most social scientists and widely accepted in the scientific community.

NIFDI staff have documented numerous problems with the procedures and reports of the What Works Clearinghouse. Learn more about some of these issues in the articles and reports below.

Does the What Works Clearinghouse Really Work?: Investigations into Issues of Policy, Practice, and Transparency (January, 2017)  
yellowabstractbutton  orangedownloadbutton
The Threshold and Inclusive Approaches to Determining "Best Available Evidence": An Empirical Analysis (2016)
yellowabstractbutton  orangedownloadbutton
Examining the Inaccuracies and Mystifying Policies and Standards of the What Works Clearinghouse: Findings from a FOIA Request (October, 2014) 
yellowabstractbutton  orangedownloadbutton
What is a Valid Scientific Study? An Analysis of Selection Criteria Used by the What Works Clearinghouse (August, 2014) 
yellowabstractbutton  orangedownloadbutton
Reading Mastery for Beginning Readers: An Analysis of Errors in a What Works Clearinghouse Report (August, 2014) 
yellowabstractbutton  orangedownloadbutton
Does the What Works Clearinghouse Work? (September, 2013) 
yellowabstractbutton  orangedownloadbutton
The What Works Clearinghouse Review Process: An Analysis of Errors in Two Recent Reports (July, 2013) 
yellowabstractbutton  orangedownloadbutton
Examining the What Works Clearinghouse and its Reviews of Direct Instruction Programs (Spring, 2013) 
yellowabstractbutton  orangedownloadbutton
A summary of concerns regarding the What Works Clearinghouse. (September, 2012) 
yellowabstractbutton  orangedownloadbutton
Reading Mastery and students with learning disabilities: A comment on the What Works Clearinghouse review (July, 2012) 
yellowabstractbutton  orangedownloadbutton
Merging the Accountability and Scientific Research Requirements of the No Child Left Behind Act: Using Cohort Control Groups (December, 2011) 
yellowabstractbutton  blueRequestbutton2
An Analysis of the Fidelity Implementation Policies of the What Works Clearinghouse (Fall, 2010)
yellowabstractbutton  orangedownloadbutton
The What Works Clearinghouse Beginning Reading Reports and Rating of Reading Mastery: An Evaluation and Comment (September, 2008)
yellowabstractbutton  orangedownloadbutton
 

To learn more about concerns others have with the WWC's work, visit the links below.

pdf Machinations of What Works Clearinghouse by Siegfried Engelmann
What Doesn't Work Clearinghouse by Jay Greene (Oct 2010)
  pdf Perspectives on Evidence-Based Research in Education by Robert Slavin (Educational Researcher, Jan/Feb 2008)
  pdf Does What Works Clearinghouse Work? by Genevieve McArthur (Australasian Journal of Special Ed., Apr 2008)

Subcategories

Implementing Direct Instruction Successfully

When implemented fully, Direct Instruction (DI) is unparalleled in its ability to improve student performance and enhance students’ self-esteem. In order to implement DI effectively, much more is required than simply purchasing instructional materials. The following two-part tutorial guides administrators, teachers, and coaches through the key features of a successful DI implementation. Part I provides an overview of the steps schools need to take in preparation for a DI implementation before school starts, while Part II provides an overview of the steps schools need to take after school has started.

IMPORTANT: This tutorial is an intensive video series comprised of 18 segments, each followed by a series of questions. Users should allow approximately three hours to watch the videos and complete the questions. NIFDI recognizes the high demand for time placed on school officials and, for this reason, has structured the tutorial so users may stop at any time and later resume where they left off.

Enroll in the tutorial here


Tutorial Thinkific Header
New to Direct Instruction? Watch the Introduction to Direct Instruction Video Series before taking the online tutorial.

Module-Bottom-Button-A rev

Module-Bottom-Button-B rev

Module-Bottom-Button-C rev2

AmazonSmileModule 01

Screen Shot 2015 10 01 at 2.17.01 PM
Let Us Help
Close The Student Achievement Gap
With Direct Instruction!

Corrective Math CoverClick for Literacy Solutions Corrective Math CoverClick for Numeracy Solutions
Call 877-485-1973 or Email Info@NIFDI.org