The National Institute for Direct Instruction maintains an Office of Research and Evaluation that compiles research on Direct Instruction, conducts original studies of DI, and responds to requests for assistance in issues related to research regarding Direct Instruction.
The Research Office has developed an extensive bibliography of writings related to DI [# 6 below], maintains a searchable data base of articles and books [add link], and has written narrative summaries and meta-analyses of this literature (3, 5).
A substantial body of NIFDI research has examined the effectiveness of the DI curricula. These studies have confirmed the accumulated findings of decades of other studies showing that students studying with DI have higher achievement scores and stronger growth rates than students studying with other curricula. These results have appeared with reading (1,2,8,9,10, 13, 15) and math (7); in urban (1,2,7), rural (2, 8), and suburban (8,13,15) settings; with middle class high achieving students (13 ); with high risk students (17), general education students (1, 2, 7 ,8, 9, 10, 13, 15, 17), and special education students (15); with schools that are predominantly African American (1,7,9), those with substantial numbers of Hispanic students (2, 8, 15), and those with large numbers of non-Hispanic whites (8, 13, 15); and with children from pre-school age (10) through middle school (#4). The strong positive results appear in studies examining state test scores (4), curriculum-based measures (2, 4, 8, 10) and norm-referenced tests (1, 4, 7, 9, 10); in the United States as well as in other countries (11) and with randomized control trials (10,13, 14) as well as quasi-experimental designs (1, 2, 4, 7, 8, 9, 11, 15).
The NIFDI research office has also examined how the NIFDI model can help teachers and students have the greatest success possible. This research has documented the ways in which schools that adhere to the NIFDI model, in all of its components, have the greatest growth in student achievement over time.
A number of organizations provide reviews of educational curricula. The results of their work have disappointed many, and their procedures have received substantial criticism from the research community. NIFDI’s analyses of the WhatWorks Clearinghouse are typical of these critiques and have described errors in specific reviews as well as problems in procedures that are used to assess fidelity of implementation (12). Other analyses have described more general problems with the review criteria and alternative approaches that provide both more internal and external validity (4).
The NIFDI office works with schools and individual researchers to help broaden the base of DI-related research. Research support is available for participating schools and for graduate students and post-doctoral scholars. [add links]
NIFDI’s Institutional Review Board (IRB) is registered with the U.S. Department of Health and Human Services and complies fully with the federally established guidelines for the protection of human subjects. Details on NIFDI’s IRB policy are available upon request.
1) “Direct Instruction and First Grade Reading Achievement: The Role of Technical Support and Time of Implementation,” Jean Stockard, Journal of Direct Instruction, 11 (1, 2011), 31-50.
2) Increasing Reading Skills in Rural Districts: A Case Study of Three Schools,” Jean Stockard, Journal of Research in Rural Education, 26 (8, 2011), pp. 1-19.
3) “Research on the Effectiveness of Direct Instruction Programs: An Updated Meta-Analysis”, Cristy Coughlin, Paper Presented at the Annual Meetings of the Association for Behavior Analysis International, May, 2011
4) Merging the Accountability and Scientific Research Requirements of the No Child Left Behind Act: Using Cohort Control Groups, Quality and Quantity: International Journal of Methodology, published on-line December 11, 2011.
5) “Research Syntheses of Direct Instruction Outcomes: A ‘Tertiary’ Review, Cristy Coughlin, forthcoming in Lloyd, J. Carnine, D., Slocum, T., & Watkins, C. (Eds.). Does Direct Instruction Deserve Status as an Evidence-Based Practice? ADI Press, 2011.
6) A Bibliography of the Direct Instruction Curriculum and Studies Examining its Efficacy, National Institute for Direct Instruction, October, 2011.
7) “Improving Elementary Level Mathematics Achievement in a Large Urban District: The Effects of Direct Instruction,” Jean Stockard, Journal of Direct Instruction, 10 (Winter, 2010): 1-16.
8) “The Development of Early Academic Success: The Impact of Direct Instruction’s Reading Mastery, Jean Stockard and Kurt Engelmann, Journal of Behavioral Assessment and Intervention for Children, 1 (1, 2010): 2-24.
9) “Promoting Reading Achievement and Countering the ‘Fourth-Grade Slump’: The Impact of Direct Instruction on Reading Achievement in Fifth Grade,” Jean Stockard, Journal of Education for Students Placed at Risk, 15 (August, 2010): 218-240.
10) “Promoting Early Literacy of Preschool Children: A Study of the Effectiveness of Funnix Beginning Reading, Jean Stockard, Journal of Direct Instruction, 10 (Winter, 2010):29-48.
11) “Direct Instruction in Africa,” Tamara and Rob Bressi, Kurt Engelmann, Amy Johnston, Jerry Silbert, and Jean Stockard, DI News, Summer, 2010.
12) “An Analysis of the Fidelity Implementation Policies of the What Works Clearinghouse,” Jean Stockard, Current Issues in Education, 13, No. 4 (2010). http://cie.asu.edu/ojs/index.php/cieatasu/article/view/398
13) “Improving Reading Skills in Lake Woebegone: A Pretest-Posttest Randomized Control Study of High-Achieving Fourth Grade Students,” submitted to Journal of School Psychology (revise and resubmit requested). – should be as technical report
14) “Changes in Reading Achievement at Rimes Elementary: A Randomized Control Study of Reading Mastery, NIFDI Technical Report, October, 2011
15) “Reading Achievement in a Direct Instruction School and a ‘Three Tier’ Curriculum School,” NIFDI Technical Report 2008-5.
16) Tech report on WWC and RM
Several systematic literature reviews have documented the efficacy of Direct Instruction programs:
Kinder and associates (2005) summarized the results of 37 studies that used Direct Instruction materials with students with disabilities. Over 90 percent of these studies found positive effects for the Direct Instruction programs.
Przychodzin-Havis and colleagues (2005) reviewed 28 published studies of the Direct Instruction Corrective Reading program. Over 90 percent of the studies found positive results and only one study found greater gains with another intervention. Similar results appeared with different types of assessments (e.g. standardized tests or curriculum-based measures), in different settings, with different types of instructors (e.g. certified teachers, peers, aides), and with different research designs.
Schieffer and colleagues (2002) reviewed 21 studies of Reading Mastery that compared its use to that of another program. Results in fourteen of these studies (67%) favored RM, other programs were favored in three (14%), and there were no differences in the remainder.
Six meta-analyses have examined Direct Instruction programs. All of these analyses have concluded that DI programs have highly positive effects on student achievement and that the programs are more effective than other curricular approaches.
John Hattie1 examined meta-analyses of over 300 research studies relating to student achievement and concluded that Direct Instruction is highly effective. No other curricular program showed such consistently strong effects with students of different ability levels, of different ages, and with different subject matters.
Borman and associates examined studies of 29 comprehensive school reform models. They found that much more evidence was available for the Direct Instruction model than for other interventions. Direct Instruction was found to produce the strongest effects of all models examined.
In “Research on Direct Instruction: 25 Years Beyond DISTAR” Adams and Engelmann respond to unfounded negative views of Direct Instruction and a lack of a thorough scientific review of the effectiveness of Direct Instruction at the time. Engelmann provides a detailed description and history of Direct Instruction and Adams conducted a meta-analysis of 34, highly controlled studies that looked at the effectiveness of Direct Instruction programs. Adams found very strong, positive results.
Coughlin’s meta-analysis focused on 20 studies of Direct Instruction that employed a randomized control group design. Strong positive effects were found with reading, language, mathematics, and other areas. Similar results appeared with general education and special education students.
Stockard used meta-analytic techniques to examine data from scores on state assessment tests from 18 different sites. Again, strong effect sizes were found. Results were similar across different grades, schools with different SES and racial-ethnic composition, and in different areas of the country.
While decades of well-designed, scientific research show that Direct Instruction programs are highly effective, the programs have faced criticism. Some of these involve how DI affects students. For instance, some suggest that DI is less effective than other types of instruction, such as the “constructivist” or “discovery” approaches, or that it has no long-lasting impact on students’ achievement. Others suggest that it is only appropriate for disadvantaged students or those with learning difficulties. Some even claim that exposure to Direct Instruction results in poor self-image, behavior problems, or other problems for students. The accumulated evidence counters each of these claims. The research conclusively shows that Direct Instruction is more effective than other curricular programs and that the positive effects persist through high school. The positive effects occur with students of all ability levels and social backgrounds. Students exposed to Direct Instruction also have greater self-esteem and self-confidence than students in other programs, primarily because they are learning more material and understand that they can be successful students.
Other criticisms focus on the Direct Instruction programs and their use by teachers. Some suggest that Direct Instruction is only “rote and drill” and that teachers don’t like it because it hampers their creativity. Again, the research evidence counters these claims. Rather than involving a “rote and drill” approach, DI programs are designed to accelerate students’ learning and allow them to learn more material in a shorter amount of time. The programs are technical and prescriptive, designed to make teachers more effective, much like the prescribed techniques for surgeons and pilots ensure optimal results for their patients and passengers. Yet, just as surgeons and pilots’ personalities create the atmosphere of an operating room or plane, the individual personalities and creativity of DI teachers permeate their classrooms and interactions with students. The research shows that teachers like DI programs because they help their students learn more and they become more effective teachers.
Others suggest that using separate elements of the programs will result in outcomes that are just as good as using the full DI programs. Yet, the research shows that using only some of the elements of DI programs, what is sometimes called “direct instruction” or “little di,” is far less effective than using the true Direct Instruction programs developed by Engelmann and associates.
The claim that Direct Instruction programs are not effective has been promulgated in recent years by the What Works Clearinghouse (WWC), which is funded by the United States Department of Education to provide reviews of curricular programs. Careful analyses of the WWC reports show that they can be very misleading and provide inaccurate summaries of the research. As a result, some WWC reports give positive ratings to programs that researchers have found to be ineffective and negative ratings to programs that the research has found to be highly effective.
For more, see the work of: