fbpx
What our partners say…

NIFDI recently released two new technical reports on the procedures used by the What Works Clearinghouse (WWC) and the ways in which they differ from those typically used within the social sciences, as well as errors in the actual process used in review.

 

What is a Valid Scientific Study? An Analysis of Selection Criteria Used by the What Works Clearinghouse (Technical Report 2014-3)

Screen Shot 2014-09-02 at 3.17.52 PM

Meta-analyses and reviews of the educational research literature have identified hundreds of efficacy studies. Yet the What Works Clearinghouse reports that very few of these analyses meet its selection criteria and standards of evidence. This report examines why these differences occur. It finds that the WWC procedures differ markedly from standard practices within the social sciences. The WWC gives no academic or scholarly justification for their policies. Moreover, an empirical, quantitative analysis of the utility of the WWC approach indicates that it provides no “value added” to estimates of a curriculum’s impact. The costs of applying the WWC standards are far from minimal and result in highly selective and potentially biased summaries of the literature. It is suggested that the public would be better served if the WWC adopted the standard methodological practices of the social sciences.

 

 

Reading Mastery for Beginning Readers: An Analysis of Errors in a What Works Clearinghouse Report (Technical Report 2014-4)

Screen Shot 2014-09-02 at 3.18.18 PM

A November 2013 report of the What Works Clearinghouse stated that it could find “no studies of Reading Mastery that fall within the scope of the Beginning Reading review protocol [and] meet What Works Clearinghouse evidence standards” (WWC, 2013b, p. 1). This technical report documents substantial errors in the WWC’s compilation of studies to examine and in the interpretations of individual studies. Effect sizes (Cohen’s d) were computed for results of more than three dozen studies identified by the WWC but rejected for analysis. All of these studies conformed to standard methodological criteria regarding valid research designs and would have been accepted in scholarly reviews of the literature. The average effect size associated with Reading Mastery was .57. This value is more than twice the .25 level traditionally used to denote educational significance. The results replicate meta-analyses that have found strong evidence of the efficacy of Reading Mastery. Given the high rate of error in this and other WWC reports, consumers are advised to consult reviews of studies in the standard research literature rather than WWC summaries.

  

Examining the Inaccuracies and Mystifying Policies and Standards of the What Works Clearinghouse: Findings from a FOIA Request (Technical Report 2014-5) - Added 10/14/14

Screen Shot 2014-10-17 at 2.22.24 PM

Reviewing documentation related to 62 Quality Reviews of What Works Clearinghouse (WWC) publications, this report summarizes the reasons for the reviews, the revisions made and not made, and the inconsistent application of WWC policies during the publication of these reports and during the Quality Reviews. This information was obtained in the fall of 2013 via the Freedom of Information Act (FOIA). These Quality Reviews were performed in response to the concerns of 54 organizations, study authors, program developers, teachers, and education researchers.Forty-one different instructional programs and study reviews were examined in these Quality Reviews. This documentation revealed a wide range of concerns, particularly the misinterpretation of study findings. This issue was given specific attention, especially in relation to how the WWC accounts for the fidelity of implementation when determining their rating of effectiveness for specific programs. With the information provided from the FOIA request and the publicly available information three conclusions appear clear. 1) The WWC suffers from a lack of transparency in their policies and guidelines, 2) the conclusions they create in their reports can be misleading, and 3) the reports are potentially damaging to program developers and ultimately the success of students.

  

See also:  
NIFDI's Webpage on the What Works Clearinghouse


Anyone with questions regarding these reports should contact Dr. Jean Stockard at NIFDI's Office of Research. Dr. Stockard can be reached at 877.485.1973 or via email at research@nifdi.org.

Implementing Direct Instruction Successfully

When implemented fully, Direct Instruction (DI) is unparalleled in its ability to improve student performance and enhance students’ self-esteem. In order to implement DI effectively, much more is required than simply purchasing instructional materials. The following two-part tutorial guides administrators, teachers, and coaches through the key features of a successful DI implementation. Part I provides an overview of the steps schools need to take in preparation for a DI implementation before school starts, while Part II provides an overview of the steps schools need to take after school has started.

IMPORTANT: This tutorial is an intensive video series comprised of 18 segments, each followed by a series of questions. Users should allow approximately three hours to watch the videos and complete the questions. NIFDI recognizes the high demand for time placed on school officials and, for this reason, has structured the tutorial so users may stop at any time and later resume where they left off.

Enroll in the tutorial here


Tutorial Thinkific Header
New to Direct Instruction? Watch the Introduction to Direct Instruction Video Series before taking the online tutorial.

Module-Bottom-Button-A rev

Module-Bottom-Button-B rev

Module-Bottom-Button-C rev2

AmazonSmileModule 01

Screen Shot 2015 10 01 at 2.17.01 PM
Let Us Help
Close The Student Achievement Gap
With Direct Instruction!

Corrective Math CoverClick for Literacy Solutions Corrective Math CoverClick for Numeracy Solutions
Call 877-485-1973 or Email Info@NIFDI.org