Volume 46, Issue 1
(March 2017)

< Previous Next >



Current Issue
Available Issues

Alerts for the Journal

Click to get an email alert for every new issue of

School Psychology Review

Journal Information

Online ISSN:  
Frequency: Quarterly
RSS Feed:
(What is this?)
rrs icon

Register for a Profile

Not Yet Registered?

Benefits of Registration Include:

  • A Unique User Profile that will allow you to manage your current subscriptions (including online access)
  • The ability to create favorites lists down to the article level
  • The ability to customize email alerts to receive specific notifications about the topics you care most about and special offers

Register Now!

Previous Article
Volume 46, Issue 1
(March 2017)
Next Article
  • Add to Favorites
  • |
  • Share Article
  • |
  • Export Citations
  • |
  • Track Citations (RSS | Email)
  • |
  • Permissions

  • Full-text
  • PDF

Relative Value of Common Screening Measures in Mathematics

Amanda M. VanDerHeyden

Education Research & Consulting

Robin S. Codding

University of Minnesota

Ryan Martin

University of Massachusetts at Boston

Correspondence concerning this article should be directed to Amanda M. VanDerHeyden, Education Research & Consulting, Fairhope, AL 36532; e-mail:

Amanda M. VanDerHeyden is a frequent contributor to the literature on the use of data-based decision making to improve efficiency, accuracy, and intensity of instruction and raise school-wide achievement. She is coauthor of the Evidence-Based Mathematics Innovation Configuration for the National Comprehensive Center for Teacher Quality at Vanderbilt University and now the Collaboration for Effective Education Development, Accountability, and Reform at University of Florida. Her most recent effort has been completing a Web-based mathematics intervention system that provides universal screening, progress monitoring, and multitier intervention aligned with student need covering numeracy to algebra. This system (SpringMath, www.springmath.com) became available September 1, 2016.

Robin S. Codding is an associate professor of school psychology at the University of Minnesota. She earned her PhD in school psychology from Syracuse University. Dr. Codding's research interests focus on the intersection of intervention and implementation by developing and exploring the effectiveness of school-based interventions, the factors that contribute to student responsiveness of those interventions, and strategies to support intervention implementation. Dr. Codding's work has emphasized academic interventions and associated assessment for data-based decision making, particularly in the area of mathematics. Dr. Codding has served in the role of associate editor for Journal of Behavioral Education, Journal of School Psychology, and School Psychology Review.

Ryan Martin is a postdoctoral fellow and behavioral consultant with May Institute and the National Autism Center, where he currently studies school-based interventions for children with autism spectrum disorder. Ryan's broader research interests include mathematics intervention, home and school consultation, and methods of assessing and improving treatment fidelity.

Editor: Amy Reschly

Abstract

Schools need evidence-based guidance on which measures in mathematics, administered under what particular set of conditions (e.g., time of year), provide the most useful prediction. The purpose of this study was to examine decision accuracy among commonly used screening measures with a priority toward identifying the least costly screening measures for predicting year-end mathematics failure. Predictors included existing demographic characteristics, the preceding year-end mathematics test score, and multiple measures administered during the study year including multiskill computation and concepts/applications measures, addition and subtraction for third grade, multiplication and division for fourth grade, and multidigit multiplication for fifth grade. Results supported the use of a single measure for screening. The preceding year's test score was superior or comparable in accuracy to current-year screening measures and was the lowest cost option (i.e., required no additional assessment time). Results cautioned against the use of multiskill computation and concepts/applications measures at all grade levels because of a high number of false-negative errors. The single-skill computation measures performed comparably to the preceding year-end test in overall accuracy. However, the single-skill probes outperformed all other measures in detecting students who would fail the year-end test, which is the most important function of a screening device. For most measures, the winter screening occasion offered the best predictive accuracy.

Received: October 9, 2016; Accepted: February 20, 2016;

Copyright 2017 by the National Association of School Psychologists

Cited by

Amanda M. VanDerHeyden, Matthew K. Burns and Wesley Bonifay. (2018) Is More Screening Better? The Relationship Between Frequent Screening, Accurate Decisions, and Reading Proficiency. School Psychology Review 47:1, 62-82.
Online publication date: 27-Mar-2018.
Abstract | Full Text | PDF (909 KB)  | Track Citations(RSS | Email)
Ethan R. Van Norman, David A. Klingbeil and Peter M. Nelson. (2017) Posttest Probabilities: An Empirical Demonstration of Their Use in Evaluating the Performance of Universal Screening Measures Across Settings. School Psychology Review 46:4, 349-362.
Online publication date: 28-Dec-2017.
Abstract | Full Text | PDF (346 KB)  | Track Citations(RSS | Email)