Evidence-centered design, cognitive psychology and assessment

Professor Mislevy joined the University of Maryland in January, 2001, as a Professor of Measurement, Statistics, and Evaluation (EDMS). He is also an Affiliate Professor of Second Language Acquisition and the Joint Program in Survey Methods. He was previously a Distinguished Research Scientist at ETS and a Research Associate at National Opinion Research Center. He earned his Ph.D. in Methodology of Behavioral Research at the University of Chicago in 1981.

Dr. Mislevy's research interests apply developments in statistics, technology, and cognitive research to practical problems in educational assessment. His work has included a multiple-imputation approach for integrating sampling and test-theoretic models in the National Assessment of Educational Progress (NAEP), a Bayesian inference network for updating the student model in an intelligent tutoring system, and an evidence-centered design framework for assessment. His current projects include the NSF-supported PADI project, which is developing an assessment design system with a focus on science inquiry, and work with Cisco Systems on simulation-based assessments of design and troubleshooting with computer networks.

Among his honors and awards are the American Educational Research Association's Raymond B. Cattell Early Career Award for Programmatic Research, the National Council of Measurement in Education's Award for Technical Contributions to Educational Measurement (three times), the National Council of Measurement in Education's award for career contributions to educational measurement, the American Educational Research Association's Lindquist Award for contributions to educational measurement, the ETS Senior Research Scientist Award, and the International Language Testing Association's Samuel J. Messick Memorial Lecture Award. In 1994, he was elected president of the Psychometric Society, and in 2007 was elected to the National Academy of Education. He has served on the Defense Language Testing Advisory board, the Board of Testing and Assessment, and committees of the National Research Council on issues concerning assessment, instruction, and cognitive psychology, and was a primary author of final report of the National Assessment Governing Board's Design Feasibility Team.

Mislevy, R.J. (in press). How cognitive science challenges the educational measurement tradition. Measurement: Interdisciplinary Research and Perspectives.

Cognitive psychology and educational assessment (2006). In R.L. Brennan (Ed.), Educational Measurement (Fourth Edition). Phoenix, AZ: Greenwood.

Automated Scoring. (2006) D.M. Williamson, R.J. Mislevy, & I.I. Bejar (Eds.). Mahwah, NJ: Erlbaum Associates.

Intuitive test theory. (with H.I. Braun). Phi Delta Kappan, 2005, 86, 488-497.

On the structure of educational assessments. (with L. Steinberg & R. Almond). Measurement: Interdisciplinary Research and Perspectives, 2003, 1, 3-67.

Substance and structure in assessment arguments. Law, Probability, and Risk, 2003, 2, 237-258.

A cognitive task analysis, with implications for designing a simulation-based assessment. (with L. Steinberg, J. Breyer, L. Johnson, & R. Almond). Computers in Human Behavior, 1999, 15, 335-374.

Graphical models and computerized adaptive testing. (with R. Almond). Applied Psychological Measurement, 1999,23, 223-237.

The role of probability-based inference in an intelligent tutoring system (with D. H. Gitomer). User-Mediated and User-Adapted Interaction, 1996, 5, 253-282 (special issue on numerical methods of handling uncertainty).

Test theory reconceived. Journal of Educational Measurement, 1996, 33, 379-416.

Evidence and inference in educational assessment. Psychometrika, 1994, 59, 439-483.

Test theory for a new generation of tests (Edited volume, with N. Frederiksen & I. I. Bejar). Hillsdale, NJ: Erlbaum, 1993.

Linking educational assessments: Concepts, issues, methods, and prospects. (foreword by R. L. Linn). Princeton, NJ: Policy Information Center, Educational Testing Service, 1992.

Estimation of latent group effects. Journal of the American Statistical Association, 1986, 80, 993-997.

Untitled Document

Collaborative research with Cisco

NetPass. Principled assessment design for proficiency in computer networks. Collaboration among Cisco Learning Institute, Educational Testing Service, and the University of Maryland on applying the principles of evidence-centered assessment design to proficiencies in designing, implementing, and troubleshooting computer networks. Co-PIs: John Behrens, Cisco Systems, Inc., Robert Mislevy, University of Maryland, and David Williamson and Malcolm Bauer, Educational Testing Service.

AERA 2005. Abstract, overheads, and papers from the session Knowledge Representation in Assessment.

AERA 2009. Video of presentations from the session An Integrated Research Program for E-Learning and Assessment in a Complex Domain.