National Academy of Education (2025)AERA Fellow (2023)NCME President (2021–22)
Last updated April 2026
Research Expertise
My research examines how measurement works—and how it can mislead—in educational contexts. I work at the intersection of psychometrics, causal inference, and assessment policy, with sustained attention to vertical scaling and growth modeling in K–12 assessment, the theoretical and historical foundations of measurement in the human sciences, and the design of large-scale assessment systems that can genuinely inform teaching and learning. Recent work develops learning progressions as an alternative framework for scale construction, content-referenced approaches to growth reporting, and the philosophical commitments that distinguish measurement in the social sciences from its counterparts in the physical sciences.
Keywords: assessment policy; causal inference in education; educational measurement; growth modeling; history and philosophy of measurement; item response theory; learning progressions; psychometrics; Rasch model; vertical scaling.
Professional Experience
2026
Associate Dean of Faculty, School of Education, University of Colorado Boulder
Ph.D., University of California, Berkeley. Education: Quantitative Methods and Evaluation. Dissertation: SAT Coaching, Bias and Causal Inference (winner, AERA Division D Outstanding Dissertation Award, 2004). Advisor: Mark Wilson.
Award for Significant Contribution to Educational Measurement and Research Methodology, American Educational Research Association Division D, for the book Historical and Conceptual Foundations of Measurement in the Human Sciences (Briggs, 2021)
2023
American Educational Research Association Fellow
2021–2022
Elected President, National Council for Measurement in Education (presidential term)
2020–2022
Honorary Research Fellow, University of Oxford, Department of Education
2016–2019
Elected at-large member of Board of Directors, National Council for Measurement in Education
2013–2016
Editor, Educational Measurement: Issues and Practice
2013
Outstanding Reviewer Award, Journal of Educational and Behavioral Statistics
2012
University of Colorado Provost’s Award for Faculty Achievement
2012
Annual Award for Contributions to Theory and Practice, National Council on Measurement in Education
EDUC 7396: Latent Variable and Structural Equation Modeling
PhD Students Graduated
Name
Year
Current position
Kyla McClure
2025
Postdoctoral Fellow, Center for Assessment (NCIEA)
Sanford Student
2023
Assistant Professor, School of Education, University of Delaware
Rajendra Chattergoon
2020
Director of Efficacy Research, Lexia Learning
Amy Burkhardt
2020
Managing Lead Scientist, Cambium Assessment
Michael Turner
2019
Global VP of Services and Support, Upland Software
Jessica Alzen
2016
Executive Director of Accountability and Evaluation, Boulder Valley School District
Ruhan Circi
2015
Principal Data Scientist, American Institutes for Research
Nathan Dadey
2015
Associate, Center for Assessment (NCIEA)
Benjamin Domingue
2012
Associate Professor, Stanford Graduate School of Education
Jonathan Weeks
2011
Principal Psychometrician, Stanford University
Kimberly Geil
2011
Independent Research Consultant
Matthew Gaertner
2011
Director of Research, WestEd (in memoriam, d. 2021)
Elena Diaz-Bilello
2011
Associate Director, CADRE, University of Colorado Boulder
Robert Talbot
2010
Associate Professor, School of Education and Human Development, University of Colorado Denver
Eric Snow
2008
Independent Research Consultant
Publications
h-index: 34 · Google Scholar, April 2026
Selected Publications
Briggs, D. C., McClure, K., Student, S., Wellberg, S., Minchen, N., Cox, O., Whitfield, E., Buchbinder, N., & Davis, L. (2025). Visualizing and reporting content-referenced growth on a learning progression. Educational Assessment, 1-23. https://doi.org/10.1080/10627197.2025.2503288
Briggs, D. C. (2022). NCME presidential address 2022: Turning the page to the next chapter of educational measurement. Journal of Educational Measurement, 59(4), 398-417. https://doi.org/10.1111/jedm.12350
Briggs, D. C. (2021). Historical and conceptual foundations of measurement in the human sciences: Credos and controversies. New York, NY: Routledge.
Briggs, D. C., & Peck, F. A. (2015). Using learning progressions to design vertical scales that support coherent inferences about student growth. Measurement: Interdisciplinary Research & Perspectives, 13, 75-99. https://doi.org/10.1080/15366367.2015.1042814
Briggs, D. C. (2013). Measuring growth with vertical scales. Journal of Educational Measurement, 50(2), 204-226. https://doi.org/10.1111/jedm.12011
Briggs, D. C. (2008). Using explanatory item response models to analyze group differences in science achievement. Applied Measurement in Education, 21(2), 89-118. https://doi.org/10.1080/08957340801926086
Refereed Journal Articles (38)
Briggs, D. C., McClure, K., Student, S., Wellberg, S., Minchen, N., Cox, O., Whitfield, E., Buchbinder, N., & Davis, L. (2025). Visualizing and reporting content-referenced growth on a learning progression. Educational Assessment, 1-23. https://doi.org/10.1080/10627197.2025.2503288
Student, S. R., Briggs, D. C., & Davis, L. (2025). Growth across grades and common item grade alignment in vertical scaling using the Rasch model. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12639
Briggs, D. C. (2024). The past, present, and future of large-scale assessment consortia. Educational Measurement: Issues and Practice, 43, 62-72. https://doi.org/10.1111/emip.12634
Ackerman, T. A., Bandalos, D. L., Briggs, D. C., Everson, H. T., Ho, A. D., Lottridge, S. M., Madison, M. J., Sinharay, S., Rodriguez, M. C., Russell, M., von Davier, A. A., & Wind, S. A. (2023). Foundational competencies in educational measurement. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12581
Briggs, D. C. (2022). NCME presidential address 2022: Turning the page to the next chapter of educational measurement. Journal of Educational Measurement, 59(4), 398-417. https://doi.org/10.1111/jedm.12350
Peck, F., Johnson, R., Briggs, D. C., & Alzen, J. (2021). Toward learning trajectory-based instruction: A framework of conceptions of learning and assessment. School Science and Mathematics, 121, 357-368. https://doi.org/10.1111/ssm.12489
Briggs, D. C., Chattergoon, R., & Burkhardt, A. (2019). Examining the dual purpose use of student learning objectives for classroom assessment and teacher evaluation. Journal of Educational Measurement. https://doi.org/10.1111/jedm.12233
Briggs, D. C., & Alzen, J. L. (2019). Making inferences about teacher observation scores over time. Educational and Psychological Measurement. https://doi.org/10.1177/0013164419826237
Briggs, D. C., & Kizil, R. C. (2017). Challenges to the use of artificial neural networks for diagnostic classifications with student test data. International Journal of Testing. https://doi.org/10.1080/15305058.2017.1297816
Penuel, W. R., Briggs, D. C., Davidson, K. L., Herlihy, C., Sherer, D., Hill, H. C., Farrell, C., & Allen, A. (2017). How school and district leaders access, perceive, and use research. AERA Open, 3(2), 1-17. https://doi.org/10.1177/2332858417705370
Briggs, D. C., & Dadey, N. (2016). Principal holistic judgments and high-stakes evaluations of teachers. Educational Assessment, Evaluation and Accountability, 29, 155-178. https://doi.org/10.1007/s11092-016-9256-7
Briggs, D. C., & Peck, F. A. (2015). Rejoinder to commentaries on using learning progressions to design vertical scales. Measurement: Interdisciplinary Research and Perspectives, 13(3-4), 206-218. https://doi.org/10.1080/15366367.2015.1104113
Briggs, D. C., & Peck, F. A. (2015). Using learning progressions to design vertical scales that support coherent inferences about student growth. Measurement: Interdisciplinary Research & Perspectives, 13, 75-99. https://doi.org/10.1080/15366367.2015.1042814
Briggs, D. C., & Dadey, N. (2015). Making sense of common test items that do not get easier over time: Implications for vertical scale designs. Educational Assessment, 20(1), 1-22.
Briggs, D. C., & Domingue, B. (2013). The gains from vertical scaling. Journal of Educational and Behavioral Statistics, 38(6), 551-576. https://doi.org/10.3102/1076998613508317
Briggs, D. C. (2013). Measuring growth with vertical scales. Journal of Educational Measurement, 50(2), 204-226. https://doi.org/10.1111/jedm.12011
Safran, R. J., Flaxman, S. M., Kopp, M., Irwin, D. E., Briggs, D., Evans, M., Funk, W., Gray, Hebbets, E., Seddon, N., Scordato, E., Symes, L., Tobias, J., Toews, D., & Uy, J. (2013). A robust new metric of phenotypic distance to estimate and compare multiple trait differences among populations. Current Zoology, 58(3), 426-439.
Briggs, D. C., Ruiz-Primo, M. A., Furtak, E., Shepard, L., & Yin, Y. (2012). Meta-analytic methodology and conclusions about the efficacy of formative assessment. Educational Measurement: Issues and Practice, 13-17. https://doi.org/10.1111/j.1745-3992.2012.00251.x
Dadey, N., & Briggs, D. C. (2012). A meta-analysis of growth trends from vertically scaled assessments. Practical Assessment, Research & Evaluation, 17(14). http://pareonline.net/getvn.asp?v=17&n=14
Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-experimental studies of inquiry-based science teaching: A meta-analysis. Review of Educational Research, 300-329. https://doi.org/10.3102/0034654312457206
Briggs, D. C., & Weeks, J. P. (2011). The persistence of value-added school effects. Journal of Educational and Behavioral Statistics, 36(5), 616-637.
Ruiz-Primo, M., Briggs, D. C., Iverson, H., Talbot, R., & Shepard, L. (2011). Impact of undergraduate science course innovations on learning. Science, 331, 1269-1270. https://doi.org/10.1126/science.1198976
Briggs, D. C., & Weeks, J. P. (2009). The sensitivity of value-added modeling to the creation of a vertical scale. Education Finance & Policy, 4(4), 384-414. https://doi.org/10.1162/edfp.2009.4.4.384
Briggs, D. C., & Weeks, J. P. (2009). The impact of vertical scaling decisions on growth interpretations. Educational Measurement: Issues & Practice, 28(4), 3-14. https://doi.org/10.1111/j.1745-3992.2009.00158.x
Domingue, B. W., & Briggs, D. C. (2009). Using linear regression and propensity score matching to estimate the effect of coaching on the SAT. Multiple Linear Regression Viewpoints, 35(1), 12-29.
Briggs, D. C. (2008). Using explanatory item response models to analyze group differences in science achievement. Applied Measurement in Education, 21(2), 89-118. https://doi.org/10.1080/08957340801926086
Briggs, D. C. (2008). Synthesizing causal inferences. Educational Researcher, 37(1), 15-22.
Briggs, D. C. (2004). Causal inference and the Heckman model. Journal of Educational and Behavioral Statistics, 29(4), 397-420.
Briggs, D. C., & Wilson, M. (2003). An introduction to multidimensional measurement using Rasch models. Journal of Applied Measurement, 4(1), 87-100.
Briggs, D. C. (2002). SAT coaching, bias and causal inference. Dissertation Abstracts International. DAI-A 64/12, p. 4433 (UMI No. 3115515).
Briggs, D. C. (2001). The effect of admissions test preparation: Evidence from NELS-88. Chance, 14(1), 10-18.
Stern, D., & Briggs, D. (2001). Does paid employment help or hinder performance in secondary school? Insights from US high school students. Journal of Education and Work, 14(3), 355-372.
Stern, D., & Briggs, D. (2001). Changing admissions policies: Mounting pressures, new developments, more questions. Change, 33(1), 34-41.
Books and Book Chapters (19)
Briggs, D. C. (2026). Psychometrics. In Teo, T. (Ed.), The Palgrave Encyclopedia of Theoretical and Philosophical Psychology. Cham: Palgrave Macmillan. https://doi.org/10.1007/978-3-031-70581-6_115-1
Briggs, D. C. (2021). Historical and conceptual foundations of measurement in the human sciences: Credos and controversies. New York, NY: Routledge.
Briggs, D. C. (2021). A history of scaling and its relationship to measurement. In Clauser, B. (Ed.), A History of Educational Measurement. New York, NY: Routledge.
Briggs, D. C., & Furtak, E. F. (2019). Learning progressions and embedded assessment. In Brookhart, S., & McMillan, J. (Eds.), Classroom Assessment and Educational Measurement. Routledge. NCME Book Series. https://doi.org/10.4324/9780429507533-9
Briggs, D., & Domingue, B. (2014). Value-added to what? The paradox of multidimensionality. In Lissitz, R. (Ed.), Value-added Modeling and Growth Modeling with Particular Application to Teacher and School Effectiveness. Charlotte, NC: Information Age Publishing.
Camilli, G., Briggs, D. C., Sloane, F., & Chiu, T.-W. (2013). Psychometric perspectives on test fairness: Shrinkage estimation. In APA Handbook of Testing and Assessment in Psychology, Volume 3: Testing and Assessment in School Psychology and Education.
Briggs, D. C. (2012). Making value-added inferences from large-scale assessments. In Simon, M., Ercikan, K., & Rousseau, M. (Eds.), Improving Large-Scale Assessment in Education: Theory, Issues and Practice. London: Routledge. https://doi.org/10.4324/9780203154519
Briggs, D. C. (2012). Making progress in the modeling of learning progressions. In Alonzo, A., & Gotwals, A. (Eds.), Learning Progressions in Science (pp. 293-316). Sense Publishers. https://doi.org/10.1007/978-94-6091-824-7_15
Briggs, D. C., & Alonzo, A. C. (2012). The psychometric modeling of ordered multiple-choice item responses for diagnostic assessment with a learning progression. In Alonzo, A., & Gotwals, A. (Eds.), Learning Progressions in Science (pp. 345-355). Sense Publishers.
Briggs, D. C. (2011). Cause or effect? Validating the use of tests for high-stakes inferences in education. In Dorans, N. J., & Sinharay, S. (Eds.), Looking Back: Proceedings of a Conference in Honor of Paul W. Holland. New York, NY: Springer. https://doi.org/10.1007/978-1-4419-9389-2_8
Briggs, D. C. (2010). Two Philadelphia reports. In Welner, K., Hinchey, P., Molnar, A., & Weizman, D. (Eds.), Think Tank Research Quality: Lessons for Policymakers, the Media, and the Public. Information Age Publishing.
Briggs, D. C. (2010). Schools in eight states: Effects on achievement, attainment, integration, and competition. In Welner, K., Hinchey, P., Molnar, A., & Weizman, D. (Eds.), Think Tank Research Quality: Lessons for Policymakers, the Media, and the Public. Information Age Publishing.
Briggs, D. C., & Wiley, E. (2008). Causes and effects. In Ryan, K., & Shepard, L. (Eds.), The Future of Test-Based Educational Accountability. Routledge. https://doi.org/10.4324/9780203895092
Ruiz-Primo, M. A., Briggs, D., Shepard, L., Iverson, H., & Huchton, M. (2008). Evaluating the impact of instructional innovations in engineering education. In Duque, M. (Ed.), Engineering Education for the XXI Century: Foundations, Strategies and Cases (pp. 241-274). Bogotá, Colombia: ACOFI Publications.
Rijmen, F., & Briggs, D. C. (2004). Multiple person dimensions and latent item predictors. In De Boeck, P., & Wilson, M. (Eds.), Explanatory Item Response Models: A Generalized Linear and Nonlinear Approach. Springer. https://doi.org/10.1007/978-1-4757-3990-9_8
Tuerlinckx, F., Rijmen, F., Molenberghs, G., Verbeke, G., Briggs, D., Van den Noorgate, W., Meulders, M., & De Boeck, P. (2004). Estimation and software. In De Boeck, P., & Wilson, M. (Eds.), Explanatory Item Response Models: A Generalized Linear and Nonlinear Approach. Springer.
Briggs, D. C. (2004). Evaluating SAT coaching: Gains, effects and self-selection. In Zwick, R. (Ed.), Rethinking the SAT: The Future of Standardized Testing in University Admissions. RoutledgeFalmer. https://doi.org/10.4324/9780203463932
Briggs, D. C. (2002). Test preparation programs: Impact. In Encyclopedia of Education (2nd ed.).
Reports & Working Papers (25)
Briggs, D. C., Carrasco, D., Martinez, S., Hopfenbeck, T., & Sandoval-Hernandez, A. (2025). Psychometric issues related to the PAES reporting scales. International Scientific Committee. 2nd Report.
Briggs, D. C., Cox, O., Student, S., & Whitfield, E. (2023). Teacher perspectives on the content-reference growth reporting prototype: Findings from interviews. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE), University of Colorado Boulder.
Cox, O., & Briggs, D. C. (2023). Development of a reading foundational skills learning progression. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE), University of Colorado Boulder.
Wellberg, S., Briggs, D. C., & Student, S. (2023). Big ideas in the understanding of fractions: A learning progression. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE), University of Colorado Boulder.
Hill, H. C., & Briggs, D. C. (2020). Education leaders’ knowledge of causal research design: A measurement challenge. Annenberg Institute at Brown University. EdWorkingPaper 20-298. https://doi.org/10.26300/vxt5-ws91
Atteberry, A., Briggs, D. C., LaCour, S., & Bibilos, C. (2015). Year 2 Denver ProComp evaluation report: Teacher retention and variability in bonus pay, 2001-02 through 2013-14. Center for Assessment, Design, Research and Evaluation (CADRE). Report for Denver Public Schools.
Briggs, D. C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., & Johnson, R. (2015). Using a learning progression framework to assess and evaluate growth. Center for Assessment, Design, Research and Evaluation (CADRE). Working Paper.
Briggs, D. C., Dadey, N., & Kizil, R. C. (2015). Comparing student growth and teacher observation to principal judgments in the evaluation of teacher effectiveness. Center for Assessment, Design, Research and Evaluation (CADRE). Report for the Georgia Department of Education.
Briggs, D. C., Kizil, R. C., & Dadey, N. (2015). Adjusting mean growth percentiles for classroom composition. Center for Assessment, Design, Research and Evaluation (CADRE). Report for the Georgia Department of Education.
Briggs, D. C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., & McClelland, A. (2014). Tier 3 student learning objective pilot: Documentation of pilot work and lessons learned in the 2013-2014 school year. Center for Assessment, Design, Research and Evaluation (CADRE). Report for Denver Public Schools.
Briggs, D. C., Diaz-Bilello, E., Maul, A., Turner, M., & Bibilos, C. (2014). Denver ProComp evaluation report: 2010-2012. Center for Assessment, Design, Research and Evaluation (CADRE) and the National Center for the Improvement of Educational Assessment.
Diaz-Bilello, E. K., & Briggs, D. C. (2014). Using student growth percentiles for educator evaluations at the teacher level: Key issues and technical considerations for school districts in Colorado. Center for Assessment and the Center for Assessment, Design, Research and Evaluation (CADRE).
Briggs, D., & Alzen, J. (2013). Does taking an online version of a course have a negative effect on student learning? An evaluation study. Commissioned by the University of Colorado’s Department of Continuing Education.
Alzen, J., Briggs, D., Whitcomb, J., Haug, C., Paterson, W., & Klopfenstein, K. (2012). An initial exploration of Colorado-trained teachers: Providing context for outcome-based teacher preparation program evaluation. Report Commissioned by the Colorado Department of Higher Education.
Alzen, J., Briggs, D., Whitcomb, J., Haug, C., Paterson, W., & Klopfenstein, K. (2012). Enhancing Colorado data systems: Linking teachers to preparation programs. Report Commissioned by the Colorado Department of Higher Education.
Briggs, D. C. (2011). Making inferences about growth and value-added: Design issues for the PARCC consortium. White Paper Commissioned by the PARCC Large-Scale Assessment Consortium.
Briggs, D. C., & Domingue, B. D. (2011). Due diligence and the evaluation of teachers: A review of the value-added analysis underlying the effectiveness rankings of Los Angeles Unified School District teachers by the Los Angeles Times. National Education Policy Center. http://nepc.colorado.edu/publication/due-diligence
Briggs, D. C., & Domingue, B. D. (2011). Hawaii school improvement growth model analysis: 2010 results. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C., & Domingue, B. D. (2010). Hawaii school improvement growth model analysis: 2009 results and sensitivity analysis. Report Commissioned by the Hawaii Department of Education.
Gaertner, M., & Briggs, D. C. (2009). Detecting and addressing item parameter drift in IRT test equating contexts. Report Commissioned by the National Center for the Improvement of Educational Assessment.
Briggs, D. C. (2009). Preparation for college admissions exams. Report Commissioned by the National Association of College Admissions Counselors.
Briggs, D. C., & Weeks, J. P. (2009). Hawaii school improvement: Growth model analysis. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C. (2008). The goals and uses of value-added models. Paper prepared for a workshop held by the Committee on Value-Added Methodology for Instructional Improvement, Program Evaluation and Educational Accountability, sponsored by the National Research Council and the National Academy of Education, Washington DC, November 13-14, 2008. .
Commentaries & Reviews (16)
Briggs, D. C. (2024). Strive for measurement, set new standards, and try not to be evil. Commentary on Duolingo English Test responsible AI standards. Journal of Educational and Behavioral Statistics. https://doi.org/10.3102/10769986241238479
Briggs, D. C. (2021). Book review: A pragmatic perspective of measurement by David Torres Irribarra. Integrative Psychological and Behavioral Science. Online First. https://doi.org/10.1007/s12124-021-09635-7
Briggs, D. C. (2021). Commentary: Comment on college admissions tests and social responsibility. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12455
Briggs, D. C. (2017). Learning theory and psychometrics: Room for growth. Assessment in Education: Principles, Policy & Practice, 24(3), 351-358. https://doi.org/10.1080/0969594X.2017.1336987
Briggs, D. C. (2016). Can Campbell’s law be mitigated? In Braun, H. (Ed.), Meeting the Challenges to Measurement in an Era of Accountability. Routledge. NCME Book Series.
Briggs, D. C. (2013). Teacher evaluation as Trojan horse: The case for teacher-developed assessments. Measurement: Interdisciplinary Research and Perspectives, 11(1-2), 24-29. https://doi.org/10.1080/15366367.2013.784153
Briggs, D. C. (2010). Validate high stakes inferences by designing good experiments, not audit items. Measurement: Interdisciplinary Research and Perspectives, 8(4), 185-190.
Briggs, D. C. (2009). Review of “Charter schools in eight states: Effects on achievement, attainment, integration and competition” by Ron Zimmer, Brian Gill, Kevin Booker, Stephanie Lavertu, Tim Sass and John Witte. Education Policy Studies Laboratory. http://www.epicpolicy.org/thinktank/review-charter-schools-eight-states
Talbot, R., & Briggs, D. C. (2007). Does theory drive the items or do items drive the theory? Measurement: Interdisciplinary Research and Perspectives, 5(2-3), 205-208. https://doi.org/10.1080/15366360701492906
Briggs, D. C. (2007). Review of “State takeover, school restructuring, private management, and student achievement in Philadelphia” by Gill, Zimmer, Christman, and Blanc, and “School reform in Philadelphia” by Peterson. Education Policy Studies Laboratory. http://epsl.asu.edu/epru/epru_2007_thinktankreview.htm
Wiley, E., & Briggs, D. C. (2007). Can value-added assessment improve accountability? Education Views. University of Colorado at Boulder, School of Education, Winter 2007.
Briggs, D. C. (2007). Assessing what students know or how they know it? Measurement: Interdisciplinary Research and Perspectives, 5(1), 62-65. https://doi.org/10.1080/15366360701293618
Briggs, D. C. (2006). Review of “Getting farther ahead by staying behind: A second-year evaluation of Florida’s policy to end social promotion” by Jay Greene and Marcus Winters. Education Policy Studies Laboratory. http://epsl.asu.edu/epru/epru_2006_thinktankreview.htm
Briggs, D. C. (2006). Book review: The SAGE handbook of quantitative methods in the social sciences. Applied Psychological Methods, 30(5), 447-451.
Briggs, D. C. (2004). Comment: Making an argument for design validity before interpretive validity. Measurement: Interdisciplinary Research and Perspectives, 2(3), 171-174.
Briggs, D. C. (2002). Comment: Jack Kaplan’s new study of SAT coaching. Chance, 15(1), 7-8.
Marion, S., & Briggs, D. C. (2022). Just give us a little: Please make one small change in federal testing law to yield big improvements. National Center for the Improvement of Educational Assessment. July 13, 2022. https://www.nciea.org/blog/just-give-us-a-little
Briggs, D. C., & Marion, S. (2022). NEPC talks education: An interview with Derek Briggs and Scott Marion. National Education Policy Center (podcast). Interview, January 2022. https://open.spotify.com/episode/4OKNlwrl2fDizsAuI62XEo
Briggs, D. C. (2024). Strive for measurement, set new standards, and try not to be evil. Commentary on Duolingo English Test responsible AI standards. Journal of Educational and Behavioral Statistics. https://doi.org/10.3102/10769986241238479
Marion, S., & Briggs, D. C. (2022). Just give us a little: Please make one small change in federal testing law to yield big improvements. National Center for the Improvement of Educational Assessment. July 13, 2022. https://www.nciea.org/blog/just-give-us-a-little
Briggs, D. C., & Marion, S. (2022). NEPC talks education: An interview with Derek Briggs and Scott Marion. National Education Policy Center (podcast). Interview, January 2022. https://open.spotify.com/episode/4OKNlwrl2fDizsAuI62XEo
Briggs, D. C. (2021). Commentary: Comment on college admissions tests and social responsibility. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12455
Penuel, W. R., Briggs, D. C., Davidson, K. L., Herlihy, C., Sherer, D., Hill, H. C., Farrell, C., & Allen, A. (2017). How school and district leaders access, perceive, and use research. AERA Open, 3(2), 1-17. https://doi.org/10.1177/2332858417705370
Briggs, D. C., & Dadey, N. (2016). Principal holistic judgments and high-stakes evaluations of teachers. Educational Assessment, Evaluation and Accountability, 29, 155-178. https://doi.org/10.1007/s11092-016-9256-7
Briggs, D. C. (2016). Can Campbell’s law be mitigated? In Braun, H. (Ed.), Meeting the Challenges to Measurement in an Era of Accountability. Routledge. NCME Book Series.
Atteberry, A., Briggs, D. C., LaCour, S., & Bibilos, C. (2015). Year 2 Denver ProComp evaluation report: Teacher retention and variability in bonus pay, 2001-02 through 2013-14. Center for Assessment, Design, Research and Evaluation (CADRE). Report for Denver Public Schools.
Briggs, D. C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., & McClelland, A. (2014). Tier 3 student learning objective pilot: Documentation of pilot work and lessons learned in the 2013-2014 school year. Center for Assessment, Design, Research and Evaluation (CADRE). Report for Denver Public Schools.
Diaz-Bilello, E. K., & Briggs, D. C. (2014). Using student growth percentiles for educator evaluations at the teacher level: Key issues and technical considerations for school districts in Colorado. Center for Assessment and the Center for Assessment, Design, Research and Evaluation (CADRE).
Alzen, J., Briggs, D., Whitcomb, J., Haug, C., Paterson, W., & Klopfenstein, K. (2012). Enhancing Colorado data systems: Linking teachers to preparation programs. Report Commissioned by the Colorado Department of Higher Education.
Briggs, D. C. (2011). Cause or effect? Validating the use of tests for high-stakes inferences in education. In Dorans, N. J., & Sinharay, S. (Eds.), Looking Back: Proceedings of a Conference in Honor of Paul W. Holland. New York, NY: Springer. https://doi.org/10.1007/978-1-4419-9389-2_8
Briggs, D. C., & Domingue, B. D. (2011). Due diligence and the evaluation of teachers: A review of the value-added analysis underlying the effectiveness rankings of Los Angeles Unified School District teachers by the Los Angeles Times. National Education Policy Center. http://nepc.colorado.edu/publication/due-diligence
Briggs, D. C., & Domingue, B. D. (2011). Hawaii school improvement growth model analysis: 2010 results. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C. (2010). Two Philadelphia reports. In Welner, K., Hinchey, P., Molnar, A., & Weizman, D. (Eds.), Think Tank Research Quality: Lessons for Policymakers, the Media, and the Public. Information Age Publishing.
Briggs, D. C. (2010). Schools in eight states: Effects on achievement, attainment, integration, and competition. In Welner, K., Hinchey, P., Molnar, A., & Weizman, D. (Eds.), Think Tank Research Quality: Lessons for Policymakers, the Media, and the Public. Information Age Publishing.
Briggs, D. C., & Domingue, B. D. (2010). Hawaii school improvement growth model analysis: 2009 results and sensitivity analysis. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C. (2010). Validate high stakes inferences by designing good experiments, not audit items. Measurement: Interdisciplinary Research and Perspectives, 8(4), 185-190.
Briggs, D. C. (2009). Preparation for college admissions exams. Report Commissioned by the National Association of College Admissions Counselors.
Briggs, D. C., & Weeks, J. P. (2009). Hawaii school improvement: Growth model analysis. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C. (2009). Review of “Charter schools in eight states: Effects on achievement, attainment, integration and competition” by Ron Zimmer, Brian Gill, Kevin Booker, Stephanie Lavertu, Tim Sass and John Witte. Education Policy Studies Laboratory. http://www.epicpolicy.org/thinktank/review-charter-schools-eight-states
Briggs, D. C. (2008). The goals and uses of value-added models. Paper prepared for a workshop held by the Committee on Value-Added Methodology for Instructional Improvement, Program Evaluation and Educational Accountability, sponsored by the National Research Council and the National Academy of Education, Washington DC, November 13-14, 2008. .
Briggs, D. C. (2007). Review of “State takeover, school restructuring, private management, and student achievement in Philadelphia” by Gill, Zimmer, Christman, and Blanc, and “School reform in Philadelphia” by Peterson. Education Policy Studies Laboratory. http://epsl.asu.edu/epru/epru_2007_thinktankreview.htm
Wiley, E., & Briggs, D. C. (2007). Can value-added assessment improve accountability? Education Views. University of Colorado at Boulder, School of Education, Winter 2007.
Stern, D., & Briggs, D. (2001). Does paid employment help or hinder performance in secondary school? Insights from US high school students. Journal of Education and Work, 14(3), 355-372.
Stern, D., & Briggs, D. (2001). Changing admissions policies: Mounting pressures, new developments, more questions. Change, 33(1), 34-41.
Hill, H. C., & Briggs, D. C. (2020). Education leaders’ knowledge of causal research design: A measurement challenge. Annenberg Institute at Brown University. EdWorkingPaper 20-298. https://doi.org/10.26300/vxt5-ws91
Briggs, D. C. (2011). Cause or effect? Validating the use of tests for high-stakes inferences in education. In Dorans, N. J., & Sinharay, S. (Eds.), Looking Back: Proceedings of a Conference in Honor of Paul W. Holland. New York, NY: Springer. https://doi.org/10.1007/978-1-4419-9389-2_8
Domingue, B. W., & Briggs, D. C. (2009). Using linear regression and propensity score matching to estimate the effect of coaching on the SAT. Multiple Linear Regression Viewpoints, 35(1), 12-29.
Briggs, D. C. (2008). Synthesizing causal inferences. Educational Researcher, 37(1), 15-22.
Briggs, D. C., & Wiley, E. (2008). Causes and effects. In Ryan, K., & Shepard, L. (Eds.), The Future of Test-Based Educational Accountability. Routledge. https://doi.org/10.4324/9780203895092
Briggs, D. C. (2004). Causal inference and the Heckman model. Journal of Educational and Behavioral Statistics, 29(4), 397-420.
Briggs, D. C. (2002). SAT coaching, bias and causal inference. Dissertation Abstracts International. DAI-A 64/12, p. 4433 (UMI No. 3115515).
Briggs, D. C., & Kizil, R. C. (2017). Challenges to the use of artificial neural networks for diagnostic classifications with student test data. International Journal of Testing. https://doi.org/10.1080/15305058.2017.1297816
Briggs, D. C., & Alonzo, A. C. (2012). The psychometric modeling of ordered multiple-choice item responses for diagnostic assessment with a learning progression. In Alonzo, A., & Gotwals, A. (Eds.), Learning Progressions in Science (pp. 345-355). Sense Publishers.
Briggs, D., Alonzo, A., Schwab, C., & Wilson, M. (2006). Diagnostic assessment with ordered multiple-choice items. Educational Assessment, 11(1), 33-64. https://doi.org/10.1207/s15326977ea1101_2
Educational Measurement (11)
Briggs, D. C. (2026). Psychometrics. In Teo, T. (Ed.), The Palgrave Encyclopedia of Theoretical and Philosophical Psychology. Cham: Palgrave Macmillan. https://doi.org/10.1007/978-3-031-70581-6_115-1
Briggs, D. C. (2024). Strive for measurement, set new standards, and try not to be evil. Commentary on Duolingo English Test responsible AI standards. Journal of Educational and Behavioral Statistics. https://doi.org/10.3102/10769986241238479
Ackerman, T. A., Bandalos, D. L., Briggs, D. C., Everson, H. T., Ho, A. D., Lottridge, S. M., Madison, M. J., Sinharay, S., Rodriguez, M. C., Russell, M., von Davier, A. A., & Wind, S. A. (2023). Foundational competencies in educational measurement. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12581
Briggs, D. C. (2022). NCME presidential address 2022: Turning the page to the next chapter of educational measurement. Journal of Educational Measurement, 59(4), 398-417. https://doi.org/10.1111/jedm.12350
Briggs, D. C., & Marion, S. (2022). NEPC talks education: An interview with Derek Briggs and Scott Marion. National Education Policy Center (podcast). Interview, January 2022. https://open.spotify.com/episode/4OKNlwrl2fDizsAuI62XEo
Briggs, D. C. (2021). Historical and conceptual foundations of measurement in the human sciences: Credos and controversies. New York, NY: Routledge.
Briggs, D. C. (2021). Book review: A pragmatic perspective of measurement by David Torres Irribarra. Integrative Psychological and Behavioral Science. Online First. https://doi.org/10.1007/s12124-021-09635-7
Briggs, D. C. (2017). Learning theory and psychometrics: Room for growth. Assessment in Education: Principles, Policy & Practice, 24(3), 351-358. https://doi.org/10.1080/0969594X.2017.1336987
Talbot, R., & Briggs, D. C. (2007). Does theory drive the items or do items drive the theory? Measurement: Interdisciplinary Research and Perspectives, 5(2-3), 205-208. https://doi.org/10.1080/15366360701492906
Briggs, D. C. (2007). Assessing what students know or how they know it? Measurement: Interdisciplinary Research and Perspectives, 5(1), 62-65. https://doi.org/10.1080/15366360701293618
Briggs, D. C. (2006). Book review: The SAGE handbook of quantitative methods in the social sciences. Applied Psychological Methods, 30(5), 447-451.
Fairness (1)
Camilli, G., Briggs, D. C., Sloane, F., & Chiu, T.-W. (2013). Psychometric perspectives on test fairness: Shrinkage estimation. In APA Handbook of Testing and Assessment in Psychology, Volume 3: Testing and Assessment in School Psychology and Education.
Formative Assessment (5)
Briggs, D. C., McClure, K., Student, S., Wellberg, S., Minchen, N., Cox, O., Whitfield, E., Buchbinder, N., & Davis, L. (2025). Visualizing and reporting content-referenced growth on a learning progression. Educational Assessment, 1-23. https://doi.org/10.1080/10627197.2025.2503288
Briggs, D. C., Cox, O., Student, S., & Whitfield, E. (2023). Teacher perspectives on the content-reference growth reporting prototype: Findings from interviews. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE), University of Colorado Boulder.
Briggs, D. C., Chattergoon, R., & Burkhardt, A. (2019). Examining the dual purpose use of student learning objectives for classroom assessment and teacher evaluation. Journal of Educational Measurement. https://doi.org/10.1111/jedm.12233
Briggs, D. C., Ruiz-Primo, M. A., Furtak, E., Shepard, L., & Yin, Y. (2012). Meta-analytic methodology and conclusions about the efficacy of formative assessment. Educational Measurement: Issues and Practice, 13-17. https://doi.org/10.1111/j.1745-3992.2012.00251.x
Briggs, D. C., Cox, O., Student, S., & Whitfield, E. (2023). Teacher perspectives on the content-reference growth reporting prototype: Findings from interviews. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE), University of Colorado Boulder.
Briggs, D. C., & Peck, F. A. (2015). Using learning progressions to design vertical scales that support coherent inferences about student growth. Measurement: Interdisciplinary Research & Perspectives, 13, 75-99. https://doi.org/10.1080/15366367.2015.1042814
Briggs, D. C., Dadey, N., & Kizil, R. C. (2015). Comparing student growth and teacher observation to principal judgments in the evaluation of teacher effectiveness. Center for Assessment, Design, Research and Evaluation (CADRE). Report for the Georgia Department of Education.
Briggs, D. C., Kizil, R. C., & Dadey, N. (2015). Adjusting mean growth percentiles for classroom composition. Center for Assessment, Design, Research and Evaluation (CADRE). Report for the Georgia Department of Education.
Diaz-Bilello, E. K., & Briggs, D. C. (2014). Using student growth percentiles for educator evaluations at the teacher level: Key issues and technical considerations for school districts in Colorado. Center for Assessment and the Center for Assessment, Design, Research and Evaluation (CADRE).
Briggs, D. C., & Domingue, B. (2013). The gains from vertical scaling. Journal of Educational and Behavioral Statistics, 38(6), 551-576. https://doi.org/10.3102/1076998613508317
Briggs, D. C. (2013). Measuring growth with vertical scales. Journal of Educational Measurement, 50(2), 204-226. https://doi.org/10.1111/jedm.12011
Briggs, D. C., & Domingue, B. D. (2011). Hawaii school improvement growth model analysis: 2010 results. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C., & Domingue, B. D. (2010). Hawaii school improvement growth model analysis: 2009 results and sensitivity analysis. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C., & Weeks, J. P. (2009). Hawaii school improvement: Growth model analysis. Report Commissioned by the Hawaii Department of Education.
Briggs, D. C. (2021). Historical and conceptual foundations of measurement in the human sciences: Credos and controversies. New York, NY: Routledge.
Briggs, D. C. (2021). A history of scaling and its relationship to measurement. In Clauser, B. (Ed.), A History of Educational Measurement. New York, NY: Routledge.
Briggs, D. C. (2021). Book review: A pragmatic perspective of measurement by David Torres Irribarra. Integrative Psychological and Behavioral Science. Online First. https://doi.org/10.1007/s12124-021-09635-7
Briggs, D. C., & Alonzo, A. C. (2012). The psychometric modeling of ordered multiple-choice item responses for diagnostic assessment with a learning progression. In Alonzo, A., & Gotwals, A. (Eds.), Learning Progressions in Science (pp. 345-355). Sense Publishers.
Gaertner, M., & Briggs, D. C. (2009). Detecting and addressing item parameter drift in IRT test equating contexts. Report Commissioned by the National Center for the Improvement of Educational Assessment.
Briggs, D. C. (2008). Using explanatory item response models to analyze group differences in science achievement. Applied Measurement in Education, 21(2), 89-118. https://doi.org/10.1080/08957340801926086
Rijmen, F., & Briggs, D. C. (2004). Multiple person dimensions and latent item predictors. In De Boeck, P., & Wilson, M. (Eds.), Explanatory Item Response Models: A Generalized Linear and Nonlinear Approach. Springer. https://doi.org/10.1007/978-1-4757-3990-9_8
Tuerlinckx, F., Rijmen, F., Molenberghs, G., Verbeke, G., Briggs, D., Van den Noorgate, W., Meulders, M., & De Boeck, P. (2004). Estimation and software. In De Boeck, P., & Wilson, M. (Eds.), Explanatory Item Response Models: A Generalized Linear and Nonlinear Approach. Springer.
Briggs, D. C., & Wilson, M. (2003). An introduction to multidimensional measurement using Rasch models. Journal of Applied Measurement, 4(1), 87-100.
Large-scale Assessment (5)
Briggs, D. C., Carrasco, D., Martinez, S., Hopfenbeck, T., & Sandoval-Hernandez, A. (2025). Psychometric issues related to the PAES reporting scales. International Scientific Committee. 2nd Report.
Briggs, D. C. (2024). The past, present, and future of large-scale assessment consortia. Educational Measurement: Issues and Practice, 43, 62-72. https://doi.org/10.1111/emip.12634
Marion, S., & Briggs, D. C. (2022). Just give us a little: Please make one small change in federal testing law to yield big improvements. National Center for the Improvement of Educational Assessment. July 13, 2022. https://www.nciea.org/blog/just-give-us-a-little
Briggs, D. C. (2012). Making value-added inferences from large-scale assessments. In Simon, M., Ercikan, K., & Rousseau, M. (Eds.), Improving Large-Scale Assessment in Education: Theory, Issues and Practice. London: Routledge. https://doi.org/10.4324/9780203154519
Briggs, D. C. (2011). Making inferences about growth and value-added: Design issues for the PARCC consortium. White Paper Commissioned by the PARCC Large-Scale Assessment Consortium.
Learning Progressions (11)
Briggs, D. C., McClure, K., Student, S., Wellberg, S., Minchen, N., Cox, O., Whitfield, E., Buchbinder, N., & Davis, L. (2025). Visualizing and reporting content-referenced growth on a learning progression. Educational Assessment, 1-23. https://doi.org/10.1080/10627197.2025.2503288
Cox, O., & Briggs, D. C. (2023). Development of a reading foundational skills learning progression. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE), University of Colorado Boulder.
Wellberg, S., Briggs, D. C., & Student, S. (2023). Big ideas in the understanding of fractions: A learning progression. Boulder, CO: Center for Assessment, Design, Research and Evaluation (CADRE), University of Colorado Boulder.
Peck, F., Johnson, R., Briggs, D. C., & Alzen, J. (2021). Toward learning trajectory-based instruction: A framework of conceptions of learning and assessment. School Science and Mathematics, 121, 357-368. https://doi.org/10.1111/ssm.12489
Briggs, D. C., & Furtak, E. F. (2019). Learning progressions and embedded assessment. In Brookhart, S., & McMillan, J. (Eds.), Classroom Assessment and Educational Measurement. Routledge. NCME Book Series. https://doi.org/10.4324/9780429507533-9
Briggs, D. C., & Peck, F. A. (2015). Rejoinder to commentaries on using learning progressions to design vertical scales. Measurement: Interdisciplinary Research and Perspectives, 13(3-4), 206-218. https://doi.org/10.1080/15366367.2015.1104113
Briggs, D. C., & Peck, F. A. (2015). Using learning progressions to design vertical scales that support coherent inferences about student growth. Measurement: Interdisciplinary Research & Perspectives, 13, 75-99. https://doi.org/10.1080/15366367.2015.1042814
Briggs, D. C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., & Johnson, R. (2015). Using a learning progression framework to assess and evaluate growth. Center for Assessment, Design, Research and Evaluation (CADRE). Working Paper.
Briggs, D. C. (2012). Making progress in the modeling of learning progressions. In Alonzo, A., & Gotwals, A. (Eds.), Learning Progressions in Science (pp. 293-316). Sense Publishers. https://doi.org/10.1007/978-94-6091-824-7_15
Briggs, D. C., & Alonzo, A. C. (2012). The psychometric modeling of ordered multiple-choice item responses for diagnostic assessment with a learning progression. In Alonzo, A., & Gotwals, A. (Eds.), Learning Progressions in Science (pp. 345-355). Sense Publishers.
Meta-analysis (4)
Briggs, D. C., Ruiz-Primo, M. A., Furtak, E., Shepard, L., & Yin, Y. (2012). Meta-analytic methodology and conclusions about the efficacy of formative assessment. Educational Measurement: Issues and Practice, 13-17. https://doi.org/10.1111/j.1745-3992.2012.00251.x
Dadey, N., & Briggs, D. C. (2012). A meta-analysis of growth trends from vertically scaled assessments. Practical Assessment, Research & Evaluation, 17(14). http://pareonline.net/getvn.asp?v=17&n=14
Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-experimental studies of inquiry-based science teaching: A meta-analysis. Review of Educational Research, 300-329. https://doi.org/10.3102/0034654312457206
Student, S. R., Briggs, D. C., & Davis, L. (2025). Growth across grades and common item grade alignment in vertical scaling using the Rasch model. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12639
Briggs, D. C., & Wilson, M. (2003). An introduction to multidimensional measurement using Rasch models. Journal of Applied Measurement, 4(1), 87-100.
Science Education (3)
Furtak, E. M., Seidel, T., Iverson, H., & Briggs, D. C. (2012). Experimental and quasi-experimental studies of inquiry-based science teaching: A meta-analysis. Review of Educational Research, 300-329. https://doi.org/10.3102/0034654312457206
Ruiz-Primo, M., Briggs, D. C., Iverson, H., Talbot, R., & Shepard, L. (2011). Impact of undergraduate science course innovations on learning. Science, 331, 1269-1270. https://doi.org/10.1126/science.1198976
Ruiz-Primo, M. A., Briggs, D., Shepard, L., Iverson, H., & Huchton, M. (2008). Evaluating the impact of instructional innovations in engineering education. In Duque, M. (Ed.), Engineering Education for the XXI Century: Foundations, Strategies and Cases (pp. 241-274). Bogotá, Colombia: ACOFI Publications.
Teacher Evaluation (12)
Briggs, D. C., Chattergoon, R., & Burkhardt, A. (2019). Examining the dual purpose use of student learning objectives for classroom assessment and teacher evaluation. Journal of Educational Measurement. https://doi.org/10.1111/jedm.12233
Briggs, D. C., & Alzen, J. L. (2019). Making inferences about teacher observation scores over time. Educational and Psychological Measurement. https://doi.org/10.1177/0013164419826237
Briggs, D. C., & Dadey, N. (2016). Principal holistic judgments and high-stakes evaluations of teachers. Educational Assessment, Evaluation and Accountability, 29, 155-178. https://doi.org/10.1007/s11092-016-9256-7
Atteberry, A., Briggs, D. C., LaCour, S., & Bibilos, C. (2015). Year 2 Denver ProComp evaluation report: Teacher retention and variability in bonus pay, 2001-02 through 2013-14. Center for Assessment, Design, Research and Evaluation (CADRE). Report for Denver Public Schools.
Briggs, D. C., Dadey, N., & Kizil, R. C. (2015). Comparing student growth and teacher observation to principal judgments in the evaluation of teacher effectiveness. Center for Assessment, Design, Research and Evaluation (CADRE). Report for the Georgia Department of Education.
Briggs, D. C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., & McClelland, A. (2014). Tier 3 student learning objective pilot: Documentation of pilot work and lessons learned in the 2013-2014 school year. Center for Assessment, Design, Research and Evaluation (CADRE). Report for Denver Public Schools.
Briggs, D. C., Diaz-Bilello, E., Maul, A., Turner, M., & Bibilos, C. (2014). Denver ProComp evaluation report: 2010-2012. Center for Assessment, Design, Research and Evaluation (CADRE) and the National Center for the Improvement of Educational Assessment.
Briggs, D., & Alzen, J. (2013). Does taking an online version of a course have a negative effect on student learning? An evaluation study. Commissioned by the University of Colorado’s Department of Continuing Education.
Briggs, D. C. (2013). Teacher evaluation as Trojan horse: The case for teacher-developed assessments. Measurement: Interdisciplinary Research and Perspectives, 11(1-2), 24-29. https://doi.org/10.1080/15366367.2013.784153
Alzen, J., Briggs, D., Whitcomb, J., Haug, C., Paterson, W., & Klopfenstein, K. (2012). An initial exploration of Colorado-trained teachers: Providing context for outcome-based teacher preparation program evaluation. Report Commissioned by the Colorado Department of Higher Education.
Alzen, J., Briggs, D., Whitcomb, J., Haug, C., Paterson, W., & Klopfenstein, K. (2012). Enhancing Colorado data systems: Linking teachers to preparation programs. Report Commissioned by the Colorado Department of Higher Education.
Briggs, D. C., & Domingue, B. D. (2011). Due diligence and the evaluation of teachers: A review of the value-added analysis underlying the effectiveness rankings of Los Angeles Unified School District teachers by the Los Angeles Times. National Education Policy Center. http://nepc.colorado.edu/publication/due-diligence
Test Preparation (8)
Briggs, D. C. (2021). Commentary: Comment on college admissions tests and social responsibility. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12455
Domingue, B. W., & Briggs, D. C. (2009). Using linear regression and propensity score matching to estimate the effect of coaching on the SAT. Multiple Linear Regression Viewpoints, 35(1), 12-29.
Briggs, D. C. (2004). Evaluating SAT coaching: Gains, effects and self-selection. In Zwick, R. (Ed.), Rethinking the SAT: The Future of Standardized Testing in University Admissions. RoutledgeFalmer. https://doi.org/10.4324/9780203463932
Briggs, D. C. (2002). SAT coaching, bias and causal inference. Dissertation Abstracts International. DAI-A 64/12, p. 4433 (UMI No. 3115515).
Briggs, D. C. (2002). Test preparation programs: Impact. In Encyclopedia of Education (2nd ed.).
Briggs, D. C. (2002). Comment: Jack Kaplan’s new study of SAT coaching. Chance, 15(1), 7-8.
Briggs, D. C. (2001). The effect of admissions test preparation: Evidence from NELS-88. Chance, 14(1), 10-18.
Stern, D., & Briggs, D. (2001). Changing admissions policies: Mounting pressures, new developments, more questions. Change, 33(1), 34-41.
Validity (3)
Briggs, D. C. (2011). Cause or effect? Validating the use of tests for high-stakes inferences in education. In Dorans, N. J., & Sinharay, S. (Eds.), Looking Back: Proceedings of a Conference in Honor of Paul W. Holland. New York, NY: Springer. https://doi.org/10.1007/978-1-4419-9389-2_8
Briggs, D. C. (2010). Validate high stakes inferences by designing good experiments, not audit items. Measurement: Interdisciplinary Research and Perspectives, 8(4), 185-190.
Briggs, D. C. (2004). Comment: Making an argument for design validity before interpretive validity. Measurement: Interdisciplinary Research and Perspectives, 2(3), 171-174.
Value-added Modeling (8)
Briggs, D., & Domingue, B. (2014). Value-added to what? The paradox of multidimensionality. In Lissitz, R. (Ed.), Value-added Modeling and Growth Modeling with Particular Application to Teacher and School Effectiveness. Charlotte, NC: Information Age Publishing.
Briggs, D. C. (2012). Making value-added inferences from large-scale assessments. In Simon, M., Ercikan, K., & Rousseau, M. (Eds.), Improving Large-Scale Assessment in Education: Theory, Issues and Practice. London: Routledge. https://doi.org/10.4324/9780203154519
Briggs, D. C., & Weeks, J. P. (2011). The persistence of value-added school effects. Journal of Educational and Behavioral Statistics, 36(5), 616-637.
Briggs, D. C. (2011). Making inferences about growth and value-added: Design issues for the PARCC consortium. White Paper Commissioned by the PARCC Large-Scale Assessment Consortium.
Briggs, D. C., & Domingue, B. D. (2011). Due diligence and the evaluation of teachers: A review of the value-added analysis underlying the effectiveness rankings of Los Angeles Unified School District teachers by the Los Angeles Times. National Education Policy Center. http://nepc.colorado.edu/publication/due-diligence
Briggs, D. C., & Weeks, J. P. (2009). The sensitivity of value-added modeling to the creation of a vertical scale. Education Finance & Policy, 4(4), 384-414. https://doi.org/10.1162/edfp.2009.4.4.384
Briggs, D. C. (2008). The goals and uses of value-added models. Paper prepared for a workshop held by the Committee on Value-Added Methodology for Instructional Improvement, Program Evaluation and Educational Accountability, sponsored by the National Research Council and the National Academy of Education, Washington DC, November 13-14, 2008. .
Wiley, E., & Briggs, D. C. (2007). Can value-added assessment improve accountability? Education Views. University of Colorado at Boulder, School of Education, Winter 2007.
Vertical Scaling (8)
Student, S. R., Briggs, D. C., & Davis, L. (2025). Growth across grades and common item grade alignment in vertical scaling using the Rasch model. Educational Measurement: Issues and Practice. https://doi.org/10.1111/emip.12639
Briggs, D. C., & Peck, F. A. (2015). Rejoinder to commentaries on using learning progressions to design vertical scales. Measurement: Interdisciplinary Research and Perspectives, 13(3-4), 206-218. https://doi.org/10.1080/15366367.2015.1104113
Briggs, D. C., & Peck, F. A. (2015). Using learning progressions to design vertical scales that support coherent inferences about student growth. Measurement: Interdisciplinary Research & Perspectives, 13, 75-99. https://doi.org/10.1080/15366367.2015.1042814
Briggs, D. C., & Dadey, N. (2015). Making sense of common test items that do not get easier over time: Implications for vertical scale designs. Educational Assessment, 20(1), 1-22.
Briggs, D. C., & Domingue, B. (2013). The gains from vertical scaling. Journal of Educational and Behavioral Statistics, 38(6), 551-576. https://doi.org/10.3102/1076998613508317
Briggs, D. C. (2013). Measuring growth with vertical scales. Journal of Educational Measurement, 50(2), 204-226. https://doi.org/10.1111/jedm.12011
Briggs, D. C., & Weeks, J. P. (2009). The sensitivity of value-added modeling to the creation of a vertical scale. Education Finance & Policy, 4(4), 384-414. https://doi.org/10.1162/edfp.2009.4.4.384
Briggs, D. C., & Weeks, J. P. (2009). The impact of vertical scaling decisions on growth interpretations. Educational Measurement: Issues & Practice, 28(4), 3-14. https://doi.org/10.1111/j.1745-3992.2009.00158.x
Presentations
Invited Talks & Keynotes (26)
Briggs, D. C. (2025). On the Nature of Measurement. Presentation at the annual meeting of the National Council on Measurement in Education, April 25, 2025.
Briggs, D. C. (2025). Is Unaccounted-for Multidimensionality Really That Problematic? Organized Discussion at the annual meeting of the National Council on Measurement in Education, April 24, 2025.
Briggs, D. C. (2024). Scaling and its Relationship with Measurement: Past, Present and Future. Invited Keynote Presentation, Society for the Study of Measurement, Berkeley, CA, August 7, 2024.
Briggs, D. C. (2024). Content-Referenced Growth. Keynote Address for Michigan State Testing Conference, February 15, 2024.
Briggs, D. C. (2024). Reflections on educational accountability and assessment. Invited Womer Lecture, University of Michigan, February 14, 2024.
Briggs, D. C. (2023). Some lessons about measurement and measurers. Keynote Address for the International Objective Measurement Workshop, Chicago, IL, April 12, 2023.
Briggs, D. C. (2022). Content-referenced growth. Keynote address for the annual meeting of the Association for Educational Assessment-Europe, Dublin, November 12, 2022.
Briggs, D. C. (2022). Historical Foundations and New Frontiers: Units of Measurement. Keynote address for Division 5 of the American Psychological Association, Minneapolis, MN, August 4, 2022.
Briggs, D. C. (2022). Turning the page to the next chapter of educational measurement. Presidential Address at the annual meeting of the National Council on Measurement in Education, April 23, 2022. https://www.youtube.com/watch?v=1jgUiZ2LUv8
Briggs, D. C. (2021). The use of content-referencing to evaluate the magnitude of student growth. Keynote address at the 17th conference of the Israeli Psychometric Association, January 27, 2021 (virtual).
Briggs, D. C. (2020). A general factor, neural bonds, and the next generation of science standards. Meeting of Testing Issues in Large-Scale Assessment (TILSA) group during meeting of the Council of Chief State School Officers, New Orleans, LA, February 18, 2020.
Briggs, D. C. (2019). The role of measurement in evaluating the practical significance of learning outcomes. Invited presentation, Stanford University Graduate School of Education, May 23, 2019.
Briggs, D. C. (2018). Visualizing location and growth: Design principles for person-item maps. Invited keynote presentation, Rasch Measurement Conference, University of Western Australia, Perth, Australia, January 18, 2018.
Briggs, D. C. (2017). Longitudinal growth models and classroom assessment. Invited presentation at North Carolina State University, School of Education, November 9, 2017.
Briggs, D. C. (2016). Psychometrics, measurement and obtainable goals. Invited presentation at the International Meeting of the Psychometric Society, Asheville, NC, July 13, 2016.
Briggs, D. C. (2015). Measuring Student Learning: Assessment 101. Invited presentation at The Aspen Institute, Aspen, CO, September 26, 2015.
Briggs, D. C. (2015). Psychometrics, Testing and Obtainable Goals. Invited debate at the National Council for Measurement in Education, April 19, 2015.
Briggs, D. C. (2015). Standardized Testing and Special Needs. Presentation at Chautauqua Education Series, Boulder, CO, March 18, 2015.
Briggs, D. C. (2013). An Economist, a Psychometrician and a Father of a Special Needs Child Walk into a School. Invited Womer Lecture, University of Michigan, February 19, 2013.
Briggs, D. C. (2013). Comparability Challenges Facing PARCC and SBAC. Invited presentation, annual meeting of the National Council for Measurement in Education, San Francisco, CA, April 30, 2013.
Briggs, D. C. (2013). NCME Hot Topics: Growth and Value-Added Modeling. Invited workshop presentation, annual meeting of the National Council for Measurement in Education, San Francisco, CA, April 27, 2013.
Briggs, D. C. (2013). The Impact of Coaching on College Admissions (sometimes even a small effect can matter!). Invited presentation at Seminario Internacional de Investigación sobre Calidad de la Educación, Bogotá, Colombia, November 8, 2013.
Briggs, D. C. (2010). Can We Use Large-Scale Assessments for both Summative and Formative Purposes? Invited presentation at the Reidy Interactive Lectures Series, Cambridge, MA, October 22, 2010.
Briggs, D. C. (2010). Rationales for measuring growth in student achievement: choosing between orthodoxy and pragmatism. Invited presentation at the BEAR Seminar, University of California, Berkeley, September 14, 2010.
Briggs, D. C. (2008). The Measurement of Learning Progressions. Invited talk at the University of Arizona, February 21, 2008.
Conference Presentations since 2015 (22)
Briggs, D. C. (2025). From Interval Scales to Scales with Intervals. Presentation at the International Meeting of the Psychometric Society, Minneapolis, MN, July 15, 2025.
Student, S., & Briggs, D. C. (2025). A self-contained empirical Bayes approach to weekly scoring. Paper presented at the annual meeting of the National Council on Measurement in Education, Denver, CO, April 26, 2025.
Briggs, D. C. (2023). Technicians, Curators or Guides on the Assessment Reform Journey? Preparing the Next Generation of Educational Measurement Professionals. Symposium presentation at the AEA-Europe Conference, Malta, November 3, 2023.
Briggs, D. C., Marion, S., & Sireci, S. (2023). Improving large-scale standardized testing. Coordinated symposium at the annual meeting of the American Educational Research Association, Chicago, IL, April 13, 2023.
Cox, O., & Briggs, D. C. (2023). Development of a reading foundational skills learning progression. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago, IL, April 13, 2023.
Briggs, D. C. (2021). Historical and conceptual foundation of measurement in the human sciences: credos and controversies. Presentation for the Institute of Behavioral Sciences, University of Colorado Boulder, November 1, 2021 (virtual).
Briggs, D. C. (2021). Some lessons learned from research on vertical scaling. Invited presentation for Associação Brasileira de Avaliação Educacional (ABAVE), September 27, 2021 (virtual).
Briggs, D. C. (2021). Historical lessons from the modeling and measuring of human abilities. Invited presentation for the 2021 annual conference of the National Council on Measurement in Education, May 18, 2021 (virtual).
Briggs, D. C. (2021). State plans for administering large-scale K-12 assessments in 2021. Education Congressional Staff Network Call coordinated by The Aspen Institute, March 19, 2021 (virtual).
Briggs, D. C. (2021). Discussion: Psychometrics and looming causal inferences. TILSA Panel coordinated by the Center for Assessment, February 23, 2021 (virtual).
Briggs, D. C., Maul, A., & McGrane, J. (2021). On the nature of measurement. Spotlight presentation at the International Objective Measurement Workshop, February 5, 2021 (virtual).
Briggs, D. C. (2020). Teaching and learning ‘educational measurement’: defining the discipline? Presentation for the annual meeting of the National Council on Measurement in Education, 2020 (virtual).
Briggs, D. C., & Furtak, E. M. (2019). Learning Progressions and NGSS. Presentation at the Special Conference for Classroom Assessment, Boulder, CO, September 18, 2019.
Briggs, D. C. (2019). Some thoughts on the origins of classical test theory. Presentation at the annual meeting of the National Council on Measurement in Education, Toronto, April 8, 2019.
Briggs, D. C., & Furtak, E. (2019). Learning progressions and embedded assessment. Presentation at the annual meeting of the National Council on Measurement in Education, Toronto, April 6, 2019.
Briggs, D. C. (2018). You can’t know where you’re going unless you find out where you’ve been. Invited presentation as part of a panel on ‘measurement problems’ inspired by Howard Wainer, annual meeting of the National Council on Measurement in Education, New York, NY, April 16, 2018.
Furtak, E. M., Briggs, D. C., & Chattergoon, R. (2018). Toward a system of classroom assessments for three-dimensional secondary science learning: the case of the Aspire study. Paper presented as part of the symposium ‘The Challenge of Assessing Knowledge in Use’ at the annual meeting of the International Conference of the Learning Sciences, London, UK, 2018.
Chattergoon, R., Briggs, D. C., Mahr, B., & Furtak, E. M. (2018). Developing a learning progression for the crosscutting concept of energy. Paper presented at the annual meeting of the National Council on Measurement in Education, New York, NY, April 17, 2018.
Briggs, D. C., & Alzen, J. L. (2018). The hidden facets of teacher growth. Paper presented at the annual meeting of the National Council on Measurement in Education, New York, NY, April 15, 2018.
Burkhardt, A., & Briggs, D. C. (2018). The state of district level interim assessments. Paper presented at the annual meeting of the National Council on Measurement in Education, New York, NY, April 15, 2018.
Briggs, D. C., Chattergoon, R., & Burkhardt, A. (2017). Examining the use of Student Learning Objectives to evaluate teachers. Paper presented at the annual meeting of the National Council on Measurement in Education, San Antonio, TX, April 29, 2017.
Briggs, D. C., Diaz-Bilello, E., Peck, F., Alzen, J., Chattergoon, R., & Johnson, R. (2015). Using a Learning Progression Framework to Assess and Evaluate Student Growth. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL, March 23, 2015.
State of Montana Assessment Technical Advisory Committee (2006–2017)
Gates Foundation, Measures of Effective Teaching Technical Advisory Panel (2010)
National Council on Measurement in Education, Brenda Loyd Dissertation Award Committee (2008–2011)
Expert Panel, Evaluating the Validity of English Language Proficiency Assessments (2009–2011)
Expert Panel, GSEG Consortia: Validity Evaluation, National Alternate Assessment Center (2008–2009)
Expert Panel, Growth Model Task Force, National Center for Learning Disabilities (2008)
Institutional Service — University of Colorado Boulder
Associate Dean of Faculty, School of Education, 2026–
Faculty Salary Equity Fellow, Office of the Provost, 2025–2026
Vice-Chancellor’s Advisory Committee (Promotion and Tenure Review), 2024
School of Education, Faculty Merit Review Committee, 2023–2025
School of Education, Research and Evaluation Methodology Program Chair, 2008–2019
Chair, Research and Evaluation Methodology Search Committees, 2008–2016 (multiple cycles)
Member, Dean Search Committee, 2014–2016
Coordinator of Bi-Weekly Research & Evaluation Methodology Seminar, 2003–2014
Graduate Student Recruitment Taskforce, 2006–2008
Science Education Search Committee, 2005–2006
Doctoral Curriculum Taskforce, 2003–2005
Manuscript Reviewing
AERA Open
American Educational Research Journal
Applied Psychological Measurement
Assessment in Education
Behavioral Research Methods
British Journal of Mathematical and Statistical Psychology
Cognition and Instruction
Educational Evaluation and Policy Analysis
Educational Measurement: Issues and Practice
Educational Policy
Educational Researcher
Evaluation Review
International Journal of Testing
Journal of Educational and Behavioral Statistics
Journal of Educational Measurement
Journal of Experimental Child Psychology
Journal of Teacher Education
Multivariate Behavioral Research
Physical Review
Psychological Methods
Psychometrika
Review of Educational Research
Theory & Psychology
Routledge Publications
SAGE Publications
Professional Affiliations
National Academy of Education; American Educational Research Association; National Council on Measurement in Education; The Psychometric Society; Society for the Study of Measurement; National Education Policy Center.