LinkedInFacebookShare
Priya Kannan

Priya Kannan

Senior Researcher, Assessment Research and Innovation

Contact Priya

Overview

Priya Kannan is a renowned expert who has over 15 years of experience in leading foundational research and operational work in score reporting and standard setting. Her research has resulted in methodological innovations in standard setting and informed the development of an iterative framework for the design and evaluation of score reports. She has disseminated her research through various conference presentations and a collection of articles and book chapters published in peer-reviewed journals.

In her current role at WestEd, Kannan provides leadership on the score report design and evaluation work for various contracts in addition to providing psychometric guidance by leading work on instrument development and survey design and development and leading focus groups and cognitive laboratories. She also provides intellectual leadership in identifying areas for research innovations and supporting business development.

Kannan received her PhD in research methodology from the University of Pittsburgh. Prior to joining WestEd, she was a research scientist at ETS for 12 years, where her work informed the iterative designs of wireframes and prototypes of operational score reports for ETS testing programs (e.g., GRE, TOEFL, Praxis). She has served on several committees for national organizations such as AERA and NCME, including as chair of the committee on diversity issues in testing (CODIT), the Bradley Hanson award committee, the AERA Div-D Significant Contributions committee, and the AERA Div-D Reception committee. She has served on the editorial board of the Practical Assessment, Research, and Evaluation journal and as an associate editor for the ETS Research Report Series. She was the recipient of the 2023 Bradley Hanson award for Contributions to Educational Measurement.

Education

  • PhD in research methodology, University of Pittsburgh
  • MA in I/O psychology, Minnesota State University

Select Publications

Ferrara, S., Davis-Becker, S., Kannan, P., & Reynolds, K. (in press). Standard setting: A taxonomy of cognitive-judgmental tasks and implications for research and practice. In L. L. Cook & M. J. Pitoniak (Eds.), Educational measurement (5th ed.). Oxford University Press.  

Brown, G. T. L., Kannan, P., Sinharay, S., Zapata-Rivera, D., & Zenisky, A. L. (2023). Challenges and opportunities in score reporting: A panel of personal perspectives. Frontiers in Education, 8, 1211580. https://doi.org/10.3389/feduc.2023.1211580

Kannan, P. (2023). Score reporting: Design and evaluation methods informed by research. In R. J. Tierney, F. Rizvi, & K. Ercikan (Eds.), International encyclopedia of education (4th ed., pp. 217–229). Elsevier. https://doi.org/10.1016/B978-0-12-818630-5.10031-4

Kannan, P., & Zapata-Rivera, D. (2022). Facilitating the use of data from multiple sources for formative learning in the context of digital assessments: Informing the design and development of learning analytic dashboards. Frontiers in Education, 7, 913594. https://doi.org/10.3389/feduc.2022.913594

Kannan, P., Zapata-Rivera, D., & Bryant, A. D. (2021). Evaluating parent comprehension of measurement error information presented in score reports. Practical Assessment, Research, and Evaluation, 26(12). https://files.eric.ed.gov/fulltext/EJ1311163.pdf

Kannan, P., Zapata-Rivera, D., & Leibowitz, E. A. (2018). Interpretation of score reports by diverse subgroups of parents. Educational Assessment, 23(3), 173–194. https://doi.org/10.1080/10627197.2018.1477584

Kannan, P., Sgammato, A., & Tannenbaum, R. J. (2015). Evaluating the operational feasibility of using subsets of items to recommend minimal competency cut-scores. Applied Measurement in Education, 28(4), 292–307.

Kannan, P., Sgammato, A., Tannenbaum, R. J., & Katz, I. R. (2015). Evaluating the consistency of Angoff-based cut-scores using subsets of items within a generalizability theory framework. Applied Measurement in Education, 28(3), 169–186.

Tannenbaum, R. J., & Kannan, P. (2015). Consistency of Angoff-based standard-setting judgments: Are item judgments and passing scores replicable across different panels of experts? Educational Assessment, 20, 66–78.

Honors, Awards, and Affiliations

Bradley Hanson Award for Contributions to Educational Measurement, 2023

Associate Editor, ETS Research Report Series, 2019–2022

Chair, Div-D Planning and Reception Committee, AERA, 2022–2023

Editorial Board, Practical Assessment Research and Evaluation, 2020–Present

Chair, Div-D Significant Contributions Committee, AERA, 2020–2021

Chair, Bradley Hanson Award Committee, NCME, 2017–2018

Chair, Committee on Diversity Issues in Testing (CODIT), NCME, 2016–2017

ETS Spot Award for Operational Standard Setting Excellence, 2014

More Related to this Featured Expert