The interest in the TPACK framework has led to a upsurge in ways of measuring TPACK development. Matt, Tae Shin and I recently published a survey paper on different ways of measuring TPACK, abstract and title given below.
I was particularly pleased with the title we came up with for the chapter. How often do you get a chance to reference one of your favorite poems in an academic article. [See here for the poem by Elizabeth Barrett Browning that served as an inspiration for our title.] Incidentally, there is another inside joke buried somewhere in the chapter – but I will leave that for you to discover 🙂
Title: Koehler, M. J., Shin, T.S., & Mishra, P. (2011). How do we measure TPACK? Let me count the ways. In R. N. Ronau, C.R. Rakes, & M. L. Niess (Eds.). Educational Technology, Teacher Knowledge, and Classroom Impact: A Research Handbook on Frameworks and Approaches. Information Science Reference, Hershey PA.
Abstract: In this chapter we reviewed a wide range of approaches to measure Technological Pedagogical Content Knowledge (TPACK). We identified recent empirical studies that utilized TPACK assessments and de- termined whether they should be included in our analysis using a set of criteria. We then conducted a study-level analysis focusing on empirical studies that met our initial search criteria. In addition, we conducted a measurement-level analysis focusing on individual measures. Based on our measurement- level analysis, we categorized a total of 141 instruments into five types (i.e., self-report measures, open-end questionnaires, performance assessments, interviews, and observations) and investigated how each measure addressed the issues of validity and reliability. We concluded our review by discussing limitations and implications of our study.
Incidentally, this handbook has a bunch of really interesting pieces – see here for more details about the book and the chapters.