- James S. Kim
Search for EdWorkingPapers here by author, title, or keywords.
James S. Kim
In a randomized trial that collects text as an outcome, traditional approaches for assessing treatment impact require that each document first be manually coded for constructs of interest by human raters. An impact analysis can then be conducted to compare treatment and control groups, using the hand-coded scores as a measured outcome. This process is both time and labor-intensive, which creates a persistent barrier for large-scale assessments of text. Furthermore, enriching ones understanding of a found impact on text outcomes via secondary analyses can be difficult without additional scoring efforts. Machine-based text analytic and data mining tools offer one potential avenue to help facilitate research in this domain. For instance, we could augment a traditional impact analysis that examines a single human-coded outcome with a suite of automatically generated secondary outcomes. By analyzing impacts across a wide array of text-based features, we can then explore what an overall change signifies, in terms of how the text has evolved due to treatment. In this paper, we propose several different methods for supplementary analysis in this spirit. We then present a case study of using these methods to enrich an evaluation of a classroom intervention on young children’s writing. We argue that our rich array of findings move us from “it worked” to “it worked because” by revealing how observed improvements in writing were likely due, in part, to the students having learned to marshal evidence and speak with more authority. Relying exclusively on human scoring, by contrast, is a lost opportunity.
Parental text messaging interventions are growing in popularity to encourage at-home reading, school-attendance, and other educational behaviors. These interventions, which often combine multiple components, frequently demonstrate varying amounts of effectiveness, and researchers often cannot determine how individual components work alone or in combination with one another. Using a 2x2x3 factorial experiment, we investigate the effects of individual and interacted components from three behavioral levers to support summer reading: providing updated, personalized information; emphasizing different reading views; and goal setting. We find that the personalized information condition scored on average 0.03 SD higher on fall reading assessments. Texting effects on test scores were enhanced by messages that emphasized reading being useful for both entertainment and building skills compared to skill building alone or entertainment alone. These results continue to build our understanding that while text message can be an effective tool for parent engagement, the specific content of the message can lead to meaningful differences in the magnitude of the effects.