- Katharine Meyer
Search for EdWorkingPapers here by author, title, or keywords.
Interactive, text message-based advising programs have become an increasingly common strategy to support college access and success for underrepresented student populations. Despite the proliferation of these programs, we know relatively little about how students engage in these text-based advising opportunities and whether that relates to stronger student outcomes – factors that could help explain why we’ve seen relatively mixed evidence about their efficacy to date. In this paper, we use data from a large-scale, two-way text advising experiment focused on improving college completion to explore variation in student engagement using nuanced interaction metrics and automated text analysis techniques (i.e., natural language processing). We then explore whether student engagement patterns are associated with key outcomes including persistence, GPA, credit accumulation, and degree completion. Our results reveal substantial variation in engagement measures across students, indicating the importance of analyzing engagement as a multi-dimensional construct. We moreover find that many of these nuanced engagement measures have strong correlations with student outcomes, even after controlling for student baseline characteristics and academic performance. Especially as virtual advising interventions proliferate across higher education institutions, we show the value of applying a more codified, comprehensive lens for examining student engagement in these programs and chart a path to potentially improving the efficacy of these programs in the future.
Despite documented benefits to college completion, more than a third of students who initially enroll in college do not ultimately earn a credential. Completing college requires students to navigate both institutional administrative tasks (e.g., registering for classes) and academic tasks within courses (e.g., completing homework). In postsecondary education, several promising interventions have shown that text-based outreach and communication can be a low-cost, easy to implement, and effective strategy for supporting administrative task navigation. In this paper, we report on a randomized controlled trial testing the effect of a text-based chatbot with artificial intelligence (AI) capability on students' academic task navigation. We find the academic chatbot significantly shifted students’ final grades, increasing the likelihood students received a course grade of B or higher by eight percentage points. We find large and significant treatment effects for first-generation students, estimating the intervention increased their final course grades by about 11 points on a 100-point scale (and a 16 percentage point increase in earning a B or higher) as well as their completion of and performance on individual course deliverables (e.g., readings, activities, exams).