Projects
Researchers for this project will develop and validate an automated assessment of students' analytic writing skills in response to reading text. During prior work the researchers studied an assessment of students' analytic writing to understand progress toward outcomes in the English Language Arts Common Core State Standards, and to understand effective writing instruction by teachers. The researchers focused on response-to-text assessment because: it is an essential skill for secondary and post-secondary success; current assessments typically examine writing outside of responding to text; and increased attention on analytic writing in schools will result in improved interventions. Recent advances in artificial intelligence offer a potential way forward through automated essay scoring of students' analytic writing at-scale and feedback to improve writing and in the teaching instruction.
Homepage: eRevise
Funding: IES Award (with
Rip Correnti and
Lindsay Clare Matsumura)
Writing and revising are essential parts of learning, yet many college students graduate without demonstrating improvement or mastery
of academic writing. This project explores the feasibility of improving students' academic writing through a revision environment
that integrates natural language processing methods, best practices in data visualization and user interfaces, and current pedagogical theories.
First, to analyze data on students' revision behaviors, a series of experiments are conducted to study interactions between students and variations of the revision writing environment.
Second, the collected data forms the gold standard for developing an end-to-end system that automatically extracts revisions
between student drafts and identifies the goal for each revision. The writing environment is iteratively refined, augmenting the interface prototyping through frequent user studies. Third, a complete end-to-end system that integrates the most successful component models is deployed in college-level writing classes.
Homepage: ArgRewrite
Funding: NSF Award (with
Amanda Godley and
Rebecca Hwa)
In this project, the researchers will refine an existing mobile application, CourseMIRROR, for use in postsecondary STEM lecture courses. This application aims to improve deep learning by encouraging students to reflect on course content and receive immediate feedback on their reflections. Often, in large lecture courses, students' ability to reflect on course content and get feedback on these reflections is limited by class size and instructor availability. At the same time, instructors often don't have access to students' reflections, so they cannot correct misunderstandings or build on class knowledge. By leveraging natural language processing and mobile learning technologies, CourseMIRROR aims to overcome these barriers and help students and instructors gain insights into what was or was not learned.
Homepage: CourseMIRROR
Funding: IES Award
(with Muhsin Meneske and Ala Samarapungavan (Purdue))
Homepage: DiscussionTracker
Funding: NSF Award (with Amanda Godley)
In this project, we are interested in positioning the robot as a tutee that one or more human learners teach about a subject domain using spoken dialogue. This project proposes to build on existing learning sciences work within the exciting research spaces of teachable agents, human-robot interaction, spoken dialogue interaction, and collaborative learning, by examining a more complex scenario: multiple students collaborate with a robot through spoken interaction.
Funding: Internal LRDC grant (with Erin Walker and Tim Nokes-Malach)