Abstract
This study presents a computational tool for evaluating student concentration and learning. Despite mixed findings on fidgeting’s impact on learning, it raises questions about whether fidgeting accurately reflects concentration and learning outcomes. we compared humans and a newly developed algorithm (hereafter Youdao Algorithm, YA) on their judgment of fidgeting and concentration. Importantly, we wanted to know whether these judgments of concentration actually predict learning outcomes. We recorded 35 9-year-olds doing a paper-based math test and Attention Network Test (ANT) for 45 minutes. In Study 1, 35 adults watched videos and rated on a scale of 1 to 5 each children’s concentration, fidgeting, and math test performance. Results revealed a consensus among adults regarding concentration and fidgeting, yet these ratings did not align with actual math test scores or performance on the ANT. In Study 2, we presented the same 35 videos to the YA. We found that the YA provided divergent assessments from a new group of adults (N=615). Interestingly, the YA demonstrated superior accuracy in predicting actual academic performance. To further test, we had adults (N=685) evaluate the videos against altered backgrounds (classrooms and cafes). Despite context changes, adults consistently agreed on concentration and learning assessments, replicating Studies 1 and 2. Overall, our studies showed that humans have a consensus of what counts as fidgeting and concentration, but this perception does not accurately predict academic performance. Since teachers’ perception impacts students’ outcome, non-human algorithm that more accurately captures fidgeting/concentration can be an immensely useful tool for educators.
Details
Presentation Type
Paper Presentation in a Themed Session
Theme
KEYWORDS
Education, Fidget, Concentration, AI Algorithm, Academic Performance, Cognitive Control