hero image
Luke Gusukuma, Ph.D. - VCU College of Engineering. Richmond, VA, US

Luke Gusukuma, Ph.D.

Associate Professor | VCU College of Engineering


Luke teaches Computer Science (CS) and does research in CS Education. He focuses on the design and delivery of feedback and instruction.

Areas of Expertise (1)

Computer Science Education

Selected Articles (2)

Pedal: an infrastructure for automated feedback systems

Proceedings of the 51st ACM Technical Symposium on Computer Science Education

Gusukuma, Luke, Austin Cory Bart, and Dennis Kafura


This paper describes Pedal, an innovative approach to the automated creation of feedback given to students in programming classes. Pedal is so named because it supports the PEDAgogical goals of instructors and is an expandable Library of components motivated by these goals. Pedal currently comes with components for type inferencing, flow analysis, pattern matching, and unit testing to provide an instructor with a rich set of resources to use in authoring and prioritizing feedback. The larger vision is the loosely-coupled architecture whose components can be readily expanded or replaced. The Pedal library components are motivated by a study of contemporary automated feedback systems and our own experience. Pedal's components are described and examples are given of Pedal-based feedback from three different introductory classes at two different universities. The integration of Pedal into several programming and autograding environments is briefly described.

view more

Misconception-driven feedback: Results from an experimental study

ICER '18: Proceedings of the 2018 ACM Conference on International Computing Education Research

Luke Gusukuma, Austin Cory Bart, Dennis Kafura, Jeremy Ernst


The feedback given to novice programmers can be substantially improved by delivering advice focused on learners' cognitive misconceptions contextualized to the instruction. Building on this idea, we present Misconception-Driven Feedback (MDF); MDF uses a cognitive student model and program analysis to detect mistakes and uncover underlying misconceptions. To evaluate the impact of MDF on student learning, we performed a quasi-experimental study of novice programmers that compares conventional run-time and output check feedback against MDF over three semesters. Inferential statistics indicates MDF supports significantly accelerated acquisition of conceptual knowledge and practical programming skills. Additionally, we present descriptive analysis from the study indicating the MDF student model allows for complex analysis of student mistakes and misconceptions that can suggest improvements to the feedback, the instruction, and to specific students.

view more