Gaze-Based Elevator Interaction

As part of his Seminar in Pervasive Computing course, Carson had a significant role in authoring a paper on the development and prototyping of using gaze-based selection as an input modality to the elevator interface, common to buildings around the world.

In this paper, Carson's attributions included relevant background research, design of the relevant interfaces, and engineering behind the eye-tracking and GUI system. The system also made use of some computer vision fundamentals, namely optical character recognition(OCR) and data sharing via local networking through TCP. Additionally, Carson guided the development of the structure of the user study completed by other members of the team.

Significant learning outcomes included understanding the background of gaze-based interaction research, gaze-based design concepts, the Pupil Labs Core platform, user-study development, and academic paper generation.

Below is the final paper as it relates to the project as well as a video demo of the system built completely by Carson.

PervasiveSeminar_Report_FinalSubmission.docx