header

🎓 Bachelor Project

Technologies used includes Python, Tensorflow, and Google Mediapipe. Read more about the project below.

Hand-Gesture-Based Interaction in Hybrid Meetings

My bachelor project, conducted at the IT University of Copenhagen, explored the potential of hand gestures to enhance user experience in hybrid meetings. Recognizing the disruptions caused by traditional methods requiring physical interaction with meeting controls, this project aimed to create a more intuitive and efficient meeting environment.

The project began with a gesture elicitation study involving 102 university students, followed by A/B testing with 104 participants to refine a set of gestures for common meeting tasks. While the research revealed no universally recognized gestures, it identified popular choices for actions like volume control and camera toggling.

To demonstrate the concept's feasibility, a Minimum Viable Product (MVP) was developed using the Mediapipe Framework and OpenCV. This MVP integrated the selected gestures with Zoom, allowing users to control meeting settings such as volume (using palm up/down gestures) and ending the meeting with a wave (though it was a somewhat awkward wave). The project involved collecting just under 14 hours of video recordings, resulting in a dataset of 43.52 gigabytes.

The MVP's usability testing yielded positive feedback, particularly regarding the intuitiveness of the chosen gestures. Participants also provided valuable suggestions for improvements, especially regarding visual feedback during gesture execution.

Future work could focus on implementing dynamic gesture recognition, enhancing visual feedback, and expanding the gesture set to encompass a wider range of meeting controls. Ultimately, this project demonstrated the potential of hand-gesture-based controls to create smoother, more intuitive hybrid meeting experiences by minimizing the need for physical interaction with devices.

Interested in reading. You can download the bachelor thesis here.