Self-Moving Mouse
New Interaction test | 2024 | Berkeley, CA
Individual Work
The Self-Moving Mouse Project explores a new approach to human-computer interaction, providing a remote control solution for users with mobility impairments or special scenarios. The device utilizes Mecanum wheels for omnidirectional movement and integrates camera-based gesture recognition, where the "OK" gesture controls direction, and the "fist" gesture stops movement to prevent accidental activation. This project stems from an interest in robotics and interactive design while serving as a learning opportunity in Mecanum wheels and visual recognition. Although the application scenario is still being explored, the project demonstrates the innovative potential of smart input devices, offering new possibilities for assistive technology and remote control.

Original object and its intent:
The object I have chosen is a wireless mouse. Every time we use a mouse, we need to reach out and physically touch it, which is the standard way almost all mice are used today. However, for users with disabilities who may not have hands to grasp a mouse, or for surfaces that are rough or uneven, a conventional mouse may not function properly and could become inconvenient. In such cases, I am considering whether a self-moving mouse could be controlled remotely or via camera recognition. The target users could be those who find it difficult to use a conventional mouse (e.g., individuals with disabilities who can control the mouse by moving their arms or feet) or people who need to use a mouse in special situations (such as remotely controlling a mouse while giving a standing presentation).

Design exploration of the new object:
The initial idea was a remotely controlled, self-moving mouse. When considering how to enable the movement of the small vehicle, I thought of adding movable wheels. After conducting some research, I discovered that Mecanum wheels could achieve omnidirectional movement on a flat surface without requiring the entire vehicle to rotate. During the prototype design process, due to time constraints, I had to purchase off-the-shelf Mecanum wheels, which were slightly larger than expected. Additionally, I designed a framework to connect the mouse, Mecanum wheels, battery, and various circuit boards.When using this prototype, the device can be placed at a distant location and controlled directly via a computer’s built-in camera to move the mouse, thereby controlling the cursor on the screen. The image recognition code for the camera can later be modified to recognize other objects, making it adaptable for different use cases. In the current prototype demonstration, only the "OK" and "fist" gestures are recognized, while other gestures are ignored to prevent accidental activation. The fist gesture signals the system to stop, while the "OK" gesture, when moved up, down, left, or right in front of the camera, causes the vehicle to move accordingly on the designated surface.
Why you chose your final direction:
At the beginning, I had three ideas: a foldable and connectable paper cup holder, an AI-integrated aromatherapy diffuser, and a self-moving mouse.Since I am always interested in physical things and robotics, I am interested in how to make physical objects interactive with digital interfaces. Besides, I am new to all the stuff, such as Mecanum wheels and camera gesture recognition.


Gesture Testing:
Demo Video