This project is designed to respond to user gestures by dispatching mouse actions. It utilizes the mediapipe model to recognize 7 different gestures and pyautogui to perform mouse movements and clicks.
To run this code, you need to install the following libraries: mediapipe, opencv-python, matplotlib, and pyautogui. You can install them using pip:
pip install mediapipe opencv-python matplotlib pyautogui
the use python to run frame_by_frame_analysis.py
python3 live_stream_analysis.py
Close your fist to start dragging the mouse. Open your hand to stop dragging the mouse.
Thumbs up to click. Point up for mouse down. Open hand for mouse up.