A library for prototyping realtime hand detection (bounding box), directly in the browser.
-
Updated
Jul 20, 2022 - JavaScript
A library for prototyping realtime hand detection (bounding box), directly in the browser.
这是一个 Flutter Packge 以实现摄像头精确追踪并识别十指的运动路径/轨迹和手势动作, 且输出22个手部关键点以支持更多手势自定义. 基于这个包可以编写业务逻辑将手势信息实时转化为指令信息: 一二三四五, rock, spiderman...还可以对不同手势编写不同特效. 可用于短视频直播特效, 智能硬件等领域, 为人机互动带来更自然丰富的体验
A pre-trained YOLO based hand detection network.
VRM hand tracking using mediapipe
package google mediapipe hand and holistic tracking into a dynamic link library
HolisticMotionCapture is an application and package that can capture the motion of a person with only a monocular color camera and move the VRM avatar's pose, face, and hands.
Oculus Quest hand tracking directly in Unity Editor for fast iteration.
HolisticBarracuda is the Unity Package that simultaneously estimates 33 pose, 21 per-hand, and 468 facial landmarks on the Unity Barracuda with GPU.
A collection of Hand Interaction Experiments using Unity and Ultraleap hand tracking
A python wrapper for Mediapipe's Multi-Hand Tracking
Add touch gesture-like hand gestures in your app; based on Handtrack.js + Hammer.js
In this project we will be using the live feed coming from the webcam to create a virtual mouse with complete functionalities.
AI在前端领域应用的Demo集合
Simple Hand Tracking module and Controlling Volume through gesture
Handtracking mediapipe sample with bitmap RGB input
Handtracking using MediaPipe HandPose. Runs as an Electron app and outputs OSC
This project merges computer vision with 3D modeling to create a lifelike virtual hand in Unity. Hand movements are tracked using OpenCV, enabling real-time interaction and applications in virtual reality, gaming, and simulations.
implemented some computer vision problems
Add a description, image, and links to the handtracking topic page so that developers can more easily learn about it.
To associate your repository with the handtracking topic, visit your repo's landing page and select "manage topics."