TL;DR: I built ParticleSync – a real-time gesture-controlled 3D particle system using Three.js and MediaPipe. It responds to hand gestures through your webcam, allowing natural interaction with 15,000+ particles. Try the live demo!
A few days ago, I asked myself a simple question:
What if we could interact with digital visuals the same way we interact with the real world — using our hands?
That curiosity led me to build ParticleSync, a gesture-controlled 3D particle system that reacts to real-time hand movements using computer vision. The entire experience runs inside the browser and responds instantly to natural gestures like opening your hand, pinching your fingers, or switching poses.
This project helped me explore the overlap between AI, computer vision, and interactive 3D graphics, and it turned out to be one of the most fun builds I've worked on so far.
What Is ParticleSync?
ParticleSync is a real-time, browser-based 3D particle system that uses AI-powered hand tracking to control particle behavior.
Instead of using buttons or sliders, users interact with thousands of particles using simple hand gestures captured through a webcam. The particles react dynamically — spreading out, pulling inward, or morphing into different shapes based on the detected gesture.
Everything happens live, without any backend processing.
Tech Stack Used
This project was built entirely using modern web technologies:
No frameworks, no servers — just the browser, GPU, and camera working together.
Performance Insight: The entire system runs client-side, with MediaPipe's hand detection happening at 60+ FPS and Three.js rendering 15,000+ particles smoothly.
How Gesture Recognition Works
MediaPipe Hands detects 21 landmarks per hand, including fingertips and joints. From these points, I derive meaningful gestures by measuring distances and positions between specific landmarks.
The index finger acts as a live interaction cursor inside the 3D space, while the relationship between other fingers determines the gesture type.
This approach allows the system to feel responsive while remaining lightweight and efficient.
Supported Hand Gestures
The system currently recognizes three core gestures:
Open Hand
Particle Repulsion
Opening your hand pushes nearby particles away, creating a smooth scattering effect.
Pinch
Particle Attraction
Pinching your thumb and index finger pulls particles inward, simulating a gravity-like force.
Victory Sign
Shape Switching
Making a victory sign cycles through different particle templates with a built-in cooldown to avoid accidental triggers.
Each gesture is detected and applied in real time, making the interaction feel natural and intuitive.
Particle Shapes and Visual Effects
Particles can morph smoothly between multiple 3D templates:
- Sphere - Uniform distribution
- Cube - Geometric arrangement
- Galaxy Spiral - Logarithmic spiral pattern
- DNA Helix - Double helix structure
- Torus - Donut-shaped formation
Instead of instantly teleporting particles, each shape defines a target position for every particle. Smooth interpolation ensures fluid transitions that feel organic rather than mechanical.
Subtle noise and motion are added to prevent the system from looking static or artificial.
Performance & Optimization
Rendering over 15,000 particles in real time required careful performance tuning:
- GPU-accelerated rendering via WebGL
- Efficient buffer updates
- Controlled interpolation speeds
- Minimal per-frame calculations
The result is a smooth experience across most modern browsers without sacrificing visual quality.
Challenges Along the Way
This project wasn't without its challenges:
- Preventing false gesture detection
- Mapping 2D camera input into meaningful 3D space
- Maintaining stable performance under heavy particle loads
- Designing gestures that feel intuitive rather than forced
Each obstacle pushed me to better understand how real-time AI systems behave in the wild, not just in controlled demos.
Why I Built This Project
I'm currently exploring AI-driven interaction systems and how computer vision can redefine the way humans interact with digital environments.
ParticleSync started as an experiment, but it quickly became a hands-on lesson in:
- Real-time AI inference
- Interactive graphics systems
- Human-centered interface design
Projects like this help bridge the gap between theory and real-world application.
What I Want to Build Next
Some future ideas for ParticleSync include:
- Multi-hand interaction
- Advanced gesture recognition
- Mobile and touch optimization
- Audio-reactive particle behavior
- Custom shape imports
This project is still evolving, and I'm excited to continue pushing it further.
Try ParticleSync Live
Experience gesture-controlled particles in your browser. Works best in Chrome with a webcam.
Camera access required. All processing happens locally in your browser.
Final Thoughts
ParticleSync reminded me why I enjoy building interactive systems — they blur the line between creativity and engineering.
If you're interested in computer vision, Three.js, or creative coding, I highly recommend experimenting with gesture-based projects. They're challenging, rewarding, and teach you far more than tutorials alone.
The convergence of AI and real-time graphics opens up incredible possibilities for immersive web experiences. What will you build?
Got questions or ideas? I'd love to hear from you! Use the feedback form below or reach out on GitHub.