WebRTC Mirror Demo
Browser-native real-time video processing using WebRTC getUserMedia and Canvas 2D pixel manipulation — no libraries, no frameworks
Overview
An interactive demo that captures a live camera feed directly in the browser using the WebRTC getUserMedia API and applies real-time visual transformations frame-by-frame via the Canvas 2D API. Built entirely with native browser APIs to explore the limits of what the platform can do without any dependencies.
Problem
I wanted to deeply understand how browsers handle real-time media streams — how getUserMedia negotiates device access, how video frames flow into a canvas context, and how pixel-level manipulation can be done at 60fps without a dedicated processing library.
Constraints
- Browser-native only — no external libraries or frameworks
- Must work across Chromium and WebKit browsers
- Camera access requires HTTPS (self-signed cert for local dev)
- Performance must stay above 30fps on mid-range hardware
Approach
Captured the media stream via getUserMedia, piped it to a hidden video element, then on each requestAnimationFrame tick drew the video frame onto an offscreen canvas, read the pixel buffer with getImageData, applied the transformation, and wrote it back with putImageData. Kept the processing loop tight to avoid frame drops.
Key Decisions
Use requestAnimationFrame instead of setInterval for the render loop
rAF syncs to the display refresh rate and pauses when the tab is hidden, preventing wasted CPU. setInterval runs regardless, causing drift and battery drain on mobile.
- setInterval at 33ms (~30fps)
- MediaStreamTrackProcessor (too experimental, low browser support)
Offscreen canvas for pixel manipulation
Drawing to an offscreen canvas and then blitting to the visible one prevents flickering mid-frame and keeps the visible canvas always showing a complete frame.
- Single canvas (visible artifacts mid-draw)
- OffscreenCanvas with Worker (overkill for this demo)
Tech Stack
- WebRTC
- getUserMedia API
- Canvas 2D API
- JavaScript
- HTML5
Result & Impact
The demo runs smoothly in-browser with no build step or dependencies. It became the centrepiece of my portfolio's interactive projects section and served as the foundation for exploring more advanced media processing patterns.
Learnings
- Browser-native APIs are far more capable than most developers give them credit for
- Pixel manipulation via getImageData/putImageData is CPU-bound — keep the buffer small
- HTTPS is non-negotiable the moment you touch camera or microphone APIs
- requestAnimationFrame gives you browser-managed pacing for free
Try It
The live demo is embedded directly on the portfolio. Open it, allow camera access, and the mirror effect activates instantly — no install, no sign-in, no external service.