To stand out in my job hunt, I turned a medical device I'd written firmware for into an interactive 3D demo using React-Three-Fiber and Blender.
Here's the result. Feel free to interact with the model:
Desktop: Left-click and drag to rotate, right-click and drag to pan, and scroll to zoom. Mobile: Use one finger to rotate, and two fingers to pan or pinch-to-zoom.
About Me & Project Motivation
Hey, I'm Mike, a full-stack software engineer. Following a layoff in April, I took the opportunity to refresh my portfolio by diving into interactive 3D visuals—something I'd always wanted to explore but initially found daunting.
Why 3D visuals? In a competitive job market, I wanted to showcase not just my existing skills, but also my ability to quickly learn and implement new technologies. Interactive 3D demos demonstrate both technical capability and attention to user experience in a way that static portfolio pieces can't match.
I'm writing this blog post because it's exactly the kind of guide I wish I'd had when starting this journey. There's a common misconception that creating interactive 3D experiences for the web is significantly more complicated than traditional 2D sites. Fortunately, with today's tools like React-Three-Fiber, that's no longer the case.
I hope this breakdown helps demystify the process and encourages you to experiment with interactive 3D in your own projects!
Tech Stack at a Glance
- Frontend: React, React-Three-Fiber, drei
- 3D Modeling: Blender 4.4
Learning Resources
I found two standout resources incredibly valuable:
- ThreeJS Journey: Essential for anyone starting out with ThreeJS.
- PolygonRunway: A fantastic entry into the world of 3D modeling.
Exporting for the Web
Getting your 3D model out of your modeling software and onto the web efficiently is a critical first step. For modern web 3D, the undisputed king of formats is glTF (GL Transmission Format), and specifically its binary version, .glb.
Think of .glb as the JPEG of 3D. It was designed by the Khronos Group (the same consortium behind WebGL) specifically for delivering 3D assets efficiently. Blender supports exporting models directly as .glb files, so this is the approach I used.
I also used Draco compression during export from Blender. This is a library from Google that dramatically shrinks the geometry data within the .glb file, leading to much faster download times for the end-user.
Bringing It to React-Three-Fiber
While React-Three-Fiber provides the core connection between React and Three.js, the drei library is the real star when it comes to productivity. It's an indispensable toolkit of helpers, abstractions, and components that handle the most common tasks in 3D development.
For my scene, I composed several drei components inside my <Canvas />
to build the experience quickly and robustly.
Here's a breakdown of the helpers I used and why they were so valuable:
<OrbitControls />
: Enables mouse/touch interaction to rotate, pan, and zoom the camera around the scene.<Stage />
: Creates a professional lighting setup with environment mapping and ground shadows.<Float />
: Adds a gentle floating animation to child objects, creating a more dynamic presentation.<Grid />
: Renders a customizable grid helper to provide visual reference for scale and positioning.<Loader />
: Displays a loading indicator while 3D assets are being loaded into the scene.
The scene looks something like this with the helpers:
import { Suspense } from 'react';
import { Canvas } from '@react-three/fiber';
import { Stage, OrbitControls, Float, Grid, Loader } from '@react-three/drei';
import { Stylus } from './Stylus'; // My 3D model component
function Scene() {
return (
<Canvas camera={{ position: [0, 2, 5], fov: 60 }}>
<Suspense fallback={<SimpleFallback />}>
{/* Stage sets up a professional, centered lighting environment */}
<Stage environment="city" intensity={0.6} adjustCamera>
{/* Float adds a gentle hover animation to its children */}
<Float speed={1.5} rotationIntensity={1} floatIntensity={1}>
<Stylus />
</Float>
</Stage>
</Suspense>
{/* Grid provides a visual reference for scale and positioning */}
<Grid infiniteGrid followCamera position={[0, -1.3, 0]} />
{/* OrbitControls enables intuitive camera interaction */}
<OrbitControls makeDefault autoRotate autoRotateSpeed={0.5} />
</Canvas>
);
}
// The Loader component is used outside the Canvas
function App() {
return (
<>
<Scene />
<Loader />
</>
)
}
Interactive Data Visualization
To demonstrate how the device collects motion data, including acceleration and rotation, I added a visualization that monitors the model's movement and graphs it on the screen.

The code for this looks something like this:
// This component tracks the motion of the camera in a Three.js scene
function MotionTracker({ onSample }: { onSample: (data: { acc: { x: number, y: number, z: number }, rot: { x: number, y: number, z: number } }) => void }) {
// Store previous position and velocity for calculations
const prevPos = useRef(new THREE.Vector3());
const prevVel = useRef(new THREE.Vector3());
const timer = useRef(0);
// Use the useFrame hook to run calculations every frame
useFrame(({ camera }, dt) => {
// Initialize prevPos on first run
if (prevPos.current.lengthSq() === 0) prevPos.current.copy(camera.position);
// Calculate velocity and acceleration
const vel = camera.position.clone().sub(prevPos.current).divideScalar(dt);
const acc = vel.clone().sub(prevVel.current).divideScalar(dt);
// Sample data at regular intervals
timer.current += dt;
if (timer.current >= SAMPLE_INTERVAL) {
timer.current = 0;
onSample({
acc: { x: acc.x, y: acc.y, z: acc.z },
rot: {
// Convert rotation from radians to degrees
x: THREE.MathUtils.radToDeg(camera.rotation.x),
y: THREE.MathUtils.radToDeg(camera.rotation.y),
z: THREE.MathUtils.radToDeg(camera.rotation.z),
},
});
}
// Update previous position and velocity for next frame
prevPos.current.copy(camera.position);
prevVel.current.copy(vel);
});
// This component doesn't render anything
return null;
}
Calibration Visualization
To connect this visual back to a real-world engineering challenge, I also demonstrated the device's sensor calibration process. The visualization highlights the dramatic difference between the raw, scattered sensor data before calibration and the aligned, precise data after.
You can see the effect for yourself in the interactive demo below. Try it yourself by dragging, zooming, and clicking 'Calibrate'.
You can find all of the code for this visualization here.
Performance Considerations
Performance is paramount in web experiences. My goal was a stable 60 FPS on a wide range of devices.
- Reducing Draw Calls: A "draw call" is a command from the CPU to the GPU to draw something on the screen. They are computationally expensive. By combining separate geometries into a single mesh in Blender, I dramatically reduced draw calls, which I verified using the Spector.js browser extension. I learned this tip from @bruno_simon.
- Lighting: I opted for dynamic lighting with simple textures instead of baking lighting into complex textures. This approach provided a good balance of visual quality and performance across devices.
- Results: The final experience maintains a stable 60 FPS on both my M1 Mac and a mid-range Android phone. (If you're seeing any choppiness on your device, let me know!)
Challenges and Solutions
My biggest initial hurdle was conceptual: how to blend a 2D user interface with a 3D scene. My first instinct was to place standard HTML inside the <Canvas />
component, which isn't supported because the canvas's children must be Three.js objects.
I discovered there are two primary solutions to this, each suited for different use cases:
-
Layering HTML with CSS (The HUD Approach): For UI that sits on top of the scene, like a main menu or a static dashboard, the best approach is to render the HTML outside the
<Canvas />
. I layered my UI components on top of the canvas using CSS position: absolute. This seems to be the most performant method for static overlays. -
Using the
<Html>
Component (The In-Scene Approach): For UI that needs to exist in 3D space - like a tooltip on a specific part of a model or an interactive button that hovers next to an object - the drei library's<Html>
component is the answer. It lets you embed DOM elements within your scene, so they correctly scale, rotate, and are hidden by other 3D objects.
Next Steps
Future improvements:
- Introducing more immersive, interactive 3D items similar to those on ThreeJS Journey's showcase.
- Further enhancing responsiveness for mobile devices.
Final Thoughts
This project was a fantastic learning experience, and I'd love to hear your thoughts. Please feel free to email me with any feedback on performance, functionality, or the code itself!