Week 1: Preparing Hardware Components and Software Framework for Motion Tracking
March 12, 2025
Hey everybody,
Welcome back to my blog! This week, I focused on finalizing the hardware components for my first prototype and started to build the software framework in Unity Hub for 3D visualization. I made good progress, but am locked behind a couple technical challenges and shipping times.
Hardware: Selecting and Ordering Components
To accurately track finger positions and movements, I need a system that combines IMU sensors for motion capture with wireless communication to transmit data to Unity in real time. After researching different options, I finalized my component selection:
- IMU Sensors: The MPU-9250 (or MPU-6050, depending on availability) is my sensor of choice due to its built in 9DOF (degrees of freedom) tracking which includes a 3-axis gyroscope, accelerometer, and magnetometer. These are necessary for precise orientation tracking. However, due to the nature of the IMU sensor, it can suffer from gyroscope drift, requiring sensor fusion techniques or relocalization for correction.
- Microcontroller with Bluetooth Communication: I selected the ESP32 because it provides native Bluetooth Low Energy (BLE) support, removing the need for an external Bluetooth module. The ESP32 will handle:
- Reading IMU sensor data via I2C communication
- Basic preprocessing to reduce data size
- Transmitting data wirelessly via BLE to Unity
- Managing power consumption to maintain efficiency for battery powered operation
- Power System: Since the gloves must be fully wireless, I’m using 3.7V LiPo batteries with a TP4056 charging module to ensure safe charging and low power draw.
All components have been ordered, and my next step is to individually assemble and test each module before combining them into the glove system. By testing each component individually, I can identify faults early on and, if necessary, research and implement alternative solutions without disrupting the entire system.
Software: Setting Up Unity for 3D Visualization
Since my project requires real-time motion visualization, I’m leveraging Unity Hub for development and Blender for 3D modeling. This week, I focused on structuring the software framework and preparing for sensor integration once the hardware is assembled.
Refining the Unity Environment
- Set up Unity with URP (Universal Render Pipeline) for optimized rendering performance.
- Began designing a custom rigged hand model in Blender, ensuring that joint rotations will align correctly with incoming IMU data.
- Explored Unity’s Inverse Kinematics (IK) system to determine how I can display motion between tracked points for smoother visualization.
Planning IMU Data Processing and Bluetooth Integration
- Outlined a data pipeline that will receive BLE sensor input from the ESP32 and parse it within Unity.
- Researched sensor fusion techniques, including Madgwick and Mahony filters, to stabilize noisy IMU readings when data becomes available.
- Developed an initial approach for real-time quaternion rotation mapping, which will allow accurate representation of finger movements once the hardware is operational.
Next week, I plan to begin testing basic Unity input handling with simulated data and prepare for real-world sensor integration once the hardware components arrive.
See you next week!
Leave a Reply
You must be logged in to post a comment.