CoJam is an intelligent stompbox designed to turn solo jamming into a full band experience. Unlike static loopers, CoJam utilizes generative AI to listen to a musician's live playing and generate dynamic, synchronized backing tracks in real-time.
Embedded Controller: Selected Daisy Seed (ARM Cortex-M7). The selection was driven by the need for a dedicated Floating Point Unit (FPU) to handle on-device audio processing with minimal overhead.
Physical Interface: Currently prototyping a stompbox design compatible with standard 9V pedal power supplies and 6.35mm audio jacks. The final unit will feature a custom user interface with real-time visual feedback for BPM and key status.
Architecture Design: Designed a firmware architecture that leverages the ARM CMSIS DSP library to perform critical feature extraction on the device.
Active Development: Currently developing C++ algorithms to implement Fast Fourier Transforms (FFT) for key detection and autocorrelation for tempo tracking.
Performance Targets: Optimizing the computation to achieve a target audio synchronization latency of <10ms, ensuring the generated backing tracks stays on beat with the live musician.
Hybrid Edge-Cloud Pipeline: Architected a split-processing model where immediate audio analysis happens on-device, while complex MIDI track generation is offloaded to a GPU-accelerated backend.
Data Flow: Designing the communication protocol to transmit extracted features (key, tempo, style) to the cloud and receive lightweight MIDI files in return, prioritizing speed over the bandwidth-heavy transfer of raw audio.
Stack: C++ (Embedded Firmware), Python (Backend Logic), Git.
Status: Moving from architectural validation to the Hardware Assembly and Firmware Integration sprints.