Behind the Scenes of Guitar Learning Game: Bringing 3D Teacher Kate to Life

Aug 8, 2023

by Ahmet Said Olgun from Deplike

Hello and welcome to the first post on the Deplike team blog!

As curious music tech enthusiasts, we’re diving deep into innovative solutions we’ve been developing that redefine the way we experience and create music. Whether you’ve been with us for a while or you’re just getting to know us, we’re glad you’re here. Our goal is to keep our community updated and connected through these articles.

The focus of this piece is to share the software development and music technology processes that enable our guitar learning app, Guitar Learning Game, to animate a 3D character named Kate who plays songs in Augmented Reality (AR) and gameplay, impeccably synchronized with the selected track.


We begin the process by addressing the song integration aspect using MIDI (Musical Instrument Digital Interface) files in our app. Initially, we segregate the guitar channel from the originating MIDI file, linked to the MP3 file, as a standalone MIDI. This segregation is critical as it allows us to have a granular understanding of all the musical events in a track. It provides detailed insights such as which notes are being played at a particular timestamp. Subsequently, we transform this MIDI file into a JSON (JavaScript Object Notation) format using the resource https://tonejs.github.io/Midi/. This conversion is instrumental in facilitating more straightforward and effective operations on the MIDI data.

The transformed JSON data is utilized in two primary domains. Primarily, we devise a model using C#, chosen for its compatibility with our Unity-based platform, indicating the exact moment when the 3D character, Kate, needs to strike a specific chord. The data for this model is aggregated in a ‘dotween sequence’, tailored for the respective song. This sequence prepared as an animation prior to the song’s execution, mitigates any potential lag during app usage.

Our users are offered the flexibility to adjust the tempo of the song and animation, thereby allowing them to experience the song at their preferred pace. This innovative approach enables our 3D character, Kate, to strum the guitar harmoniously with the track. If users manage to follow Kate’s hand movements accurately, they can mirror the rhythm of the song impeccably.

The secondary usage of this JSON data is to engineer a ‘learning path’. Leveraging the data harvested from the guitar channel, we discern the variety of chords and chord switches present in the song.

This understanding facilitates the creation of a new JSON file encapsulating key information such as images of the skills developed, audio paths, identifiers, and sub-skills. This meticulously designed structure keeps us informed when identical chords recur in diverse songs, enabling us to offer an enhanced user experience.

To simplify the user’s journey of playing their chosen song, we initially fragment the song into checkpoints. These checkpoints are further disassembled into smaller units, termed ‘chord switches’, which are subsequently divided into individual chords. This information is organized for the user on their learning path in a manner transitioning from specific to broad. Consequently, users can incrementally get into their selected song without feeling overwhelmed with frustration.

Feel free to try Guitar Learning Game on Android & iOS by clicking here and let us know what you think.