I've got everything I need for a DSP guitar tuner project.
- a poorly tuned 12-string guitar, that has been under the bed for a few years unplayed
- A Sipeed MAIX board (64-bit RISC-V, plenty of RAM and Flash, with floating point, I2S mic, 2" LCD and an I2S microphone)
- A working RISC-V SDK kit on my laptop
I've had a few attempts at various solutions, but nothing that will actually allow me to reliably tune a string.
I've got raw samples from the Mic at 48,000S/s, and the mic has an AGC so that is fine.
Most of my attempts have been around correlation of a few thousand sample with the desired frequency, an narrow bands either side, then showing that on the LCD. It of works, and you have to tune for a symmetric curve around the center target frequency.
Last night I tried correlation only against target frequency, looking at the phase of that signal relative to the phase of the same correlation a fixed number of samples later (about 1/48th of a second), hoping to measure how much the phase has drifted due to the the string being detuned.
That doesn't seem to be working.
I assume that everybody will say "FFT!" but when working with 8192 samples @ 48000S/s you get ~6Hz wide bins, which is audibly out of tune.
The fundamental frequency range for guitar strings are between about 200 Hz and 800 Hz, so I do have the option sample-rate conversion down to 3kS/s or so, but CPU cycles are supposed to be cheap
The range of indication is most likely +/- 10% of the target frequency, as strings can be about 1/3 of an octave apart.
Any hint/ideas?
Maybe a relatively bandpass around the target frequency, then count the period on the result?
Any idea of keywords for the underlying math I can read up on?