Why not, you sure can!
Even made one myself:
Basic design is a discrete analog phase-interleave CC/CV control, with 0-5V range inputs, that are driven from a pair of MCP4922s; outputs are measured with a MCP3208. Header has 5V and SPI on it. (Well, one channel is phase interleave, the other is single phase. Hence the two input and two output terminal blocks, and the pair of dual DACs.)
With an MCU instead, there are many options:
- Many have an onboard DAC, that can be wired directly to a circuit as above.
- Timers are ubiquitous; generating PWM into a filter makes a simple if slow DAC.
- PWM can be passed direct to the inverter (open loop or slow closed loop) -- probably dangerous, not very responsive, but who knows?
- Same, but with a protective latch (e.g. peak current mode control) plus per-cycle interrupt doing PID control -- basic protections in place, safer, but still subject to software error.
- If elaborate configurable logic is available, the better part of a hardware digital control could be built (more likely a full FPGA though).
- There are also some more specialized SMPS MCUs out there; I recall MCP has a PIC integrated with a peak-current-mode latch and gate driver, something like that.
These would be hard to tell apart at a glance; the additional analog bits (filter), presence of SMPS controller, etc. would be the tip-off, but tracing much of the circuit may be necessary to identify the scheme.
Digital controls are tricky to get right. Like, the last one I made was rather noisy; although it was a frequency-shift resonant control, and the dithering of that can contribute a lot more noise than other types might, but maybe it had to do with my sample timing, how I smoothed samples out to reduce ripple, etc., or just the natural tendency towards chaos that these systems can have. But it only needed to be stable over say 10s of ms, so the modest cycle-to-cycle variations average out, and I didn't have reason to investigate further.
A basic buck converter with peak current control should be pretty easy to get within a few LSBs of correct (of whatever part of the control loop has the least ENOB; perhaps the current-setting DAC?), and then regulating output voltage and current to implement CC/CV operating region is just a slower PID loop.
But I would actively discourage developing digital controls, until one knows absolutely everything that they are doing. Software has
exponentially combinatorially more ways of failing than analog circuits do. Using ICs as building blocks (SMPS controllers, etc.), you at least have some confidence that they tested that chip, if not exhaustively, then well enough to commit to the expense of tapeout and masks. For my part, I don't consider my above mentioned control particularly reliable -- it
seems to work but I have no way to truly test it, I have no illusions that it is infallible and reliable, that there aren't weird logic latchups, arithmetic errors, etc. in it, or that the platform itself is at all reliable (it seems to behave according to the datasheet/manual -- but how do you
prove that?).
"Hardware eventually fails; software eventually works" - Michael Hartung
That is, hardware can be correct once, at design/manufacture time, and go until its very atoms cease to cooperate (which, can be sooner than later, but "later" can be very long indeed). Software, after infinite updates, may eventually accomplish the thing that it was originally intended to, but I would go one further and say that software need never "work" at all, because in many projects, the priority is on more features, not less bugs.
Others may not share my caution about software, and especially among say cheap power supplies, where the pressure is to knock out something quickly, tested only to work in the average case -- who knows. It's probably a good idea not to connect anything to such a power supply that is worth more than the supply, and not easily repaired in case of gross supply voltage mismatch.
Tim