At last, I've succeeded.Thrice I descended into FPGA hell. Twice I was rebuffed; turned away in defeat.
But this time, I've cracked it
All it took was another entire overhaul
The new++ uGPU design:This time the uGPU is built around a rudimentary 16bit RISC style processor I created specifically for this application.
The processor is a Harvard architecture, this easily allows it to be a single cycle processor.
The core pillar of the system is that the processor is put into a HALT state where all internal registers are reset until an IRQ comes in.
These IRQs will prompt the processor to do something, like load the next character data into the renderer, or transfer data from the I/O into memory.
The implementationThis being my third redesign, I designed the processor over Christmas (the 'rona made a few near passes on me, I had some spare time while I was avoiding it) and then gave myself 3 weeks to get it all implemented.
If it wasn't done by this Friday just passed I was going to can the whole thing and switch over to using an RPI zero as my video out.
So the deadline was set, and off I went. I ran into many of the same old FPGA hell speedbumps (sims working, FPGA not), and caught many of the same silly mistakes I had learned from already (using combinational logic where I really shouldn't have been).
This time it was make or break. I had to get this working. Tuesday/Wednesday I had glimmers of hope. The internal systems were behaving themselves; the processors IO routine was responding to the IRQ and transferring hardcoded data across to the video memory.
Come Thursday I implemented my first version of the IO block. A mix of messy logic combining combinational and clocked in all the worst ways; but it kind of works. Data is bouncing around not being always written to the write location, but no corruption which is good.
For the first time since I started this project I have data being written into the uGPU (a z80 got that grace) and updating characters on the screen. That felt
really good!
By Friday I have made no more progress. In work I scribble out a schematic and simulate it here:
http://digitaljs.tilk.eu/ (very helpful online Verilog visualizer, synthesis using yosys).
The logic seems nice and clean but I won't get to implement it until Sunday, as I'm traveling over the weekend. I know it's beyond my Friday deadline, but it was
technically designed on Friday and I couldn't give up when I was
so close.
So I get in on Sunday and implement and test it in hardware. Now the IO is completely stable no characters being written to the wrong location. But some of the characters just don't seem to be getting written.
With some more serious head scratching I realize that I have set my IO IRQ to clear
any time the processors RD signal goes high. That means the processor could be doing something completely unrelated and accidentally clear the IRQ interrupt.
One quick processorRD AND'ed with the IO_CS line later and I have a working uGPU, with no missed character writes.
Success.... what next?Relax for a while. But not too long. Just the right amount. Then come back and see what functionality I can extend and add to the uGPU. I now have a processor that I can easily write any sort of program for.
Add cursor functionality? Sure. Add fancy commands like screen clearing? Absolutely. See what I can do about adding hardware scrolling to the renderer and some kind of sprite support? I'll put that on the long finger for now but it is in the pipeline.
I also had to drop the video output resolution back down to 800x600 to minimize timing issues while implementing the design, so one thing I want to try is bumping it back up to 720p.
If people have an interest in the source code, uGPU board design, or whatever let me know. I will put together documentation for myself, but if there's interest in this project I will tidy it up and make it releasable.
No promise on when I would get around to this, I want to write up an assembler and do a few other items for the uGPU's processor for the sake of my own sanity, but I will do it if it's wanted.
I've attached an image I captured of the Z80 writing characters the the uGPU.
I'll upload a video of it to YouTube and link it here when I get a chance.