Author Topic: FPGA VGA Controller for 8-bit computer  (Read 510766 times)

0 Members and 42 Guests are viewing this topic.

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #400 on: November 16, 2019, 06:31:42 pm »
Ooops,a typo in the GUPtalk, I wasn't setting the read/write address correctly.

Here you go:
I've included a 115200 version in the zip just in case.  Well sort it out when we teamview.

Hmmm... now it doesn't seem to be reading the GPU memory - has it gone back to reading the demo files?  ???
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #401 on: November 17, 2019, 04:20:55 pm »
Ok @nockieboy, now that the GPU_talk RS232 memory sniffer/debugger is perfectly functional, your work on the controls and features for the GPU begins now.  I'm sending an enhanced version of the GPU_talk with some added smart features for later tonight.

These are the following simple additions are required for your project before we make the address generator:

1.  Modify the core of your sync_gen.v so that the position of the active video image is shifted 16 pixels to the right by 16 lines down.  Remember, you will also need to move the sync signals as well so that the picture still remains centered with reference to the syncs timings.  Make this 2 new parameters called "IMAGE_X_OFFSET" and "IMAGE_Y_OFFSET".

2. To vid_osd_generator.v, add and pass through these parameters:
     A) 'GPU_powerup.mif' file
     B) Memory Address Bits  (take care as we want total bits here.  This means when defining the top bit in a register, it would be this value -1)
     C) Memory Words           (If this field is left empty, it's value should be calculated from (B)'s parameter )

3.  Make a new verilog module called 'GPU_HW_Control_Regs.v'.  This module should be a mass of registers which which are all available in parallel to any other part of the GPU, with an 8 bit data & address[19:0] input write port, a reset input and reset parameters for the first 32 bytes.  This one huge output register should be called GPU_HW_Control_regs[HW_REGS_SIZE*8-1:0].  Module specs:

     a)  All inputs, 8 bit data in, write data enable, 20 bit write address, and reset should be all register latched (yes, latch the reset input, then use that register's output as the module's reset).  This latching offers better FMAX timing fitting if we need to run this module on the alternate slower clock frequency. (IE, 50 Mhz clock).
     b)  A parameter, HW_REGS_SIZE , for the quantity/size bulk of 8 bit registers.  (Note, for your top hierarchy, block diagram design entry, you will place a 'param' which this module will use as it's setting, see attached image as an example.)
     c)  Base write address for the write input.  The module will always take in all 20 address bits, but when writing data to the 8 bit data input port, the upper address wires [19:8] should equal this number for the write to be successful.  If parameter (b) is >256, then the bottom of the upper base write address [8] will be ignored as you will be opening a 512 byte window, and so on.  (Set this default to the base of the last 256 bytes in your system memory.  Yes, it will occupy the same last 256 bytes as your system GPU ram.)
     d)  Reset parameters.  Have 32 8 bit parameter settings which will be forced into the first 32 bytes if the reset signal is sent.  Make the rest of the registers reset to 0.

4.  Place an input for the 'bulk GPU_HW_Control_regs[HW_REGS_SIZE*8-1:0] registers' into the following verilog modules: sync_generator.v, vid_osd_generator.v, vid_out_stencil.v.  Remember, you will need to receive the block diagram parameter into each module to get the right number of wires for the inputs.

5.  Add a new 48 register output to the 'sync_gen.v' called 'raster_HV_trigger[47:0]'.  Make the even 'raster_HV_trigger[47:0]' wires pulse at a horizontal pixel position and the odd wires pulse on a vertical line according to the new 'GPU_HW_Control_regs[HW_REGS_SIZE*8-1:0]' onput wire, organized as 16 bit words, using the first 10 of 16 bits of each 2x2 bytes coming from the new input "GPU_HW_Control_regs[HW_REGS_SIZE*8-1:0]".  You should add a new parameter input to the sync_gen.v which will allow you to shift the beginning byte base (beginning) address of GPU_HW_Controls where the 'raster_HV_trigger' get their 48 settings from.  IE 48x2 bytes = 96 bytes total.  Once done, out on your top block diagram, 'or' these 48 HV triggers outputs together and 'or' tie them to to the upper ro[],go[],bo[] bits of the OSD_img output.  Test with the GPUtalk app.  You should be able to move 24 vertical lines and 24 horizontal lines around the screen, with the first 0-15 coordinates being non-visible as those coordinates are too far to the left and above the top of the picture since we moved the picture window by 16 pixels to the right and down in step (1).  (Once working, mix and combine / change around the 'or' driven ro[]&go[]&bo[] colors VS raster_HV_trigger[47:0] even and odd pairs so each set of coordinate generated lines may generate a different color on the display.)

     Show us your progress.  Do one item at a time.  For the new modules and functions, perform a functional simulation of each one to make sure their controls work before inserting it into your design, otherwise, you are guessing to what functions and what doesn't.  Even show us the simulations as proof of functionality.  (Do this step, and you will have a full GPU working in not time at all.)

     This work is the preparation required for the image raster address generators as they require parallel hardware control registers and and some adjustable trigger events at adjustable locations during the display, including a setup time just before the active display begins.
« Last Edit: November 17, 2019, 07:59:41 pm by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #402 on: November 17, 2019, 05:22:54 pm »
Simulation tips:

Make you main GPU folder with a complete final project sub-folder + alternate projects for each verilog.v simulation.

  In each of those alternate folders for the simulation project, make a quartus 9 project with with a cyclone 3 FPGA and make a block diagram entry with clk and reset inputs as well as any other input pins to drive your verilog module.  Add outputs pins you wan to see from the verilog module.

  Add the verilog module by 'Adding' it's file to your quartus 9 project in quartus 9's 'Project/Add remove files in project' from your master Quartus Prime project's folder and add any other required verilog dependencies.  This way you are not copying xxx.v files back and forth.  Quartus 9's simulation will use only the verilog code in you master Quartus Prime's project folder.

  Open the xxx.v file and generate the symbol in Quartus 9, insert and wire the symbol into your simulation design.  Compile.  Create a new .vwf  (vector waveform file) and add the input and output pins, buses and regs you want to see in the simulation.  Go into simulator settings and select 'Simulation mode' "Functional" and Simulator Input "xxxx.vwf". and simulate away.

I can walk you through simulating your 'sync_gen.v' if you are having trouble.
« Last Edit: November 17, 2019, 08:02:05 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #403 on: November 18, 2019, 09:37:13 am »
1.  Modify the core of your sync_gen.v so that the position of the active video image is shifted 16 pixels to the right by 16 lines down.  Remember, you will also need to move the sync signals as well so that the picture still remains centered with reference to the syncs timings.  Make this 2 new parameters called "IMAGE_X_OFFSET" and "IMAGE_Y_OFFSET".

Is this really as simple as this?

Code: [Select]
// image offset parameters
parameter IMAGE_OFFSET_X = 16;    <----- added parameter
parameter IMAGE_OFFSET_Y = 16;    <----- added parameter

// no-draw area definitions
// defined as parameters so you can edit these on Quartus' block diagram editor
parameter H_FRONT_PORCH = 16;
parameter HSYNC_WIDTH   = 96;
parameter H_BACK_PORCH  = 48;
parameter V_FRONT_PORCH = 10;
parameter VSYNC_HEIGHT = 2;
parameter V_BACK_PORCH  = 33;
parameter PIX_CLK_DIVIED = 4;

// re-calculate back porch values, taking image offset into account
localparam H_ADJ_B_PORCH = H_BACK_PORCH - IMAGE_OFFSET_X;    <----- reduce back porch accordingly
localparam V_ADJ_B_PORCH = V_BACK_PORCH - IMAGE_OFFSET_Y;    <----- reduce back porch accordingly

// total screen resolution
localparam LINE = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH + HSYNC_WIDTH + H_ADJ_B_PORCH; // complete line (inc. horizontal blanking area)
localparam SCANLINES = IMAGE_OFFSET_Y + V_RES + V_FRONT_PORCH + VSYNC_HEIGHT + V_ADJ_B_PORCH; // total scan lines (inc. vertical blanking area)

// useful trigger points
localparam HS_STA = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH - 1; // horizontal sync ON (the minus 1 is because hsync is a REG, and thus one clock behind)
localparam HS_END = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH + HSYNC_WIDTH - 1; // horizontal sync OFF (the minus 1 is because hsync is a REG, and thus one clock behind)
localparam VS_STA = IMAGE_OFFSET_Y + V_RES + V_FRONT_PORCH; // vertical sync ON
localparam VS_END = IMAGE_OFFSET_Y + V_RES + V_FRONT_PORCH + VSYNC_HEIGHT; // vertical sync OFF

 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #404 on: November 18, 2019, 09:57:46 am »
2. To vid_osd_generator.v, add and pass through these parameters:
     A) 'GPU_powerup.mif' file
     B) Memory Address Bits  (take care as we want total bits here.  This means when defining the top bit in a register, it would be this value -1)
     C) Memory Words           (If this field is left empty, it's value should be calculated from (B)'s parameter )

Okay, just need some clarification here for A).  You want me to pass a reference to the init file that gpu_RAM uses so the filename can easily be changed in the design view, or you me to pass a new file?

Code including changes for B) and C) is attached.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #405 on: November 18, 2019, 10:57:22 am »
3.  Make a new verilog module called 'GPU_HW_Control_Regs.v'.  This module should be a mass of registers which which are all available in parallel to any other part of the GPU, with an 8 bit data & address[19:0] input write port, a reset input and reset parameters for the first 32 bytes.  This one huge output register should be called GPU_HW_Control_regs[HW_REGS_SIZE*8-1:0].  Module specs:

     a)  All inputs, 8 bit data in, write data enable, 20 bit write address, and reset should be all register latched (yes, latch the reset input, then use that register's output as the module's reset).  This latching offers better FMAX timing fitting if we need to run this module on the alternate slower clock frequency. (IE, 50 Mhz clock).
     b)  A parameter, HW_REGS_SIZE , for the quantity/size bulk of 8 bit registers.  (Note, for your top hierarchy, block diagram design entry, you will place a 'param' which this module will use as it's setting, see attached image as an example.)
     c)  Base write address for the write input.  The module will always take in all 20 address bits, but when writing data to the 8 bit data input port, the upper address wires [19:8] should equal this number for the write to be successful...

Okay, still with you so far...  ???

...If parameter (b) is >256, then the bottom of the upper base write address [8] will be ignored as you will be opening a 512 byte window, and so on.  (Set this default to the base of the last 256 bytes in your system memory.  Yes, it will occupy the same last 256 bytes as your system GPU ram.)

Nope - lost me now. :o

So I'm supposed to set the 9th bit [8] to zero if [19:8] is greater than 256?  The bit in parenthesis is totally lost on me.

So we've got 32 hardware control registers that can be written to and are all output from the module.  These write addresses we're discussing here are to address the individual bits of the registers, but something else is going on which I don't understand if the upper 12 bits are above 256?  These registers are separate from the GPU RAM, so I guess the reference to the base address in GPU RAM is to set the HW registers from the values in a 32-byte block in GPU RAM that the host can write to?  Going to need a little more guidance on this one.  Wouldn't it make sense to have the 32 HW registers' shadow the first 32 bytes in GPU RAM?

« Last Edit: November 18, 2019, 11:00:42 am by nockieboy »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #406 on: November 18, 2019, 02:30:49 pm »
I'll get to your 2 last comments in the next response.

Ok, 2 items:

1) The latest GPUtalk.

-I've added Load and Save dialogs, where you can now choose file names and start memory address and length of memory.  Entering nothing for the start and length automatically uses all the ram buffer.
-I've added 'm' which generates and saves a Quartus .mif Memory Initialization File.
-You can now click on the binary bits to toggle them when editing the data.
-You can now also click on the data and 16bit decimal data to enter a number.
-Using the + and - keys to increment and decrement the selected memory data does it for 2 bytes as a 16 bit word.  Using the wheel mouse to inc and dec the data value is still only 8 bit.
-When not in writing data mode, you can now click on the address to enter an address to directly jump to.
-Now, if you do not enter anything when prompted for the 'COM' number, the program bypasses all RS232 routines making it just as fast as if you were connected.  Pretty much makes it just a hex editor and .mif file generator.

2) 256 character, high res version of the GPU Quartus project.
-It will now has a 256 character font and should be in high res 64x32 characters.
-The font graphics now begin at address 0.
-There is also a Parameter in the OSD to switch to a 8x16 font.
-The .mif still has the Atari 128 character font (in case the attached VGA rom .bin fonts aren't compatible), so you will need to use the GPUtalk to load in the right font, 8x8, or 8x16 depending on the parameter setting into address 0.  Then you can now generate a new .mif file with the best looking font for you.  (Send photos.)

I've included 2 VGA fonts I got from this site:
https://github.com/spacerace/romfont/tree/master/font-bin -> Binary files
https://github.com/spacerace/romfont/tree/master/font-images -> Photos of the fonts
If they work, the others should work too.

See attachments:
« Last Edit: November 18, 2019, 02:38:33 pm by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #407 on: November 18, 2019, 02:59:26 pm »
1.  Modify the core of your sync_gen.v so that the position of the active video image is shifted 16 pixels to the right by 16 lines down.  Remember, you will also need to move the sync signals as well so that the picture still remains centered with reference to the syncs timings.  Make this 2 new parameters called "IMAGE_X_OFFSET" and "IMAGE_Y_OFFSET".

Is this really as simple as this?

Code: [Select]
// image offset parameters
parameter IMAGE_OFFSET_X = 16;    <----- added parameter
parameter IMAGE_OFFSET_Y = 16;    <----- added parameter

// no-draw area definitions
// defined as parameters so you can edit these on Quartus' block diagram editor
parameter H_FRONT_PORCH = 16;
parameter HSYNC_WIDTH   = 96;
parameter H_BACK_PORCH  = 48;
parameter V_FRONT_PORCH = 10;
parameter VSYNC_HEIGHT = 2;
parameter V_BACK_PORCH  = 33;
parameter PIX_CLK_DIVIED = 4;

// re-calculate back porch values, taking image offset into account
localparam H_ADJ_B_PORCH = H_BACK_PORCH - IMAGE_OFFSET_X;    <----- reduce back porch accordingly
localparam V_ADJ_B_PORCH = V_BACK_PORCH - IMAGE_OFFSET_Y;    <----- reduce back porch accordingly

// total screen resolution
localparam LINE = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH + HSYNC_WIDTH + H_ADJ_B_PORCH; // complete line (inc. horizontal blanking area)
localparam SCANLINES = IMAGE_OFFSET_Y + V_RES + V_FRONT_PORCH + VSYNC_HEIGHT + V_ADJ_B_PORCH; // total scan lines (inc. vertical blanking area)

// useful trigger points
localparam HS_STA = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH - 1; // horizontal sync ON (the minus 1 is because hsync is a REG, and thus one clock behind)
localparam HS_END = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH + HSYNC_WIDTH - 1; // horizontal sync OFF (the minus 1 is because hsync is a REG, and thus one clock behind)
localparam VS_STA = IMAGE_OFFSET_Y + V_RES + V_FRONT_PORCH; // vertical sync ON
localparam VS_END = IMAGE_OFFSET_Y + V_RES + V_FRONT_PORCH + VSYNC_HEIGHT; // vertical sync OFF
Oh, ok, here is how I would have done it:

Code: [Select]
// image offset parameters
parameter IMAGE_OFFSET_X = 16;    <----- added parameter
parameter IMAGE_OFFSET_Y = 16;    <----- added parameter

// no-draw area definitions
// defined as parameters so you can edit these on Quartus' block diagram editor
parameter H_FRONT_PORCH = 16;
parameter HSYNC_WIDTH   = 96;
parameter H_BACK_PORCH  = 48;
parameter V_FRONT_PORCH = 10;
parameter VSYNC_HEIGHT = 2;
parameter V_BACK_PORCH  = 33;
parameter PIX_CLK_DIVIED = 4;

// total screen resolution
localparam LINE = H_RES + H_FRONT_PORCH + HSYNC_WIDTH + H_BACK_PORCH; // complete line (inc. horizontal blanking area)
localparam SCANLINES = V_RES + V_FRONT_PORCH + VSYNC_HEIGHT + V_BACK_PORCH; // total scan lines (inc. vertical blanking area)

// useful trigger points
localparam HS_STA = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH - 1; // horizontal sync ON (the minus 1 is because hsync is a REG, and thus one clock behind)
localparam HS_END = IMAGE_OFFSET_X + H_RES + H_FRONT_PORCH + HSYNC_WIDTH - 1; // horizontal sync OFF (the minus 1 is because hsync is a REG, and thus one clock behind)
localparam VS_STA = IMAGE_OFFSET_X + V_RES + V_FRONT_PORCH; // vertical sync ON
localparam VS_END = IMAGE_OFFSET_X + V_RES + V_FRONT_PORCH + VSYNC_HEIGHT; // vertical sync OFF


and don't forget, you need to push the HDE and VDE screen coordinates in the middle of the code.

If you do not move the image, with your 8bit cpu, if you wan to position an action off the left edge of the screen, you would have to know about the raster width and place that controlled action on the right side of the image X&Y counter.  If you rather do things this way, then, you do not need to change the sync_gen at all.  However, any code for you MCU which uses sprites which can be scrolled off the left side of the screen will require you to place the sprite off the right side of the screen and subtract value.

Basically, though usually fixed, there are a number of settings which will need changing if you change the horizontal and vertical screen resolution where as if the screen was shifted down and to the right according to the reference H&V counter, these settings will always stay the same.

You have a judgement call to make here.  I decided to make the numbers parameters so you may turn the feature off and on lateron depending on the complexity of the horizontal line address generator features.

I'm also forcing you to setup a simulation of the sync_gen.v so you may check if the feature works correctly.

Your going to need this step.  Compile and check will no longer suffice when you need to see if each signal happens on the right pixel exactly.
« Last Edit: November 18, 2019, 03:24:06 pm by BrianHG »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #408 on: November 18, 2019, 03:16:59 pm »
3.  Make a new verilog module called 'GPU_HW_Control_Regs.v'.  This module should be a mass of registers which which are all available in parallel to any other part of the GPU, with an 8 bit data & address[19:0] input write port, a reset input and reset parameters for the first 32 bytes.  This one huge output register should be called GPU_HW_Control_regs[HW_REGS_SIZE*8-1:0].  Module specs:

     a)  All inputs, 8 bit data in, write data enable, 20 bit write address, and reset should be all register latched (yes, latch the reset input, then use that register's output as the module's reset).  This latching offers better FMAX timing fitting if we need to run this module on the alternate slower clock frequency. (IE, 50 Mhz clock).
     b)  A parameter, HW_REGS_SIZE , for the quantity/size bulk of 8 bit registers.  (Note, for your top hierarchy, block diagram design entry, you will place a 'param' which this module will use as it's setting, see attached image as an example.)
     c)  Base write address for the write input.  The module will always take in all 20 address bits, but when writing data to the 8 bit data input port, the upper address wires [19:8] should equal this number for the write to be successful...

Okay, still with you so far...  ???

...If parameter (b) is >256, then the bottom of the upper base write address [8] will be ignored as you will be opening a 512 byte window, and so on.  (Set this default to the base of the last 256 bytes in your system memory.  Yes, it will occupy the same last 256 bytes as your system GPU ram.)

Nope - lost me now. :o

So I'm supposed to set the 9th bit [8] to zero if [19:8] is greater than 256?  The bit in parenthesis is totally lost on me.

So we've got 32 hardware control registers that can be written to and are all output from the module.  These write addresses we're discussing here are to address the individual bits of the registers, but something else is going on which I don't understand if the upper 12 bits are above 256?  These registers are separate from the GPU RAM, so I guess the reference to the base address in GPU RAM is to set the HW registers from the values in a 32-byte block in GPU RAM that the host can write to?  Going to need a little more guidance on this one.  Wouldn't it make sense to have the 32 HW registers' shadow the first 32 bytes in GPU RAM?

Note, the HW_REGS_SIZE should be at least be 256.  Now, you need a parameter 'WRITE_BASE_ADDRESS'.
Now, for example, if we set the base address to $003F00 (16128), when the input address[19:0] = $003F00 through $003FFF, and a write enable comes in, the correct 8 bits will be filled into the right 1 of 256 GPU_HW_Control_regs.  It's basically a ram, but in regs.  Now if you have the HW_REGS_SIZE set to 512 and the 'WRITE_BASE_ADDRESS' set to $003E00 then writes from input address[19:0] = $003E00 through $003FFF would access the 512 x 8bit GPU_HW_Control_regs.  the 'WRITE_BASE_ADDRESS' can be set to anything within the 20 bit address range.  (Don't think about addr > or addr <, think about just the number of address wires in the 'WRITE_BASE_ADDRESS' and the input address[19:0] just need to be equal while the least significant address wires are are not used if addressA==AddressB test.  These low address bits are only used to point to the correct GPU_HW_Control_regs for writing the 8 bit data coming in.

Begin with this much and simulate.

As for the reset, is just like any of the other resets you have already done.  No separate handle needed.
« Last Edit: November 18, 2019, 03:19:31 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #409 on: November 18, 2019, 04:36:34 pm »
1) The latest GPUtalk.

Marvellous - works really well, thanks!

-The .mif still has the Atari 128 character font (in case the attached VGA rom .bin fonts aren't compatible), so you will need to use the GPUtalk to load in the right font, 8x8, or 8x16 depending on the parameter setting into address 0.  Then you can now generate a new .mif file with the best looking font for you.  (Send photos.)

Okay, so I've changed the 8x8 font using GPUtalk without any issues:

Before:

874646-0

After changing to 8x8 font:

874650-1

But there's a problem - if I recompile the project with the parameter FONT_8x16 set to 1 in vid_osd_generator.v, I get a slightly corrupted display - which is expected as initially the GPU is powering up with the default 8x8 .mif in memory, but when I start up GPUtalk, it is constantly listing errors and takes a good 10-15 seconds to list the first page of data:

874654-2

This only happens when FONT_8x16 is set to 1 in vid_osd_generator...  :-//

If they work, the others should work too.

Well, it looked like it might be working, but the errors in GPUtalk are preventing a clean bin transfer to the GPU.  I can't imagine what the problem could be, but you might have an idea?
« Last Edit: November 18, 2019, 04:44:52 pm by nockieboy »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #410 on: November 18, 2019, 07:30:09 pm »
...and don't forget, you need to push the HDE and VDE screen coordinates in the middle of the code.

Not really sure what you mean here - like this?

Code: [Select]
// handle signal generation
always @(posedge pclk)
begin
if (reset) // reset to start of frame
begin
h_count <= (H_RES - 2);
v_count <= (SCANLINES - 2);
//z_count <= 1'b0;
hsync   <= 1'b0;
vsync   <= 1'b0;
vde     <= 1'b0;
hde     <= 1'b0;
end
else
begin
if (pc_ena[3:0] == 0) // once per pixel
begin

// Horizontal blanking area - set HDE LOW
if (h_count == IMAGE_OFFSET_X + H_RES - 1)    *******  Added IMAGE_OFFSET_X in here
begin
hde <= 1'b0;
end

// check for generation of HSYNC pulse
if (h_count == HS_STA)
begin
hsync <= 1'b1; // turn on HSYNC pulse
end
else if (h_count == HS_END)
hsync <= 1'b0; // turn off HSYNC pulse

// check for generation of VSYNC pulse
if (v_count == VS_STA)
begin
vsync <= 1'b1; // turn on VSYNC pulse
end
else if (v_count == VS_END)
vsync <= 1'b0; // turn off VSYNC pulse

// reset h_count & increment v_count at end of scanline
if (h_count == LINE - 1) // end of line
begin
h_count <= 1'b0;
hde <= 1'b1;  // Turn on horizontal video data enable

// Now h_count has been zeroed, check if the V-count should be cleared at end of SCANLINES
if (v_count == SCANLINES - 1)
begin
v_count <= 1'b0;
vde <= 1'b1; // Turn on vertical video data enable
end
else
begin // If v_count isn't being cleared, increment v_count
v_count <= v_count + 1'b1; // increment v_count to next scanline
if (v_count == V_RES - 1)
vde <= 1'b0 ; // Turn off vertical video data enable - reached bottom of display area
end
end
else // not at end of scanline, so just increment horizontal counter
h_count <= h_count + 1'b1;
if (h_count == H_RES - 1)
hde <= 1'b0 ;  // Turn off vertical video data enable

end // if (pc_ena)

end // else !reset

end // always @clk

...but not sure about pushing vde - it seems to be fine as it uses triggers that incorporate IMAGE_OFFSET_Y?

I'm also forcing you to setup a simulation of the sync_gen.v so you may check if the feature works correctly.

Your going to need this step.  Compile and check will no longer suffice when you need to see if each signal happens on the right pixel exactly.

This is going to take me a little time to set up and interpret correctly, but I understand that it's an important step.
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #411 on: November 18, 2019, 09:26:50 pm »
Now, for example, if we set the base address to $003F00 (16128), when the input address[19:0] = $003F00 through $003FFF, and a write enable comes in, the correct 8 bits will be filled into the right 1 of 256 GPU_HW_Control_regs.

Right, I'm with you.  So I can set WRITE_BASE_ADDRESS based off of HW_REGS_SIZE:

Code: [Select]
parameter HW_REGS_SIZE = 256;
parameter [HW_REGS_SIZE - 1:0] RESET_BYTES;
parameter WRITE_BASE_ADDRESS = 16384 - HW_REGS_SIZE;

Assuming I'm not out by 1 in those calculations - I seem to have real difficulty with when and where I should be subtracting 1 from the figures.  :o  But the above figures will give a WRITE_BASE_ADDRESS of $3F00, according to my trusty Windows calculator..  :-DMM

I guess the above calculation for WRITE_BASE_ADDRESS is fine as long as we always want it at the end of the GPU's RAM space.

Don't think about addr > or addr <, think about just the number of address wires in the 'WRITE_BASE_ADDRESS' and the input address[19:0] just need to be equal while the least significant address wires are are not used if addressA==AddressB test.  These low address bits are only used to point to the correct GPU_HW_Control_regs for writing the 8 bit data coming in.

So we're not comparing the write address against a range of suitable addresses, but instead checking for WRITE_ADDRESS[19:8] == WRITE_BASE_ADDRESS[19:8]?
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #412 on: November 18, 2019, 09:42:18 pm »

This only happens when FONT_8x16 is set to 1 in vid_osd_generator...  :-//

If they work, the others should work too.

Well, it looked like it might be working, but the errors in GPUtalk are preventing a clean bin transfer to the GPU.  I can't imagine what the problem could be, but you might have an idea?
There shouldn't be a problem with the GPUtalk and the 8x16 setting as it has nothing to do with the com at all.
GPU talk says 'received 0 bytes of a read expecting 256 bytes.'.  Now, GPU talk doesn't check if the com port it opened is actually working.  So it may be transmitting data into a 'NULL' port.  This could happen if another app or GPUtalk is open in the background, or, if you remember how Quartus' USB BlasterII wouldn't work when GPUtalk was left running, since USB Blaster also uses a FTDI chip like your USB-TTL RS232 cable, maybe you just need to close the USB Blaster control window, or close and re-open GPUtalk.   It might be useful to check if you USB-RS232 cable has a RXD-TXD LED.  Or, I can add a TXD/RXD Led output signal to my RS232debugger.v and you can tie those to 2 LEDs on you dev board.  Then you can see com activity regardless of the data coming in and out.  (I'm prepping a new .v file with 2 led outputs, RXD and TXD led signals.)
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #413 on: November 18, 2019, 10:38:40 pm »
I've added a LED_txd and LED_rxd to the "rs232_DEBUGGER.v".  These outputs are positive logic, so, if you dev board uses active low leds, then just add the 'exp' gate between the module outputs and the IO pins.

(Note I found a bug with the low baud rate in the rs232_debugger.v, so 1 last fix will be coming after I get a days rest.)
« Last Edit: November 18, 2019, 10:40:54 pm by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #414 on: November 19, 2019, 02:49:12 pm »
There shouldn't be a problem with the GPUtalk and the 8x16 setting as it has nothing to do with the com at all.

I know - I don't think the issue is with GPUtalk, though.  :o  Nothing changes with GPUtalk, whether it's talking to the 8x8-build GPU or 8x16-build version.  Is something getting changed somehow in the RS232 module in the 8x16 build?  I know it's just a single parameter that's changing, and it's not even in the RS232 module, but I don't see how GPUtalk can work fine with the 8x8 build but not the 8x16 one.  :-//

GPU talk says 'received 0 bytes of a read expecting 256 bytes.'.  Now, GPU talk doesn't check if the com port it opened is actually working.  So it may be transmitting data into a 'NULL' port.

But some data is getting through, because after 10 seconds or so the screen updates with data, and I'm able to write to the GPU, albeit not very predictably.  I happen to share the Tx/Rx lines with two of the LEDs on the dev board, so I can see activity on the serial connection at the GPU - I can also see the Tx/Rx LEDs flickering on the USB-TTL converter.

Also, immediately after testing the 8x16 build, I compile and upload the 8x8 version to the GPU and restart GPUtalk and everything works fine.  I'm careful to make sure the programmer isn't still open when I run GPUtalk, and vice-versa.

Though to me, it can't be an issue with GPUtalk as it works fine with the GPU in 8x8 mode - it must be something going wrong with the GPU build somehow?
« Last Edit: November 19, 2019, 03:50:44 pm by nockieboy »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #415 on: November 20, 2019, 01:20:05 pm »
I've re-vamped to com and GPUtalk into a generic debugging utility.  Update coming later today.
Meanwhile, create a simulation project for just the sync_gen.v module.  You gotta learn, cause the next steps will get deep...
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #416 on: November 20, 2019, 03:38:03 pm »
Meanwhile, create a simulation project for just the sync_gen.v module.

Installing 9.1sp2 as I write this.

You gotta learn, cause the next steps will get deep...

Were you just trying to keep me motivated earlier when you said we'd hit the hardest part of the project then?  ;)
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #417 on: November 20, 2019, 08:43:25 pm »
Meanwhile, create a simulation project for just the sync_gen.v module.

Installing 9.1sp2 as I write this.

You gotta learn, cause the next steps will get deep...

Were you just trying to keep me motivated earlier when you said we'd hit the hardest part of the project then?  ;)

It's not that it's hard, it's that you will not succeed without simulation.  You need to see what you code and internal registers are doing when you write code.  You need to test thing which you cannot generate in the real world feeding into your FPGA.  You need to test each verilog module feeding it specific controled inputs to see how it behaves.

    I've renamed to GPU_talk to RS232_Debugger.  I've cleaned up the verilog code and going to soon clean up the HEX Editor / Debugger and post it on a separate thread as it has now become/becoming a viable product all on it's own.  I've removed all the Altera dependencies in the RS232_Debugger.v so it now may work with any FPGA Vendor /  verilog compiler.

     I've also added an additional 8 real-time parallel debug ports.  4x8bit input ports and 4x8bit output ports.  In the new RS232_Debugger.exe, the 4 green ports displayed at the bottom left are real-time always updated refreshed display.  I currently connected the in0[7:0] & in1[7:0] to a new 16bit frame counter in the sync_gen.v module which you can see counting while the RS232_Debugger is running.  As for the output ports,   I tied ports out1[7:0] through out3[7:0] as a color palette selection for you OSD font.  (The foreground text color is set negative values while the background is set positive.)  You can manipulate the colors by clicking on the out1,2,3 bits in the debugger, or, hover the mouse over the out's binary bits and use the mouse wheel to increment/decrement the number.

See attached image for illustration, +, get the new GPU project and RS232_Debugger hex editor.
And if you cant get the 8x16 font working, there is something wrong with your setup, compiler install, corrupt files, or, something else.
« Last Edit: November 21, 2019, 07:25:27 am by BrianHG »
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #418 on: November 21, 2019, 10:19:12 am »
I've renamed to GPU_talk to RS232_Debugger.  I've cleaned up the verilog code and going to soon clean up the HEX Editor / Debugger and post it on a separate thread as it has now become/becoming a viable product all on it's own.  I've removed all the Altera dependencies in the RS232_Debugger.v so it now may work with any FPGA Vendor /  verilog compiler.

876182-0

It certainly is a viable product.  It's working perfectly now - not sure what's changed between this project and the last, or this version of GPU_talk/RS232_Debugger and the last, but I'm not getting any problems communicating with the GPU while it's running in 8x16 mode now:

876186-1

Sorry about the picture quality in that second one - the sun decided to come out whilst I was trying to take the picture, but you get the idea.  ;D

     I've also added an additional 8 real-time parallel debug ports.  4x8bit input ports and 4x8bit output ports.  In the new RS232_Debugger.exe, the 4 green ports displayed at the bottom left are real-time always updated refreshed display.  I currently connected the in0[7:0] & in1[7:0] to a new 16bit frame counter in the sync_gen.v module which you can see counting while the RS232_Debugger is running.  As for the output ports,   I tied ports out1[7:0] through out3[7:0] as a color palette selection for you OSD font.  (The foreground text color is set negative values while the background is set positive.)  You can manipulate the colors by clicking on the out1,2,3 bits in the debugger, or, hover the mouse over the out's binary bits and use the mouse wheel to increment/decrement the number.

Yes, I've had some fun playing with the colours before realising I had work to do!  :-\

 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #419 on: November 21, 2019, 12:00:43 pm »
See what you can learn of setting up a simulation in QIIv9.1.  I can make myself available on the weekend to walk you through a complete simulation from scratch.  You'll probably need this as existing documentation sucks.

:( Sadly, when Altera decided to ditch their inbuilt simulator, and re-wrote their user interface to the new slower one, we lost a quick code diagnostic tool which helped one didn't want to bother with ModelSim where you could just drop and draw IO pins and registers from your project and see immediate results.

It's funny, even if you wanted to use third party simulators like ModelSim, Quartus' original Vector Waveform editor did allow the generation of vector stimulus files to be used by ModelSim and other third party simulators.  However, in the new QuartusPrime, it's vector file viewer/editor just sucks royal shit in speed and performance and editing/entering/generation waveform features compared to the old QIIv9.1 version which functions like a combined word-processor and Photoshop paint software.
« Last Edit: November 21, 2019, 12:07:15 pm by BrianHG »
 

Offline avogadro

  • Contributor
  • Posts: 34
  • Country: hr
Re: FPGA VGA Controller for 8-bit computer
« Reply #420 on: November 21, 2019, 12:10:34 pm »
I dont know if anyone mentioned but you can use wayback machine to access the site you're looking for.

https://web.archive.org/web/20190414023118/http://searle.hostei.com/grant/z80/
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #421 on: November 21, 2019, 12:53:49 pm »
I dont know if anyone mentioned but you can use wayback machine to access the site you're looking for.

https://web.archive.org/web/20190414023118/http://searle.hostei.com/grant/z80/

Thanks avogadro, that's all sorted now - there's a few mirror sites, as well as the wayback machine, but Grant Searle has opened another website at http://searle.wales as well.

See what you can learn of setting up a simulation in QIIv9.1.  I can make myself available on the weekend to walk you through a complete simulation from scratch.  You'll probably need this as existing documentation sucks.

Absolutely - I'll see what I can do with the limited spare time I have before the weekend, but a walkthrough would be much appreciated.  :-+
 

Offline nockieboyTopic starter

  • Super Contributor
  • ***
  • Posts: 1812
  • Country: england
Re: FPGA VGA Controller for 8-bit computer
« Reply #422 on: November 21, 2019, 03:15:49 pm »
Okay, so while I'm waiting for the weekend to learn about simulation, I thought I'd have a go at writing a host handler to monitor the Z80's address, data and control buses and accept anything written to the 'GPU window' in memory and pass it to the GPU RAM.

I thought I'd pop my code up to be torn apart as I don't really know if I'm going down the right lines with this and would appreciate the feedback.  ;)

Basically, it should monitor the Z80's address bus and memory signals to see if a byte is being written to the window in RAM that the GPU will sit within - this should be the third 512 KB, so 0x180000 to 0x1FFFFF.  When it sees the Z80's MREQ and WR lines go low (I haven't bothered with inverting signals yet, this is purely a code exercise at the moment) and M1 is high (i.e. the Z80 is writing to memory), it checks to see if the Z80 is addressing a byte within its 512 KB window - by checking if host_addr[21:19] == 3'b011.  If it does, then it passes host_addr[18:0] to the vid_osd_generator, along with the data being written and setting h_wr_ena high so that vid_osd_generator will write the data to memory.

I've just thrown this code together really, but the more I think about it the more I worry that there could be timing problems etc.  Should I run the code off of the Z80's 8 MHz clock or the much faster GPU clock?  Are there issues with how I'm setting/resetting h_wr_ena? etc..  :-//

Code: [Select]
module host_handler (

// input
input wire reset, // GPU reset signal
input wire host_CLK, // Microcom clock signal (8 MHz)
input wire host_M1, // Z80 M1 - active LOW
input wire host_MREQ, // Z80 MREQ - active LOW
input wire host_WR, // Z80 WR - active LOW
input wire host_RD, // Z80 RD - active LOW
input wire [21:0] host_addr, // Microcom 22-bit address bus
input wire [7:0] host_wr_data, // Z80 DATA bus to pass incoming data to GPU RAM

// output
output wire [7:0] h_rd_data, // Z80 DATA bus to return data from GPU RAM to Z80
output reg          h_rd_req, //
output reg          h_wr_ena, // flag HIGH when writing to GPU RAM
output wire [19:0] h_addr, // connect to host_addr in vid_osd_generator to address GPU RAM
output reg  [7:0]  h_wdata // 8-bit data bus to GPU RAM in vid_osd_generator

);

always @ (posedge host_CLK)
begin

// is a WR memory cycle being executed?
if (!host_M1 && host_MREQ && host_WR) begin

// is the GPU being addressed?
if (host_addr[21:19] == 3'b011) begin // host_addr[21:19] == 3'b011 targets the 512KB 'window' at 0x180000-0x1FFFFF

h_wdata <= host_wr_data;
h_addr <= host_addr[18:0];
h_wr_ena <= 1;

end // gpu addressed
else begin
h_wr_ena <= 0;
end // gpu not addressed

end // memory cycle
else begin

h_wr_ena <= 0;

end

end
« Last Edit: November 21, 2019, 03:17:34 pm by nockieboy »
 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #423 on: November 21, 2019, 04:33:44 pm »
Your doing ok, and for now, maybe you should keep working like this.
But, you are going to need to begin to eventually think along operating in 2 different clock domains.

1) You have a Z80 operating at 8Mhz.  It is sending you impossibly slow read and write requests.
2) Your static ram in an FPGA can operating at 125 Mhz.
3) How can I keep it running that fast and still run the Z80's 8Mhz requests?


This is not difficult.  Working/thinking like this will allow you to keep running the ram's host port at 125MHz.

If you are not sure what you are writing is functional, it is time to simulate...

Your other choice is to not change anything, but, create a separate xxxx.v module bridge to take these slow 8mhz requests and strobe them out at the system 125mhz speed.  This would be wasteful and may introduce additional clock delays.  Remember, if you clock the intel FPGA ram at 8mhz, though the writes may be taken immediately, if you do a read request, you get a response 2 clocks later unless you change the setting on the dual port ram which may kill your FMAX to a point where the GPU cant work.  Also, what happens if you do a write on one clock and a read of that same address on the next clock?  In the FPGA running at 8MHz, an additional clock cycle is required to get the new value.  Same goes for 125Mhz, however, since at 125MHz the write would go through and complete before the next Z80 read, I don't think this could be a problem unless the Z80 ran above 75MHz.

 

Online BrianHG

  • Super Contributor
  • ***
  • Posts: 8143
  • Country: ca
    • LinkedIn
Re: FPGA VGA Controller for 8-bit computer
« Reply #424 on: November 21, 2019, 04:40:08 pm »
This is the final RS232_Debugger update before I make it's own dedicated thread.  I've added page read bursts into the communications protocol.  The read and write the the GPU now operate at full speed unhinged.  This means with 128 kilobytes in the GPU, reading or writing the full memory contents should take under 2 seconds.

Let me know if it loads and sends the full 16k any faster...

I've also made the VGA 8x16 font default.
« Last Edit: November 21, 2019, 04:53:55 pm by BrianHG »
 
The following users thanked this post: nockieboy


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf