Is the debug IP-CORE aux. UART output particularly important for debugging the remaining problems at this time?
If so, would I be correct to assume that you don't happen to have any variant build with the cores included that can tunnel the various debug UART
information out the USB / ethernet / internal usb blaster channels somehow?
If not I'll have to find some devkit or programmer or something with compatible USB/UART bridging which is probably possible if not handy.
Though I use my RS232 debugger to peek at the ram contents and have a few handy buttons to send a reset and see 4 regs in my design, passing a reg called rdcal to the blue LEDs does give you enough info on the status of the DDR3 controller. The RS232 debugger only requires 2 wires, RXD&TXD, 3.3v ttl to RS232 converter. If you have code to bridge data to one of DECA's onboard ports & have a hex editor/viewer, it should be easy peasy to implement on one of the fully flexible available 16 read&write ports.
If the RAM isn't acting quite as expected across all modes / PCBAs I would generally wonder if they might have substituted a different DRAM IC than the one listed in the BOM for some lots, or if that particular chip has known or undocumented errata vs. the way it is hooked up in this unit. I wouldn't even be shocked if there were mode dependent SI problems based on the circuit / layout / power supply which nobody may know of since it's just a devkit and most people wouldn't be doing rigorous DRAM channel validation testing and if they did they'd probably just be using mostly the altera based projects / core.
But I'm sure you'd have looked into any compatibility concerns that occurred to you when porting the core.
The ram is acting fine. The initial problems span from power-up controls originating on the CLK_IN 50MHz oscillator input feeding the core logic which operates on the PLL 300MHz DDR3 clock domain. A number for D-flipflop regs in series bridging the commands from the CLK_IN domain to the DDR_CLK domain fixed all the issues except for 1. Even though such logic wasn't necessary as I did code for the crossing of 2 different clock domain speeds, I guess that you cannot predict how the compiler or FPGA itself tries to do the wiring in this logic freezing up my code. With a 50MHz source, I can only reliably boot the ram at 250MHz, 300MHz, 350MHz, 400Mhz, and sometimes 450MHz. When setting frequencies like 320MHz or 310MHz, or 270MHz, each build returns random results. (I know 270MHz is underclocking the ram, but 250MHz works fine.) Once the ram has been properly booted, and everything is operating on the 300MHz and 150MHz clocks coming from the PLL, I have not seen the DDR3 controller logic fail, even if I seriously overclocked the FPGA.
At power-up, my code does do a read calibration inspection of the RAM and it does tune the PLL's read phase to the optimum position. My code is to DDR3 spec, done by the book. Different ram should not be a problem, especially at 300MHz.
Currently, I'm cleaning my source code and I need to create some additional documentation to post the code with a few different example implementations this weekend.