It also depends heavily on the accuracy of the timing model. A toolchain that doesn't get this right could easily produce designs that are marginal and risk failing over temperature or part-to-part variation
This is true in respect of a particular design, not the FPGA. A poor design may fail for any reason, but the FPGA doesn't care if it does, it will just process signals under provided configuration.
This is not what Mike said.
The toolchain (especially on the synthesis and timing analysis part) will have to meet timing constraints that you may have set. This happens through logic, placement and routing choices and optimization that are obviously dependent on 1) the FPGA architecture and 2) the physical characterization data that the tools use for the timing/power analysis.
For the overall process to be successful, several requirements must be met:
- the synthesis is successful, i.e. your synthesis tool managed to translate some kind of High-Level Synthesis (HLS) design representation - it may be RTL or block-level schematic or whatever to the exact LUT/CLB/VersaTile your FPGA may have. This is not trivial
- the placement of the cells and the routing of the signals is successful
- if you have a time-constrained design (i.e. it needs to work at a given frequency), you will have to trust that the tools from the design flow managed to fulfill your constraints. Failing that, you need to look at the report from timing analysis tools to work around the actual performance you get. This is the part that Mike discussed. If the tools are not good or if the primary silicon data they use is not accurate, their analysis may be wrong. Your design may work, almost work at 156.25 MHz required for your 10Gb Ethernet or fail miserably when some particular condition is met, causing a lot of headaches.