The reference shunt likely has a subtle varying tempco difference than the DUT, due to construction of the sensing resistance element and the resultant thermal conduction to the ends, versus to ambient.
Normally, shunts like the reference are turned on their side so that the resistance element has a lower Rth to ambient. You could also include some airflow to further suppress any tempco contribution.
For most power applications, commercial shunts I've come across like those two would typically have a 1% tolerance from the manufacturer.
I recently tested a batch of ten 5A 100mV marked "class 0.2" shunts from A.J.Williams (Australian) and was able to 'confirm' that shunt resistance varied between all units by less than 0.17% max. Those shunts were circa 40 years old, and of unknown use history, but given the class rating and serial numbering and no outlier measurement results, it does appear that they were originally made with a 0.2% tolerance. I could also conjecture that the three shunts closest to the median value are at worst <0.1% tolerance from the original manufacturer reference, and likely to have a lower tolerance, but haven't as yet been able to further that view.
PS, thanks for copying the shunt calibration report.