First, just do the math. An ohmmeter measures resistance by supplying a known current, often 1mA on lower ranges, and then measuring the voltage across the resistance with that current going through it. As the resistances get lower and lower, it has to either supply a higher current or try and accurately measure lower and lower voltages. As you get into measuring lower and lower voltages, effects like thermal voltages, noise, etc start to make it difficult to get an accurate reading.
Second, consider the effect of the leads and connections. Those have resistances too, so that gets added to the resistance measurement. Worse, those resistances may not be completely stable, making it hard to compensate. The solution is to use the 4-wire method and that requires a DMM (typically a bench unit) that has a 4-wire ohms connection.
For examples of how much accuracy you can likely achieve, here's two examples.
My Fluke 289 has a dedicated Lo-Ohms 50R range, it uses about 8mA of test current and has a resolution of 1 milliohm. It has a specified accuracy (in the zero ohm region) of +/- 20 counts, or 20 milliohms, but if I use the REL (zeroing) function and I'm patient, the readings are repeatable to about 5 milliohms or so. That's pretty good for a handheld DMM with a two wire connection.
My Fluke 8846A bench DMM with a 10 R range uses 5mA of test current and has a resolution of 10 microohms. It's specified accuracy would be +/- 3 milliohms, although it seems stable and repeatable down to about 100 microohms or so.
The only way to do a lot better than that would be to use an instrument or homemade setup that uses a higher test current. If you use a bench supply to put 1 ampere through your resistance that you want to measure, all you need is a DMM that can reliably measure 1mV and you can measure 1 milliohm. For you purposes, that may do. Actual calibrated equipment that works down in those ranges is quite expensive.