Wytnucls, as c4757p summed up, these are just two sides of the same coin.
My question is just simply why they didn't include 100 KHz while designing the circuit or chipset at the 1st place ? Is that because of technical constraint that might inflict higher cost ?
The 100Khz standard is pretty aged in the industry, especially at power cap datasheets.
There are
not 2 sides of the same coin. D is a direct indication of the healthy state of the capacitor. ESR doesn't tell you anything by itself. You have to consult approximate reference tables to find out if the cap is perhaps out of specs. Alternatively, you'd have to work out D anyway, to crosscheck with the published datasheet (D= ESR x 2 x Pi x f x C). Cumbersome calculations if many different caps have to be checked in turn.
In my experience, a good indication to watch for is that any electrolytic cap with a D of 25% or more (0.25) needs to be replaced, especially if the initial rated capacitance has dropped by more than 25%.
If the caps are rated low ESR, a D of more than 10% (0.10) would indicate a failed or failing capacitor.
A
capacitance meter with just 2 frequencies would suffice for most testing. All caps datasheets will reference values for either 120Hz or 1 KHz testing frequencies. There are special circumstances though, where testing caps at a higher frequency would be beneficial.