And which manuals would they be?
This one is good: http://www.componentsengineering.com/wp-content/uploads/pdfs/LCR-Measurement-Primer.pdf
Yes thats a good one, but let me spin it on its head now, all of the ESR charts on capacitors that I've seen have all been on electrolytics and are taken at 100Hz or 120Hz. In fact on some LCR/ESR testers even have a table of expected results printed on them, and once again they are for electrolytics at the same frequencies.
Are tables for there tables that show what a good ESR reading should be at 100KHz and also at 10KHz for these other caps?
It has always been the case in my experience that these other caps like ceramics etc have 2 failure modes open or shorted circuited. The capacitance may drift a bit over time and this can checked on the testers but nowhere have I ever come across any form of a table or even a mention about the passable ESR values for these, nor have I heard anyone mention that this is a problem until this thread. So is this a consequence of the ever increasing frequencies being used today or what?
A long time ago in the first half of the 20th century, before computers, measuring capacitors was done with manual bridges; things that looked like this:
https://www.bing.com/images/search?q=impedance+bridge&qpvt=impedance+bridge&FORM=IGREAt that time 1 kHz was more or less a default frequency for measurement of caps.
After the second world war, there was a rapid increase in electronic technology and electrolytic capacitors became the standard item for ripple filtering in power supplies. Since electrolytic caps were usually polarized, it was necessary to avoid applying a reverse voltage to them, but to measure them it was necessary to apply an AC voltage, and AC voltage goes negative for half the cycle; what to do? As it happens, electrolytic caps don't mind a very small reverse voltage for a short time so the manufacturers got together and established a standard. The standard was to apply an AC frequency similar to what the cap would see in use. This would be double the grid frequency most of the time because of the common full wave rectifier circuits. So the standard, still in use today, is to measure at 120 Hz (twice the 60 Hz grid frequency; 100 Hz in Europe, close enough) with an applied voltage of 1/2 volt RMS.
Since in the 1940's and 1950's when the standard was established, switching power supplies operating at frequencies above 20 kHz were almost non-existent, there was no need to know or measure ESR at those frequencies. But, nowadays switchers are ubiquitous, and manufacturers of electrolytics intended for use in switchers specify impedance at 100 kHz. The impedance and ESR of typical electrolytics have the same numerical value at 100 kHz, so measuring impedance is the same as measuring ESR. This fact allows a low cost instrument to measure ESR; see the long thread I reference below for details about this. The 100 kHz frequency is sort of in the middle range of commonly used switcher operating frequencies although megahertz switchers are looming.
So these are the reasons for 100 Hz, 120 Hz, 1 kHz and 100 kHz measurement frequencies. But what about 10 kHz? When the first low cost LCR meters were being designed, 100 kHz was enough harder to do than 10 kHz that 10 kHz was what you got. It was still useful to get a better idea of capacitor parameters at switcher frequencies than just 1 kHz. And a user might have another reason to measure at or near 10 kHz, and it's easy to include it in modern instruments.
Finally, LCR meters 20 years ago were still somewhat expensive and repair techs couldn't afford one. But then someone noticed that at 100 kHz a typical electrolytic with capacitance of, say, 100 uF or more has a reactance much less than 1 ohm; .016 ohms for 100 uF and less for larger sizes. Because the reactance is so low, it's common that the ESR will be greater than the reactance and so the reactance can be ignored when measuring the ESR. At lower frequencies where the reactance is larger than the ESR, a phase sensitive detector must be used to measure ESR, and this increases the cost of the meter. A method of using digital pulse techniques to measure ESR resulted in ESR meters costing much less than the typical LCR meter of the time. One of the first was the "Blue ESR meter":
https://anatekinstruments.com/products/fully-assembled-anatek-blue-esr-meter-besrMany meters using this technique have been developed since then, and they are mainly useful to repair technicians (and hobbyists) to locate defective electrolytics. Since the concept of ESR was not very useful to a repair tech when the only available meters cost many thousands of dollars, those techs were not familiar with what a good cap's ESR would be. The availability of ESR meters that a repair tech could afford meant that somehow techs would have to know what good and bad ESR would be for various electrolytics. Hence, tables of typical good ESR appeared:
https://www.eevblog.com/forum/beginners/esr-values-for-electrolytic-caps/People complain about these tables, rightly so, and in truth there is great variablilty in electrolytic capacitor ESR and the tables are really just useful as rough guidelines, which is better than nothing. The best thing to do is measure the ESR of a known good sample of the electrolytic in question.
The price and capability of integrated circuits has continued to improve over the years, and now there are fully capable LCR meters available for about the same price as the pulse technique ESR meters, for example the DE5000.
I explain in detail about capacitor ESR here:
https://www.eevblog.com/forum/projects/impedance-lcr-esr-meters/msg459303/#msg459303Also have a look at this thread:
https://www.eevblog.com/forum/projects/capacitor-measurements-on-an-impedance-analyzer/msg178362/#msg178362