For li-ion, AC ESR and complex AC impedance at 10kHz, or over frequency sweeps, are discussed in academy, and used as tools by the cell manufacturers to control quality, but every time someone claims they are super useful for battery system designers (those who buy the cells and construct packs) or end-users, I have to ask, how, and have not seen any kind of answer to that question so far.
Li-Ion cell impedance is specified at 1KHz. Check some datasheets of popular 18650's, the same about specs of rechargeable NiMh. I am sorry that you are so uninformed about how useful and widespread battery ESR measurement is in certain industries. Search web/YT for "fluke battery analyzer", #1 tool for hi capacity UPS technicians.
Yes, I know all this, I know how to measure this. The question was, unless you are a cell manufacturer, what do you do with the number once you have obtained it? "I was tasked to measure it" doesn't count.
My experience is that quite crappy EOL cells can show acceptable 1kHz impedance, and you need to measure it over the whole SoC, and/or DC ESR, capacity, and possibly self-discharge current to decide if the cell has gone bad.
Evaluating SOH reliably would require some more complexity than just measuring 1kHz AC impedance and using some good/bad thresholds. As trobbins has explained above, lead acid industry has come with some quite complex algorithms estimating the SOH. But li-ion works differently and needs different algorithms. SOH estimation has been widely discussed in academy, I have skimmed through a lot of papers.
The reason why 1kHz AC impedance doesn't correlate with real-world requirements too well is simple: the cells have capacitor-like behavior due to their similar construction to an ultracapacitor. Ion transfer takes over at lower frequencies (well below 1kHz), but that is what is actually needed - if the load was happy taking just 1ms pulses, they would have used capacitors, not batteries. Real loads need DC power from the battery, and the ion transfer is supplying it. Even further; most real loads have enough bypass capacitance on their inputs that they pull insignificantly low currents around 1kHz and beyond.
But when we measure 1kHz AC Z, we are measuring the capacitance of the cell.
This is what makes capacitors and batteries so different. A capacitor is just a capacitor, but a battery is more equivalent to a voltage source with a series resistor, PLUS a capacitor with another (possibly smaller) series resistor in parallel. Add inductance to the mix, and you'll get much more complexity to the ESR over frequency curve sweeps than sweeping capacitors. And it becomes hard to see which part of the curve is caused by just the capacitance of the cell, and which is relevant for evaluating the cell's capacity to supply DC power, which is all what finally matters.
So I opt for measuring capacity and DC ESR. Which requires doing a full charge-discharge cycle to a cell taken apart from the pack. Which is very tedious, but gives you the right answers you need.
Analysing SOH in-circuit is interesting, but far more complex than measuring 10kHz Z "because that's what industry does".