Why can your local IDE build not cache downloads? What does SSL have to do with that?
It does. However, you are getting more and more specific to just one issue.
The build system (maven) will (run a fresh) resolve the dependency tree and download each and everyone of the 100s of jar files. These can and often do total to a figure best measured in Gigabytes. In some wider build systems it will download docker images each can be several gigabytes.
In amongst them are those which are either built locally, built remotely or built as part of a pipeline elsewhere and published to an internal repo.
Remembering this is development. In dev phases maybe only a few weeks long from "the first shovel" to rubbing your hands together in glee as it passes into production. So you spend quite a bit of time bumping and cross linking versions of different libraries and services.
In an ideal world all the versioning would match up perfectly and everyone would remember to bump their version numbers, but in inevitably issues occur relating to stale, stuck, out of date dependencies in the local repo.
It can take hours and hours to pin down which one and why it's broken, delete it and redownload it. Just like in Windows a lot of problems are solved by "Reboot", in maven a lot of problems are solved by "Update dependencies..." and tick "Force download snapshots". Either that or delete the contents of the repo and rebuild.
There are other reasons to frequently dump your local repo. The build servers always start from a clean repo for consistency purposes there is always a defined "clean" start point. If you want to replicate that locally you need to dump your repo and invalidate your caches.
Thus, SSL is an issue as there is nothing caching the incoming jar files. If someone, like QA, invalidates an entire stack of services and libraries, forcing a re-run of those pipelines ... to say initiate a deployment to UAT... you could see maybe 30 or 40 parallel builds all basically downloading the same dependencies over HTTPS. That's a terrible waste.
"Why not use an internal repo or mirror for all dependencies?"
I have worked in companies where that is indeed the norm. They have teams of dozens of engineers full time just approving and mirroring Java dep repos. Then they have to do the same for RedHatEL repos. Python repos, anaconda repos, .Net repos. Docker repos. Helm repos.... JS/TS repos. This costs millions a year to run for a large enterprise with in house devs (like a bank). Not even government go this far. The military probably do.
The common CDN sources for these things are pretty tightly monitored and peer validated. They are not immune to "supply side attack", such as the recent scandal with "colors.js". However they are monitored by many, many security researchers across the planet. Especially after the colors.js scandal. So with so many pairs of eyes on them and basically none on your local repos, it is debatable which is more secure.
Online, webpage style CDNs are not really part of my "scene". I am a back end developer. "colors.js" would not have been anywhere near as impactful on the backend. It's rare to see a backend application which uses a public CDN at runtime. At build time maybe, but the "sealed" output must pass test before deployment. A "colors.js" incident would be immediately spotted.