For those making a living using those tools ... a free tool is simply too expensive to use.
That's not a particularly good excuse avoiding gcc; you can always pay more to reduce your costs :-)
That *is* what the various vendors who charge for gcc are supposed to be doing; providing the ease-of-use, customer-support, maintenance, and testing and "version change caution" on top of what the OSSW hackers who actually modify gcc are inclined to do. In theory, the chip vendors (Atmel, Microchip) provide similar assistance when they provide "custom" gcc-based tools.
And there is still scaling in more ways than one. When I worked at cisco, we used gcc pretty exclusively. Starting on various 68k cpus, adding PPC, MIPS, x86, Sparc, ARM, and others as time went on. In the beginning, we were looking at one poorly supported not-very-good free compiler (pcc, as shipped on various early 68k workstations?) with a new compiler (gcc) that created noticeably better code (more than 10% smaller over a large system.) As we added processors, architectures, people, and code, gcc pretty much kept up with us, and despite "looking", we didn't find any compelling competitors until Intel added bi-endian support for x86 (~2006) There were of course issues:
1) (in the early days) We had some compiler hackers. And lots of people who could argue with compiler hackers over whether various behaviors were bugs in the compiler or not.
2) We started paying (cygnus) for stable versions of gcc.
3) being able to patch the compiler and associated tools was invaluable.
4) We had a "compiler support team" that would apply our custom patches, interface with cygnus, carefully version-control and test new versions to see whether there were surprises, and integrate the compilers into what was eventually a rather baroque build environment. Also a "tools team" that did non-compiler tools.
5) keeping the same gcc compiler as we switched cpus was a MAJOR advantage; for a very large embedded system, the toolset surrounding your compiler becomes very large, and it's probably not practical to duplicate it for multiple vendors.
6) running on multiple platforms was a major advantage; no windows PC environment was likely to work as we went from 68k workstations to solaris servers to farms of linux devices and/or personal linux systems.
7) Even so, gcc underwent significant and apparently random "churn" that kept us several versions behind while we figured out how to procede. If we had had a particular compiler vendor, we might have been able to bend them to our needs, and that might have been bad for us in the long run (dealing with Intel was "interesting." (really random things turned out to be major issues: discontinuation of multi-line string literals; change in ordering of macro expansion vs string argument concatenation; Weird things. It's amazing how much stuff ends up in a large C program that isn't officially "specified" somewhere. (or that "wasn't." Having a language standards committee make a decision counter to the way you've been doing things... sucks.)
I don't know that we ever used much 'real time debugging.' I never did. We had gdb working over serial (or network) connections to the DUT, plus our own debugging capabilities (not so different from printf.) But it wasn't really a real-time system, either.
9) we reached a point where our code was pretty much too big and too monolithic for other compilers to handle. :-( Tools like eclipse that like to build and read a database of the full code base were ... extremely outmatched.
10) I don't remember "code quality" ever being a significant issue. And we had people that would pour over generated code and do heavy-duty runtime analysis. gcc might not be the absolute best compiler around, but it was pretty good. Across ALL the cpus we used it on.
11) There's a tremendous amount of effort that SHOULD go into using a compiler in a professional environment that I think a lot of smaller companies tend to overlook. (ie: you should version-control your compiler and the entire toolchain used to build your system. And do massive testing anytime any of it changes.) (It's really "not impressive" when a chip vendors IDE (v2) fails to compile their "demo system example program (originally shipped with V1 IDE.) Sigh.)
12) Being tied to a single compiler vendor for your platform is not particularly good. It's somewhat (a lot) better if that compiler is OSSW, though. (FTDI, Cypress, Zilog Z8, PIC8... All make me nervous.)
(ramble, ramble.)
So ... is gcc suitable for professional development? Hell yes!
Is it really "free" ? Almost certainly not; you always have to spend time and energy supporting your compiler. The more support you get from somewhere else, the less you have to spend on your own. Ideally, you find a vendor whose support strengths compliment your own. A compiler vendor who spends a lot of time "supporting" C beginners may not be useful if your level of tool expertise is relatively deep. If your tool expertise is NOT deep, you want a vendor and toolset that corrects for that. (I mean, I can (probably) build a ARM gcc from source, figure out which switches are needed to support the combination of features on the particular ARM chip I'm using, import (or even write) the .h files defining the the on-chip registers for core and peripherals from the vendor's tools or datasheet. But I'd really rather not...)
Is it cheaper than other compilers in the long run? A complicated question, that depends on a lot of things.
What about for ARM? (This was a thread about ARM compilers...): I would not have any hesitation about using a gcc-based compiler for ARM. However, I would not be happy about trying to build my own gcc-arm toolchain from scratch for any particular ARM chip. One of the "relatively inexpensive" gcc "plus" IDEs would be attractive, though. OTOH, I'm not religiously inclined to have a strong preference for open source compilers, either. If the company/class/eval-kit provides Keil or IAR or Green Hills (no one has mentioned GH yet?), I'm not going to throw a tantrum about wanting to use OSSW instead. (you laugh - but this is going on NOW in the UTexas Embedded System MOOC. Sigh. (Not a lot. There are about 30k people supposedly signed up for the class. Maybe a score have expressed some reservations in the discussion forums about being able to continue (the class is using Keil) without the 32k limitation, after the class is over. And one "trantrum"...))