For a commercial user $1500 is a pittance.
It exceeds the limits for discretionary spending, therefore requiring approval (possibly through several layers of management), careful accounting of depreciation, and hopefully prior inclusion in this years budget. True, it is dwarfed by the cost of a programmer, and if it's known ahead of time to be required for a product design, it shouldn't be a significant part of the overall budget, but "pittance" is a bit of an understatement. (I spent about $1200 on a computer for work once. The CFO was NOT HAPPY with me.)
(and this gets even more complicated if you're a moderately large company, and have to start worrying about $1500 per seat instead of just $1500. Or a license complaince scheme that may or may not be compatible with your infrastructure or culture.)
I had this epiphany recently. A professional software or microcontroller developer is essentially indistinguishable from a hobbyist when presented with a brand new chip architecture. This hypothetical time period where your 6-figure-salaried expert gets to order half-a-dozen different "eval boards", each with their own development environment and $1500 price tag, so they spend a couple months figuring out which one is going to "best fit" the project? It doesn't usually exist. Chip selection is more likely to be made based on "prior experience" (corporate or individual.) So a manufacturer needs to get a $30 eval board with free development tools to the people who might use them well before product development starts. So that they can buy it from petty cash, or with their own funds, and when the NEXT product cycle comes along they might think "I was playing with X using the Y IDE and it was pretty neat; add it to the list of "possibles.""