What nctnico is trying to say is the 8-bit PIC architecture is not tailored for C compilers. Yes the XC8 compiler is much better than it used to be (which caused many trauma's back then), but still is not up to C99 spec.
Back to OP's question:
In my experience some university teachers are very unwilling to bend their teaching program around to cover a wider area. That is because they invest quite a lot of time and effort to make material and work into 1 eco system for their material.
It does not mean their choice is the best. It's just 1 that was made so they avoid helping 8 students group each with their own microcontroller + hardware + eco-system. If you have to assist each student group 30 minutes to debug a flaky serial UART, then you would have a very long day indeed.
However this behaviour can sometimes overshoot in that teachers will stick they knew 20 years ago, and keep teaching 20 year old crap with no hint of modern techniques or technology. E.g. I had a intern last year that was taught ASM on a PIC16F54. There was no alternative possible, otherwise he wouldn't get a score. There is also absolutely none practical value in that, because A) That PIC doesn't have any peripherals to speak of which can be taught. B) Who programs in ASM these days?
I was fortunate at my college for Embedded systems where the teacher managing it made sure we had tools for AVR, Altera, Analog DSP, Xilinx, PIC, Philips 8051, ARM and TI. Basic embedded courses were taught on PIC first, but they later refitted courses to AVR. The teacher was quite supportive about students using the tools (if they are available) for their own or college projects, and if he wasn't too busy he was ready for questions even about MCU's he didn't know much about at all. The key here is that going over your rational thought process is a very important thing in understanding & debugging problems.
Unfortunately I also have seen some teachers that are like: "for this course you need to use VisualHDL" (which only runs on college PCs). You could deviate from the choice but then you're on your own. Great if it works, if it doesn't.. bad luck, maybe try VisualHDL next time.
It seems like your professor responded a bit grumpy today. I wouldn't let it distract you too much, assuming you asked him a question he could give a honest answer about (although Arduino=crap is also very honest opinion). What I mean is, I assume the question was about the project & not about some exact shenanigans of an Arduino library.
As for my MCU choices; most projects I've worked on are not very large volume. However, I also never choose to use Arduino. The IDE is not productive enough for me, since I am used to much more productive bigger IDE's. The Arduino libraries are fun for a proof-of-concept and hobby-grade one-offs, but IMHO on an engineering college you should refrain from those.
If you refrain from using the libraries and just use Arduino for the IDE & bootloader, you're basically programming it like any other ATMEGA328P, and maybe should just say you tested your code on an AVR because you had it at home.
For most of my projects I typically design, I would drop in a common MCU I've used before. At work we used to do all designs with a LPC1768, which was sufficient for all our needs. Lately we've upgraded to STM32F427. Cost is absolutely not an issue here, it's basically 2-4$ extra per board, but productivity & sanity is more important (remember; low volume, <500 boards per year).
However if a project has some odd important requirement, I wouldn't go through great effort to make it work on that beloved MCU. For example, I would never consider doing ethernet on an AVR anymore, since so many ARM Cortex m3 chips are around that have built in 10/100 ethernet MAC's that require a $2 ethernet phy, much less lines of code with much greater performance and integration.