Can't be answered without 1/ defining reliability and 2/ issuing a proper system analysis. There's no simple answer to this; too bad for those looking for quick recipes.
Generally speaking, the more hardware components a given system includes, the more hardware failure possibilities. Now replacing a lot of hardware with less hardware and more software can lower failure probability - but not necessarily. Complex ICs like MCUs can "fail" for many reasons as well (not just software-wise). And then there's the software... it's well known that unless your system is very simple, bug-less software is notoriously hard to get. Again, only a thorough analysis will tell whether probable software bugs can lead to worse failures than hardware failures you'd get if you just had more components in the system instead...
And for the last part nctnico mentioned, it's a very, ultra common trap. Yes hardware issues can be very costly to fix compared to software bugs, but the difference is often largely overestimated in practice - meaning, the cost of software (developing AND fixing) is largely underestimated. Just because software looks easy to fix compared to hardware doesn't mean that it will cost less in the end. As to pure development, IME it's even often the exact opposite. Unless you're working on systems that have extremely simple software, or on systems that are particularly tricky hardware-wise, software will most often cost you a lot more engineering time than hardware, and in many cases (not all of course), engineering costs overweigh most other kinds of expenses on a given project...