Not sure what the debate is really about any more in this thread.
While digital could mean various things depending on the context, it generally refers to a technique for coding and processing information which at the foundation is binary (ie, a 1 or 0, the presence or absence of electricity) subject to some agreed upon thresholds and conventions.
Of course, digital might refer to digits as in human fingers and toes, or it might refer to a digital readout such as the output of a clock, and it could probably mean a lot of things to a lot of people but the vast majority of the Information Technology humans use to create Information Systems that are used to manage information (including in the form of data, text, audio, image, and video) gets foundationally managed as bits (binary digits), ie, 1s and 0s. The 1s and 0s can be coded according to popular standards such as ASCII, or Hex, etc, or using proprietary, or special purpose, or obscure techniques, but digital at it's foundation is more often than not (there are exceptions for nearly everything) binary, ie 1s and 0s.
Sometimes when trying to determine if something is on or off we might come up with can't discern/don't know or maybe even don't care. No doubt we can point to three states or conditions, or potentially n number of states and conditions. In fact we can do a lot of things a lot of different ways, but while we could do a lot of things it turns out that with just 1s and 0s we can do a huge number of things, and the simplicity, consistency, and elegance of binary has made the concept of binary the foundation of the biggest revolution in human history - so maybe instead of obscuring the reality with what are interesting if not edge cases at best, we ought to acknowledge that digital is fundamentally, at it's core, binary. I think we might be blurring bits with states and logic and codes. The distinctions are meaningful but we might be losing sight of the forest while examining the trees.
Using 1s and 0s we can represent virtually anything but given that our systems generally need to provide some practical utility the 1s and 0s get truncated into Bytes or other finite sets of bits which then in some cases come up short of analog's "continuously variable". And of course, humans are becoming very clever at developing "mixed-use" techniques which use A2D and D2A. We live in an analog and physical world that we are learning to increasingly impact through the application of digital technology - which generally, at it's foundation, is binary, ie, 1s and 0s.
Net, net: I think we can give it to radiolistener that for most people most of the time "binary" and "digital" are highly and very commonly related even if the two terms are not completely synonymous and not fully interchangeable.