Darkman's answer was one that I had used on similar systems, using rs422 or repeated rs232 between nodes (back in the early 1980's). A simple protocol that included device ID and device channel, as well as IO type and command was all that was needed, along with start and end frame characters.
IIRC, I used '#' as the start delimiter, two bytes following for device address ('00' to 'ff' as ascii two chars), a byte for command ('0'-'f') and a byte for channel ('0'-'f') followed by a state byte (depending on the command). All up the longest commands used 10 ascii chars (that's 80 bits) and the IO devices each had 8 digital ins, 8 digital outs and 4 analogue ins. Operation was poll-response, so the master station would poll each device in turn, or send out specific commands to particular devices and channels. The end delimeter was <CR>.
Testing could easily be done using a terminal, or hyperterm, as suggested. Typing something like '#011<CR>' would have the device return the 8 digital inputs, using a start delimiter of '$' (something like '$0113e<cr>', where the state of the 8 DI's was 0x3e). I ran this in a house for 15 years and did not need any error detection or correction. Some error rejection is already present by allowing only the small range of characters '0'-'f' and '#', <CR> in the message, otherwise the message is dropped.