If the probe cal signal works, you can transmit data across the RS-232 link but you're not seeing a signal, then a connection problem seems the most likely to me. Especially ground, since you don't need to connect a ground lead for the probe cal signal. Try switching between the signal ground and the shield of the RS-232 connector (you can sort of clip onto the screws / posts. Try a different probe or set the probe to 1x, if available.
Like I said, if you set the computer to transmit continuously, you don't have to predict where to set the trigger, because you'll have a constant square wave. Just set it, probe, and play with the controls.
I don't know how I'm going to get the computer to transmit continuously. So far I can hit a key or several keys in a row manually (or for as long as I like keying), or I can transmit a relatively long file full of characters (but even that will send a page or so in about a second). I like the idea of transmitting continuously because then to your point I could just play with the trigger until it finds something. Just not sure how to make it transmit continuously. Thx
I can use the DMM to probe Pin 5 GND (with DMM negative) and Pin 3 RxD (with DMM positive) and I get -12 volts (actually -11.98). When I connect the DMM probe to Pin 2 TxD (with DMM positive) I get +11.33 volts.
I think you're going to struggle to see anything much on an analogue scope.
Each RS232 character takes 10 bit times to transmit (assuming 8N1 data format). At 9600 baud, one character takes 1/960 sec, or about 1 millisecond to transmit. Even if you hold down a key in your terminal software and get 20 repeats/second, the line is busy for 20ms and idle for 980ms per second.
Set up the scope to trigger on rising edges, with a trigger level of 0V, and use normal (not auto) triggering. This will ensure you trigger on the beginning of the start bit, and only see the active period. Set the time base so each bit is about 1 division wide - say, 100us/div for 9600 baud.
Press a key and your scope should trigger. Most scopes have a trigger LED or similar indicator to let you know when they've triggered OK, so even if you don't see much on the screen, you should at least see the trigger light flash.
Once it's triggering OK, turn the CRT trace brightness way up, and see what you can pick out. If you can't see anything much when pressing keys in your terminal software, try sending a long file to keep the link actively transmitting more of the time.
This sort of sporadic, non-repetitive signal is just the sort of signal that digital storage scopes are much better at showing, because they can capture the trace once and then show it continuously until the next trigger comes along. The brightness and persistance problems you get with trying to examine the signal on an analogue scope just go away. If you do a lot of work with serial interfaces like this, an inexpensive DSO would be money well spent - and once you've tried one with serial UART decoding built in, you won't want to go back.
I can use the DMM to probe Pin 5 GND (with DMM negative) and Pin 3 RxD (with DMM positive) and I get -12 volts (actually -11.98). When I connect the DMM probe to Pin 2 TxD (with DMM positive) I get +11.33 volts.
If no data is flowing the signal should be marking (i.e. negative). It should only be spacing (i.e. positive) to denote 0 bits in the transmission, or a break condition. This applies to both transmit and receive signals.
I think I need to get a grip on the expected voltage when the proper two wires are connected to the computer when it is powered up but nothing is being transmitted and what voltages are expected for each "0" and each "1" as they reach the scope.
I can use the DMM to probe Pin 5 GND (with DMM negative) and Pin 3 RxD (with DMM positive) and I get -12 volts (actually -11.98). When I connect the DMM probe to Pin 2 TxD (with DMM positive) I get +11.33 volts.
If no data is flowing the signal should be marking (i.e. negative). It should only be spacing (i.e. positive) to denote 0 bits in the transmission, or a break condition. This applies to both transmit and receive signals.
dfmischler, thank you. So, are you saying (given what I described above) that when no data is flowing I should be at -12 volts? And are you saying that 0 bits (which are are part of some characters) will be represented by approximately +11 volts? If so, what about "1s"? What voltage would you expect for 1s? (Will those be treated as "markings" which are -12 volts, or something else?)
I think I need to get a grip on the expected voltage when the proper two wires are connected to the computer when it is powered up but nothing is being transmitted and what voltages are expected for each "0" and each "1" as they reach the scope.
I'm also going to try a much slower rate as suggested by amyk; 2400 to 110 or so. Any suggestions for the time div settings at these lower speeds or do you still like 100us/div?
I can use the DMM to probe Pin 5 GND (with DMM negative) and Pin 3 RxD (with DMM positive) and I get -12 volts (actually -11.98). When I connect the DMM probe to Pin 2 TxD (with DMM positive) I get +11.33 volts.
If no data is flowing the signal should be marking (i.e. negative). It should only be spacing (i.e. positive) to denote 0 bits in the transmission, or a break condition. This applies to both transmit and receive signals.
dfmischler, thank you. So, are you saying (given what I described above) that when no data is flowing I should be at -12 volts? And are you saying that 0 bits (which are are part of some characters) will be represented by approximately +11 volts? If so, what about "1s"? What voltage would you expect for 1s? (Will those be treated as "markings" which are -12 volts, or something else?)
I think I need to get a grip on the expected voltage when the proper two wires are connected to the computer when it is powered up but nothing is being transmitted and what voltages are expected for each "0" and each "1" as they reach the scope.That is correct. A logic 1 is the negative voltage (mark), which is also the standby state.
I'm not sure how familiar you are with RS-232, but basically there are two types of devices: Data Terminal Equipment (DTE) and Data Communications Equipment (DCE)
If you are connecting a DTE to a DCE then you use a straight through cable. Otherwise, a DTE to a DTE (or DCE to DCE) requires a crossover, or null modem, cable.
A computer generally has a DTE port.
This means that according to the standard, it should be transmitting on pin 3 of its port. If you are using a crossover/null modem, then you would be looking at pin 2.
Are you checking on the oscilloscope with both devices connected, or only the one?
I actually built a tester I use at work, which is basically a diode from ground through two different LEDS; one to pin 2, and one to pin 3. Whichever pin has the negative voltage on it lights the relevant colour LED. This tells me which pin is the output pin of that device (the one that lights up).
It's very rare that it doesn't work.
First up, here is that tester I made that I mentioned:
This is connected straight to the port. Being a computer, it is a DTE port and therefore transmitting on pin 3. The tester shows this with a red LED (and this is the pin I tested on my 'scope).
This is with a crossover in place. The port is now transmitting on pin 2, hence the green LED this time.
I then hooked up my 'scope, and used Realterm to transmit a constant string of capital U. I use this program because I know it outputs on the port as soon as you hit the key. I have found some terminal emulation software that only outputs once you hit 'Enter' or similar.
I have the baud rate on the port set to 1200. The scope is set to 2V/div vertical, and 1ms/div horizontal. I have it triggered off the negative edge roughly as it passes through ground.
Just to confirm... You don't have any kind of flow control turned on, right?
Turning off flow control didn't seem to help (just curious.... what impact would flow control have on such a test?)
Yay
If you want to study the effect of changing the data format, it's really important that you get a stable trace which always triggers in the same place, ideally at the beginning of the start bit. The proper trigger condition for this is a rising edge around zero volts.
One difficulty you'll encounter sooner or later is that rising edges occur at many times, every time the data transmitted changes from 1 to 0. If the scope triggers on data bits instead of just start bits, the bit patterns on successive sweeps won't necessarily line up, and you won't see a stable trace.
To work around this, you'll need to make use of the Hold Off control, which temporarily disables the trigger circuit for a period of time after each trigger event. Set the Hold Off period to be at least as long as it takes to send a complete character - 10 bit times at whatever baud rate you're using - and the trigger circuit won't re-arm until the character is complete and the line is idle again. This ensures that the first edge it sees will be the start bit.
This technique relies on there being an idle period in between characters on the line, which you'll get if you hold down a key. If you send a long file, triggering (not to mention identifying which bit is which) will be much more difficult.
Ok, 1200 baud (roughly 1200 bits per second?) would mean 10 bits would take 1/120th of a second; not sure if I calculated that right but even if I did, how do I specify precisely that amount of hold off? On my scope I can use the hold off knob by simply eye-balling when the pattern stabilizes, but I'm sure if I can key in a specific time value - or maybe I'm confused, some more