Has anyone got some code or ideas on how to sense the baud rate
of an Asynchronous Serial I/O automatically. Ideally this should be
done continuously allowing a RC clocked PIC to adjust calibration
as its operating freq changes.
I have considered using the start bit to give the timing of the link
(baud rate) which as you say will only work for 50% of characters.
I was also thinking in terms of timing edge transitions and dividing
this time by the bit time to give number of bits at same level (mark or
space). Has anyone else considered this rather than the more conventional
measure the link at bit rate aftre delay of bit rate/2 from start ??
I have seen several systems where it was required to press
the space bar as the first character. In serial communications,
the least significant bit is transmitted first. For a space, the
bit pattern would look like the following:
This conforms to 1 start bit, 8 data bits, no parity and two stop bits.
The processor then has six bits that are low in which to increment a
counter and calculate the serial data rate relative to it's own clock
frequency. The longer time for measurement makes for a more accurate
determination of the serial data rate.
Another example would be to use a ( , 8 , H , X , h or an x. These
characters will allow a measurement for four bits long. Althogh a
little less accurate, you only need to right shift your counter
twice ( divide by four ) in order to calculate your reload data
for your bit counter.
The BASIC interpreter in the 8051AH-BASIC chips requires you to press
the space bar to log on, and then for the real time clock to work,
you give the command xtal = xx.xxxxxx to match the crystal connected
to the micrcontroller.