Synchronous vs. Asynchronous
Synchronous vs. Asynchronous
asynchronous communication
The technical theory behind asynchronous and synchronous communication is
effectively the same: device B needs to know when a transmission from device A
has started, and if it was understood correctly. However, the difference lies in
how the data transmission is broken down.
In context, lets look at this in terms of a conversation between two people.
With asynchronous communication you would need to stop after every word to
make sure your conversational partner understood what you have said, and they
knew you were about to speak the next word.
With synchronous communication, you would establish with your conversational
partner that you are speaking English, that you will be speaking words at
measured intervals, and that you would be speaking a complete sentence or
paragraph, before pausing to confirm they understood. You would establish with
your listener beforehand that any odd noises you make during the conversation
or between sentences (coughs, umms, errs etc) should be ignored.
Clearly the synchronous method is much faster, even though starting the
conversation may take slightly longer. In fact, by replacing the start, stop and
parity bits around individual words with start, stop and control (processing
instructions and error checking) sequences around large continuous data blocks,
synchronous communication is about 30% faster than asynchronous
communication, before any other factors are considered.
Synchronous data transfer: sender and receiver use the same clock signal
o supports high data transfer rate
o needs clock signal between the sender and the receiver
o requires master/slave configuration
Asynchronous data transfer: sender provides a synchronization signal to
the receiver before starting the transfer of each message
o does not need clock signal between the sender and the receiver
o slower data transfer rate
Notes:
There are many serial data transfer protocols. The protocols for serial data
transfer can be grouped into two types: synchronous and asynchronous. For
synchronous data transfer, both the sender and receiver access the data
according to the same clock. Therefore, a special line for the clock signal is
required. A master (or one of the senders) should provide the clock signal to all
the receivers in the synchronous data transfer.
For asynchronous data transfer, there is no common clock signal between the
sender and receivers. Therefore, the sender and the receiver first need to agree
on a data transfer speed. This speed usually does not change after the data
transfer starts. Both the sender and receiver set up their own internal circuits to
make sure that the data accessing is follows that agreement. However, just like
some watches run faster than others, computer clocks also differ in accuracy.
Although the difference is very small, it can accumulate fast and eventually cause
errors in data transfer. This problem is solved by adding synchronization bits at
the front, middle or end of the data. Since the synchronization is done
periodically, the receiver can correct the clock accumulation error. The
synchronization information may be added to every byte of data or to every
frame of data. Sending these extra synchronization bits may account for up to
50% data transfer overhead and hence slows down the actual data transfer rate.