Skip site navigation (1)Skip section navigation (2)
Date:      Tue, 13 Jan 2004 13:16:41 -0800
From:      "Crist J. Clark" <cristjc@comcast.net>
To:        freebsd-hackers@freebsd.org
Subject:   Measuring Time on Serial Port Events
Message-ID:  <20040113211641.GA99925@blossom.cjclark.org>

next in thread | raw e-mail | index | archive | help
I'm doing some work involving measuring latencies of communications
over serial ports. To avoid clock synchronizations issues if we were
running on separate machines, a configuration is a modem hooked into
/dev/cuaa0 and another in /dev/cuaa1. We talk to the modem on cuaa0
which calls the modem on cuaa1, we tell it to answer, and then we
throw data back and forth and take timestamps.

Right now, all of the code is running in userland.

I am trying to figure out what tuning we could do to get things as
accurate as possible. That is, the information we want is the time
that a bunch of bits to leave the COM port versus when they arrive on
the other one. Since things look more like,

   Userland    |    OS      | Comms Hardware |
               |            |                |
 [measuring]<->|<-[ sio  ]->|<---- UART ---->|<------->
 [ program ]   |  [driver]  |                |

And this doesn't account for delays between when we get the data in
userland and then have to make gettimeofday() calls for timestamps and
other potential delays.

I'm concerned how far off our measurements in userland will be from
when bits actually arrive and leave on the wire. The data we are
concerned with has latencies of a few 100 ms, but calibrations on the
PSTN are a typically 50-ms-ish. We need to have a few significant
digits below that.

Any pointers?
-- 
Crist J. Clark                     |     cjclark@alum.mit.edu
                                   |     cjclark@jhu.edu
http://people.freebsd.org/~cjc/    |     cjc@freebsd.org



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?20040113211641.GA99925>