Skip site navigation (1)Skip section navigation (2)
Date:      Mon, 29 Aug 2005 14:07:46 -0500
From:      Sam Pierson <samuel.pierson@gmail.com>
To:        FreeBSD Hackers <freebsd-hackers@freebsd.org>
Subject:   Atheros driver and radiotap reliability
Message-ID:  <d9204e4c050829120713734e3d@mail.gmail.com>

next in thread | raw e-mail | index | archive | help
Hi guys,

I'm trying to get an accurate measurement of signal strength (preferably=20
in dBm) on a per-packet basis between two atheros cards that I have.  I
had some correspondence with the ethereal developers and David Young
and apparently there is a bug in how ethereal handles the radiotap header.
David told me that tcpdump will correctly report whatever the device driver
tells it is the correct signal signal strength, but not to trust it until t=
he
devices have been calibrated.  How does the ath driver report the signal
strength in the radiotap header?   From tcpdump, it's giving me this value:

/* taken from /sys/net80211/ieee80211_radiotap.h */
 * IEEE80211_RADIOTAP_DB_ANTSIGNAL      u_int8_t        decibel (dB)
 *
 *      RF signal power at the antenna, decibel difference from an
 *      arbitrary, fixed reference.
...

In this same file, there is a u_int8_t ANTSIGNAL reported in dBm.  It appea=
rs
as though everything is driver (and device, probably) dependent, so I'd lik=
e
to know how the driver computes this value. =20

As a side question, does anyone have an easier way to reliably measure=20
per-packet signal strength?   The area has a decent amount of traffic and=
=20
I have to be able to analyze the packets themselves, so a plain hardware
solution will not do.  Thanks,

Sam



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?d9204e4c050829120713734e3d>