The goal of this project was to retrofit old car with direct TPMS system reasonably cheaply, but use something dependable than these Aliexpress-special bottom of the barrel $20 kits like that:
Due to extra weight they are probably no-no to use with rubber valve stems, and with really tiny battery they are probably more hassle than it is worth.
I would like usual sensors with larger batteries, mounted inside the tyre. One of the OEM sensors that can be bought cheaply (like $5) on eBay-like sites, is Opel 13581561 (aka EG53MA4):
They are manufactured by Schrader, use OOK modulation, and protocol decoding is already supported in rtl_433 project.
It looked like both cheap and reliable solution, so I ordered one for testing. However it quickly turned out that the one that arrived didn’t measure pressure at all!
I think these sensors sold for $5 are most likely:
The one I got arrived in genuine-looking packaging, and reason it didn’t work was that measuring hole was covered by… glue? Additionally, stamped manufacturing date was 2014 which didn’t bode well for longevity. I fixed it by removing the unexpected glue by pliers, but that clearly wasn’t the reliable source of the sensors I was seeking.
I did develop the receiver anyway using this suspicious sensor as the reference, but for mounting inside tyres I bought Autel MX-Sensor:
These are universal sensors that have the ability to function as drop-in replacements for huge list of OEM sensors. Developed in China, but looking decently professional (certainly better than Aliexpress no-names!). Unfortunately this raises cost quite a bit, with each sensor costing around $30. That still beats buying proper OEM sensors, which are around $100. They need to be configured with their propertiary programmer, but buying that wasn’t necessary because I had already selected protocol I want, and sellers offer to program these while ordering (for around $2 per sensor).
EG53MA4 protocol use OOK/ASK modulation (ie. AM for digital data). Data is encoded using Manchester code, with symbol rate of 8 ks/s (125 μs per symbol), yielding effective data rate of 4 kb/s. Total frame size is 15 bytes (30 ms), consisting of:
0xFF
.Each packet is transmitted 6 times with 140 ms intervals.
Using RTL-SDR with previously mentioned rtl_433 software is possible, but would be inexcuseable overkill. I had in mind simple dedicated OOK receiver paired with small microcontroller, printing decoded data through UART line (which would be connected to Linux SBC with HDMI display, possibly more on that in future article).
I first went with nearly the cheapest receiver possible, RFM210LCF (for $1.3) which are HopeRF modules containing single-chip CMT2210LC receiver.
Functionally they are equivalent to the once-popular crappy superregenerative receivers built from discrete components and single op-amp: you supply power, and you get single line with demodulated OOK data coming out. These however are a bit more advanced, they have crystal oscillator and receiver chip doing RF mixing magic. They have AGC circuit that works continuously, which means that it always outputs something, even if it’s just noise.
I wired up receiver using one of the devices from 0-series AVR, ATtiny804. They are nice upgrade from the previous series, and probably the best feature is the single-wire programming interface. Unfortunately they lack support in upstream avr-libc, and avrdude also doesn’t work with Atmel-ICE programmer in UPDI mode. This sadly means it is necessary to use Microchip-supplied toolchain.
Software is quite simple, maybe a bit too much.
Timer is configured in frequency-counter mode (on input event capture counter value to register and reset counter),
with edge detector on signal input pin set as the event source. Because edge detector cannot be set to both edges,
interrupt handler flips the detector configuration each time. It then listens for 10 valid length half-bits
in the packet preamble, at which point it considiers itself synchronized with the incoming signal. Detection of start of packet payload
relies on assumption that the first two most significant bits are always 0b01
.
Decoding of frame contents uses the property of Manchester code that there is level transition in
the middle of each bit. Interrupt handler
checks whether time measured from previous edge corresponds to half- or whole-bit period,
and if it expects to be currently in middle of a bit it captures bit value. After capturing
whole frame it verifies the checksum, hex-encodes whole packet into ASCII and dumps it through UART.
Wiring is exceptionally simple (extra output was added for debugging):
VDD [==========] GND
DBG PA4 [==========] PA3
PA5 [==========] PA2
PA6 [==========] PA1 INPUT
PA7 [==========] PA0 (prog)
PB3 [==========] PB0
UART PB2 [==========] PB1
One thing this implementation completely neglects is bit synchronization: any glitch during level transitions will ruin whole packet. However when I looked on signal traces that wasn’t even the biggest problem. Intervals in demodulated data edges weren’t very stable, additionally it seemed biased so that high levels were disproportionately longer. This caused short and long symbols to become impossible to tell apart from each other. Here’s capture from receiving 4 kHz 50% duty cycle square wave:
With that in mind, the program got asymetric pulse length comparison kludge added, which somewhat improved reception range:
// nudge ambiguous pulses depending whether it was high or low
if (edge)
bit_long = (interval > ((187 - 40) * 5)); // low pulse
else
bit_long = (interval > ((187 + 40) * 5)); // high pulse
Road test of this receiver were disappointing, with high packet loss. Frames were received only occasionaly, especially from rear wheels. Total number of received packets from each sensor during around 10 km of driving was: (it was tested with receiver placed under passenger seat, with this antenna):
Remember that sensors transmits every payload 6 times, so amount of useful data is even less than that!
The thing is, the receiver I’m using is specified for “Data Rate: 1.0 - 5.0 kbps”, but I want to receive symbol rate of 8 ks/s. Internally configured RSSI peak detector settings are likely too slow for this rate, causing these problems. Clearly, another receiver was needed.
It would be possible to keep the same decoding method and microcontroller software, by using similiar module but instead with chip variant variant equipped with EEPROM that keeps configuration, which would allow to configure RSSI peak detector for proper symbol rate. That seemed like lost opportunity though, because these more advanced parts also usually come with integrated packet engine with FIFO and external command interface. I wasn’t entirely happy with previous method because of previously mentioned lack of bit synchronizer, so I decided on changing the architecture.
I considered RFM219S, containing CMT2219A chip, but rejected it for two reasons. First, only documented way of configuring these is exporting register dump from very Chinese-looking Windows application. Secondly, it does have very weird interface that’s chimera of SPI and I²C (datasheet innocently calls it “4-wire SPI”, but two of the signals are named SDA and SCL… huh?).
Finally I decided on RFM65W module, which apparently contains SX1239 under the HopeRF disguise. It uses conventional SPI interface.
In this architecture most of the work is done by the receiver chip. Microcontroller software should only write configuration values to the registers, wait for interrupt, and then read payload from FIFO buffer. There are a few caveats though.
In these packet-oriented chips AGC (Automatic Gain Control) works differently.
Previous receiver behaved like AGC was enabled all the time and adjusting gain continuously.
This was convenient because we could listen continuously to be triggered by receiving signal
looking like valid preamble.
Contrary to this, in RFM65W AGC measurement is one-shot affair which is only performed during the short time after receiver is
taken out of WAIT
state when signal RSSI exceeds previously configured RssiThreshold.
We must leave the receiver in the WAIT
state most of the time, ready to be triggered
by RSSI exceeding configured RssiThreshold
, because only then packet will be received with proper gain set by the AGC.
This presents a bit of conundrum: when threshold is set too high valid packets might will get ignored; when set too low
receiver will be regularly triggered by noise, which will cause AGC to tune gain to noise floor
and packet engine will wait forever for preamble which will never arrive. This is handled by configuring the Timeout
interrupt which fires after receiver exits WAIT mode, but packet is not received for 32 ms. Microcontroller will then issue
RestartRx command placing receiver back in WAIT mode.
To avoid hardcoding RssiThreshold
(I’m expecting that noise floor might change with environmental conditions),
software performs dynamic adjustment dance: when spurious trigger is detected, threshold is increased.
However when for full one second no spurious triggers are detected, threshold is lowered. This should settle at
equilibrium point not much above noise floor, but high enough that receiver dead-time caused by spurious trigger isn’t overly high.
Side note: Datasheet recommends:
in OOK mode, the AGC will give better results if performed while receiving a constant “1” sequence
Unfortunately data frame preamble is 0101… pattern and we can do nothing about it, we have to live with this.
Another difficulty is that to use packet matching by sync words I need to write expected bytes to module registers.
It’s a problem though, because packet payload begins with unknown (possibly flags) field directly after the preamble.
While from all sensors mounted on the wheels I receive 0x4D93
, that might not be safe to rely on
as on some captured frames during bench testing there is 0x4C96
in this field. If this is really flags field,
it would be pretty bad to lose frames when eg. battery discharges causing some changes in this value. As such,
only other element available for matching is the preamble itself, but being uniform pattern of 0101…,
and the receiver having delay between detecting signal and properly receiving bits it will cause random
bit shifts on the whole packet. I handle this by not using receiver chip
built-in Manchester decoding, configuring sync word to 0xAAAAAAAA
(4 of total 10 symbol-bytes in the preamble),
packet length to remaining maximum 26 symbol-bytes and then trimming unnecessary preamble and shifting
bits into proper positions in microcontroller software,
again relying on the assumption that the first two most significant bits of the payload are always 0b01
.
Another wire is configured for the PayloadReady
interrupt, which tells microcontroller to drain the FIFO, bit shift
packet into the right place, verify checksum and dump hex-encoded ASCII packet through UART interface.
Wiring on this version is a bit more involved:
VDD [==========] GND
SS PA4 [==========] PA3 SCK
PA5 [==========] PA2 MISO
DIO1 PA6 [==========] PA1 MOSI
DIO0 PA7 [==========] PA0 (prog)
RESET PB3 [==========] PB0
UART PB2 [==========] PB1
Luckily, road test results in comparable conditions are much better:
Using the fact that every payload is transmitted 6 times I can calculate packet loss for each sensor:
There’s still significant difference on rear wheels, but keep in mind these numbers are packet loss, but not unique payload loss! It’s not guaranteed but after looking at the data it seems very likely that no unique data payload was lost.
According to the rtl_433 source code, pressure value is expressed in 0.025 bar per bit, and temperature in 1 °F per bit. I doubt these values are correct though. Pressure readings would be too low, and temperature too high. From my limited testing, and comparison of various guessed somewhat-round factors, I believe the correct conversion for pressure is 0.4 psi per bit. I’m not yet sure about the temperature.