Don't care.
30 degrees to me means wear a jacket, not stripping down to your underwear.
I let Google do the conversion if I REALLY need to.
Bloody ignorant 'murricans, you are. And blue!
Anyway, I don't remember if I mentioned it before, but I've got an US-built device (an audio matrix) with a web server. The server presents internal temperature in °F, on the wire, and does °C in javascript in the client. I screen scrape the value with curl, and convert it via some awk:
tempmtx=$(curl -s http://xxxx.xxxxxxxxx.xx/nortxe_status.html | \
awk -F\' '/^xPower/ {
split($2,values," ");
printf "%.2f",(5/9)*(values[6]-32);
};')
I was initially suspicious of my code, since:
MariaDB [envi]> SELECT temp FROM readings WHERE temp != 0.00 AND id = 4006 LIMIT 10 ;
+-------+
| temp |
+-------+
| 41.00 |
| 41.00 |
| 41.00 |
| 41.00 |
| 41.00 |
| 41.00 |
| 41.00 |
| 41.00 |
| 41.00 |
| 41.00 |
+-------+
10 rows in set (0.00 sec)
...but as it turns out, the people at Extron must be reading the internal temperature in °C off the sensor on the board, as an
int, converting it to °F (with IIRC 5 digits precision) for HTTP transport, and Javascripting it back to °C in the browser, for display. I see no other sensible explanation to every value being exactly a whole amount of °C...
Edit: This was my 666th post, slightly trailing Saskia.