Right, video streaming time. I wanted to have my E4 in a setup in another room and still be able to check both the measurements and check the video. And preferably without having to walk over there all the time. I already had telnet + scripts to take readings and take snapshots and such. But no video until now. (Well not true, I ran mplayer over an X11 tunnel, but that was rather lame
).
So now I have a streaming server that can be connected to over the network. If you feel like it you could even make it remotely accessible because you want to check the thermal state of your beer before going home or something.
All you need is just one piece of software, and that is
ffmpeg. Then setup a tcp server with
ffserver (working example config attached, rename to ffserver-flir.conf). When that is running you send your E4 camera stream to this ffserver using ffmpeg.
# start streaming server
ffserver -f ffserver-flir.conf
# send flir E4 stream to ffserver
ffmpeg -r 25 -f video4linux2 -i /dev/video0 -vcodec copy http://localhost:8090/feed1.ffm
That's it. Now you can connect to the stream using your favorite media player or browser.
mplayer -nocache http://yourserver:8090/flir.asf
I tested it with 3 seperate machines all displaying the stream simultaneously, and no problemo.
There were some sneaky gotchas to get it working, but those are all neatly embedded in the config + command line. Re-ordering of certain options like "-r" is at your own peril.
Forgot to mention... You can also have it start ffmpeg automatically, so all you need is start ffserver. I'd advice starting things up seperately the first time though just so you can see all the debug output in case something doesn't go as planned. At any rate, to automatically start ffmpeg just add 1 line to Feed section, like so:
<Feed feed1.ffm>
File /tmp/ffserver_feed1.ffm
FileMaxSize 2048K
ACL allow 127.0.0.1
Launch ffmpeg -r 25 -f video4linux2 -i /dev/video0 -vcodec copy
</Feed>
And another thing I didn't mention is that this uses UVC so you will have to have kernel support for that, but that would seem rather obvious.
Anyways, for most popular distros uvc should work out of the box. And another other thing ... this sends the RAW video stream directly to ffserver. This way ffmpeg does NOT do any encoding, nor does ffserver have to do any transcoding. So ffserver takes the raw stream, and directly encodes it (to a 512kbps asf stream in this case). If you want you can have ffmpeg do some initial encoding to get some compression, send it to ffserver and have that do transcoding before streaming. But the bitrate for the raw stream is pretty low so IMO the current setup makes more sense.
Doh! Also forgot to mention this ... you can check the server status by browsing to
http://yourserver:8090/stat.html