Archive for category Ubuntu

A dual-camera server for Zoneminder

A while ago I managed to acquire a used Asus WL500G Premium wireless router for $50. Its a rather useful device as it is supported by OpenWRT and has two USB 2.0 ports. That’s quite unusual for a wireless router and it was a lucky find. My idea was to use it to run two USB webcams as IP cameras connected over the household LAN to a machine I have running the very good Zoneminder security software. The idea was to get better resolution than the conventional analog CCD camera + capture card system that I was using, which tops out at about 640 x 480.

I’m going to describe the process I went through to get this working in the hope that it might be useful to anyone else trying to assemble an OpenWRT-based multi-camera server. Its a pity that relatively few devices are available that will run OpenWRT and have multiple USB 2.0 ports, but they do exist and I hope to show they are worth pursuing.

So, I bought a Logitech S5500 webcam, which was capable of 960 x 720, connected it to the router, installed OpenWRT and mjpg-streamer, and… got very mediocre results. The video stream kept breaking up, triggering Zoneminders alarms, and the image couldn’t be expanded much beyond 512 x 384 before the load on the Zoneminder machines CPU went over 1.0, indicating Imminent Problems. I assumed that the issue was with the Zoneminder machine being underpowered; it was a 256Mbyte 1.2GHz Pentium III.

So when I eventually replaced it with a nice new Pentium dual-core E6500 with 1Gbyte, I was surprised to find that the problems remained. Some checking of the Zoneminder logs soon turned up some evidence; repeated complaints of “Invalid JPEG file structure: two SOI markers”. So. Clearly either the camera, or mjpg-streamer was generating bad JPEGs…

There is a moral here; always Google your error messages. I omitted to do this, or wasn’t thorough enough.

So I proceeded to build the latest stable version of OpenWRT Backfire for the WL50oG. OpenWRTs build process has been steadily improving; here is all that was needed. I did this build on a Atom-based netbook running Ubuntu Lucid:

sudo apt-get update
sudo apt-get install subversion build-essential libncurses5-dev zlib1g-dev gawk flex git-core quilt
mkdir openwrt
cd openwrt
svn checkout svn://

When svn completes its downloads you can edit buildsystem/feeds.conf.default and save some time and bandwidth by taking out feeds you won’t use.  For this build I commented out the feeds for LuCI and the Xwrt WebUI.

cd backfire
./scripts/feeds update -a
./scripts/feeds install -a
make defconfig
make menuconfig

This will present a menu interface, in which I selected the following items: (pressing space twice to set to ‘*’, not ‘M’). I could have done all the customisation of OpenWRT after building with the default settings, but its easy to do here. I didn’t want to use the wireless connectivity, and I wanted mjpg-streamer and the ability to write to external USB drives (in case I wanted to configure the router as a time-lapse camera system), so:

Target System: (Broadcom BCM947xx/953xx)
Target Profile: (ASUS WL-500g Premium v1 (Atheros WiFi))
Base System:
     select block-mount
     deselect dnsmasq, firewall
     deselect iptables,ppp,wpad-mini
Kernel modules:
     select USB support: kmod-usb-core, kmod-usb-storage, kmod-usb2, kmod-usb-video, kmod-uhci
     select Video Support; kmod-video-core, kmod-video-uvc
     select Filesystems->kmod-fs-vfat, kmod-nls-xxxx
     select mjpg-streamer

make V=99 2>&1 | tee build.log | grep -i error

After this build (which may take some time, as it needs to download quite a lot of source), the firmware image can be found in backfire/bin/brcm47xx/openwrt-brcm47xx-squashfs.trx. The fancy make invocation ensures that if things go wrong the console should show errors and the build.log file has the complete log of the build.

Installing the image, given that OpenWRT was already running on the router, was easy. First, I set up an ftp server on the build machine with anonymous access enabled:

sudo useradd -d /home/ftp/ftp -s /bin/false ftp
sudo mkdir -p /home/ftp/upload
sudo apt-get install vsftpd
sudo vi /etc/vsftpd.conf

I edited the configuration to set anonymous_enable=YES, then restarted the server to get it to reread the configuration file.

sudo service vsftpd restart

and copied the image into the server directory so the router could find it:

cp bin/brcm47xx/openwrt-brcm47xx-squashfs.trx /home/ftp/ftp

From here its a matter of logging in to OpenWRT on the router, and telling it to load the image over ftp and update to it:

cd /tmp
sysupgrade /tmp/openwrt-brcm47xx-squashfs.trx happened to be the address the development machine had been given by my DHCP server; your mileage may vary.

OpenWRT Backfire doesn’t actually have the latest version of mjpg-streamer, but it swiftly became apparent that it hadn’t solved the problem. So my next step was to add some code to mjpg-streamer to filter out bad frames, on the assumption that these were coming from the webcam. The JPEG format is well documented (the Wikipedia article is a good source), and it it isn’t difficult to parse the basic structure without diving into the compressed data. Two SOI markers should be easy to detect.

To cut a long story short, my filter code, once running on the router, didn’t find any such thing.

At this point I did what I should have done much earlier, which was to Google “zoneminder” and “Invalid JPEG file structure: two SOI markers” a bit more thoroughly. And there in the Zoneminder FAQ is the answer:

What causes “Invalid JPEG file structure: two SOI markers” from zmc (1.24.x)

Some settings that used to be global only are now per camera. On the Monitor Source tab, if you are using Remote Protocol “HTTP” and Remote Method “Simple”, try changing Remote Method to “Regexp”.

This was indeed the fix. It was nothing to do with the camera or mjpg-streamer, but an internal problem in Zoneminder. I could now use the camera, but after only a day of testing a new problem became apparent. Some webcams, and the S5500 is one of them, have trouble with darkness. Here is a sample frame from the S5500 at night.

Night, badly rendered.

The little 8×8 blocks in this image are a giveaway that this really is a problem with the JPEG encoding. This time it was easy to prove that it was coming from the webcam, whose internal processor evidently can’t encode a uniformly black frame. Unfortunately, it doesn’t produce a consistent image like the one shown above, but a kaliedoscopic variety of them. This triggers Zoneminders motion detection, and you wind up with huge alarm events that last for hours and eat up your storage.

Fortunately, this problem is common enough that mjpg-streamer has a solution. The bad frames are all relatively small, there being no detail to encode, so a simple size test can detect them. The -minimum-size parameter allows us to throw away frames that are smaller than a certain size, and a little testing quickly revealed that at 960 x 720, any frame under 21000 bytes could be assumed to be bad. So:

/usr/bin/mjpg_streamer -i " -d /dev/video0 -r 960x720 -m 21000" -o " -p 80"

should be the solution? Well, nearly. The trouble is that what mjpg-streamer does with the bad frames is skip over them and not return a result until it gets a good frame. Which is fine for occasional bad frames, but when the camera produces a continuous stream of them – at night – mjpg-streamer may not return a result for hours. Zoneminder doesn’t like this either, and is prone to disable the camera altogether. This problem is apparently addressed in Zoneminder 1.25, but I’m running 1.24.2, so another solution was needed. The easiest thing to do was to edit input_uvc.c in the mjpg-streamer source so that instead of skipping the frame, mjpg-streamer returned the last good frame that it had. This was a matter of changing the lines

        if(pcontext->videoIn->buf.bytesused < minimum_size) {
            DBG("dropping too small frame, assuming it as broken\n");


        if(pcontext->videoIn->buf.bytesused < minimum_size) {
            DBG("dropping too small frame, assuming it as broken\n");
            /* Provide the previous frame, as this condition might last some time... */

This change isn’t really suitable for all uses as it means that the camera server may return an image of what was visible several hours ago when the scene was adequately lit, rather than current darkness. A more elegant solution would be to return a synthetic JPEG that was just a uniform field of black, but this was a quick workaround for my purposes.

Changing the code in the OpenWRT source and getting it recompile without ignoring or overwriting the changes was a bit tricky. The combination of OpenWRTs interesting build system as it interacts with mjpg-streamers makefile means that just changing input_uvc.c deep down in the build_dir directory and doing a make won’t work. After some trial and error I wrote a script to do the rebuild:

touch $MJPGDIR/mjpg_streamer.c
rm $MJPGDIR/ipkg-brcm47xx/mjpg-streamer/usr/lib/*.so
rm $MJPGDIR/*.so
make V=99 >build.log

Having fixed this, I bought a second webcam (a Logitech C905) as it was now clear the whole idea would work as intended. The differences between the two webcams when looking at the same scene were surprising. Here is the S5500:

The C905 wins on detail, but also has quite different colour rendering.

The final step was to get both webcams to start up when OpenWRT booted.

OpenWRTs initialisation is a little complicated, and is controlled by the files in /etc/rc.d, /etc/init.d, and /etc/config. The mjpg-streamer package includes appropriate entries in each of these directories to automatically start up a single instance, but there is nothing to prevent us from running two versions provided that we tell them to serve to different TCP ports. The two webcams appear as /dev/video0 and /dev/video1. So we edit /etc/config/mjpg-streamer and add a couple more config options for device2 and port2:

config mjpg-streamer core
        option device           "/dev/video0"
        option device2          "/dev/video1"
        option resolution       "960x720"
        option minimumsize      "21000"
        option fps              "10"
        option port             "80"
        option port2            "8080"                                
        option enabled          "true"

Then we add an entry in init.d; the easiest way is to duplicate the existing script:

cd /etc/init.d
cp mjpg-streamer mjpg-streamer2

and edit the second script to use the device2 and port2 options, plus its own PID file name.

#!/bin/sh /etc/rc.common
# Copyright (C) 2009


start() {
        config_load mjpg-streamer
        config_get device core device2
        config_get resolution core resolution
        config_get fps core fps
        config_get port core port2
        config_get_bool enabled core enabled
        config_get minimumsize core minimumsize
        [ $enabled -gt 0 -a -c $device ] && sleep 3 && $SSD -S -m -p $PIDF -q -x $PROG\
         -- --input " --device $device --fps $fps --resolution $resolution"\
         --output " --port $port" &

stop() {
        $SSD -K -p $PIDF

Finally we need to add a link in the rc.d directory to the new init.d script:

cd /etc
ln rc.d/S50mjpg-streamer2 ../init.d/mjpg-streamer2

This will start up the WL500 with both cameras if both are present, and serve them on port 80 and port 8080. The images and movie streams can therefore be found at:

http://<IP address>/?action=snapshot

http://<IP address>:8080/?action=snapshot


http://<IP address>/?action=stream

http://<IP address>:8080/?action=stream

respectively. Entering the snapshot URLs above into Zoneminder has produced a working system. The only remaining issue was setting up  Zoneminder to minimise the CPU usage, which is an art in itself and better covered in the Zoneminder forums…



Trouble at t’phone

I have been revising my file server/firewall/ZoneMinder system that I described in some of my first blog posts and it has all gone pretty smoothly. I bought a new motherboard and CPU, mostly to get gigabit Ethernet support, but in the hope that ZoneMinder would work better with my IP camera (another story for later post). The new board is an Asus G41-based M-ATX part with a 2.9GHz Pentium Dual-Core E6500 on it. The bump in CPU power from a 1.2GHz Pentium 3 is substantial. I was able to reuse the old HP box and power supply, and power consumption has only gone up from 45W to ~53W, which is pretty good. Its running Ubuntu Lucid Lynx (10.04) and installation and setup went very smoothly apart from some confusion with shorewall, whose default two-interface setup assumes that the local net is eth1 and the Internet is on eth0. I have it the other way around and you need to change all the config files.

This post, however, is about a new idea I wanted to try, which was to install my old Telecom CDMA phone in the server cupboard and use it to send and receive text messages to and from the server.

On the face of it this didn’t look difficult. There is a Linux/Windows/MacOS package called gnokii that does practically everything with a phone, provided that the phone supports Bluetooth or has a data cable, that you could possibly want. My Nokia 6165i has Bluetooth, so the first step was to buy a cheap Bluetooth adapter (Dick Smith, $20), plug it in, and fire up Bluetooth support:

sudo apt-get install bluez

Once I had that, I turned on the phone, told it to be discoverable on Bluetooth (down in Settings/Connectivity/Bluetooth/Bluetooth Settings) and told Bluetooth to look for it:

hcitool scan

Scanning …
00:12:D1:3F:00:8F    Michael H 6165i

It found it! Now to get gnokii:

sudo apt-get install gnokii

The version of gnokii that comes with Ubuntu 10.04 is 0.6.28, which has its preferences in ~/.gnokiirc or /etc/gnokiirc. Later versions change this, but for the moment it was enough to edit the heavily commented /etc/gnokiirc and use the Bluetooth ID given by hcitool so that the following lines were active:

port = 00:12:d1:3f:00:8f
model = series40
connection = bluetooth

debug = on

Debug is useful if you run into problems, as I did…

gnokii –identify

should then talk to the phone, whereupon we run into our first problem. The phone needs to pair to the Bluetooth transceiver on the PC, and if we were running desktop Ubuntu a nice dialog would come up on the desktop as well as the phone, we enter the same PIN into each dialog and all would be well. But this is running on a file server and I can’t do that. Some Googling eventually found a solution, which is to use one of the bluez examples to get the PIN code handshake done:

sudo /usr/share/doc/bluez/examples/simple-agent hci0 00:12:d1:3f:00:8f
This prompts for the PIN code, you enter it, the phone prompts, you enter the same number, all is well.

Now sending a text message should be as simple as:

echo “This is a text message from the file server” | gnokii –sendsms number -r

But life isn’t that simple. It seems to work, but the text messages simply don’t go. It took a good deal of debugging, updating, and fiddling about before I discovered the simple truth:

gnokii doesn’t support CDMA phones. CDMA is an older technology and the commands to send text messages are there — but subtly different. The only thing that might have worked was using gnokii AT mode, which I could get at by changing model = series40 to model = AT in the preferences. But the answer to that is short and final:

PDU mode is not supported by the phone. This mobile supports only TEXT mode
while gnokii supports only PDU mode.
SMS Send failed (Unknown error – well better than nothing!!)

So there we are, it isn’t going to work. It wasn’t going to work for long anyway, as the old CDMA network is going to be turned off next year, so the next step is to buy a  GSM Bluetooth phone and try again (ones with cracked but otherwise functional screens can be had very cheap on Trademe, I  see…). I hope to extend this post with better news at some point in the future.

1 Comment

My Netbook runneth over; dual booting Ubuntu on the Eee 1000HA

I have recently acquired one of the newly popular netbook machines, an Asus Eee 1000HA. This is one of a startling variety of Asus Eee models which seem to be being turned out as quickly as Asus can come up with new arrangements of netbook components. This particular version has a 10″ screen, 1G of RAM and a 160G hard drive. It comes with Windows XP.

Windows XP is all very well, but I wanted to try running Linux on the Eee as well. 160G of hard drive is more than enough to run two different operating systems, so why not? There are several Linux distributions specifically designed for the Eee, so this should be easy, you might think. As it turns out, its not completely straightforward. Hence this post.


The first hurdle you encounter is that the Eee doesn’t have an optical drive, so there is no way to burn a CD and boot from it, which is the usual way to install Linux. The 1000HA does however have USB ports and an SD card slot. It turns out it is capable of booting from either a USB flash drive or an SD card; I happened to have an 4Gbyte SDHC card which I acquired for backup already in the SD slot. I decided to install the aptly-named Eeebuntu 2.0 Standard distribution via this card.

So, under Windows, it was simply a matter of downloading the eebuntu-2.0-standard.iso file (880Mbyte). The .iso file is an archive file containing a bootable disk image, so you can’t just copy it onto the SD card and expect it to boot. An application called UNetbootin is needed to unpack it onto the SD card correctly; I downloaded unetbootin-windows-312.exe and ran it. UNetbootin presents a single window where you can select Diskimage format, and browse to select the eebuntu .iso file. The SD card appears as a USB drive at letter I:\ (UNetbootin is clever enough to preselect this for you). Clicking on OK sets UNetbootin running copying the .iso across to the SD card, which takes several minutes and may appear to hang at the extra-large filesystem.squashfs file.

Once this process completes, click on the ‘Reboot Now’ option that UNetbootin conveniently presents and as soon as Windows has shut down, hold down the ESC key. As the Eee starts up it should bring up the boot device menu. The SD card is described rather misleadingly as “USB:Single Flash Reader”. Select this and the UNetbootin menu will follow. Select ‘Default’, or wait for it select itself for you, and the Eeebuntu splash screen should follow. After a moment or so the Eeebuntu desktop comes up with the ‘Install’ icon at the top left. Opening this starts the installer. This asks some straightforward questions up to the point where it asks how to partition the disk.

Here we need to be a bit careful, because the disk layout is not quite what the installer expects. Asus for some reason divides the disk into three usable partitions; the first 80Gbyte partition contains Windows XP and all its associated files. The second is also formatted for Windows, but is empty and appears under XP as the D: drive. The third is a recovery partition for booting from when Windows borks itself as it sometimes does. The installer will default to resizing the first Windows partition to 17.5Gbyte and installing Eeebuntu in the remaining space. This is workable, but not very even-handed. I prefer to install Eeebuntu over the unused second Windows partition. To do this — assuming that your D: drive under XP is empty —  select the Manual option and click Forward.

In the next dialog, select the second partition (/dev/sda2, about 65711Mbyte) and click on ‘Delete partition’. This is necessary because the partition is formatted for NTFS, which Ubuntu can’t install to. You will also need a swap partition, and we must make space for that.  Select the resulting free space and click ‘New partition’. Make a logical swap area partition that is larger than your RAM size (1024Mbyte in my case) I used a rather arbitrary 3000 Mbyte. Then make a logical ext3 partition with a mount point of ‘/’ covering the remaining 62709 Mbytes. These should be /dev/sda5 and /dev/sda6. Select the format checkbox for the ext3 partition. Click Forward only after checking the above carefully as a mistake here could ruin your Windows install.

After answering a few more questions the install should run smoothly. On restarting (you can ignore the instruction to remove the nonexistent disc and close the nonexistent tray and just hit Enter) the Eee should display the Grub boot menu that allows you to select Ubuntu (or just wait and it will default to Ubuntu itself)

Assorted fixes

All is not completely plain sailing with Eeebuntu from the start, however.

Your first stop should be to plug into a wired LAN with Internet access and run the Update Manager. This will offer to do a bunch of updates (151 when I tried). For some reason it will complain that these can’t be authenticated when you click on Apply; you have to ignore this and apply anyway. Restart.

Under Applications/System Tools you will find an application called Eeebuntu Config. You should run it. It doesn’t have a setting for the Eee 1000HA (or it didn’t when I tried, you might get an updated version) I selected the 1000H, clicked on ALL and Execute. This runs a bunch of scripts that should customise Ubuntu better for the Eee.

If you start Firefox you may be startled by a license agreement for an add-on called DownThemAll, which you have to accept as it comes up again and again if you try to decline it. This was mistakenly added to Eeebuntu when it was built. You can remove it by accepting the license agreement, then clicking Disable and Restart Firefox in the Add-ons window that comes up when you do so.

The grub boot menu can be made friendlier (and you can pick what it defaults to and when) by editing /boot/grub/menu.lst. This needs to be done with supervisor privileges; I started a terminal session and entered ‘sudo gedit /boot/grub/menu.lst’ which gives you a nice visual editor. The file is reasonable self-explanatory.

The mouse pointer when busy is for some reason a rather ugly monochrome wristwatch instead of the normal Ubuntu rotating pattern. I haven’t been able to figure out why so far.

Wireless performance is, irritatingly enough, not very good. Signal strengths are lower and performance is spottier than Windows XP, by a considerable margin. You might wonder how such a thing is possible. It turns out to be because the open-source ath5k drivers for the onboard 802.11 wireless card don’t work very well – Atheros, who make the wireless chip, don’t distribute detailed information about it, or driver source code. So the open-source driver has been written in the dark, as it were, and its a wonder that it works at all. There is actually an end-user solution to this which involves using a tool called ndiswrapper to run the XP drivers inside Linux. But getting that working is another story…