Mediatomb on Ubuntu 12.04

One of the things I use Ubuntu for is to run my home server, which serves my video collection to various players around the house. This is achieved with a DNLA server that ships with Ubuntu called Mediatomb.

Unfortunately, despite having all the needed function, and being remarkably stable, Mediatomb hasn’t been under active development for the last year or two, and when I upgraded to the latest version of Ubuntu server, I discovered that a key feature of Mediatomb had been disabled; namely the support for writing import filters in Javascript.

This allows the collection of video to be sorted and filtered by their characteristics into a series of virtual folders, which can then be used by the media players to find whatever content is required. You could have folders of films sorted by leading actor, or director. Or folders of films by certification. Or date of release. The options are endless. It’s a great feature, and utterly indispensable when it’s suddenly removed.

The reason the feature is disabled is that Mediatomb depends on an out of support version of libjs, or Spidermonkey, the Mozilla Javascript engine. The Ubuntu developers don’t have time to fix this, so until the Mediatomb developers do, the Ubuntu developers have applied a quick fix and disabled the Javascript support so the package can still be shipped.

This post shows how to re-enable that Javascript support on Ubuntu 12.04 Server. It’s a bit of a dirty hack, but it will work until either:

  1. The Mediatomb developers fix this
  2. Someone (such as Raik Bieniek or Saito Kiyoshi) packages & maintains a better fix for 12.04 in a PPA
  3. This effort to replace the Javascript support with Python support completes (I couldn’t get it to compile)

The basic approach here is to rebuild the shipped package of Mediatomb, but re-enabling the Javascript support. This requires that we have a version of Javascript on the system that Mediatomb can use. The current version in the 12.04 repositories won’t work (the API’s have changed), so we need to install an older “back-level” version, which we can get from Debian, who have still been applying security fixes to it.

  1. cd;mkdir temp;cd temp
  2. sudo apt-get install build-essential, to install the tools you will need to build packages on Ubuntu
  3. sudo apt-get build-dep mediatomb, to install all the dependencies needed for mediatomb to compile
  4. sudo apt-get source mediatomb, to get the source code for mediatomb, and unpack it into a convenient subdirectory
  5. sudo vi mediatomb-0.12.1/debian/rules, and change the line that says “–disable-libjs” to “–enable-libjs” (note that those are prefixed by double-dashes)
  6. Add a new entry to the changelog file in the same directory, incrementing the version number from zero to one. This will help prevent your changes being overwritten.
  7. Get an old copy of Spidermonkey from the Debian Squeeze distribution (on which Ubuntu is ultimately based). You need libmozjs2d and libmozjs-dev, in either the amd64 or i386 versions, depending on whether you are running in 64-bit or 32-bit mode. To determine which version you need, enter the command “dpkg –print-architecture” in a terminal. Then install the appropriate packages using sudo dpkg -i packagename
  8. In all likelihood you will get an error from one or both of those installs, complaining about dependencies. To resolve them and complete the installs, simply enter sudo apt-get install -f
  9. cd mediatomb-0.12.1 and then sudo ./configure. Lots of content will scroll past, but at the end there should be a summary; look for a line that says something like “libjs : yes”. If present then you have enabled Javascript support in the build, and satisfied the dependencies. You can now install any additional dependencies and reconfigure the build further if you wish.
  10. Switch back to your source code with cd ~/temp/mediatomb-0.12.1
  11. Start the compilation with sudo fakeroot debian/rules binary. Lots of compilation messages should scroll past.
  12. When it stops, you should have three .deb files in ~/temp. You can install them with sudo dpkg -i mediatomb*.deb

Finally, switch to root (sudo su) and then issue the command echo packagename hold | dpkg –set-selections for each of mediatomb, mediatomb-common, mediatomb-daemon, libmozjs2d and libmozjs-dev. Then drop back to your user by entering control-D. This will prevent your customised packages being overwritten as part of the normal update processes (they will be “held”.)

You can now configure Mediatomb normally, including the use of custom import.js scripts by altering /etc/mediatomb/config.xml as desired.

Update: Having just been through a reboot on my server it seems that Mediatomb isn’t installed to autostart properly. To resolve this you need to run the command sudo update-rc.d mediatomb defaults which will install the various rcn.d startup and shutdown links.

Update2: I’ve noticed that sometimes after a reboot Mediatomb still isn’t autostarted properly. Turns out that there is a message in /var/log/mediatomb.log referring to The connection to the MySQL database has failed: mysql_error (2002). What this means is that if you are using MySQL rather than SQLite, there is a race condition where Upstart sometimes tries to bring up Mediatomb before the MySQL database is available. You can resolve this by editing /etc/init/mediatomb.conf, and changing:

start on (local-filesystems and net-device-up IFACE!=lo)

to

start on (started mysql and local-filesystems and net-device-up IFACE!=lo)

Upstart will then ensure that MySQL is running before attempting to start Mediatomb.

Printer statistics (ongoing)

Time has been passing. And I’ve studiously ignored the warnings from the printer, and kept on sending it jobs. Which it has kept on printing. Beautifully.

I’m now at a total of 885 impressions (348 mono and 537 colour) and despite complaints that the magenta cartridge is now also “low”, there seems to be no difference in the quality of the output. I’ve now got replacement black and magenta toner cartridges “waiting to go”, but see no reason to install them until the print quality starts to actually degrade.

I’m cynically starting to wonder if Lexmark have set the warning levels artificially early as a means of promoting toner sales. It will be interesting to see if the printer enforces replacement before the toner actually runs out, and how much toner is left in the cartridge at that point…

Printer statistics

My Lexmark laser printer started complaining about low toner in the back cartridge today. I’ve had the printer just over a year, but even so, I was surprised that it had got through 2500 sides of text (what the black cartridge is rated at) so quickly.

So I checked the statistics. The 543DN has an inbuilt web server that provides all kinds of helpful information, including the fact that my cartridges are:

  • Black: low
  • Yellow: 80%
  • Magenta: 30%
  • Cyan: 50%

There is lots of information on the pages printed, including average job length, job type, etc. It turns out that I’ve printed a total of 519 jobs, of which 474 are 1 or 2 pages long. My longest job was 23 pages.

I’ve printed a total of 312 mono A4 sides, and 499 colour A4 sides, for a grand total of 811 sides overall.

And because my “2500 sheet high capacity black toner cartridge” is nearly empty after only 811 sides printed, this is where I can point out that the old adage about “lies, damned lies and statistics” is absolutely true, and the definition of a printed side (as used by the printer manufacturers) has no standing in the real world whatsoever.

If I assume my cartridge statistics are correct, 811 real world A4 impressions costs me all of a black cartridge, 70% of a magenta, 50% of a cyan, and 20% of a yellow. A total of 2.4 cartridges at £60 each, or £144. Which is 18p an impression. Which seems expensive, but given my preponderance of colour printing, perhaps isn’t as bad as it first seems.

Upgraded study music system

Like a lot of people, I sometimes work from home. In general it’s a day or two a week, and I’m lucky enough to have a small (2m x 2m) study dedicated to that. And since it’s the one room in the house that is genuinely “mine” and no-one else’s, it’s something of a refuge as well as a place of work.

My need for music while I work has meant that over the years I’ve moved from a pair of earphones plugged into the laptop, to a Joggler acting as a DIY Squeezebox Touch. This fed a pair of powered computer speakers from its headphone socket, which produced some basic noise, but wasn’t exactly high fidelity.

For some time now I’ve been watching the discussions around Gainclone and T-class amplifiers on the internet with interest, and this month I decided to spend some of my “toy fund” on improving my study music system. In the end, I was somewhat limited by the Joggler (which I wanted to keep) in my choice of improvements, as the only analogue audio output available is a low-quality headphone output. Which meant I needed to move into the digital domain, and flow my music over the USB output into an offboard DAC and amplifier.

I looked at various options (mostly involving second-hand DACs) but then I discovered this review of the Topping TP30, which is a combination T-class amplifier and USB DAC in a conveniently small package. Further investigation led me to this further (and more detailed) review.

At this point I was pretty much convinced, especially as I was able to get one delivered from a UK Ebay supplier for only £63. My only concern was that as the output is only 10watts RMS per channel into 8 ohm loads, the speakers need to be reasonably efficient, even in as small a room as mine.


Fortunately I had an old pair of speakers from a Sony micro hifi, which looked like they might be good candidates – and were helpfully only 6 ohm loads too.

So, what does it sound like?

Surprisingly good. It’s no match for my old Audiolab 8000A and KEF reference speakers in the lounge, but it’s a massive step up from the old set of 2.1 powered speakers that I was using. It produces a very pleasant sound, tending towards the lean and analytical – which is the sound I prefer anyway. With a good recording, it can be quite revealing, and with the right track it happily grabs the beat and rocks along nicely. Volume levels are (as Rolls Royce would say) “adequate”; half volume is as much as I would ever want in this size room. My only criticism would be that it doesn’t have the reserves of power to provide sudden big slams of bass; but then I wasn’t expecting that it would.

What’s interesting to me from having done this is the minimal cost of creating a very impressive sounding system; the TP30 was £63 delivered. Speakers like mine are about £25 on Ebay. Jogglers are £50 on Ebay. Admittedly I already had the NAS with the Squeezeserver software & music store … but many people have a spare computer these days, or you could equally well feed the amplifier and speakers directly from a laptop. Hard to imagine a better value-for-money set up for a student. It’s even relatively portable.

Yellow light of death

My PS3 is one of the original 60GB launch systems; as such it’s getting a bit long in the tooth. But it still gets used every day for either streaming my movies, or playing the kids games. Or both.

Until Tuesday evening, when it suddenly beeped three times, and turned itself off, with the power light flashing red. Trying to turn it back on again didn’t work, and neither did power cycling it. At which point I did some research, and discovered that this is known as the “Yellow Light Of Death”, or YLOD for short. It’s the way the PS3 indicates a catch-all hardware problem, and basically means the console is dead, and needs to be sent back to Sony for repair

Which would be OK, but since mine is well out of warranty we’re talking a significant bill to repair it, and in all likelihood I wouldn’t get back my rather rare launch system, with all the additional hardware that helps with backwards compatibility for PS2 titles. Not good.

It transpires that early PS3 systems like mine used 90nm Cell B/E processors, which were very power-hungry hot-running processors, and Sony didn’t do a very good job mounting the heatsinks in the early systems. This resulted in the systems running hot, and eventually causing dry solder joints to form between the CPU & GPU and the motherboard, resulting in these YLOD hardware failures. These problems are much less common on the newer PS3 Slims, which user newer 45nm Cell B/E processors that generate much less heat (200w total system consumption, compared to nearly 400w in mine).

Fortunately some enterprising people have managed to find a way of fixing this by making their own hot-air reflow solder station out of a standard hot air paint stripper, and have been able to re-flow-solder all the surface-mounted components around the CPU/GPU, fixing the dry solder joints and restoring their PS3s to life. And better yet they’ve posted videos of how to do it, so others can repeat their success.

So I did.

Going slowly it took me about 3 hours to strip the PS3 down to it’s bare motherboard, heat up the components so their soldered connections re-flowed and remade themselves, and then to rebuild it afterwards. I must admit that I wasn’t expecting it to work – but then again I had nothing to lose beyond a spare evening and £5 of thermal compound to remount the heatsinks with.

But the result is that the PS3 is now working again, and in practice, probably better than it has for a long time; by resolving the poorly mounted heatsinks and using a better quality of thermal compound the fan can now run much more slowly, and still provide better cooling. So it’s much quieter while streaming movies. Which is perfect.

Except I rather fancy getting a second generation Apple TV and hacking XBMC onto it to produce a really smart media hub, and fixing the PS3 has just made justifying the expenditure on that a lot harder!

International Broadcast Conference

As part of my new role supporting the Media and Entertainment industry for IBM, I’ve just spent a week in Amsterdam at the International Broadcast Conference (IBC2010), which was a simply fascinating experience.

Of course, I’m primarily there to work – and in my case I was mainly acting as a host for some of my customers who were attending the show. But I’m also still learning how the broadcast industry works, so this was a great opportunity to do a lot of self-education, see a lot of the basic broadcast technologies “up close”, and talk to the suppliers to understand how they are used. I was also able to network with some of IBMs senior technical and executive team who were attending from around the world, and spend time learning in detail from them about the IBM technologies that they were at the show to demonstrate to our customers.

There was no doubt as to the theme of the show … everywhere you looked was 3D. From the cameras that shoot it, through specialist electronics that “fake” it, to specialist displays that show it, there was absolutely no escaping it. What was interesting (to me anyway) was how unconvincing I found it. The effects were very impressive, but somehow it didn’t seem to add very much to the overall experience. Worse, I found that after a relatively short time watching something in 3D, I started to feel slightly ill – something like a cross between motion sickness and a headache. I had the same result whether I was watching an active system (with the shuttered glasses) or a passive system (with the polarised glasses). After several days of watching these systems in action, I’m not convinced that the technology is really ready for the home; it doesn’t work as well for an “off center” viewer, it’s inconvenient (at best) for anyone who wears spectacles, and (from my informal polling of other attendees) a goodly proportion of people don’t actually find the effect very pleasant.

Add in the significant investment required in new equipment (a new TV, plus several extra sets of glasses) and it will be interesting to see if it really takes off as quickly as the industry would like.

On the other hand, some of the 4k resolution equipment was simply stunning. I spent a few minutes standing next to a Panasonic 152″, 4k (4096 x 2160) 3D plasma display. It was the size of a large wall, and yet even standing with my nose almost pressed to it, I still couldn’t see the individual pixels. The quality of the images it was displaying were simply unimaginable – very, very, impressive. I’d really like to think it could be the next big thing after the industry has got itself over 3D.

Hifi disaster, rescued by KEF (part ii)

Just to conclude this, I disassembled the speaker with the failed HF driver unit, disconnected the HF drive unit from the crossover unit, and measured the resistance through the suspect HF drive unit coil, which proved to be open circuit – ie, dead. So I went ahead and ordered a matched pair of replacement HF drive units from KEF, for £78 delivered the next day.

The units dutifully turned up the next day, along with detailed instructions on how to replace them. In total it required about an hour, mostly because it involved some soldering of connections inside the speaker, which although straight-forward, was a bit physically awkward.

And the good news is that the speakers are now sounding as good as new again.

However, when that HF unit initially failed, I had started thinking of replacing my speakers with a 5.1 speaker set. And the thought of moving to a surround sound set-up has stuck with me, to the point where I started investigating how to create a 5.1 system using my KEF Reference 101/2s as my L+R front speakers.

After some good advice from the avforums website, and a chat with the historic speaker experts at KEF, I was advised to try to find another pair of Reference 101/2s for my rear speakers and a Reference 100c, or 90c for the centre (dialogue) speaker.

By sheer fluke, I managed to find a pair of 101/2s up for sale locally (the only ones I’ve seen so far) and managed to get them for £90 – which seemed a good price. I also found a KEF Reference 90c centre speaker on ebay for £65 delivered. I now need to decide on my AV receiver and a HiFi rack (these look nice!), sort out some wall brackets, and ultimately, whether I need a subwoofer or not. If I do, then I hear very good reports about these ones.

So far, between repairs and purchases, the speakers for my fledgling surround system have cost me £235. Admittedly they are not brand-new, or modern “lifestyle” designs, but they are all top quality, full-range speakers that would have cost nearly £1500 when new.

It’s not how I’d originally expected this to work out, but it is proving exceptionally cost-effective!