• Subscribe

    Subscribe to This Week In Panospace by eMail.
    Subscribe in a reader
  • License

    Creative Commons License
    This work is © 2008-2012
    by Yuval Levy
    and licensed under a
    Creative Commons License.
  • Entries

    October 2018
    M T W T F S S
    « Dec    
  • Archives

Not all ‘buntus are Born Equal

Precise Pangolin, or 12.04 is a long term support (LTS) release.  One might think that particular care has been put into it.  11.10 (Oneiric Ocelot) has served me well during my first year in law school, but I thought that going on an LTS version will serve me better for the next two.  Wrong.

Contrast and compare Ubuntu 12.04 and Kubuntu 12.04.  When I boot my backup workstation attached to a 47″ FullHD TV with the installer CD, Ubuntu installs just fine, it recognizes the 1920×1080 resolution of the display and gives me a usable desktop.  Kubuntu in contrast set it up with tiny fonts that are not even readable with a magnifying glass.  Moreover, I don’t even get to update the settings: the second or third window or dialog no longer open / displays, and the old venerable 3GHz Athlon X2 is slower than a snail.  The same machine that runs smoothly with Ubuntu.  Both on the same 12.04 LTS basis.  What is wrong?

One of my keyboards is a Logitech diNovo Edge.  Bluetooth.  A long standing bug has not been fixed yet, so I must install with another keyboard and then pair the diNovo.  Up until 10.04 it used to work perfectly without the need for such workarounds.  The bug seems to be fixed in the upcoming Quantal Quetzal release, but hold your breath if it will be fixed in the long term supported Precise Pangolin, released less than six months ago.

My laptop’s wireless used to work.  Until a bug broke it.  I found a workaround: blacklist the kernel module (i.e. disable the driver in Linux-speak) that was too greedy and grabbed for himself the wireless card even if it was not able to drive it.  The bug is fixed in the upcoming Quantal Quetzal release, but the developer asked me if I needed a backport (i.e. a fix in previous versions of ‘buntu) and to mark it as invalid if I don’t.  Since I have the workaround with the blacklisted module, I would lie if I would say that I need it.  The point is not whether I need it or not.  The point is: is Ubuntu 12.04 really supported until 2017 and what exactly does “support” mean, if obvious bugs that have been fixed in the development version are not fixed in the “supported” versions?

Under such circumstances it is difficult to recommend any ‘buntu version.  Those are situations that the ordinary user should not be confronted with.  Upgrading?  What was I thinking…

Tipping Point

Embarrassing:  Monday morning I came to class, opened my notebook’s lid, and loud music filled the silent room.  I was caught by surprise.  It took me three long password entry attempts until I unlocked the screen and could hit the mute button – “Fn F8” did not work in locked screen mode.  In the meantime Tim Simenon and Mark Saunders remix of Strangelove, from the last Depeche Mode release, digital download, pumped up the beat.

Ex-post analysis: the previous evening I had my earbuds on while working on my exam summaries.  My last actions have been to close the notebook’s lid and unplug the earbuds, in that order.  Kubuntu remembers different audio settings when earbuds are plugged.  Unfortunately the sleeping notebook did not register the unplugging and on waking up from sleep did not bother to check if something has changed.  This makes the difference between a useful feature or a useless one; and between a smooth experience and an embarrassing one.

This was also the moment I decided that my next personal computer will be a Mac.  I will make the switch at the next opportunity and will relegate Kubuntu where it belongs: in the toys box.  I am thankful to Kubuntu for the great playtime, but I no longer have time to play.   I need a machine that delivers quickly and predictably.  I need to be productive.

Initially I thought I could get through law school using Kubuntu.  Indeed the only thing that I could not make work was the exam writing software.  Everything else I got working, including access to the library’s printing infrastructure, the university’s email system and knowledge base, and document interoperability with my Mac and Windows using friends.

However, this came at a cost I can no longer afford: time.  Three month into it, I am calling the experiment off.  There is no major problem with Kubuntu, just a lot of papercuts – little issues that can mostly be worked around but cost me more time than I can afford.

When I started school, in September, I had set up both my workstation and my notebook from scratch.  I bought a new hard disk for the desktop and a new solid state drive for the notebook.  I installed a clean Kubuntu 11.10 from scratch when it was still beta but quite promising.  I needed a dependable infrastructure.

My conclusion after three months (and after years of general ‘buntu usage) is that Open Source desktops do not qualify for the “dependable” qualifier yet.  And I think that a purely free organization based solely on the principles of Open Source is not well equipped to make a dependable desktop.

A few examples:

I have a Brother all-in-one inkjet and a Brother laser printer.  On my first day at law school they were both fully functional.  During the past three months, I have lost twice use of the printing functionality following updates of the cups package.  Once only the desktop was affected.  Once I also lost the ability to print to a PDF file.  I was able to re-install and re-configure things.  Not a wise use of my time when approaching a deadline for a paper that had to be submitted in print.  I also lost the scanning functionality, on the notebook but not on the desktop.

As a law student, I mostly deal with text.  LibreOffice stopped scrolling horizontally.  I can move the cursor inside the window, but it is time-consuming.  In October LibreOffice Writer stopped displaying formatting characters.  The button in the toolbar would not toggle them. Finding the workaround was time consuming, and the issue reappears spuriously.  I can work around it going to the menu Tools -> Options -> LibreOffice Writer -> Formatting Aids and checking the checkboxes.  Every time I have to do it I am tempted to save my time and buy a Mac.

At some point, gimp did not start.  I found the workaround online.  It only fixed it for my notebook, not for my workstation.

And of course I do email.  Kmail has been updated and now it only saves email addresses and not the names associated with them.  It is a major time waster to manually connect semi-random sequences of letters and numbers to actual names.  I even took the time to file a bug report.  I did not have time to migrate to a different mail client, but I am seriously considering it, especially since I get a whole bunch of cryptic errors messages from Akonadi and Nepomuk, usually just when I do not need it and they cover the area of the screen where I am typing class notes.

This duo of underlying technologies managed to fill my new 400 GB Kubuntu partition on the desktop and crash it with a disk full error.  Now I only use webmail on that machine.  Akonadi is also terrible at dealing with mobility – when I move between home and campus, or even between different campus locations, I will not see new emails until I kill Akonadi and its spawned processes and restart Kmail.  This alone is so complex that I had to write a script for it.  Why can’t Akonadi kill its spawned processes?

Another papercut I bump into constantly is the interference of the touchpad while typing.  Ubuntu has a setting for it but I have not found anything equivalent in Kubuntu.  Disabling the touchpad while typing (i.e. for about a second since the last key stroke) should be default on notebooks in 2011!

I could go on listing plenty of such small issues.  Individually they may be small and insignificant, but together they consume a lot of my time.  And I am learning that listing them is not helping me, which brings me to my conclusion that a purely free organization based solely on the principles of Open Source is not well equipped to make a dependable desktop.

Unless there is somebody who thinks of the user’s experience, end-to-end, integrating all the different bits and pieces of Open Source code into a cohesive unit that makes sense, Open Source desktops will be a waste of time, like this bug report.

In the coming weeks I need to focus on exams, and I will work around the papercuts until then.  But it is most likely that 11.10 will be relegated to be second (or third) in the boot up pecking order of my desktops soon.


This blog has been on hiatus for most of the summer.  And it is likely to stay so for an indefinite time.  Or maybe just change its frequency and focus.

Not that there was nothing to say – there was simply no time to write.  The summer went by extremely fast.  On the personal side loads of joys and satisfactions from the next generation growing up fast and discovering the world one day at a time.  One sad but inevitable moment was the passing away of my last grandmother.

The Hugin release process has stalled.  I released release candidate 4 three weeks ago but don’t consider it of good enough quality to be declared final.  I announced at the beginning of the cycle that September 5 is my last day of contribution for the foreseeable future as I am embarking on a new and exciting path: law school.  Hugin will do fine without me.  Maybe somebody will pick up where I left; or maybe 2011.2 will be one of those branches that dry out without blooming into a full release.  Trunk development has already moved on.

In August I suffered two hard disk crashes within less than two weeks, losing (temporarily) my netbook and my workstation.  In the recovery process I did not reinstall a Hugin development environment – a convenient way to focus myself on my new commitments.

Also in the recovery process, I learned a lot about setting up and optimizing a Solid State Drive (maybe I’ll write about it later).

I reinstalled Windows.  The exam software at my school works only on Windows or Mac OS X.  Since I already own an unused Windows license that came bundled with my netbook, I might as well give it a try.  If it is not good enough I may buy a Mac.  Student’s life.

Unintended consequence:  I also reconsidered my desktop environment.  Before the crashes my workstation worked well with Kubuntu Lucid 10.04 and my netbook worked well with the newer Kubuntu Natty 11.04.  Since I won’t have time to deal with the unexpected I decided that it is too early for me to jump on the Oneiric cycle and settled for Kubuntu Natty 11.04 on both the workstation and the netbook.  Or so I thought.

The netbook was no surprise.  It ran Natty already before the hard disk crash and on the optimized SSD it runs smoother than before.  Ready for school.

Kubuntu 11.04 on the workstation is an unexpected disappointment.  When more than a few windows are open, the next one will open black and display its content only after downsizing.  As the number of windows grows, the maximum size at which the next window displays shrinks.  I never encountered this phenomenon with 10.4 – same hardware (dual display and nVidia 6150 integrated graphic – this is a very old setup) and same drivers (nVidia proprietary).  I quickly replaced KDE with Unity and the problem is gone.  Extra bonus:  Unity is much faster and responsive than KDE.

KDE has served me well for the past two years.  I find KDE’s functionality works better for me than Gnome’s intentionally designed limitations.  Dolphin is a true help to access my files, while Nautilus feels more like an obstacle than a help to me.

I am aware that fully fledged desktops such as Gnome or KDE are more  taxing on computing resources than lightweight ones.  The trade off is acceptable to me as long as the system is functional and responsive and the overall user experience is fluid.  I was not aware how bad KDE compares to other Linux desktops in terms of resource waste; and how bad they all compare to to Windows.

So why not try to use Windows as my main desktop?  Honestly I find the Windows 7 experience to be better than its reputation.  Smother than my four years old memories of Windows XP.  Smoother than Ubuntu/Gnome 2.x that I used when transitioning.  But also smoother than the KDE 4.x that I am using these days.  Windows Explorer is as useless as Gnome’s Nautilus.  On the positive side bluetooth devices work seamlessly and drivers installation is no longer a painful disk-shuffling.  Microsoft has done its homework.  And yet the bare Windows 7 still feels hollow and devoid of functionality.

The two things that I am missing most in Windows 7 are

  1. An equivalent to the Debian package manager to enable easy access/addition of apps/functionalities.  I guess they will call it an “app store”.
  2. A complete toolchain to empower writing/modification of software and harness the power under the hood of the PC.

There are ways around those limitations.  A credit card helps.  Using a more advanced system helps too.  Apple delivers a complete toolchain free of charge and with the newest Mac OS X 10.7 Lion iteration there is an app store (although I am not sure about the choice inside the walled garden).

App stores are the future,  unfortunately.  Not because they are bad (one could argue that the Ubuntu repository is a big app store) but because they are being used to create walled gardens and restrict consumer choice.  That’s a battle for another day.

For now, my bottom line is that I am back to dual-booting Windows and Linux; and on the Linux side I need to find a usable desktop environment for my workstation because Kubuntu 11.04 fails miserably where Kubuntu 10.04 worked fine.

Hugin 2010.4.0beta1 released.

Hugin 2010 Logo

Yesterday I released the first tarball in the new Hugin release cycle.  The goal is to release 2010.4.0 before the end of the year.

It’s only a couple of months since the last release, but a lot has changed, in the code, in the process, and in the infrastructure.

I wrote about the infrastructure change three days ago.  The activity in the new bug tracker is massive.

In the code, the most important news is that with its own brand new control points detector, Hugin can be considered feature-complete.

To underscore this achievement, the project has given itself a new look, contributed by Cristian Marchi, that has given an evolutive face lift to the original design by Dr. Luca Vascon whose source files have been lost in time.

In terms of process, this time around we have more contributors than ever, on multiple disparate platforms.  The project will still stick to its policy of releasing source code as soon as it is good to go and leaving it up to the user communities on the different platforms to produce and distribute binaries because it does not make sense to delay the release of source code only because there are no binaries; and it does not make sense to delay the release of binaries for a platform with faster builders only because there are no binaries yet on other platforms.  However the natural and inevitable time lag between a released source package and a working binary package (which is what most users are looking forward to) is likely to be reduced for most platforms.

First to respond to my call was Matthew Petroff.  He made Windows binaries in four variations (32bit/64bit, installer and standalone zipped) available within a few hours and before anybody else.  Matthew has joined the team recently and he has done some excellent polishing work on the Windows side of thing.

Then the indefatigable Harry van der Wolf followed up.  Building for OSX is always a little bit different/special and require more effort than most other platforms.  He reported this morning that everything works and will produce the coveted bundle installer tomorrow.  What would Hugin Mac users do without Harry?

Andreas Metzler reported a “work for me” update to the Debian experimental source package.  Based on his work I will try to produce my first Ubuntu packages of Hugin for Ubuntu Lucid (my main system) and Jaunty (chrooted), and Gerry Patterson will tag-team for Maverick.

On the Fedora front things are quiet but not less up to date.  Between Bruno Postle and Terry Duell recent versions of Fedora should be covered soon.

Lukáš Jirkovský will try to use OpenSuSE Build Service, but he’s very busy and there is no guarantee.

No promises.  There is always an inevitable lag between the release of a source tarball and that of a usable binary – at minimum the time it takes for the builder to download the tarball, build it, run a minimal test, and publish it.  But we are doing our best to make this the Hugin release with the shortest delay from source to binary.

This weekend is a test run.  The really interesting run will be when we approach the final release.  Keep your champagne cold for now.

And when will somebody report success building Hugin on Android or iOS?

From Subversion to Mercurial. Part 3, Implementation Day and Beyond

If you followed the steps described in the first and second parts of this series, you should have a Mercurial (Hg) repository ready to replace your project’s Subversion (SVN) repository.  In this third and last part we’ll go over Implementation Day, with particular detail on how to implement this migration on the SourceForge infrastructure.


Can’t test enough.  Your script produces an Hg repository that looks OK on superficial investigation with tools like hg log and hg view.  But does the code build?  Hugin’s build system had a couple of dependencies on SVN and they needed to be updated.  Thomas Modes and Kornel Benko stepped up to the task.  Can developers and builders use this repository?


On Implementation Day the project will transition from SVN to Hg.  While all relevant contributors are proficient in SVN, Hg was new territory for many.  While progressing on the conversion I kept the community informed and took every opportunity to encourage learning of the new tools, including public tutoring that continues after the transition.  You want to encourage people to share their experiences and learn from each other.  Conceptually the biggest difference between SVN and Hg is that with Hg the repository sits on your local client.  A check in to SVN is the equivalent of a check in and a push to Hg.  Offline operation is not possible with SVN but it is with Hg.  However both are revision control system (RCS) and very similar to use.

Implementation Day Overview

Warn everybody one last time.  Create a new repository on SourceForge for each migrating code line.  Lock down SVN by revoking write access to everybody but a few maintainers who will clean up after the transition.  Run once again the whole migration on a green field, from scratch, to be sure that everything to the very last SVN commit is included.  Test one last time this new local repository (compare it to previous results); create the new repository on SourceForge and push your local repository to it. Last but not least, configure the repository on SourceForge and announce the transition to the world.  Sounds easy.  The devil is in the detail.

Mercurial on SourceForge

SourceForge has been very generous with the projects it is hosting: we can have unlimited Hg repositories.  Unfortunately there are rough edges.

To activate Mercurial for your project:

  • Login via the web as a project administrator and go to the “Develop” page for your project.
  • Select the Project Admin menu, and click on “Feature Settings”.
  • Select “Available Features”.
  • Select the checkbox to the left of the “Mercurial” heading. Your repository will be instantly enabled.

This first repository will be fine but if you want to activate more than one repository you will have to manually set them to be group writable.  To activate additional repositories:

  • Log on to SourceForge’s shell service (assuming you have set up your SSH key) with `ssh -t USER,PROJECTUNIXNAME@shell.sourceforge.net create`
  • Navigate to your project’s Mercurial space with `cd /home/scm_hg/P/PR/PROJECTUNIXNAME`, e.g. for Hugin this would be `cd /home/scm_hg/h/hu/hugin`
  • Create a new directory with the name you want for the repository.  E.g. for Hugin’s website this was `mkdir hugin-web`
  • Execute `hg init DIRNAME` (where DIRNAME is the directory you just created, e.g. `hg init hugin-web`). This will initialize the new repository.
  • Inside the new repository, edit the configuration file .hg/hgrc (see configuration section below)
  • SourceForge rough edge: group write access must be given manually `chmod -R g+w /home/scm_hg/P/PR/PROJECTUNIXNAME/DIRNAME`

Configuration of the Mercurial Repository on SourceForge

SVN support on SourceForge is mature and projects are used to amenities such as email commit notifications.  Hg support is better than what the scant documentation suggests.  Most standard functionality, including email notification, works, even if it is officially unsupported.  One only has to find out how to configure it.  I played around with some trial and error already when optimizing the Enblend repository last year.  This is the hgrc file template that works for us:

changegroup.notify = python:hgext.notify.hook

from = NOTIFICATION_ADDRESS@lists.sourceforge.net

host = localhost

baseurl = http://PROJECT.hg.sourceforge.net/hgweb/PROJECT/DIRNAME

sources = serve push pull bundle
test = False
config =
template = \ndetails:   {baseurl}{webroot}/rev/{node|short}\nchangeset: {rev}:{node|short}\nuser:      {author}\ndate:       {date|date}\ndescription:\n{desc}\n
maxdiff = -1

NOTIFICATION_ADDRESS@lists.sourceforge.net = **[trusted]

users = *

You’ll have to replace your own project unix name PROJECT; your own Hg repository top directory DIRNAME; and your own NOTIFICATION_ADDRESS mailing list.  The configuration options are documented.

Committer Write Access

With a dRCS like Mercurial write access has a completely different meaning.  Everybody can `hg clone` an existing repository and once cloned has full write access and can publish their own repository.  The d in dRCS stands for distributed.  Technically there are no more hierarchies and no more central control.  All clones are equal.  Whoever owns a clone can decide to publish it on the web, e.g. with `hg serve`, and give write access to whomever they want.  Granting SourceForge write access only means that the committer can push to the repository hosted on SourceForge.  What makes a repository authoritative is user’s trust, and this is given implicitly by pulling from it.

SourceForge Rough Edges, Again

I wish there was a way to group-manage access rights on SourceForge.  I have not found it.  I needed to revoke SVN access to most developers, and grant them Hg access.  I had to click through each and every contributor registered with the project and single handedly managed their access rights.  To make things worse the pseudo-ajax web interface of SourceForge is everything but asynchronous: it reload the page after each change.  Ajax-cosmetics with underlying old technology from the last century.

One point projects on SourceForge will need to pay attention to are default access rights.  I did not find a place to change those, so every new project member gets by default SVN access right, unless you explicitly remove them.  It seems to me that the defaults on SourceForge are based on the principle of random uncoordinated historical growth.  Have they ever heard of the generally accepted principle of least privilege?  And the default file access for newly created extra Hg repositories is less than reasonable least privilege (see above).

<rant>And don’t tell me about SourceForge’s IdeaTorrent and ways to request and enhancement.  In my experience it does not work and some things on that site have been broken for years when the fix is simple, easy, and does not take much time.  Have you tried to use a SourceForge mailing list archive?</rant>


Now that everything is set, you can simply `hg push` from your local repository to the SourceForge one.  Or if you’re really confident, you can rsync the .hg directory (but don’t forget to edit the .hg/hgrc configuration file on the SourceForge end).

CMake Build System

Our CMake build system depended on SVN and after the push it was broken.  Kornel Benko and Thomas Modes fixed it.  Bruno Postle added a break in the CMake build system in the SVN repository, to warn users of that repository that newer versions are in Hg.  Harry van der Wolf updated the OSX build system.


The disruption was short.  A few hours after going live, developers started committing again, using Hg.  Builders started building and distributing again, using Hg.  The Google Summer of Code students cloned away their own copies of the source code and started working on the next major developments for Hugin.  After taking on the most complex of the code lines in the SVN first,  I migrated the remaining ones over a few hours Sunday night.  Hugin and most of its related projects live now happily in Hg and can easily be converted to other formats, including Bazaar, git, and even SVN.  Initially I thought to mirror the default code branch from Hg to SVN, but our project does not really need that.  Subversion has been made completely redundant by a newer, better, superior tool. Mercurial and its likes would not exist without Subversion, and should be seen as a continuum in the lineage rather than a break from the past.  With Mercurial, Hugin is freer than ever, and you are free to take it further on a journey to the future.  For now Hugin still lives on SourceForge, where the next critical bit of infrastructure is the bug tracker.  But with Mercurial the dependency on SourceForge; and the dependency on any single central service or person; has been further reduced.  Long and Free live Hugin.

From Subversion to Mercurial. Part 2, Mapping the Road

In the first part we started a community buy-in process to support the migration and we set out the technical stage. In this part we’ll map out the road for moving the code from Subversion (SVN) to Mercurial (Hg).

Repository Layout

Source and Target layout are most likely different from one another.  You need to test if the selected conversion tool supports the source layout.  Most tools handle standard/canonical layouts, but few repositories follow such layouts strictly and consistently over time.

The Hugin SVN repository was itself the result of a migration from an even older tool, CVS.  The subdivisions of the Hugin codeline did not follow the canonical trunk/branches/tags subdivision to the letter: We had good reason to distinguish three kind of branches: development branches, obsolete_branches, releases. Moreover the repository contained seven unrelated code lines because of the SourceForge limitation to one SVN repository per project.  The sensible choice was to separate each of the seven code lines into its own Hg repository.  In Hg, branches and tags are not part of the layout and they only need to be addressed in terms of history conversion.

History Clean Up

The next big question is how far back do you need to go?  And to what level of detail?  We decided to keep the SVN repository publicly accessible to document history.  This freed us from the need  for a detailed reconstruction of the past.

You will have a wide range of choices from painstakingly reconstructing every single past changeset to pragmatically start from scratch with a current code snapshot.  The trade-off is between effort, storage requirements, and benefits to the project.  I decided to go as far back and into as much detail as the automated tools enable me with little effort; and to step beyond that only in case the benefits outweigh the extra effort.

This meant giving up on the history of past development branches.  The nature of SVN merge operations implicitly omits carrying the history of the development branch into trunk. To fully reconstruct history one must extract the development branch and transplant it into the Hg default code line.  Maybe feasible but time consuming.

Save that time.  You will need it to comb a few knots you’ll find hidden inside SVN history.  The result of less than optimal manipulations, these knots are quickly fixed in subsequent SVN revisions so that they do not affect day to day operation.  They get forgotten until somebody has to dig up history.  We had two such knots in Hugin:

Movie files that do not belong in the repository landed there by mistake.  A few revisions later they were removed and stopped affecting daily checkout operations.  But they’re still there, represent more than  75% of the weight of the Hugin SVN repository, and will affect the Hugin Hg repository if left untreated.

We also had an unorthodox switch of a branch to replace trunk completely.  It worked well while using SVN but automated conversion tools trip over unconventional layout operations.  Luckily this only left a small cosmetic scar with the tool retained.  I decided not to spend time on cosmetic aspects and left the scar untouched.


The advent of distributed RCS spurred development of a panoply of tools to efficiently move around bits of code.  It was difficult to discern upfront which tool would work for my specific scenario.  I’ve tried a few of them and  the one that worked best for me was Mercurial’s own convert extension. Another tool that was helpful in the process was Mercurial’s hgk extension.

Edit the following lines int your ~/.hgrc file (create it if it does not exist) to activate these extensions.  You will also need the directives in the [ui] section:

convert =
hgext.hgk =
username = YOU <your@email.add>
verbose = True

Mapping Users

Changesets are committed by users.  The definition of a user in Hg differs from SVN.  We need to map SVN users to Hg users.  The syntax of the file is one user per line with a statement listing the SVN user and the corresponding Hg user, e.g.
yuv = Yuval Levy <yuv@example.com>

The following command will produce a file listing alphabetically all users that ever committed to SVN, one per line:
svn -q log | grep ^r | cut -d'|' -f 2 | sort | uniq > svn_users.txt

I used a quick script to generate SourceForge users addresses (@users.sourceforge.net) from that file, but some manual cleanup will be inevitable (and is a good opportunity to keep the buzz going and the stakeholders interested).

While it is possible to enter any thing in the username directive of ~/.hgrc, the best practice is to put in a name and an email address.  This is important to establish the legitimacy of the code committed.

Conversion Process

Mapping out the conversion is an iterative process:  set up the conversion command, kick it off, go for a walk while the computer churns through the repository.  When you come back, hopefully there is an Hg repository that you can analyze to determine the next step.  Usually the next step will be to refine some of the configuration files or conversion parameters.  Rinse/repeat until the resulting Hg repository fulfills your expectations.

I strongly recommend that you document each single step and minuscule change.  Even better: if I was to start such a process again, I’d keep a shell script to run everything from scratch to the reconstruct the current state.  You will find yourself going back to the same operations again and again, sometimes days or weeks later. Memory may betray you on small details.

Convert, Again, Again, and Again.

The basic command to convert a repository is
hg convert --branchsort --config convert.svn.branches=hugin/branches --config convert.svn.tags=hugin/tags --config convert.svn.trunk=hugin/trunk --authors svn_users.txt --filemap hugin_filemap.txt hugin-mirror hugin-mercurial

The paths to the branches, tags, and trunk depend from the SVN repository’s layout and the intended outcome. You’ll tweak those many times.

When I wanted to add the 2010.0 release branch on top of the converted trunk, the command was:

hg convert --branchsort --config convert.svn.branches= --config convert.svn.tags=hugin/tags --config convert.svn.trunk=hugin/releases/2010.0 --authors svn_users.txt --filemap hugin_filemap.txt hugin-mirror hugin-mercurial

hugin_filemap.txt is used to include/exclude paths.  To filter out the heavy movies, I used the following:

exclude "GSoC 2007/Presentation 1"
exclude "GSoC 2007/Presentation 2"

Examine The Results

When you first walk into the newly converted repository with cd hugin-mercurial, it feels empty.  There is only one invisible .hg folder.  The repository.  Use hg view to have a first look at the resulting revisions tree. You need to hg checkout a revision if you want to see more. Or delve into internals. The file .hg/shamap will list all SVN revisions with path and revision number against Hg SHA1 changeset IDs.  These are helpful in case you need to manipulate history, e.g. to skip on some revisions or to link a disconnected part of history such as a separately extracted branch with a parent and child changesets.  For such manipulations you will use the –splicemap and –branchmap options.  They point to  files, like –filemap, but work differently.  They are described in hg help convert and can help you fix the most broken of repositories.  I was thankful I did not have to deal with this – for adding the release branches into the repository it was sufficient to simply run convert again on the same hugin-mercurial target.


As you proceed, you will find your repository to improve iteration after iteration.  As soon as you have a result to show, pack it into a tarball and community contributors to download and try the repository in the tarball.  Share as much information as you can, enable them to do the same as what you did.  Unless you have unlimited time and resources, this is the only way to go beyond basic repository integrity checks.  The tests will reveal corrupted repositories, and if the contributors will go one step further and try to build the code, they will also reveal dependencies into the build system that may require the committing of specially crafted code to support Hg instead of SVN.  Keep trying and refining until you have on your hard disk an Hg repository that is ready to replace the old SVN repository. Then you’ll know you’re askready for Implementation Day.

Moved 2: From Subversion to Mercurial. Part 1, Setting the Stage

It’s less than four weeks since I drove that 26′ U-Haul truck full of stuff and I’ve had enough of moving for a while.  So why move again?  This is a different kind of move: a move to more efficient infrastructure.  To a decentralized source code repository.  Thank you Subversion, you’ve served us well over the past years.  Welcome Mercurial, a distributed revision control system (RCS) of the next generation.  In this series of three articles I describe how I moved Hugin from Subversion (SVN) to Mercurial (Hg).  In the first part I’ll describe how to kick off the process in the community and set the technical stage on your machine.  The second part deals with the technical code conversion.  The third part with the conversion aftermath and the actual switch.  Once the road is mapped out, the process is a relatively straight forward one.  I made some mistakes while mapping the road and I hope that if you find yourself in the same situation, these articles will help you prevent such mistakes.

Why Mercurial and why Now?

It could have been git, or Bazaar.  They are all equally good.  But I found Mercurial to be the one with the more mature client support, particularly GUI clients on disparate operating systems; and it is well supported at SourceForge where Hugin is currently hosted.  Our project needs to accommodate contributors using Linux, Windows, OSX, BSD and we do not want to leave anybody behind.  To get all stakeholders buy into the process I started a public discussion.

Spring was the right time for repository cleaning.  With a tight integration schedule the team merged most outstanding development branches into the main code line.  Migrating before branching out again for a new set of Google Summer of Code projects will avoid extra complexity.

For more than two years Hugin has been humming along on an asynchronous development and release process that has helped increase the capacity of the project to absorb changes.  Despite a diligent, disciplined and careful team we seem to have hit a scalability ceiling.  It may be lack of resources (except for the Google Summer of Code students during their three months on Google’s payroll, we’re all here in our spare-time) but I suspect that it is also the infrastructure and I expect Mercurial will further increase the capacity of the project to absorb changes.


One of the first questions to arise from the community was the scope of the change.  If already changing RCS, how about reviewing all infrastructure?  Hugin has been at SourceForge since inception.  A lot has happened in the project hosting arena since.  Sites like GoogleCode, Launchpad, BerliOS, GIThub offer a panoply of services – RCS; bug tracking; mailing lists; web and download hosting.  Often different implementations of the same Open Source tools.  Mostly “free” (as in beer, but beware of the alcohol)  for Open Source projects like Hugin.

The RCS, while central to the project, is just part of a project’s infrastructure.  Migrating the whole infrastructure is beyond the scope of this project.  And beyond the available resources too.   Just moving the nearly 200 open bug reports (many of which are stale or duplicate – the bug tracker needs a good spring cleaning too) to a new bug tracking system can keep a spare-timer busy for months.  SourceForge may not be the most fashionable choice, but it works for us.

Server and Client

On the one side is an existing SVN repository on the SourceForge server.  On the other side is a new Hg repository on the SourceForge server.  How do I move the code from one side to the other?  The first mistake I made was to work on the SourceForge server itself.  This slowed me down and ate their precious bandwidth.  I should have known better: SVN runs on the server sitting in my office closet.  Even that was too much overhead.  The most efficient way to go about the task is to mirror the SVN repository to a local client and work from there.

These are the steps for a K/X/Ubuntu distribution:

sudo apt-get install mercurial subversion python-subversion
svnadmin create hugin-mirror
cd hugin-mirror
echo '#!/bin/sh' > hooks/pre-revprop-change
echo 'exit 0' >> hooks/pre-revprop-change
chmod +x hooks/pre-revprop-change
export FROMREPO=https://hugin.svn.sourceforge.net/svnroot/hugin/
export TOREPO=file://`pwd`
svnsync init ${TOREPO} ${FROMREPO}
svnsync --non-interactive sync ${TOREPO}

The initial sync can take hours or more.  This is a good time to take a break.  If the sync is aborted, you may need to reset the lock state and restart the conversion:

svn propdelete svn:sync-lock --revprop -r 0  ${TOREPO}
svnsync --non-interactive sync ${TOREPO}

It’s a good idea to repeat the above two commands in a cron job or a startup job to keep in sync with the repository over time.

Your local machine is set for the job.  Keep the discussion in your community going, to get all relevant stakeholders to buy into the process. On the next installment we’ll look at how to map the road.