• Subscribe

    Subscribe to This Week In Panospace by eMail.
    Subscribe in a reader
  • License

    Creative Commons License
    This work is © 2008-2012
    by Yuval Levy
    and licensed under a
    Creative Commons License.
  • Entries

    November 2009
    M T W T F S S
    « Oct   Dec »
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    30  
  • Archives

It’s a SONY (with Minolta Genetics)

I really should not be writing this post now. Too many critical deadlines before year’s end. But I could not resist. I upgraded. And the new toy is too attractive. Upgrading to SONY and not to Canon is specific to my situation. YMMV: you may find similarities and differences to your own situation; or you may want to skip to the rest of this post, describing what I like (and dislike) in the SONY Alpha 850 and how it plays with Ubuntu / Free software.

Why This Time I Choosed SONY over Canon

In a nutshell: another case of vendor lock-in. Sort of.

Before switching to digital I was shooting with a Minolta Maxxum 700si, hence my interest in SONY. I kept the very good Minolta lenses and accessories over the years. Adding similar capabilities to a greenfield system would set me back of at least 3000$. Their resale value had plummeted with the fortunes of Minolta’s photo division.

I also have a Canon 350D, with lenses and accessories. The Canon was primarily a business tool and the investment was limited (as in: ROI is bigger if the investment is smaller). It still serves me well but is at the end of its useful life.

The decision to go Canon in 2005 was partly motivated by the 350D game-changing nature (although I kept shooting film with the 700si). But first and foremost it was frustration with Minolta’s digital products hopelessly trailing Canon and Nikon by a generation or two, leaving users like me orphaned. Eventually Minolta went out of the dSLR business and SONY acquired the leftovers.

From the beginning it was clear that SONY was aiming for the top. And it got there. By 2008, in slightly less than three years, SONY caught up with rivals Canon and Nikon in the main markets. But that year I was not ready to buy a new dSLR yet: I needed FullHD video recording and preferred a dedicated camcorder over Canon’s 5D MkII.

In the autumn of 2009 I was in the market for a new photo camera. SONY and Canon launched two game changers: both the SONY Alpha 850 and Canon 7D offer radically more features than ever at a price tag of 2000$. And they represent two different approaches to the market.

Canon with an APS-C sensor and the continuation of the photo/video hybrids that seem to appeal to the general public. SONY with a full frame sensor and a camera completely designed for traditional photography.

Convergence of photo and video is a good thing that will happen when the manufacturers will get it right. For now SONY’s approach wins if you ask me. In my opinion Canon’s design will need another iteration or two before hitting a sweet spot. There are many. Same camera with a full frame sensor would be one. Lighter and smaller form factor like Sigma’s DP1 around the same APS-C sensor would be another one. I wish my camcorder had an APS-C sensor. For now, I don’t see myself using a dSLR to capture movies – I need a tiltable display and I need a weight distribution that allows me to ergonomically and steadily hand hold the device for a few minutes. From my perspective, the 7D is almost right for photography: only the sensor size is wrong.

SONY on the other hand might have cut a few things from it’s flagship A900 to fit the A850 in the right price envelope; but it got the most important details right. The full frame sensor expands my creativity range beyond the APS-C sensor.

SONY Alpha A850: The Pros

  • Usability. SONY has added a layer of usability on top of the recognizable Minolta DNA. I did not need to read the handbook to start shooting and even to start accessing advanced functionality.
  • Ergonomics. Most functions are quickly and easily accessible, and are sorted logically.
  • In-body SteadyShot applies to all lenses. My good old Minolta lenses not only got an extended lease of life, they are more useful when shooting hand held and in low light conditions.
  • Intelligent Preview helps deciding on the right exposure.

SONY Alpha A850: The Cons

  • SteadyShot must be manually de-activated when shooting on a tripod, else the pictures will be blurry. I fell for this one on my first shooting day.
  • Bracketing limited to -2/+2 EV, like Canon. When will SONY learn from Pentax? Also Nikon has improved bracketing.
  • No speed improvement when limiting capture to APS-C size. What’s the point, then?

SONY Alpha A850: The Nuisances

These are no real disadvantages, just stupid details that could have been handled better in my opinion.

  • Handbook lack important technical details, e.g. about the difference between the RAW and cRAW format (is cRAW lossy or lossless compression?) or the effect of creative mode on the RAW file (none – it only influences the JPEG and the default RAW conversion parameters). But who cares? Who reads handbooks anyway?
  • Memory Stick. SONY is a sore loser on this one. The dead weight and space occupied by this relic of proprietary  technology could have had better use for a second compact flash slot, with switching functionality like Canon and Nikon. Stupid but not critical.
  • Proprietary USB plug. Sure it also features a composite video on the same plug, but what’s the point of displaying the camera’s output on a 640×480 low resolution display when it already has a built-in LCD with better resolution and an HDMI output? I rather have a standard USB plug. I don’t really use the USB (nor the video output) – just extract the compact flash card and plug it into the card reader.
  • No support for my old Minolta 5400HS flash. It syncs at a paltry 1/200; and it does not set the exposure right. I’ll have to buy a SONY flash next year. Support for the 5400HS would have been too good to be true.

Software

SONY has a track record for being very proprietary about its products and file formats. I feared they would not play well with Free software and I was ready to return the camera if it could not fit in my workflow.

The good news is that SONY’s own Image Data Converter SR plays well with Wine on my Ubuntu 9.04 notebook, even at its underpowered 1.6 GHz. There were a few display hickups, most notable the Area Selection Tool. I did not try the Remote Camera Control and I did not like the Image Data Lightbox.

RAWstudio is my favorite RAW converter in Linux. I don’t know what SONY does to its RAW files, but using the in-camera’s white balance in RAWstudio yields a washed-out picture with a reddish color cast. A slightly better result is achieved with auto white balance:

The problem affects also LuminanceHDR (formerly qtpfsgui). This time the color cast is cyan.

Results

While I prefer to shoot RAW, I shoot RAW + JPEG initially until I am confident that my workflow can process the RAW files. So the following 1:1 crop is from a JPEG, slightly edited with GIMP.

The SONY Alpha 850 is truly mouth watering! But now I better go after my prior obligations…

Why I Have Not Used Twitter so Far (and Why Maybe You Should)

Dilbert.com

  1. Why use Twitter? identi.ca is Open Source (and no, I haven’t used it either).
  2. Wearing two socks of different colors is embarrassing enough without telling about it to the whole world.
  3. It used to be “verba volant scripta manent“. Is it again?
  4. 140 characters? too little for most articulate thoughts with proper grammar.
  5. I don’t need uninvited commercial interest to know who influences my thoughts and who is influenced by it. I love my privacy.
  6. Conversation? I prefer uninterrupted face to face!
  7. Shortened URLs? Where exactly are they landing?

You may want to use Twitter if you:

  1. Want to show the kids born after the fall of the wall that you can be cool too.
  2. Think you deserve the same following as <put your favorite starlet name here>.
  3. Want to learn to be concise.
  4. Are a multi-tasker that want people to notice that during a conversation they get your undivided attention… not!
  5. Care to read what others say about your brand, your products, your services. Twitter is a one-way outlet for (links to) your press releases… not!
  6. Don’t mind others eavesdropping on your current conversations in a few years.
  7. Want to show that you were “in” in 2007.

A Good Thing

http://panoverflow.com/

Earth has a Fever

October 24 was 350.org. November 11 was Remembrance Day. December 7 it’s the UN’s Climate Change Conference in Copenhagen. Three events in the last three months of 2009. How are they related?

Earth has a fever. The people of 350.org claim it is carbon dioxide and want to influence politicians at the climate change conference to quickly reduce carbon dioxide emissions beyond the Kyoto protocol targets.

So far so good, but how does Remembrance Day fit into this?

In my opinion the current focus is on symptoms, not on the root cause. Human activity drives carbon dioxide emissions. Two factors drive human activity: our lifestyle and our demographics. Sure, we can and should make our lifestyles more efficient and reduce the pro-capita carbon footprint. But the real fever affecting Earth is not carbon dioxide. It’s the demographic explosion of the human species, from two billions in the 1950ies to seven billions today.

The human being is part of nature. Like every species its demographics evolves within two limits: the resources available to feed on; and the predators feeding on it. Advances in medical sciences have left the human being with no other predator but himself. And when the times comes, when resources are scarce, he becomes a gruesomely efficient predator.

WWI and WWII were about lack of food. And now we have excess of carbon dioxide that will adversely affect the resources we feed on. Ironically, the same chemist that developed chemical warfare in WWI also developed the nitrogen fixation process that is so important to both explosive and fertilizers production. Fertilizers did away with the limiting factor of the time. We see the consequences of removing that limit today.

Maybe science will expand the available resources once again, with carbon dioxide capture and storage. But we’d be dealing with the symptoms once again. The root cause is not carbon dioxide, it’s our demographics. And the long-term solution is birth control. A few years from now, China’s one-child policy may seem liberal and reasonable. Or maybe the predator will have taken care of himself.

Itches and Scratches

I feel numb. All of the itches I had, that motivated me to contribute to Hugin, are frozen, anesthetized by a few remarks, who seems to represent the silent majority of users – at least of Windows users.

A few excerpts of his accusations:

  1. hugin distribution politics is far from the high quality of the software.
  2. Did you do anything better?
  3. you don’t make it easy for me (and for others) to keep up this sympathy [for Hugin]. And I fear others aren’t as patient as me. In other words and if you still didn’t get it: I think you distract people from using or even trying hugin.
  4. And BTW.: I won’t read any longish reply to that. If it is as short as  “I’m sorry”, I will…

The only thing I am sorry about is wasting my time on such apparently useless things as:

  • getting the project accepted to Google Summer of Code in 2007, resulting in a 75.000 US$ donation from Google over three years (thank you!) that enabled 15 student-summer projects on Hugin and related project; and more in-kind sponsoring from industry leaders Agnos and Nodal Ninja (thank you again) that helped motivating students and mentors beyond the Google Summer of Code. Running the team during two of the three participations and mentoring two students.
  • Mentoring other contributors.
  • Writing the Windows installer script for Hugin-0.7.0 when I was a user of that platform.
  • Re-engineering the project to encourage participation, resulting in an amazing documentation effort by users of all platforms.
  • Re-engineering the project’s repository, streamlining development and integration of new features, releasing Hugin-2009.2.0 in record time (and 2009.4.0 is around the corner, all of this without freezing development).
  • A few bug fixes, and little features. There was more work in progress, but at the moment I don’t feel like touching it. It’s in a tarball in a remote corner of my disk for now.
  • Organizing a Hugin art exhibition.

Actually, looking at the list I don’t feel sorry at all about what I did. I am sorry for those who ask if I did anything better. I wish I did not have such a short fuse, but overall I feel good. New, different itches are coming back to me. The numbing is temporary. I can do other things with my free time in the Free world. Life beyond Hugin.

The thing I like most about Free software is “scratch your own itches”: Contribution is purely voluntary, everybody does what they want/like to do; with little exceptions (as in: being hired or otherwise paid to do a job) there is no obligation.

To me personally, my first motivation is to learn. Life is an exciting journey and there is always something new to learn. The day I will rest my head on the pillow without having learned something new will be a sad day for me. Flat EEG.

Corollary: I like to move on to new things once the concept is proven. Not only the fuse is short. Others can pick it up from here. If my byproduct is helpful to them, good for them. If it isn’t, it’s OK for me too.

Next, I’m in it for the social fun. I’m an eccentric and I seem to find more people with comparable life experience, who are sometimes like-minded, among Open Source contributors. However social fun does not mean pleasing others. In a business relationship I will please my customers, my boss, my co-workers. In a social fun relationship, I’ll do what I enjoy doing, with people that accept me as I am. Not that I won’t occasionally scratch somebody’s else itch. In the end this is my free time and I dispose of it as I please.

Last but not least I’m in it to expand my toolbox, on two levels: as a photographer I’m pragmatic about Open Source and think that the tools complement nicely and sometimes substitute licensed proprietary tools. GIMP is unlikely to replace Photoshop in my workflow any time soon, but it has some unique features that Photoshop does not match. So I happily use both. As an apprentice coder, I’m all for Open Source. The method is better than anything else I have read or seen. But maybe that’s just because the superior proprietary method is such a well-kept secret? I doubt it.

Anyway, it is this time of the year again; and there are some important deadlines and changes on the horizon. Free time is going to be a very limited resource in the next weeks, and after that I’ll listen to my itches again.

The Most Welcoming Bed and Breakfast in London, Ontario

Londinium was established as a town by the Romans almost 2000 years ago. It was common practice for Romans to adopt and romanize native names for new settlements. Not so the Brits: when they settled North America, they brought their names with them. So there are a dozen Londons in North America, one of them in Canada.

We’ve been to the original London many time, and we visted London, Ontario for the first time last summer. We were lucky to find the most welcoming bed and breakfast in London, Just For You. Owned and operated by Ron and Gerard two Dutch expatriates, this bed and breakfast combines the coziness and warmth of a B&B with a level of service that competes with the best five stars hotels of the world. It was an inspiring place to recharge our energy before and after our busy days in the city. Gerard served us healthy and creative breakfasts. One day he treated us to Poffertjes, a sweet complement to the varied and well presented fruit salad and accompanied with original Dutch Hagelslag and real cumin Gouda.

The room was so welcoming, I felt inspired to shoot an HDR panorama and to test if recent developments have made it easier on artists to create HDR panoramas. The bottom line is: things will soon become easier. For this panorama I had to work through some issues of the tool chain.

090809bb4u01triplane

A New Approach to High Dynamic Range (HDR) Panoramas

Disclaimer: the author has been contracting with Photomatix.

Stacking and stitching or stitching and stacking exposure stacks has been discussed in the past, as a manually controlled workflow. It has been automated in Hugin 2009.2.0. As a reminder: stacking and stitching is generally more efficient but had a draw back for full spherical panoramas. Last time I tried it suffered from vortex-like artifacts at nadir and zenith. The upcoming enblend-enfuse 4.0 introduced new algorithms that may linder the problem. Time to try again.

The process in a nutshell:

  1. Shoot the brackets for a full spherical panorama using a tripod (i.e. perfectly aligned stacks).
  2. Merge the stacks to individual HDR pictures.
  3. Stitch the pictures as if they were LDR pictures (the current Stitcher tab in Hugin really need some explanation – details below).

This is how I did it, in detail.

1. Shot bracketed RAWs. Fed them to Photomatix Pro and batch processed into HDR. Optional: In the same batch Photomatix Pro can also generate an initial tonemapped version (needed to set up the stitching project).

There are many software to merge RAWs into HDR, some of them Open Source. The reason why I use Photomatix is the automated batch processing that includes fixing chromatic aberration and noise. This automates those deterministic aspects of the process and let me focus on those aspects where human creativity makes a difference.

Photomatix’ Manual recommends using a third-party RAW converter and gives detailed instructions for white balance, basic settings, and curves. I did not find any drawback in using Photomatix’ integrated RAW converter.

2. Identified control points between the images.

None of the control points detectors known to me support HDR files (out of Photomatix) as input at this time. Hence the need for the initially tonemapped images. For this specific panorama, taken with an 8mm fisheye, there were not that many images to link, so I opened the HDR images in Hugin’s Control Points tab and clicked myself through them. A passage through that tab is anyway mandatory to establish vertical control lines.

3. Stitch in “Normal” output mode.

This is the confusing part. Currently Hugin’s Stitcher tab has three main output modes, each with their variations: Normal; Exposure Fusion; and HDR merging. Intuitively, this is HDR, right? but it’s not merging. We (the Hugin team) need to do our homework and improve the interface.

Another confusing point is the output format selection for the “Normal Output”. The only options available are TIFF, JPEG, PNG. Don’t worry, choose TIFF and the result can be loaded and tonemapped in most HDR software, including Qtpfsgui (soon to be renamed Luminance) and Photomatix (correction: with Photomatix Pro 3.2.6 I had to open it with Photoshop first and save as EXR). It would be nice if we had also EXR there, like for the merging process. And Radiance HDR too. Or at least an indication that the TIFF output will be 32bit.

Hickups And Fixes

The resulting HDR equirectangular had an artefact at the Zenith. Enblend 4.0 pre-release, with default settings, produced a dark speckle instead of a vortex. Smaller, but still disturbing.pitchedcp

pitched_3.2

pitched_4

pitched_4noopTo study this, I pitched the panorama down 90°, bringing the Zenith to the middle of the equirectangular, on the equator (and consequently the Nadir on the 360° at the equator). And to make it visible, I used plain color images. The four images on the right are: a simple, unblended output of the six layers on top of each other; the blend with Enblend 3.2; the blend with Enblend 4.0; and last the working workaround: the blend with the –no-optimize option suggested by Christoph Spiel, release manager for Enblend 4.0. The not optimized version enabled me to produce a usable HDR equirectangular and continue the process. On Christoph’s request, I dug deeper in the code to isolate when the artefact is introduced. With his patient guidance and a few experiments it seems narrowed down to the seam-line vectorization or de-vectorization code.

Conclusion

The resulting HDR equirectangular is technically correct. Enblend 4.0 (pre-release) is an improvement, though not yet perfect. As an added bonus, on multi-core CPUs it is also significantly faster than previous versions.

The other obstacle left is tonemapping: many tonemappers, notably Qtpfsgui (soon to be renamed Luminance) don’t deal properly with zenith and nadir, limiting the user’s choices. The latest Photomatix is tested to deal with the 360° seam and the zenith. In the meantime this is just a post-processing inconvenience that can be solved with a brush in Photoshop or GIMP.

That was a much longer than expected processing for a panorama, but it was worth it. I hope it helps advance Enblend 4.0 toward release.

E.T. Phone Home

If it was only E.T., I would not be worried.

More and more “stuff” uses my hardware and my internet connection to call home. Often without asking for permission. I am one of those surfing the web with Noscript. I also run a local DNS server and do a few other things to stay safe online. Nevertheless, with every new generation of operating systems and software, integration with the Internet becomes more pervasive. I want to integrate on my terms, for example to exercise parental control.

I am reviewing my computing infrastructure, and the next step is to run every operating system in a sandbox: A VirtualBox guest inside a secured Linux distribution. The host’s firewall should be an inclusive firewall, block everything except traffic that is specifically allowed.

So far I found some material about people using IPcop in one VirtualBox to firewall another VirtualBox, but this seem a big redundancy to me. Why not running IPcop as host system? Of course the host system would have no other purpose other than firewalling the virtual boxes, so even Ubuntu would run inside a VirtualBox. Am I missing something? Research continues, hints and help are welcome.

 

Are You a Social Learner?

If you read this blog, chances are that you contribute to Open Source one way or another. This paper is the most detailed study I found on the net about Open Source contributors like us and their motivation.

Hugin-2009.2.0 Windows Installer

hugin-logoJust as I was laying out the current status with regards to binary distributions, Allard Katan released a Windows binary installer of 2009.2.0 and uploaded it to the project’s Sourceforge page. I quickly fired up my ailing Windows XP partition to test it.

It is still based on the old 0.7.0 installer that I wrote more than two years ago. My verdict upfront: not official release quality if you ask me. The devil lies in the details.

straight

straight_3.2

straight_4

The most serious issue is Enblend/Enfuse. Because Windows does not have a package manager, the Hugin installer must ship runtime dependencies with the actual Hugin binary. Enblend/Enfuse is a mandated runtime dependency, and a moving target. Compare the images on the right.

The first one are six layers, simulating six images to be blended into a 360°x180° full spherical panorama.

The second one is the blend by Enblend-3.2 as shipped with this installer. The third one is the blend by Enblend’s pre-release 4.0. All using default settings. This installer ships with an outdated version of Enblend.

Unlike Enblend/Enfuse which is a mandated dependency, a control point generator is not mandatory.

Ippei Ukai has introduced an elegant modular plug-in solution for control point detectors for 0.8.0 on OSX, and the Windows version should be updated to a similar mechanism. It’s only a few lines in the installer. But it requires understanding for the process.

There are good reasons not to include control point detectors with Hugin binary distributions: Patents. U.S.-based Linux distributions such as Fedora/Red Hat and Debian steer away from distributing Autopano-SIFT-C because of the patents. And the SURF algorithm, used by Panomatic is restricted by a European patent. I personally would have not uploaded a binary installer with control point detectors to Sourceforge.

Last but not least: the release notes are unclean. They are simply copy & pasted from the source code release notes. To me this is a warning about the quality of the installer. This shortcut may not be critical, but how many other corners have been cut? In my opinion this installer’s quality is closer to an unofficial snapshot than to an official, long term supported release. There are plenty of such snapshots around the net, some of them linked from the download page of this blog. It’s the beauty of the GPL: the code is set free and others can do almost all they want with it.

I do not endorse this installer. But if it makes other users happy, let it be. When I wrote the community charter, I meant it. My opinion on the Windows installer is unimportant. I am anyway not using Hugin on Windows. Even if I disagree with Allard, I gave him full access rights so that he can go ahead and do what he thinks is right with the Windows installer. And I don’t regret it. I wrote the installer when I had an interest to understand the process of building an installer for Windows, and even if I am the original author of the installer, it does not belong to me. I’m happy if you like it as-is. Just don’t complain that it does not match up to the binaries on OSX or on Linux, it is a known bug and limitation.

Too Much Love Will Kill You

With the increased pace of development, added features, and improved stability, Hugin has become more visible and has got more attention and love from potential users, and of course they have needs and wishes.

One recurring user request is: “why don’t you make binaries available?” John Riley argues for a “concerted effort to create binaries”, saying that Hugin “deserves a much wider audience than those who are willing and/or able to build it from source” and that “making it more accessible would increase its visibility, which could get more people interested in contributing”.

No doubts that binary distributions would be nice. How do they fit in the plan? What is the status of the project? Of the binaries? What are the priorities?

The Plan

Most contributors to Hugin are volunteers. The rule number one is that there is no master plan. Well, I do have my plan, but I can not impose it on others; and others can’t impose their plan on me. Each contributor decides what is right for him, set his own priorities and contributes what he wants. If the contribution makes sense other contributors will adopt, follow, join, contribute.

I like to think of contributors as a free-flowing fluid on the project, like water on Earth. The fluid is naturally attracted to valleys where it forms rivers and lakes. Its passage shapes the landscape.

Controlling the fluids would need expensive barrages. Most often not a practical proposition. Project administrators only decide what is on Sourceforge and even this is not control: The software is Free. No barrier can prevent the fluids from flowing in an alternate direction and if that direction is more attractive, others will follow. It’s purely meritocratic.

Status

Hugin is undergoing major change. Growth has exposed the limits of the previous structures, and we had to adapt to the new dynamic.

When I first got involved with Hugin three years ago, I found the learning curve extremely steep. We needed to make it more attractive to contributors. In response to the difficulties encountered by the Google Summer of Code students the project reacted with SDK and documentation. Hugin has become more accessible for contributors. We see the result with new contributors coming on board almost every month (and without the incentive of a Google internship). Their contributions pose new challenges.

Last year we integrated four major contributed features: celeste; fast preview; batch processor; and new projections.The integration resulted in a development freeze for much of 2009, until the 0.8.0 release last July.

In the meantime, the queue of new contributed features waiting for integration had doubled:

  • GPU-stitching (to use modern hardware for faster stitching, integrated in 2009.2.0);
  • lens calibration (to develop a better mathematical model for the lens geometry, integrated in 2009.4.0 but not ready for practical use);
  • deghosting for enfuse (to reduce ghosts when merging stacks of images with enfuse, integrated in trunk and scheduled for release after 2009.4.0);
  • CPclean (to prune bad control points by statistical method, integrated in 2009.4.0);
  • Auto-crop (to reduce the blank areas around the image);
  • New layout mode (to improve the handling of exposure stacks);
  • XYZ-transform (to improve geometric positioning when the images are not taken from precisly the same no-parallax-point);
  • Masking in GUI (a Google Summer of Code 2008 project that we’re still studying how to integrate).

And there are plenty of small incremental improvements here and there that would not have happened if we were in a freeze.

Extrapolating the past, we’d be frozen until 2011. This would effectively kill the project. I introduced parallel development to solve the bottleneck, and I hope we can work through most of the back log before the end of the winter.

There is a cost to parallel development: desynchronization. And the less glamorous areas of the project suffer particularly. Building binaries for distribution is such an area.

Binaries

The Hugin project is committed to release source code that builds on a wide variety of platforms. The creation and distribution of binaries is left to the users’ communities. Of course we are users ourselves, and we do build binaries, mostly for testing purposes and only on our respective platforms. Our contribution to the distribution process is mostly documentation, because building for self-use and building for distribution are two completely different things, despite a common set of underlying steps.

We stop at the source code for many reasons. Primarily: limited access to all the platforms; limited resources; limited time. Building for distribution requires extra care (quality assurance) and extra steps that are critically important to make the difference between a good distribution and a dysfunctional one. Starting from a fresh source tree is strongly recommended to build a proper distribution. A developer’s source tree is seldom fresh. It contains the traces of the developers work, and this has the potential to contaminate the build process.

One of my activities over the past three months, as I took charge of our release process, was to liaise with downstream distributors to help them produce better distribution in the hope that new versions of Hugin will trickle faster down the distribution channel to the end users. The result:

  • The Hugin FreeBSD port is now easier to maintain and maintainer Vasil Dimov tracks recent releases closely.
  • The Hugin Gentoo Linux ebuild is kept up to date by Thomas Pani, Gentoo maintainer.
  • Coordination with Andreas Metzler and the maintainers team at Debian resulted in the clean up of some outstanding issues and better support of that plattform and its derivates, including Ubuntu. Unfortunately Hugin 2009.2.0 did not make the cut for Ubuntu 9.10, which has Hugin-0.8.0 available through it’s package manager.

Of course this is of little interest for people who do not use FreeBSD, Gentoo, Debian, Ubuntu. And the reality is that the majority of users are still on Windows. Systems without package managers (Windows and OSX) add additional effort / complexity to the build and distribution of binaries. Nevertheless, Harry van der Wolf, supported by some other Mac users, provides unofficial OSX binaries on a regular basis. And for Windows, I have recently guided a few users through parts of the build process. Unfortunately none of the resulting Windows binaries is stable enough to be declared official.

Priorities

Given all the back-log of features to integrate, and the consequent knowledge that the current release of Hugin is short lived and a newer, better one is just around the corner, why put the effort to produce a Windows installer that will be obsolete soon? Of course ideally the installer should track closely the development, but it seems to be too much effort at the moment. At some point, the gap between the installer and the code will be deep enough to attract more attention, and somebody may be motivated to provide a Windows installer.