Professional, vs Volunteer, Open Source

A few weeks ago, a colleague asked me what I believed to be the biggest threat facing open source today. I answered that I think it’s full-time open source developers, and the effect they have on part-time volunteer developers.

Long ago (it actually hasn’t been very long, it just seems that way sometimes) open source was developed primarily by part-time hobbyist developers, working on their evenings and weekends on things that they were passionate about. Sure, there were full-time developers, but they were in the minority. Those of us working a few hours on the weekends envied them greatly. We wished that we, too, could get paid to do the thing that we love.

Now, 20 years on, the overwhelming majority of open source development is done by full-timers, working 9-5 on open source software. (No, I don’t have actual statistics on this. This is purely anecdotally based on my daily observations. I’d love to see actual scientific data on this.) And those who are working nights and weekends are often made to feel that they are less important than those that are putting in the long hours.

Most of the time, this is unintentional. The full-timers are not intentionally marginalizing the part-timers. It just happens as a result of the time that they’re able to put into it.

Imagine, if you will, that you’re a evenings-and-weekends contributor to a project. You have an idea to add a new feature, and you propose it on the mailing list, as per your project culture. And you start working on it, a couple of hours on Friday evening, and then a few more hours on Saturday morning before you have to mow the lawn and take your kids to gymnastics practice. Then there’s the cross country meet, and next thing you know, it’s Monday morning, and you’re back at work.

All week you think about what you’re going to do next weekend.

But, Friday evening comes, and you `git pull`, and, lo and behold, one of the full-timers has taken your starting point, turned it in a new direction, completed the feature, and there’s been a new release of the project. All while you were punching the clock on your unrelated job.

This is great for the product, of course. It moves faster. Users get features faster. Releases come out faster.

But, meanwhile, you have been told pretty clearly that your contribution wasn’t good enough. Or, at the very least, that it wasn’t fast enough.

The Cost of Professionalism

And of course there are lots of other benefits, too. Open source code, as a whole, is probably better than it used to be, because people have more time to focus. The features are more driven by actual use cases, since there’s all sorts of customer feedback that goes into the road map. But the volunteerism that made open source work in the first place is getting slowly squelched.

This is happening daily across the open source world, and MOST of it is unintentional. People are just doing their jobs, after all.

We are also starting to see places where projects are actively shunning the part timers, because they are not pulling their weight. Indeed, in recent weeks I’ve been told this explicitly by a prominent developer on a project that I follow. He feels that the part timers are stealing his glory, because they have their names on the list of contributors, but they aren’t keeping up with the volume of his contributions.

But, whether or not it is intentional, I worry about what this will do to the culture of open source as a whole. I do not in any way begrudge the full-timers their jobs. It’s what I dreamed of for years when I was an evenings-and-weekends open source developer, and it’s what I have now. I am thrilled to be paid to work full time in the open source world. But I worry that most new open source projects are completely corporate driven, and have none of the passion, the scratch-your-own-itch, and the personal drive with which open source began.

While most of the professional open source developers I have worked with in my years at Red Hat have been passionate and personally invested in the projects that they work on, there’s a certain percentage of them for whom this is just a job. If they were reassigned to some other project tomorrow, they’d switch over with no remorse. And I see this more and more as the years go by. Open source projects as passion is giving way to developers that are working on whatever their manager has assigned, and have no personal investment whatsoever.

This doesn’t in any way mean that they are bad people. Work is honorable.

I just worry about what effect this will have, and what open source will look like 20 years from now.

Blues Brothers

We watched Blues Brothers last night.

Like many movies that have been hyped and built up to me over the years, this one completely failed to live up to the promise.

The premise of the movie is that Dan Aykroyd repeats a particular kinda funny line a dozen times, and lots of famous people are in unexpected situations.

I suspect, that if I had watched Saturday Night Live more as a kid that the movie would have had more appeal. I didn’t, and it didn’t.

So I give this movie a B+ for a pretty decent, if utterly absurd, car chase at the end. But a D in all other respects. The plot was weak and the acting was terrible.

Ironically, as is often the case in situations like this, I think I actually would have enjoyed the movie more had it not been promised as the greatest movie of all time for so many years, by so many people.

Switching to KDE

I’m in the midst of switching from Gnome to KDE/Plasma. I’m doing this because KDEnlive crashes a lot less under KDE, and the every-3-minutes crashes were making editing videos amazingly painful.

I’m actually really liking it. The biggest problem right now (less than 24 hours in) is muscle memory making unexpected things happen.

One of the things I liked most about MacOS was that I had different applications on different virtual desktops, and I had my fingers trained so that if I wanted, say, to go to email, that was on desktop 2, and alt-2 took me there. This was never possible (or, at least, easy) on Gnome. But it’s easy on KDE, and I’m rapidly getting back into that habit, even though it’s been roughly 5 years since I’ve used a Mac.

There are, of course, small irritations, having more to do with what I’m used to than whether they are “good” or “bad”. But I think, over all, in addition to improving how long it takes to edit video, this will be a net win for productivity.

We’ll see.

Student supercomputing at ISC18

This is the second year that I’ve attended isc-hpc in Frankfurt. One of the highlights of this event is the student cluster competition. This year there are 12 teams competing for the fastest and most energy efficient student cluster.

This year I interviewed about half of the teams. It’s hard to find a time when they’re not actively working on projects, and one doesn’t want to interrupt.

There are several parts to the contest. In addition to solving some simulation problems that they’re presented with months in advance, there is always a secret application that is provided when they arrived on site. They have to program and optimise the solution, and then try to run it faster than any of the other teams. And, they have to do all that without exceeding 3 kilowatts of power usage.

This year there are teams from Africa, Asia, South America, and Europe.

The awards ceremony is this afternoon, when will find out which one of these teams of young student geniuses will prevail.

Feathercast at FOSS Backstage

This week I am at FOSS Backstage, in Berlin. It’s a conference about what goes on behind the scenes in open source – issues of governance, licences and other legal stuff, community management, mentoring, and so on. Once a project gets beyond a few people, these issues start coming up.

As was observed in this morning’s keynote, this doesn’t feel like a first-time conference, but rather like something that’s been running smoothly for a while. I’m very impressed with the event.

I have been doing interviews for Feathercast this week, and have, so far, 13 interviews captured. So this is going to take a time to edit and publish. They’ll be on the Apache Software Foundation YouTube channel, as well as on the Feathercast site.

I’ve been talking to speakers from the event, and, since all of the sessions are being videoed, I’m trying not to simply reproduce their talk, but get some information about the project, organization, or concept that they were presenting. I hope you like what I’ve done.

Follow @FeatherCast on Twitter to find out when the episodes are published.

Upcoming events (June and beyond)

I’m about to head out for a few events again, and I’m in the process of planning several other events.

First, I’ll be in Berlin for FOSS Backstage , Berlin Buzzwords , and the Apache EU RoadShow. This is a trifecta of open source events happening at the Kulturbrauerei in Berlin. I’ll be speaking at Backstage about mentoring in open source, which, you might know, is something I’m passionate about. I’ll also be doing interviews for Feathercast, so if you’re going to be there, find me and do an interview.

I’ll be home for a week, and then I’ll be attending the ISC-HPC Supercomputing event in Frankfurt. This is the second time I’ll attend this event, which was my introduction to Supercomputing last year. I’ve learned so much since then, but I’m still an HPC newbie. While there, I hope to spend most of my time speaking with the EDUs and research orgs that are present, and doing interviews with the student supercomputing teams that are participating in the Student Cluster Competition.

Beyond that, I’m planning several events, where I’ll be representing CentOS.

In August, I’ll be attending in Boston, and on the day before DevConf, we’ll be running a CentOS Dojo at Boston University. The call for papers for that event is now open, so if you’re doing anything interesting around CentOS, please submit a paper and come hang out with us.

Later in August, I will (maybe? probably?) be going to Vancouver for Open Source Summit North America (formerly Linuxcon) to represent CentOS.

In September, I’ll be at ApacheCon North America in Montreal. The schedule for this event is published, and registration is open. You should really come. ApacheCon is something I’ve been involved with for 20 years now, and I’d love to share it with you.

October is going to be very full.

CentOS is proudly sponsoring Ohio LinuxFest, which apparently I last attended in 2011! (That can’t be right, but that’s the last one I have photographic evidence for.) We (CentOS) will be sharing our booth/table space with Fedora, and possibly with some of the project that use the CentOS CI infrastructure for their development process. More details as we get closer to the event. That’s October 12th – 13th in Columbus.

Then, on October 19th, we’ll be at CERN, in Meyrin, Switzerland, for the second annual Cern CentOS Dojo. Details, and the call for papers, for that event, are on the event website at

Immediately after that, I’ll be going (maybe? probably?) to Edinburgh for Open Source Summit Europe. This event was in Edinburgh a few years ago, and it was a great location.

Finally, in November, I plan to attend SuperComputing 18 in Dallas, which is the North American version of the HPC event in Frankfurt, although it tends to be MUCH bigger. Last year, at the event in Denver, I walked just over 4 miles one day on the show floor, visiting the various organizations presenting there.

So, that’s it for me, for the rest of the year, as far as I know. I would love to see you if you’ll be at, or near, any of these venues.

Web server performance problem solved, years later

(Geeky post alert. If you’re reading this on Facebook, the links and formatting are going to be all messed up.)

15 years ago, I wrote a blog post about a stereo cabinet glass door that spontaneously exploded. For some reason, this post attracted a lot of attention. If I had written it a few years later, one would say it “went viral.” It received tens of thousands of page views, and 330 comments.

At some point, I decided to export it to a static page, since every page load was causing my server – at the time, running on a Pentium in my home office across my DSL line – to slow down horribly. In the process, I managed to delete the page entirely (a long story within a long story) and I grabbed the page off of the Wayback Machine.

That page is HERE, by the way.

Each comment has a Gravatar logo next to it, which, due to the way curl (the tool I used to retrieve the static copy) works, has a name like avatar(230).php but is actually a jpeg file. That means that every time the page loads, it makes 330 calls to the php engine, which errors out because the file in question isn’t a php file, but is an actual on-disk jpeg file. Like this one, for example.

Then, several years ago, I switched from using mod_php to using php_fpm, which does the same thing, except more efficiently.

Finally, at some point, I added a mod_security ruleset that attempted to detect when people were DDoSing my site – the barrier it set was more than 30 requests in under a second.

These various things, all combined, resulted in a situation where whenever someone attempted to view that page, it would cause my server to crawl to a halt, and the visitor to be added to my mod_security deny list. This was not desired behavior.

Of course, this is all in retrospect. All I knew was that several times a day, I’d get failure notices from my server monitoring, and by the time I got there to see what was happening, the problem had cleared up. So, no big deal, right.

This has been going on for years.

Today, looking at error logs trying to figure out what was happening, I suddenly put all of the pieces together, and fixed the problem, in less time than it has taken me to write this blog post. The solution has a few parts.

First, we exclude anything in the /files/ directory from being processed by php:

# (old line) ProxyPassMatch ^/(.*\.php(/.*)?)$ fcgi://$1
ProxyPassMatch ^/(?!files)(.*\.php(/.*)?)$ fcgi://$1

That adds the (?!files) negative lookahead, which says “only do this if it DOESN’T match ‘files’

Next, we turn off the mod_security rule specifically for these requests:

<LocationMatch (Exploding)>
SecRuleEngine off

Which says, don’t run the SecRuleEngine for requests that contain ‘Exploding’, which is in the URL of the static copy of the blog post.

Finally, I have to tell httpd that the .php files in the static copy are, in fact, jpeg files:

<Directory /var/www/vhosts/drbacchus/files>
AddType image/jpg .php

This has the added benefit that if anybody dropped a .php file in my files directory, it would be defanged, so to speak, and wouldn’t execute.


300 TFTC

I just found my 300th geocache! I started Geocaching in March of 2003. It was a difficult time and I needed a reason to get out of the house and do something other than sit and stare at the walls. And so I started geocaching. I met a lot of good friends while geocaching, although I’m not in touch with very many of them anymore. Today I’m up in New Jersey for a wedding and took the opportunity to go out and get the last two to push me to the 300 line. Thanks for the cache.


I asserted to my daughter last week that a paper cup with water in it will not catch fire if placed directly in a fire. So, of course, we had to try it.

I was a little nervous, but it turns out this is completely true. The cup burned down to the water line, and then didn’t burn until the water had completely boiled off. The *instant* the last of the water boiled off, the cup burst into flames and was gone almost immediately. (Animated version of image is here.)

Why? Well, it’s because water boils at 212°F (100°C) and paper combusts at 451°F (843.8°C) so as long as there is water in the cup, the heat of the cup is being convected away into the water to heat it towards boiling and the cup remains too cold to ignite. Once the water starts boiling, the cup is full of steam, which is quickly carrying away the heat. The moment the water has all evaporated, though, the cup is abruptly at combustion temperature and goes up in a flash.

You should try it. It’s a great way to impress your kids. Or win a bet.

Blogging, and feedreaders

A week or two ago I had a conversation with Stormy about the lost art of blogging, and blog reading. Long ago, Google Reader was a daily routine, and kept me in touch with the blogs that I wanted to read, and made me more likely to write blogs of my own. When Google Reader died, nothing really took its place, and the thing that kinda sorta took its place – Facebook and Twitter – do a terrible, terrible job of giving me the sources I actually want, and, instead, feed me a steady diet of pablum and clickbait.

Yesterday, Anil Dash tweeted about Google Reader, and made some great observations about what an important tool it was for a certain population.

The entire Twitter thread is worth reading … and would have made a good blog post.

These two things have inspired me to try Feedly again. It is much better than the last time I tried to use it, and I have high hopes that I’ll actually stick with it this time, and make it part of my daily routine again. I hope. I also hope that this will result in my actually writing again, like I used to do, on a nearly daily basis.

The Margin Is Too Narrow