Whelp, the latest IPCC report doesn’t beat about the bush. It whacks right into it. And, yes, while governments and companies need to take responsibility and finally start to DO something, we all still have a personal responsibility.
As KDE we’re working on the energy consumption of our applications (which already has shown something we, the Krita developers, can do right away), I changed to buying the electricity I use for building Krita (and everything else) from the local windmill coop… I don’t drive, never have, and that pile of wood next to the stove dates back to 2018, and we’re not going to burn it any more. (Why would we — it’s too warm in winter, when we came to live here in 2007, we needed that stove because it was too cold otherwise!)
All of that is peanuts.
But there is one place where we, as a free software community, have to take responsibility (that word again) and stop doing something we absolutely love to do, and for which we’ve been pining.
In-person conferences with a world-wide attendance.
I’m hating having to say this, and I’m hating that I have to say that it is inevitable, and I hate that many of my Krita collaborators won’t actually be meeting me, the improved second release, but… We, as a free software community need to have this discussion, and until now, all I’ve seen has been silence.
Flying always scared me irrationally, but that’s not the reason I think we should stop it. Cheap air travel is only possible because it’s subsidized, untaxed and airlines are being kept alive only as a matter of national pride. And that’s the only reason we can actually afford to fly people from all around the world to conferences, sprints, Akademies and the like.
Our conference model is built upon something we wouldn’t be able to afford if air travel was priced in accordance with its real cost.
And, we’re probably not going to miss the presentations, much? Most presentations at free software conferences are, honestly, a bit meh. A conference with presentations is something we’ve copied from Academia. What we’re really missing is the meetings, the chats, the sitting together at a table and making notes, the occasional pair programming, the making new acquaintances, the gossip.
And the remote conference format pretty much only provides the presentations, with a bit of chat, which we already have next to it. And I really cannot handle remote conferences myself. For meetings and BoF sessions, I get a headache in a minute or ten. For presentations, especially when pre-recorded, I get bored in under a minute. That sucks.
Maybe we can find a better on-line get-together activity. Max Miller from Tasting History has a virtual cocktail party with his patreons. Maybe the next virtual conference could start with planning and facilitating the socializing, and only add in a program as an afterthought?
But whatever, we still should not go back to burning enormous amounts of kerosene for our get-togethers. Would it be too much to say that that would be criminal?
(Recommended music while reading this: Peggy Seeger singing “The Housewive’s Lament)
I’ve been having nightmares, well, really weird dreams for quite some time now. Things like finding myself trying to clear up the floor of a large DIY store just before the first opening day, and not managing it, no matter how many friends were helping. Trying to pack all my luggage when leaving a rented holiday home, and never being able to fit everything in. Trying to pack furniture and stuff into boxes before moving out because someone needs to replace the floor, and having to squash ancient china cups in a carton box, breaking them all in the process. If I manage to wake up, I’m exhausted and tired.
And I guess I’ve grokked why I keep having these dreams. And it’s not lockdown or Covid-19. It’s this:
When we held our last bug fundraiser in 2018, we decided to focus for a year on stability. we had about 300 bugs. And we’ve never even managed to come close to that number of bugs ever since! And yet, in the past year, we’ve closed more than a thousand issues.
Clearly, something is wrong here. This isn’t sustainable. We have managed to deflect a lot of reports that were actually cries for support to a new community website, krita-artists.org and to r/krita. Our experiment with ask.krita.org wasn’t successful, but krita-artists really works. We still do get a lot of worthless bug reports, but I think it could be a lot worse, now that we really have millions of users on four platforms.
But it’s still not sustainable. Every report needs to be read, often deciphered, triaged, investigated, fixed — and all those steps involve communication with the reporter. Even the most useless bug of reports can take an hour to resolve with the reporter.
11:32:47. Eep, incoming bug interrupt. Bug 422729 – No action (assignable shortcut) for Magnetic Selection Tool. 11:33:24: Check, yes, reporter is right. 11:34:32: Confirm. Check code. Notice that all .action files that define actions in the plugins/tools folder are wrong. 11:37:35: Add note to bug report. Fix all of them. 11:47:49: reporter asks a question that needs to be answered, but is not very relevant. 11:56:29: answer given, while I’m building Krita with my fix. 11:57:03: Tested fix, isn’t complete, more work needed. Oh, wait, tested the wrong build. Tested the right build, but the new action isn’t found. 12:00:00 Debug more. 12:25:45: fixed some more broken xml files, ready to make the commit. 12:27:41: Bug is fixed. 12:31:33: Fix is backported to krita/4.3 branch and an extra commit is added that in future will print a warning if the xml parser cannot read the .action files.
Back to this blog post… This was a fast fix: it took about an hour, and in between I also had to run downstairs to receive my new book on Jan van Eyck.
So, this bug report reported a small, but real issue and uncovered a deeper issue. Without the report we probably wouldn’t have noticed the issue. Krita is better because of the report, and the reporter is one of our “known-good” reporters: he has reported 16 bugs this year, one is still in Reported state because we couldn’t reproduce it, six are Confirmed, two are Assigned and seven are Resolved Fixed. Yu-Hsuan Lai has helped us make Krita materially better!
But… Even with me, Dmitry, Agata, Wolthera, Emmett, Eoin, Ivan, Mathias and more people fixing bugs all the time, we’re not getting the numbers down. The floor remains littered, the luggage unpacked and the china unboxed.
Of course we’re not the only ones in this situation: every project that takes bug reports from the general public faces this issue. Some people argue that any bug that has a workaround should be documented and closed; but workaround don’t make for a happy workflow. Others argue that every bug report that is older than two weeks, or a month should be closed because it’s clearly not a priority. But the issue reported is real, and will get reported over and again, with no way of chaining the reports together.
It’s also possible to treat reports like a funnel: first ask people to report on a user-support forum, like krita-artists.org, and only when it seems to be a real bug create a bug report, and only when it’s triaged, put it in a queue. But the problem with that is that nobody’s queue is ever empty. That can be done by assigning the bug to one of us: currently we have 64 bugs in the Assigned state. that means, on average, ten bugs each person’s queue. That in turn probably means betwee a week and a month of tasks already in our queue… Which means we shouldn’t actually look at any newly reported bugs, before we funnel them into our queue. (Another option would be to create issues on invent.kde.org for every bug we really want to work on, something we used to do with phabricator… But pretty quickly we drowned in those tasks as well.)
Which in turns means that either reporting bugs is useless or our todo queues are useless.
And that todo list doesn’t even include working on new features, refactoring old code so we decrease technical depth or working on regressions — that is, features we broke accidentally.
So, even though bug reports help make Krita better, the one thing I’m sure of is that we shouldn’t do anything that will make it easier to report bugs. We have more than enough reports to keep us busy.
It would be nice, though, if we could make it easier to make good bug reports, for instance by automatically attaching our system information and usage logs to a bug report made with the Help->Report Bug functionality.
Theoretically, that can be done, bugzilla offers a complete REST interface. And I seem to remember that we had that in KDE, years ago, but that it broke because of an update of bugzilla. There’s a new version of bugzilla coming, apparently, and maybe it’ll be worthwhile to investigate that again, in the future.
Argh… I think the dreams will continue for a while.
When I read “Qt offering changes 2020” yesterday, my first reaction was to write a pissy blog post. I’m still writing a blog post with my thoughts about the changes, but I’ll be nice. There are three parts to this post: a short recap of my history with Qt and then my thoughts on what this means for KDE, for Krita and for free software.
I started programming using Qt and PyQt when I read about Qt in Linux Journal, which I was subscribing to back in 1996. That means that I’ve been using Qt for about 25 years. I initially wanted to write an application for handling linguistic field data, and I evaluated GTK+, wxWidgets, Qt, Tk, fltk, V and a few others that have been forgotten in the mists of time. I choose Qt because it had great documentation, a consistent API, the most logical (to me…) way of doing things like setting up a window with a menu or handling scrollbars and finally because it made C++ as easy as Java.
I’ve stayed with Qt for 25 years because, through all the vicissitudes, it kept those qualities. Mostly. There are now a lot more modules, most of which aren’t necessary for my work, there are people working on Qt who seem to be a bit ashamed that Qt makes C++ as easy as Java and want to make Qt as computer-sciency as C++, there have been the licensing issues with the QPL, the changes to GPL, to LGPL and then again some modules back to GPL there have been the Nokia years, the Digia times.
But I’ve always felt that I could build on Qt. And the reason for that is the KDE Free Qt Foundation. To summarize: this is a legal agreement that keeps Qt free software. If the Qt company won’t release a version of Qt under a free software license within a year of a release, Qt becomes licensed under the BSD license.
With yesterday’s message, the Qt company is searching the utter boundaries of this agreement. To recap:
Long Term Support releases remain commercial only (the post doesn’t mention this, but those releases also need to be released under a free software license within a year to adhere to the agreement, at least to my understanding).
Access to pre-built binaries will be restricted: put behind an account wall or be only available to commercial license holders
And there’s a new, cheaper license for small companies that they can use to develop, but not deploy their work to customers.
This is a weirdly mixed bag of “changes”. The last one is a bit silly. Even the “commercial” side of Krita is too big to qualify! We’re five people and have a budget of about 125k…
The middle point is worth considering as well. Now there is nothing in any free software license that talks about a duty to make binaries available.
For a very long time, Krita, when part of KOffice, only made source tarballs available. Right now, we, like the Qt company, have binaries for Linux, Windows, macOS and (experimentally) Android. The Windows binaries are for sale in the Windows Store and on Steam, the Linux binaries are for sale on Steam. And all binaries can be downloaded for free from krita.org and other places.
This move by the Qt company would be like the Krita project shutting down the free downloads of our binaries and only make them available in the various stores. It would be legal, but not nice and would cost us hundreds of thousands of users, if not millions. It is hard not to wonder what the cost to the Qt community will be.
The first change, the restriction of the LTS releases to commercial customers has all kinds of unexpected ramifications.
First off, Linux distributions. Disitributions already rarely use LTS releases, and in any case, with Qt 3 and Qt 4 there didn’t use to be any LTS releases. But disitributions do have to keep older versions of Qt around for unported applications for a longer time, so they do need security and bug fixes for those older versions of Qt.
Then there’s the issue of how fixes are going to land in the LTS releases. At the last Qt contributor summit the Qt project decided on a process where all fixes go through “dev” and then are ported to the stable branches/LTS branches. That’s going to break when Qt6 becomes dev: patches won’t apply to Qt 5.
Albert has already blogged about this change as well, but he only really focused on distributions and KDE Plasma; there is of course much more to KDE than the Plasma desktop and Linux distributions.
As for Krita, we’re using Qt 5.12 for our binaries because we carry a lot of patches that would need porting to Qt 5.13 or 5.14 and because Qt 5.13 turned out to be very, very buggy. For Krita, using a stable version of Qt that gets bug fixes is pretty important, and that will be a problem, because we will lose access to those versions.
In my opinion, while we’ve done without stable, LTS releases of Qt for years, it’s inevitable that Qt 5.15 will be forked into a community edition that gets maintained, hopefully not just by KDE people, but by everyone who needs a stable, LGPL licenced release of Qt5 for years to come.
Splitting up the Qt community, already responsible for handling a huge amount of code, is not a good idea, but it looks like the Qt company has made it inevitable.
And once there’s a community maintained fork of Qt, would I contribute to the unforked Qt? Probably not. It’s already a lot of work to get patches in, and doing that work twice, nah, not interested. If there’s a maintained community version of Qt 5, would I be interested in porting to Qt 6? Probably not, either. It isn’t like the proposed changes for Qt 6 excite me. And I don’t expect to be the only one.
As for the more intangible consequences of these changes: I’m afraid those aren’t so good. Even in our small Krita community, we’ve had people suggest it might be a good idea to see whether we couldn’t port Krita to, say, Blender’s development platform. This would be a sheer impossible task, but that people start throwing out ideas like that is a clear sign that the Qt company has made Qt much less attractive.
If I were to start a new free software project, would I use Qt? Last Sunday the answer would have been “of course!”. Today it’s “hm, let’s first check alternatives”. If I had a big GTK based project that’s being really hampered by how bad, incomplete and hard to use GTK is, would I consider porting to Qt? Same thing. If the KDE Free Qt Foundation hadn’t that agreement with the Qt company, the answer would probably be no, right now, it’s still probably a yes.
Now as for the actual announcement. I think the way the Qt company represents the changes is actually going to help to harm Qt’s reputation. The announcement is full of weasel-wording…
“General Qt Account requirement” — this means that in order to download Qt binaries, everyone is going to need a Qt account. Apparently this will make open-source users more eager to report bugs, since they will already have an account. And, yay, wonderful, you need an account to access the completely useless Qt marketplace. And it allows, and now we’re getting at the core reason, the Qt company to see which companies are using the open source version of Qt and send salespeople their way. (But only if the people making the accounts are recognizable, of course, not if they make the account with their gmail address.) When I was working for Quby, I was unpleasantly surprised at how expensive Qt is, how little flexibility the Qt company shows when dealing with prospective customers — and how we never downloaded the installer anyway.
“LTS and offline installer to become commercial-only” — this will break every free software project that uses services like travis to make builds that download Qt in the build process. Of course, one can work around that, but the way the Qt company represents this is “We are making this change to encourage open-source users to quickly adopt new versions. This helps maximize the feedback we can get from the community and to emphasize the commercial support available to those with longer product life cycles that rely on a specific Qt version.” Which of course means “our regular releases are actually betas which we expect you freeloaders to test for us, to provide bug fixes for us, which we can use to provide the paying customers with stable releases”.
And yes, theoretically, the main development branch will have all bug fixes, too, and so nobody misses out on those bug fixes, and everyone has stability… Right? The problem is that Qt has become, over the years, bigger and buggier, and I doubt whether releases made fresh off the main development branch will be stable enough to provide, say, a stable version of Krita to our millions of users. Because, apart from all the bug fixes, they will also have all the new regressions.
“Summary”. “The Qt Company is committed to the open-source model of providing Qt technology now and in the future and we are investing now more than ever. ” — though only to the extent that the Qt Company is forced to adhere to the open-source model by the KDE Free Qt Foundation.
“We believe that these changes are necessary for our business model and the Qt ecosystem as a whole. ” — my fear is that the Qt Company will not survive the fracturing of the Qt ecosystem that this decision practically guarantees.
Well, that was three interesting articles on the same topic on the same day, namely, billionaires. And read in turn they explain exactly why the Linux Desktop is still at such a marginal market share, and why that’s not because we, who work hard on it, are failures who have been doing the wrong thing all the time. It is in the first place policies, bought with money, that allowed people to build monopolies, taxing individuals and so becoming even more rich and powerful.
(Similarly, it’s not individuals through their choices who are destroying the planet, it is policies bought by the very rich who somehow believe that their Florida resorts won’t sink, that they won’t be affected by burning up the planet so they can get richer. But that’s a digression.)
So, the the first article, by Arwa Mahdawi, discussed the first part of this problem: with enough money, all policies are yours. It’s just a squib, not the strongest article.
Exploit a monopoly: this is illegal under the laws of the United States.
Exploit insider information. This is also illegal.
Buy a tax cut. This seemed uniquely USA’ian until the Dutch prime minister Rutte promised abolition of the dividend tax to Unilever. This would seem to be illegal as well, but IANAL.
Extort people who already have a lot of money. Extortion is illegal.
Inherit the money. This is the only legal way to become a billionaire.
Now the article entitled What Is a Billionaire, by Matt Stoller was posted to the Linux reddit today. Not surprisingly, many people completely didn’t get the point, and thought it was irrelevant for a Linux discussion forum, or was about capitalism vs socialism, or outdated Microsoft bashing.
However, what it is about, is the question: why is Bill Gates not in jail for life with all his wealth stripped off? He’s a criminal, and his crime has directly harmed us, the people working on free software, on the Linux Desktop.
So, to make things painfully clear: Bill Gates made it so that his company would tax every computer sold no matter whether it ran Windows or not. If a manufacturer wanted to sell computers running Windows, all the computers it sold were taxed by Microsoft. He would get paid for the work a Linux distribution was doing, and the Linux distribution would not get that money.
That means there’s a gap twice the amount of this illegal tax between Microsoft and the Linux distribution. If a Linux distribution would want to earn what Microsoft earned on a PC sale, it would have to pay the Microsoft tax, and ask for its own fee.
This cannot be done.
And I know, this has been said often before, and discussed often before, and yeah, I guess, poor Bill Gates, if he hadn’t been bothered so badly with the hugely unfair antitrust investigation, he would also have been able to monopolize mobile phones, and the world would have been so much sweeter. For him, for certain.
I guess we didn’t do all that badly with the Linux Desktop market share being what it is. This is a fight that cannot be won.
Monopolies must be broken up. It’s the law, after all.
I was going to attend the Linux App Summit, and even going to speak, about Krita and what happens to a an open source project when it starts growing a lot. And what that would mean for the Linux desktop ecosystem and so on. But that’s not going to happen.
There was really bad flooding in the south of France, which damaged the TGV track between Montpellier and Barcelona. When we went home after the 2018 Libre Graphics Meeting, we took the train from Barcelona to Paris, and noticed how close the water was.
Well, I guess it was too close. And this is relevant, because I had planned to come to Barcelona by train. It’s a relatively easy one-day trip if everything is well, and gives ten hours of undisturbed coding time, too. Besides, I have kind of promised myself that I’m not going to force myself to take planes anymore. Flying terrifies me. So I didn’t consider flying from Amsterdam for a moment — I was only afraid that other people would try to force me to fly.
Then I learned that there is a connection: I would take the train to Montpellier, and then a bus to Bezieres and then a train again. It would make the whole trip a two-day journey, with, as recommended by the travel agency I bought the tickets from, a stop in Paris. That advice was wrong.
I should have taken my connecting train to Montpellier, got a hotel there, and continued the next day. At this point I was like, okay… I’m just going home. Which I am doing right now. I bet the trip back would be just as difficult, and my Spanish isn’t as good as my French, so I would have an even harder time getting people to tell me what to do exactly, when to travel and where to go.
At the 2019 Libre Graphics Meeting, illustrator Livio Fania presented a heart-felt plea for more professionalism in libre graphics.
And that was the moment I began to think a bit. What is it that makes one project professional, and another not? Where, in this case, I’d take “professional” to mean “someone can depend on it so they can earn their daily bread with no more than the problems you always have with any software, because all software sucks, and hardware even more so”.
As Livio said in his presentation, funding makes a difference. If a project can fund its development, its continuity will be better, it will be better able to implement its vision and deliver what it promises, simply because funding equals time. That’s also what I tried to argue in my previous blog post.
In practice, it’s very hard to find funding for applications that that people do not earn their income with. Of course, there are very accomplished free and open source video players, editors or file managers. Those tend to be still completely volunteer-driven and very underfunded. And there’s a reasonably successful web-browser, which is actually funded quite well — but it’s funded to avoid Google being broken up as the monopolist that it is, mainly by Google.
And of course, there are applications that people use daily, earn their bread with and that are completely unfunded, even if they have donation buttons and money in the bank: GIMP, Inkscape, Scribus, various RAW photo applications, often by choice. This means that those projects are even more squeezed for time than a project like Krita. Just think how much difference Tavmjong Bah would make if he could be funded to work on Inkscape full-time! Tav gets $107 a month through Patreon… (Which I contribute too.)
But though Livio focused on the need to get funding to accelerate development, and it’s the first step, there’s more to creating a professional application.
The second step is: focus on the user’s needs. That starts with knowing what you want to create. If your goal is to implement a standard specification fully, as is the case with Inkscape, then is that goal sufficiently user-oriented to form the basis for an application designers can earn their daily bread with? It’s possible, but it’s something to always be aware of.
And like I argued recently, is looking inward, discussing the where’s and why’s of Free Software, no matter how enjoyable, not taking away time better spent getting to know your userbase, their needs and… The competition.
I would not be surprised if visiting the Linux Application Summit next would be less useful for Krita and its users than a week long training in Photoshop would be for me, as the Krita maintainer. We’ve all been there: we’ve all claimed we’re not competing with the big, sovereign, proprietary applications that are seen as a standard in the field where our application is the “open source alternative”.
But if you don’t compete, you cannot win. And if you don’t win, then millions of users will not use free and open source software. And I happen to believe that software freedom is important. And I’m slowly gaining the confidence to say: I am competing.
(And we are. Competing. Otherwise Adobe wouldn’t have been adding so many new features for painters and illustrators to their latest Photoshop release, some of them features Krita has had for a decade.)
And when we compete, which makes people trust us, and when our user fund our efforts, then we need to take another step towards professionalism.
That is, committing to continuity. I’ve always said “free software never dies”, but it does. Look at Amarok, which is thoroughly dead. Of course, Krita has been released since 2004, and there’s quite a bit of continuity already. But this does take commitment, which also needs to be public.
Finally, there’s the point where as a full-time project member, as the project maintainer, can no longer say “We don’t provide binaries. Get the source and build it, or wait for a distribution”. You have to provide your application in a form your users can use directly; it’s their time you’re telling them to use for something they don’t earn money with, if you ask them to build.
And then… We have to stop making excuses. Which is probably the hardest thing, because all software sucks, and all code sucks, and sometimes it’s impossible to fix a bug in Krita, because it’s in the OS, the Window manager or the development platform. Or the tablet driver, oh dear, the tablet drivers. But it’s your customer, your supporter, the person who depends on your work to earn their money who is stopped from doing that. And suddenly workarounds and hacks become necessary.
So, funding a core team of developers is the start, focusing on the field instead of the free software community, a will to compete, providing continuous improvement, making sure your software can be used and finally, not making up excuses if there are problems but fixing them.
In my previous blog post, I mentioned that being part of the wider Free Software community can be a drag on a project.
In this post, I want to put that in perspective. Elaborate a bit, if you will.
For someone like me who wrote his first GPL’ed application in 1993 (it was a uucp mail and usenet client written in –gasp!– Visual Basic), I’ve spent countless hours pleasantly occupied contemplating the where and which of Free Software. I am part of the KDE community, the Libre Graphics community, the Free Software community. Inside our communities, we’ve got engrossing discussions about licensing, diverting flamewars about business models, scintillating technical developments, awesome foes to smite, forks to be praised or lambasted. There are conferences to be visited, or be perorated at, sprints to organize and organizations to join.
In short, you can develop Free Software within the Free Software community while all the while looking in. This is important for Free Software. That is bad for Free Software. If I make this effort, or take this initiative, or mentor this student, Free Software will improve.
And that’s fun, and gives one the satisfied feeling of making the world a better, freer place. And I care about software freedom, really I do!
It does take a lot of time, though, and that time — is that really spent making the world better for the people who use my software? Is all that caring about the ins and outs of the Free Software community the best use of my time, when I’m working on an end-user application? And does it give me the right kind of insight in what I should be doing?
That’s what I meant when I wrote that I was kind of agreeing with Ton that being part of the Free Software community could be a drag on Krita becoming successful.
If I’m spending my time on GNOME vs KDE, Flatpak vs Snap vs AppImage, deb vs RPM, then I’m only worrying about technical details that don’t mean a thing for someone who wants to paint a comic. That makes it at worst a waste of my time, or at best a hobby that is only tangentially related to Krita and its users.
If I’m looking inside the community, and not out of it, facing the people who will actually be using my software, I probably won’t be making the right kind of software.
If I’m visiting conferences like Akademy, Fosdem or Linux Application Summit, I’m spending time on nourishing the Free Software community, but not on meeting my users, meeting their expectations or even learning what my proprietary competitors are doing.
Like I said on Friday, we need funding to make sure our projects have a stable base of effort. That stable base makes our projects more attractive for other developers interested in helping out. But we also need to look out of the window. And that’s a message for me, personally, too. Because our delicious little Free Software feuds and ruckuses are almost as big a time-sink as Brexit.
Yesterday, me, Dmitry, Wolthera and Agata visited BlenderCon 2019. Intel had asked us to come to the conference to help set up the Intel/Acer booth, which was showing off Krita in all its HDR glory. After all, it’s pretty cool when a free software project has a real, tangible, technical, artistic first! It was great being there, meeting people who were really pleased to finally meet Krita hackers in the flesh.
(Aside 2: we tacked on a small sprint to this event, because the Intel team that works on these projects with us wanted to visit me in Deventer, and I wanted them to meet some other Krita hackers. At the sprint, I had HDR-enabled laptops for Dmitry and Agata, who were in dire need of new hardware. Yoga C940, really nice devices. We also discussed how to progress with the resource system rewrite.)
But, cool as HDR is, and the hardware that eventually arrived at the booth was pretty cool, too, and interested as many people were, that’s not the main thing I took away from the conference. What struck me once again was the disparity between how Blender is looked at from inside the Linux Desktop community, as if it were a largely irrelevant niche hobby project of no big moment in the larger scheme of things, and the reality of Blender as one of the most successful end-user oriented free software projects.
But first watch this:
Let’s start comparing the numbers for some projects, roughly. Things I’m interested in are installed base, market share, funding levels, developer engagement, community engagement.
That puts the total Linux Desktop installed base at 90,000,000, of which one third is ChromeOS. All other Linux desktop projects, from Plasma to Gnome, from Mate to XFCE, from Deepin to i3 have to divide that remaining 90,000,000 installations.
Then financial stuff… In 2017, the Gnome project had an income of 246,869 euros, in 2018 of 965,441 euros, but that includes the Handshake donation, which is one-time. For KDE, the 2017 income was about 150,000 euros, and 2018 was about 600,000 euros, again including the Handshake donation. I can’t be bothered to look for similar statements for Mate and XFCE which are tiny in any case. In a normal year, Desktop Linux seems to have a total budget of about 400,000 euros, excluding commercial investment which is not controlled by the projects themselves. KDE e.V. does not use its budget to pay for development, and the GNOME Foundation supports development financially in a very limited way.
The Free Desktop has three percent of the installed base of Windows/macOS.
LibreOffice has been estimated to have 200,000,000 active users. About 10% are Linux users, which means that they count all Linux desktops, probably because LibreOffice is installed by default by all Linux Distributions. In 2018, the Document Foundation’s income was 855,847,78 euros, pretty much all from donations. According to OpenHub, LibreOffice had about 200 developers in the past year, and has had 1900 developers over its entire existence, and 64 in the last month. Microsoft Office has five times the market share, at the very least.
LibreOffice’s installed base is one fifth of Microsoft’s.
Krita has, according to data from Microsoft, which counts every time an exe is started on Windows 10, 1,500,000 active users (that is, distinct systems on which Krita is started at least once a month). A dumb extrapolation of that to 4,500,000 users on all platforms since Windows 10 has 33% market share is probably too coarse, but let it stand. It’s order of magnitude that’s important here.
In 2010, Photoshop had 10,000,000 users. Of course, Adobe has moved to a subscription service since. In 2017, Adobe Cloud (which also includes subscriptions without Photoshop), had 12,000,000 subscribers. Looks like going to a subscriber model is really curtailing their installed base. Also, no wonder that last year’s Photoshop update suddenly included all kinds of fancy features aimed at painters, not photographers.
Krita currently is at 3,000,000 downloads a year from krita.org, which excludes downloads from external download sites or Linux distribution installs.
Our current budget is in a bit of a flux, since it’s rising, mostly because of income from Steam and the Windows store, but it’s about 240,000 euros a year. All of that is used to support development.
Krita has had 52 developers contributing in the past year, and 450 over its entire existence, and 17 in the past month. We currently have five full-time developers.
(In case you want a comparison with projects similar to Krita, GIMP is 675, 74, 21 and Inkscape 423, 94, 18 — but for those projects, the translations are part of the git repo, for Krita those are external, so the comparison is questionable…) I don’t know download numbers for GIMP or Inkscape, because those projects intentionally don’t track those.
Let’s risk a limb: Krita’s installed base is about a quarter of Photoshop’s.
It’s pretty hard to find recent information about downloads or installed base for Blender. Of course, Blender works in an industry where the “market leader” AutoDesk has about 100,000 users in total. The same estimate, using information from 2014, claims 4,000,000 downloads for Blender and about 200,000 users. I haven’t found any information that’s more recent, but Blender 2.80 has made a big splash: I wouldn’t be surprised that it’s made an order of magnitude difference. (Besides, if I look from our download numbers to installed base as reported by Microsoft, the calculation in that article doesn’t stand.)
The 2019 Blender Conference, the ostensible topic of this blog, has over 600 paying attendees. But Blender is a tool that makes its users money; nobody makes money just using a desktop. It makes sense for someone making money with Blender to pay to go to a conference this packed with useful information and contacts.
Akademy is free, though Guadec asks for a registration fee. I couldn’t find information about how many people attended Guadec in 2019, but about 160 people attended Akademy; I am sure almost all contributors.
The BlenderCon fee gets you food, drinks, clothing and a bag. The whole conference feels very professional, from the venue to the catering, from the hardware, to the level of excellence of the presentations. More importantly, there is an ecosystem around Blender that actually rents booths to show off their offering, puts leaflets, stickers and brochures in the swag bag. Almost all attendees are seriously committed blenderheads: but that means users. This is not just one developer community looking inside itself once a year.
Blender’s development fund currently brings in about 1,200,000 euros a year, which funds 20 full-time developers. That’s not the only source of funding. Blender has about 172 developers in the past year, and 550 over its entire existence, and 64 in the past month, same as LibreOffice. Looking at the last number, it means that there are anyway more volunteer committers in the Blender community than paid developers. Funded development hasn’t eaten the community.
Let’s hazard a guess: Blender has four times the installed base of AutoDesk Maya. This is pretty rough, of course, so ingest with salt.
This is important, because one of the things that keeps getting argued in the KDE community is that paying for development destroys the community. So, where does that argument come from, and when might it be true, and when might it not be true? Remember, I’m only talking about projects aimed at end-users, about applications, about the desktop. Programming languages, libraries, web stuff, all that is irrelevant for this discussion. What is relevant is that we often hear people claim that it is impossible to successfully develop this type of software as free software.
Well, my guess is that the people who continue to claim that funding development on free software will destroy the community are either nostalgic for their nineties student days, or tried to build a business around free software that they recognized was going to be valuable, but that was built by others. They would have founded a company, hired the developers, then started working on projects and contracts for customers. Of course that will fail: customers only care about their needs. And once you’re invoicing per-hour, you’re bust. You will never find time to properly maintain the project you wanted to build your company around. You’ve turned into a leech, and will drain your central asset of its lifeblood.
The other way is what happens with Blender, with LibreOffice and with Krita. There is no company that uses the free project to build a business selfishly. Instead, it is the project itself that is funded. There might be businesses that profit from the existence of the successful project (like the cloud renderer companies that were showing off at the BlenderCon), but that’s not central to the funding of the project.
Instead, there is someone central to the project who drives its development and growth with enthusiasm and who cares for the community; the paid developers are not extraneous people uninvolved in the community, but part of it.
I was going to write “in my opinion”, but the facts are pretty clear. Funding the development of free software applications this way is essential to achieve real success.
Blender is a success. Blender has, in fact, won, as Ton says. Despite being 25 years old, it’s the tool young 3D creatives reach for automatically, Maya is for old fogies. It has support from all hardware industry players: Intel, AMD, NVidia. It has support from closed source companies like Epic, from companies like Adidas, who simply are users of Blender. It is becoming, if it hasn’t already become, an industry standard.
And if Blender, a free and open source application can become an industry standard, users of Blender will find it easier to accept that Krita, which is also free and open source, can serve their needs just as well. Blender shows that free software can be first class, and that will drag every other willing project with it in its wake.
At Krita, we’ve worked together with Intel, for over 15 years now. If you compare the market numbers for Adobe and Autodesk, then it’s clear there’s a much larger potential community of Krita users than Blender users (unless Blender turns Grease Pencil into a regular painting application…). No reason we cannot have community four times bigger than Photoshop’s installed base! We’ve got plenty of growth still before us.
But I used the word “community” here, instead of something like “market”. Because the second part that defines success is the ability to remain a community. Well, when it comes to that, Blender does show the way forward, too.
Autodesk has such a small user base that it cannot grow a community. It’s too expensive for that. Which means that long-term it has already lost. (Besides, it cannot buy and shutdown Blender, which is what AutoDesk does when faced with competition). The only thing it could do is make Maya cheaper, but unless they hire Mario Draghi, they cannot make Maya cheaper than Blender.
There are more Photoshop users, because Photoshop is cheaper, and Photoshop users have more of a community feeling, but the company is very proficient at alienating that community; people feel attached to the tool, but hate the company behind it.
For Krita, we’re trying to foster our community. We had a big sprint in August where we invited more artists than ever, we’re funding the development of our YouTube channel — but there’s a way to go. I wish we had someone to setup Krita Artists, analogous to Blender Artists…
What is Niche?
My argument is that the model by which the Linux Desktop, GIMP or Inkscape are developed make those efforts inevitably niche. Not that everyone agrees about what is niche. At the 2019 KDE Onboarding Sprint, I was told twice that Krita of course serves a niche market. Doing art is something only a very few people are interested in, after all, isn’t it? Not even the actual download numbers could change that impression. The desktop, KDE PIM, that sort of projects are the important ones, providing something everyone needs. It’s fine to disregard anything outside the Linux world because nothing we do could ever succeed there. Nobody uses Qt applications on Windows.We all know they just don’t feel right. (Which would be news to Autodesk, since Maya is written in Qt.)
To me, looking at the numbers I’ve tried to assemble above, the Linux Desktop is a niche, LibreOffice, Blender and Krita are not.
Ton has once told me he doesn’t feel connected in any way to the regular free software/open source crowd. Being Free Software is essential for Blender’s success. The GPL is core. But being part of the GNU/GNOME/KDE etc. world, he warned me, would be a drag on Krita becoming successful.
And you know what? Unless we can turn our own communities around, I’m beginning to think he’s right. To make a real difference, our communities have to cross boundaries and enter the wider world. To flourish, a free software project needs to have a budget to fund its core developers within the project, to implement the vision of the project.
Do they exist? Especially as free software? I don’t actually know, but I’ve never seen a free software project use something like what I’ve got in mind.
That would be: a website where we could add any number of test scenarios.
People who wanted to help would get an account, make a profile with their hardware and OS listed. And then a couple of weeks before we make a release, we’d release a beta, and the beta testers would login and get randomly one of the test scenarios to test and report on. We’d match the tests to OS and hardware, and for some tests, probably try to get the test executed by multiple testers.
Frequent participation would lead to badges or something playful like that, they would be able to browse tests, add comments and interact — and we, as developers, we’d get feedback. So many tests executed, so many reported failure or regressions, and we’d be able to improve before the release.
It would be a crowd-sourced alternative to something like Squish (which I’ve never found to be very useful, not for Krita, not at the companies where it was used), it would make beta testers not just feel useful, it would make beta testing useful. Of course, maintaining the test scripts would also take time.
It sounds simple to me, and kind of logical and useful, but given that nobody is doing this — does such a thing exist?
These are my notes from the onboarding sprint. I had to miss the evenings, because I’m not very fit at the moment, so this is just my impression from the days I was able to be around, and from what I’ve been trying to do myself.
The KDE Onboarding Sprint happened in Nuremberg, 22 and 23 July. The goal of the sprint was to come closer to making getting started working on existing projects in the KDE community easier: more specifically, this sprint was held to work on the technical side of the developer story. Of course, onboarding in the wider sense also means having excellent documentation (that is easy to find), a place for newcomers to ask questions (that is easy to find).
Ideally, an interested newcomer would be able to start work without having to bother building (many) dependencies, without needing the terminal at first, would be able to start improving libraries like KDE frameworks as a next step, and be able to create a working and installable release of his work, to use or to share.
Other platforms have this story down pat, with the proviso that these platforms promote greenfield development, not extending existing projects, as well as working within the existing platform framework, without additional dependencies:
Apple: download XCode, and you can get started.
Windows: download Visual Studio, and you are all set.
Qt: download the Qt installer with Qt Creator, and documention, examples and project templates are all there, no matter for which platform you develop: macOS, Windows, Linux, Android or iOS.
GNOME Builder is also a one-download, get-started offering. But Builder adds additional features to the three above: it can download and build extra dependencies (with atrocious user-feedback, it has to be said), and it offers a list of existing GNOME projects to start hacking on. (Note: I do not know what happens when getting Builder on a system that lacks git, cmake or meson.)
KDE has nothing like this at the moment. Impressive as the kdesrc-build scripts are technically (thanks go to Michael Pyne for giving an in-depth presentation), they are not newcomer-friendly, with a complicated syntax, configuration files and dependency on working from the terminal. KDE also has much more diversity in its projects than GNOME:
Unlike GNOME, KDE software is cross-platform — though note that not every person present at the sprint was convinced of that, even dismissing KDE applications ported to Windows as “not used in the real world”.
Part of KDE is the Frameworks set of additional Qt libraries that are used in many KDE projects
Some KDE projects, like Krita, build from a single git repository, some projects build from dozens of repositories, where adding one feature, means working on half a dozen features at the same time, or in the case of Plasma replaces the entire desktop on Linux.
Some KDE projects are also deployed to mobile systems (iOS, Android, Plasma Mobile)
Ideally, no matter the project the newcomer selects, the getting-started story should be the same!
When the sprint team started evaluating technologies that are currently used in the KDE community to build KDE software, things started getting confused quite a bit. Some of the technologies discussed were oriented towards power users, some towards making binary releases. It is necessary to first determine which components need to be delivered to make a seamless newcomer experience possible:
Prerequisite tools: cmake, git, compiler
A way to fetch the repository or repositories the newcomer wants to work on
A way to fetch all the dependencies the project needs, where some of those dependencies might need to transition from dependency to project-being-worked on
A way to build, run and debug the project
A way to generate a release from the project
A way to submit the changes made to the original project
An IDE that integrates all of this
The sprint has spent most of its time on the dependencies problem, which is particularly difficult on Linux. An inventory of ways KDE projects “solve” the problem of providing the dependencies for a given project currently includes:
Using distribution-provided dependencies: this is unmaintainable because there are too many distributions with too much variation in the names of their packages to make it possible to keep full and up-to-date lists per project — and newcomers cannot find the deps from the name given in the cmake find modules.
Building the dependencies as CMake external projects per project: is only feasible for projects with a manageable number of dependencies and enough manpower to maintain it.
Building the dependencies as CMake external projects on binary factory, and using a docker image identical to the one used on the binary factory + these builds to develop the project in: same problem.
Building the (KDE, though now also some non-KDE) dependencies using kdesrc-build, getting the rest of the dependencies as distribution packages: this combines the first problem with fragility and a big learning curve.
Using the KDE flatpak SDK to provide the KDE dependencies, and building non-KDE dependencies manually, or fetching them from other flatpak SDK’s. (Query: is this the right terminology?) This suffers from inter- and intra-community politicking problems.
ETA: I completely forgot to mention craft here. Craft is a python based system close to emerge that has been around for ages. We used it initially for our Krita port to Windows; back then it was exclusively Windows oriented. These days, it also works on Linux and Windows. It can build all KDE and non-KDE dependencies that KDE applications need. But then why did I forget to mention it in my write-up? Was it because there was nobody at the sprint from the craft team? Or because nobody had tried it on Linux, and there was a huge Linux bias in any case? I don’t know… It was discussed during the meeting, though.
As an aside, much time was spent discussing docker, but when it was discussed, it was discussed as part of the dependency problem. However, it is properly a solution for running a build without affecting the rest of the developers system. (Apparently, there are people who install their builds into their system folders.) Confusingly, part of this discussion was also about setting environment variables to make it possible to run their builds when installed outside the system, or uninstalled. Note: the XDG environment variables that featured in this discussion are irrelevant for Windows and macOS.
As a future solution, Ovidiu Bogdan presented Conan, which is a cross-platform binary package manager for C++ libraries. This could solve the dependency problem, and only the dependency problem, but at the expense of making the run problem much harder because each library is in its own location. See https://conan.io/ .
The attendendants decided to try to tackle the dependency problem. A certain amount of agreement was reached on acknowledging that this is a big problem, so this was discussed in-depth. Note again, that the focus was on Linux again, relegating the cross-platform story to second place. Dmitry noted that when he tries to recruit students for Krita, only one in ten is familiar with Linux, pointing out we’re really limiting ourselves with this attitude.
A KDE application, kruler, was selected, as a prototype, for building with dependencies provided either by flatpak or conan.
Dmitry and Ovidiu dug into Conan. From what I observed, laying down the groundwork is a lot of work, and by the end of the evening, Dmitry and Ovidiu has packaged about half of the Qt and KDE dependencies for kruler. Though the Qt developers are considering moving to Conan for Qt’s 3rdparty deps, Qt in particular turned out to be a problem. Qt needs to be modularized in Conan, instead of being a big, fat monolith. See https://bintray.com/kde.
Aleix Pol had already made a begin integrating flatpak and docker support into KDevelop, as well as providing a flatpak runtime for KDE applications (https://community.kde.org/Flatpak).
This made it relatively easy to package kruler, okular and krita using flatpak. There are now maintained nightly stable and unstable flatpak builds for Krita.
The problems with flatpak, apart from the politicking, consist in two different opinions of what an appstream should contain, checks that go beyond what freedesktop.org standards demand, weird errors in general (you cannot have a default git branch tag that contains a slash…) an opaque build system and an appetite for memory that goes beyond healthy: my laptop overheated and hung when trying to build a krita flatpak locally.
Note also that the flatpak (and docker) integration in KDevelop are not done yet, and didn’t work properly when testing. I am also worried that KDevelop is too complex and intimidating to use as the IDE that binds everything together for new developers. I’d almost suggest we repurpose/fork Builder for KDE…
We’re not done with the onboarding sprint goal, not by a country mile. It’s as hard to get started with hacking on a KDE project, or starting a new KDE project as has ever been. Flatpak might be closer to ready than conan for solving the dependency problem, but though flatpak solves more problems than just the dependency problem, it is Linux-only. Using conan to solve the dependency problem will be very high-maintenance.
I do have a feeling we’ve been looking at this problem at a much too low level, but I don’t feel confident about what we should be doing instead. My questions are:
* Were we right on focusing first on the dependency problem and nothing but the dependency problem?
* Apart from flatpak and conan, what solutions exist to deliver prepared build environments to new developers?
* Is kdevelop the right IDE to give new developers?
* How can we make sure our documentation is up to date and findable?
* What communication channels do we want to make visible?
* How much effort can we afford to put into this?