Why we shouldn’t blame ourselves for the Linux desktop’s microscopic marketshare

Well, that was three interesting articles on the same topic on the same day, namely, billionaires. And read in turn they explain exactly why the Linux Desktop is still at such a marginal market share, and why that’s not because we, who work hard on it, are failures who have been doing the wrong thing all the time. It is in the first place policies, bought with money, that allowed people to build monopolies, taxing individuals and so becoming even more rich and powerful.

(Similarly, it’s not individuals through their choices who are destroying the planet, it is policies bought by the very rich who somehow believe that their Florida resorts won’t sink, that they won’t be affected by burning up the planet so they can get richer. But that’s a digression.)

So, the the first article, by Arwa Mahdawi, discussed the first part of this problem: with enough money, all policies are yours. It’s just a squib, not the strongest article.

Then, Robert Reich, a former US secretary of labor enumerates the ways people can become so exceedingly rich, and none of that is because they are so hard-working and so successful:

  • Exploit a monopoly: this is illegal under the laws of the United States.
  • Exploit insider information. This is also illegal.
  • Buy a tax cut. This seemed uniquely USA’ian until the Dutch prime minister Rutte promised abolition of the dividend tax to Unilever. This would seem to be illegal as well, but IANAL.
  • Extort people who already have a lot of money. Extortion is illegal.
  • Inherit the money. This is the only legal way to become a billionaire.

Now the article entitled What Is a Billionaire, by Matt Stoller was posted to the Linux reddit today. Not surprisingly, many people completely didn’t get the point, and thought it was irrelevant for a Linux discussion forum, or was about capitalism vs socialism, or outdated Microsoft bashing.

However, what it is about, is the question: why is Bill Gates not in jail for life with all his wealth stripped off? He’s a criminal, and his crime has directly harmed us, the people working on free software, on the Linux Desktop.

So, to make things painfully clear: Bill Gates made it so that his company would tax every computer sold no matter whether it ran Windows or not. If a manufacturer wanted to sell computers running Windows, all the computers it sold were taxed by Microsoft. He would get paid for the work a Linux distribution was doing, and the Linux distribution would not get that money.

That means there’s a gap twice the amount of this illegal tax between Microsoft and the Linux distribution. If a Linux distribution would want to earn what Microsoft earned on a PC sale, it would have to pay the Microsoft tax, and ask for its own fee.

This cannot be done.

And I know, this has been said often before, and discussed often before, and yeah, I guess, poor Bill Gates, if he hadn’t been bothered so badly with the hugely unfair antitrust investigation, he would also have been able to monopolize mobile phones, and the world would have been so much sweeter. For him, for certain.

I guess we didn’t do all that badly with the Linux Desktop market share being what it is. This is a fight that cannot be won.

Monopolies must be broken up. It’s the law, after all.

Fail! No Linux App Summit for me.

Fail…

I was going to attend the Linux App Summit, and even going to speak, about Krita and what happens to a an open source project when it starts growing a lot. And what that would mean for the Linux desktop ecosystem and so on. But that’s not going to happen.

There was really bad flooding in the south of France, which damaged the TGV track between Montpellier and Barcelona. When we went home after the 2018 Libre Graphics Meeting, we took the train from Barcelona to Paris, and noticed how close the water was.

Well, I guess it was too close. And this is relevant, because I had planned to come to Barcelona by train. It’s a relatively easy one-day trip if everything is well, and gives ten hours of undisturbed coding time, too. Besides, I have kind of promised myself that I’m not going to force myself to take planes anymore. Flying terrifies me. So I didn’t consider flying from Amsterdam for a moment — I was only afraid that other people would try to force me to fly.

Then I learned that there is a connection: I would take the train to Montpellier, and then a bus to Bezieres and then a train again. It would make the whole trip a two-day journey, with, as recommended by the travel agency I bought the tickets from, a stop in Paris. That advice was wrong.

I should have taken my connecting train to Montpellier, got a hotel there, and continued the next day. At this point I was like, okay… I’m just going home. Which I am doing right now. I bet the trip back would be just as difficult, and my Spanish isn’t as good as my French, so I would have an even harder time getting people to tell me what to do exactly, when to travel and where to go.

Sorry, everybody.

Professional Free Software

At the 2019 Libre Graphics Meeting, illustrator Livio Fania presented a heart-felt plea for more professionalism in libre graphics.

And that was the moment I began to think a bit. What is it that makes one project professional, and another not? Where, in this case, I’d take “professional” to mean “someone can depend on it so they can earn their daily bread with no more than the problems you always have with any software, because all software sucks, and hardware even more so”.

As Livio said in his presentation, funding makes a difference. If a project can fund its development, its continuity will be better, it will be better able to implement its vision and deliver what it promises, simply because funding equals time. That’s also what I tried to argue in my previous blog post.

In practice, it’s very hard to find funding for applications that that people do not earn their income with. Of course, there are very accomplished free and open source video players, editors or file managers. Those tend to be still completely volunteer-driven and very underfunded. And there’s a reasonably successful web-browser, which is actually funded quite well — but it’s funded to avoid Google being broken up as the monopolist that it is, mainly by Google.

And of course, there are applications that people use daily, earn their bread with and that are completely unfunded, even if they have donation buttons and money in the bank: GIMP, Inkscape, Scribus, various RAW photo applications, often by choice. This means that those projects are even more squeezed for time than a project like Krita. Just think how much difference Tavmjong Bah would make if he could be funded to work on Inkscape full-time! Tav gets $107 a month through Patreon… (Which I contribute too.)

But though Livio focused on the need to get funding to accelerate development, and it’s the first step, there’s more to creating a professional application.

The second step is: focus on the user’s needs. That starts with knowing what you want to create. If your goal is to implement a standard specification fully, as is the case with Inkscape, then is that goal sufficiently user-oriented to form the basis for an application designers can earn their daily bread with? It’s possible, but it’s something to always be aware of.

And like I argued recently, is looking inward, discussing the where’s and why’s of Free Software, no matter how enjoyable, not taking away time better spent getting to know your userbase, their needs and… The competition.

I would not be surprised if visiting the Linux Application Summit next would be less useful for Krita and its users than a week long training in Photoshop would be for me, as the Krita maintainer. We’ve all been there: we’ve all claimed we’re not competing with the big, sovereign, proprietary applications that are seen as a standard in the field where our application is the “open source alternative”.

But if you don’t compete, you cannot win. And if you don’t win, then millions of users will not use free and open source software. And I happen to believe that software freedom is important. And I’m slowly gaining the confidence to say: I am competing.

(And we are. Competing. Otherwise Adobe wouldn’t have been adding so many new features for painters and illustrators to their latest Photoshop release, some of them features Krita has had for a decade.)

And when we compete, which makes people trust us, and when our user fund our efforts, then we need to take another step towards professionalism.

That is, committing to continuity. I’ve always said “free software never dies”, but it does. Look at Amarok, which is thoroughly dead. Of course, Krita has been released since 2004, and there’s quite a bit of continuity already. But this does take commitment, which also needs to be public.

Finally, there’s the point where as a full-time project member, as the project maintainer, can no longer say “We don’t provide binaries. Get the source and build it, or wait for a distribution”. You have to provide your application in a form your users can use directly; it’s their time you’re telling them to use for something they don’t earn money with, if you ask them to build.

And then… We have to stop making excuses. Which is probably the hardest thing, because all software sucks, and all code sucks, and sometimes it’s impossible to fix a bug in Krita, because it’s in the OS, the Window manager or the development platform. Or the tablet driver, oh dear, the tablet drivers. But it’s your customer, your supporter, the person who depends on your work to earn their money who is stopped from doing that. And suddenly workarounds and hacks become necessary.

So, funding a core team of developers is the start, focusing on the field instead of the free software community, a will to compete, providing continuous improvement, making sure your software can be used and finally, not making up excuses if there are problems but fixing them.

Looking in or looking out?

In my previous blog post, I mentioned that being part of the wider Free Software community can be a drag on a project.

In this post, I want to put that in perspective. Elaborate a bit, if you will.

For someone like me who wrote his first GPL’ed application in 1993 (it was a uucp mail and usenet client written in –gasp!– Visual Basic), I’ve spent countless hours pleasantly occupied contemplating the where and which of Free Software. I am part of the KDE community, the Libre Graphics community, the Free Software community. Inside our communities, we’ve got engrossing discussions about licensing, diverting flamewars about business models, scintillating technical developments, awesome foes to smite, forks to be praised or lambasted. There are conferences to be visited, or be perorated at, sprints to organize and organizations to join.

In short, you can develop Free Software within the Free Software community while all the while looking in. This is important for Free Software. That is bad for Free Software. If I make this effort, or take this initiative, or mentor this student, Free Software will improve.

And that’s fun, and gives one the satisfied feeling of making the world a better, freer place. And I care about software freedom, really I do!

It does take a lot of time, though, and that time — is that really spent making the world better for the people who use my software? Is all that caring about the ins and outs of the Free Software community the best use of my time, when I’m working on an end-user application? And does it give me the right kind of insight in what I should be doing?

That’s what I meant when I wrote that I was kind of agreeing with Ton that being part of the Free Software community could be a drag on Krita becoming successful.

If I’m spending my time on GNOME vs KDE, Flatpak vs Snap vs AppImage, deb vs RPM, then I’m only worrying about technical details that don’t mean a thing for someone who wants to paint a comic. That makes it at worst a waste of my time, or at best a hobby that is only tangentially related to Krita and its users.

If I’m looking inside the community, and not out of it, facing the people who will actually be using my software, I probably won’t be making the right kind of software.

If I’m visiting conferences like Akademy, Fosdem or Linux Application Summit, I’m spending time on nourishing the Free Software community, but not on meeting my users, meeting their expectations or even learning what my proprietary competitors are doing.

Like I said on Friday, we need funding to make sure our projects have a stable base of effort. That stable base makes our projects more attractive for other developers interested in helping out. But we also need to look out of the window. And that’s a message for me, personally, too. Because our delicious little Free Software feuds and ruckuses are almost as big a time-sink as Brexit.

Back from the Blender Conference 2019

Yesterday, me, Dmitry, Wolthera and Agata visited BlenderCon 2019. Intel had asked us to come to the conference to help set up the Intel/Acer booth, which was showing off Krita in all its HDR glory. After all, it’s pretty cool when a free software project has a real, tangible, technical, artistic first! It was great being there, meeting people who were really pleased to finally meet Krita hackers in the flesh.

(Aside 1: note how crowded that theatre is!)

(Aside 2: we tacked on a small sprint to this event, because the Intel team that works on these projects with us wanted to visit me in Deventer, and I wanted them to meet some other Krita hackers. At the sprint, I had HDR-enabled laptops for Dmitry and Agata, who were in dire need of new hardware. Yoga C940, really nice devices. We also discussed how to progress with the resource system rewrite.)

But, cool as HDR is, and the hardware that eventually arrived at the booth was pretty cool, too, and interested as many people were, that’s not the main thing I took away from the conference. What struck me once again was the disparity between how Blender is looked at from inside the Linux Desktop community, as if it were a largely irrelevant niche hobby project of no big moment in the larger scheme of things, and the reality of Blender as one of the most successful end-user oriented free software projects.

But first watch this:

Numbers

Let’s start comparing the numbers for some projects, roughly. Things I’m interested in are installed base, market share, funding levels, developer engagement, community engagement.

Desktop Linux has a market share of about 3%, including ChromeOS. Windows 10 runs on about 1,000,000,000 devices., and Windows 10 has about 33% marketshare. (You might want to quibble about these numbers, the sources where I pulled them from, but it’s pretty the same everywhere. And exact numbers don’t matter in this discussion.)

That puts the total Linux Desktop installed base at 90,000,000, of which one third is ChromeOS. All other Linux desktop projects, from Plasma to Gnome, from Mate to XFCE, from Deepin to i3 have to divide that remaining 90,000,000 installations.

Then financial stuff… In 2017, the Gnome project had an income of 246,869 euros, in 2018 of 965,441 euros, but that includes the Handshake donation, which is one-time. For KDE, the 2017 income was about 150,000 euros, and 2018 was about 600,000 euros, again including the Handshake donation. I can’t be bothered to look for similar statements for Mate and XFCE which are tiny in any case. In a normal year, Desktop Linux seems to have a total budget of about 400,000 euros, excluding commercial investment which is not controlled by the projects themselves. KDE e.V. does not use its budget to pay for development, and the GNOME Foundation supports development financially in a very limited way.

The Free Desktop has three percent of the installed base of Windows/macOS.

LibreOffice has been estimated to have 200,000,000 active users. About 10% are Linux users, which means that they count all Linux desktops, probably because LibreOffice is installed by default by all Linux Distributions. In 2018, the Document Foundation’s income was 855,847,78 euros, pretty much all from donations. According to OpenHub, LibreOffice had about 200 developers in the past year, and has had 1900 developers over its entire existence, and 64 in the last month. Microsoft Office has five times the market share, at the very least.

LibreOffice’s installed base is one fifth of Microsoft’s.

Krita has, according to data from Microsoft, which counts every time an exe is started on Windows 10, 1,500,000 active users (that is, distinct systems on which Krita is started at least once a month). A dumb extrapolation of that to 4,500,000 users on all platforms since Windows 10 has 33% market share is probably too coarse, but let it stand. It’s order of magnitude that’s important here.

In 2010, Photoshop had 10,000,000 users. Of course, Adobe has moved to a subscription service since. In 2017, Adobe Cloud (which also includes subscriptions without Photoshop), had 12,000,000 subscribers. Looks like going to a subscriber model is really curtailing their installed base. Also, no wonder that last year’s Photoshop update suddenly included all kinds of fancy features aimed at painters, not photographers.

Krita currently is at 3,000,000 downloads a year from krita.org, which excludes downloads from external download sites or Linux distribution installs.

Our current budget is in a bit of a flux, since it’s rising, mostly because of income from Steam and the Windows store, but it’s about 240,000 euros a year. All of that is used to support development.

Krita has had 52 developers contributing in the past year, and 450 over its entire existence, and 17 in the past month. We currently have five full-time developers.

(In case you want a comparison with projects similar to Krita, GIMP is 675, 74, 21 and Inkscape 423, 94, 18 — but for those projects, the translations are part of the git repo, for Krita those are external, so the comparison is questionable…) I don’t know download numbers for GIMP or Inkscape, because those projects intentionally don’t track those.

Let’s risk a limb: Krita’s installed base is about a quarter of Photoshop’s.

It’s pretty hard to find recent information about downloads or installed base for Blender. Of course, Blender works in an industry where the “market leader” AutoDesk has about 100,000 users in total. The same estimate, using information from 2014, claims 4,000,000 downloads for Blender and about 200,000 users. I haven’t found any information that’s more recent, but Blender 2.80 has made a big splash: I wouldn’t be surprised that it’s made an order of magnitude difference. (Besides, if I look from our download numbers to installed base as reported by Microsoft, the calculation in that article doesn’t stand.)

The 2019 Blender Conference, the ostensible topic of this blog, has over 600 paying attendees. But Blender is a tool that makes its users money; nobody makes money just using a desktop. It makes sense for someone making money with Blender to pay to go to a conference this packed with useful information and contacts.

Akademy is free, though Guadec asks for a registration fee. I couldn’t find information about how many people attended Guadec in 2019, but about 160 people attended Akademy; I am sure almost all contributors.

The BlenderCon fee gets you food, drinks, clothing and a bag. The whole conference feels very professional, from the venue to the catering, from the hardware, to the level of excellence of the presentations. More importantly, there is an ecosystem around Blender that actually rents booths to show off their offering, puts leaflets, stickers and brochures in the swag bag. Almost all attendees are seriously committed blenderheads: but that means users. This is not just one developer community looking inside itself once a year.

Blender’s development fund currently brings in about 1,200,000 euros a year, which funds 20 full-time developers. That’s not the only source of funding. Blender has about 172 developers in the past year, and 550 over its entire existence, and 64 in the past month, same as LibreOffice. Looking at the last number, it means that there are anyway more volunteer committers in the Blender community than paid developers. Funded development hasn’t eaten the community.

Let’s hazard a guess: Blender has four times the installed base of AutoDesk Maya. This is pretty rough, of course, so ingest with salt.

Funding Development

This is important, because one of the things that keeps getting argued in the KDE community is that paying for development destroys the community. So, where does that argument come from, and when might it be true, and when might it not be true? Remember, I’m only talking about projects aimed at end-users, about applications, about the desktop. Programming languages, libraries, web stuff, all that is irrelevant for this discussion. What is relevant is that we often hear people claim that it is impossible to successfully develop this type of software as free software.

Well, my guess is that the people who continue to claim that funding development on free software will destroy the community are either nostalgic for their nineties student days, or tried to build a business around free software that they recognized was going to be valuable, but that was built by others. They would have founded a company, hired the developers, then started working on projects and contracts for customers. Of course that will fail: customers only care about their needs. And once you’re invoicing per-hour, you’re bust. You will never find time to properly maintain the project you wanted to build your company around. You’ve turned into a leech, and will drain your central asset of its lifeblood.

The other way is what happens with Blender, with LibreOffice and with Krita. There is no company that uses the free project to build a business selfishly. Instead, it is the project itself that is funded. There might be businesses that profit from the existence of the successful project (like the cloud renderer companies that were showing off at the BlenderCon), but that’s not central to the funding of the project.

Instead, there is someone central to the project who drives its development and growth with enthusiasm and who cares for the community; the paid developers are not extraneous people uninvolved in the community, but part of it.

I was going to write “in my opinion”, but the facts are pretty clear. Funding the development of free software applications this way is essential to achieve real success.

Defining Success

Blender is a success. Blender has, in fact, won, as Ton says. Despite being 25 years old, it’s the tool young 3D creatives reach for automatically, Maya is for old fogies. It has support from all hardware industry players: Intel, AMD, NVidia. It has support from closed source companies like Epic, from companies like Adidas, who simply are users of Blender. It is becoming, if it hasn’t already become, an industry standard.

And if Blender, a free and open source application can become an industry standard, users of Blender will find it easier to accept that Krita, which is also free and open source, can serve their needs just as well. Blender shows that free software can be first class, and that will drag every other willing project with it in its wake.

At Krita, we’ve worked together with Intel, for over 15 years now. If you compare the market numbers for Adobe and Autodesk, then it’s clear there’s a much larger potential community of Krita users than Blender users (unless Blender turns Grease Pencil into a regular painting application…). No reason we cannot have community four times bigger than Photoshop’s installed base! We’ve got plenty of growth still before us.

But I used the word “community” here, instead of something like “market”. Because the second part that defines success is the ability to remain a community. Well, when it comes to that, Blender does show the way forward, too.

Autodesk has such a small user base that it cannot grow a community. It’s too expensive for that. Which means that long-term it has already lost. (Besides, it cannot buy and shutdown Blender, which is what AutoDesk does when faced with competition). The only thing it could do is make Maya cheaper, but unless they hire Mario Draghi, they cannot make Maya cheaper than Blender.

There are more Photoshop users, because Photoshop is cheaper, and Photoshop users have more of a community feeling, but the company is very proficient at alienating that community; people feel attached to the tool, but hate the company behind it.

For Krita, we’re trying to foster our community. We had a big sprint in August where we invited more artists than ever, we’re funding the development of our YouTube channel — but there’s a way to go. I wish we had someone to setup Krita Artists, analogous to Blender Artists…

What is Niche?

My argument is that the model by which the Linux Desktop, GIMP or Inkscape are developed make those efforts inevitably niche. Not that everyone agrees about what is niche. At the 2019 KDE Onboarding Sprint, I was told twice that Krita of course serves a niche market. Doing art is something only a very few people are interested in, after all, isn’t it? Not even the actual download numbers could change that impression. The desktop, KDE PIM, that sort of projects are the important ones, providing something everyone needs. It’s fine to disregard anything outside the Linux world because nothing we do could ever succeed there. Nobody uses Qt applications on Windows.We all know they just don’t feel right. (Which would be news to Autodesk, since Maya is written in Qt.)

To me, looking at the numbers I’ve tried to assemble above, the Linux Desktop is a niche, LibreOffice, Blender and Krita are not.

Ton has once told me he doesn’t feel connected in any way to the regular free software/open source crowd. Being Free Software is essential for Blender’s success. The GPL is core. But being part of the GNU/GNOME/KDE etc. world, he warned me, would be a drag on Krita becoming successful.

And you know what? Unless we can turn our own communities around, I’m beginning to think he’s right. To make a real difference, our communities have to cross boundaries and enter the wider world. To flourish, a free software project needs to have a budget to fund its core developers within the project, to implement the vision of the project.

Distributed Beta Testing Platforms

Do they exist? Especially as free software? I don’t actually know, but I’ve never seen a free software project use something like what I’ve got in mind.

That would be: a website where we could add any number of test scenarios.

People who wanted to help would get an account, make a profile with their hardware and OS listed. And then a couple of weeks before we make a release, we’d release a beta, and the beta testers would login and get randomly one of the test scenarios to test and report on. We’d match the tests to OS and hardware, and for some tests, probably try to get the test executed by multiple testers.

Frequent participation would lead to badges or something playful like that, they would be able to browse tests, add comments and interact — and we, as developers, we’d get feedback. So many tests executed, so many reported failure or regressions, and we’d be able to improve before the release.

It would be a crowd-sourced alternative to something like Squish (which I’ve never found to be very useful, not for Krita, not at the companies where it was used), it would make beta testers not just feel useful, it would make beta testing useful. Of course, maintaining the test scripts would also take time.

It sounds simple to me, and kind of logical and useful, but given that nobody is doing this — does such a thing exist?

KDE Onboarding Sprint Report

These are my notes from the onboarding sprint. I had to miss the evenings, because I’m not very fit at the moment, so this is just my impression from the days I was able to be around, and from what I’ve been trying to do myself.

Day 1

The KDE Onboarding Sprint happened in Nuremberg, 22 and 23 July. The goal of the sprint was to come closer to making getting started working on existing projects in the KDE community easier: more specifically, this sprint was held to work on the technical side of the developer story. Of course, onboarding in the wider sense also means having excellent documentation (that is easy to find), a place for newcomers to ask questions (that is easy to find).

Ideally, an interested newcomer would be able to start work without having to bother building (many) dependencies, without needing the terminal at first, would be able to start improving libraries like KDE frameworks as a next step, and be able to create a working and installable release of his work, to use or to share.

Other platforms have this story down pat, with the proviso that these platforms promote greenfield development, not extending existing projects, as well as working within the existing platform framework, without additional dependencies:

  • Apple: download XCode, and you can get started.
  • Windows: download Visual Studio, and you are all set.
  • Qt: download the Qt installer with Qt Creator, and documention, examples and project templates are all there, no matter for which platform you develop: macOS, Windows, Linux, Android or iOS.

GNOME Builder is also a one-download, get-started offering. But Builder adds additional features to the three above: it can download and build extra dependencies (with atrocious user-feedback, it has to be said), and it offers a list of existing GNOME projects to start hacking on. (Note: I do not know what happens when getting Builder on a system that lacks git, cmake or meson.)

KDE has nothing like this at the moment. Impressive as the kdesrc-build scripts are technically (thanks go to Michael Pyne for giving an in-depth presentation), they are not newcomer-friendly, with a complicated syntax, configuration files and dependency on working from the terminal. KDE also has much more diversity in its projects than GNOME:

  • Unlike GNOME, KDE software is cross-platform — though note that not every person present at the sprint was convinced of that, even dismissing KDE applications ported to Windows as “not used in the real world”.
  • Part of KDE is the Frameworks set of additional Qt libraries that are used in many KDE projects
  • Some KDE projects, like Krita, build from a single git repository, some projects build from dozens of repositories, where adding one feature, means working on half a dozen features at the same time, or in the case of Plasma replaces the entire desktop on Linux.
  • Some KDE projects are also deployed to mobile systems (iOS, Android, Plasma Mobile)

Ideally, no matter the project the newcomer selects, the getting-started story should be the same!

When the sprint team started evaluating technologies that are currently used in the KDE community to build KDE software, things started getting confused quite a bit. Some of the technologies discussed were oriented towards power users, some towards making binary releases. It is necessary to first determine which components need to be delivered to make a seamless newcomer experience possible:

  • Prerequisite tools: cmake, git, compiler
  • A way to fetch the repository or repositories the newcomer wants to work on
  • A way to fetch all the dependencies the project needs, where some of those dependencies might need to transition from dependency to project-being-worked on
  • A way to build, run and debug the project
  • A way to generate a release from the project
  • A way to submit the changes made to the original project
  • An IDE that integrates all of this

The sprint has spent most of its time on the dependencies problem, which is particularly difficult on Linux. An inventory of ways KDE projects “solve” the problem of providing the dependencies for a given project currently includes:

  • Using distribution-provided dependencies: this is unmaintainable because there are too many distributions with too much variation in the names of their packages to make it possible to keep full and up-to-date lists per project — and newcomers cannot find the deps from the name given in the cmake find modules.
  • Building the dependencies as CMake external projects per project: is only feasible for projects with a manageable number of dependencies and enough manpower to maintain it.
  • Building the dependencies as CMake external projects on binary factory, and using a docker image identical to the one used on the binary factory + these builds to develop the project in: same problem.
  • Building the (KDE, though now also some non-KDE) dependencies using kdesrc-build, getting the rest of the dependencies as distribution packages: this combines the first problem with fragility and a big learning curve.
  • Using the KDE flatpak SDK to provide the KDE dependencies, and building non-KDE dependencies manually, or fetching them from other flatpak SDK’s. (Query: is this the right terminology?) This suffers from inter- and intra-community politicking problems.

ETA: I completely forgot to mention craft here. Craft is a python based system close to emerge that has been around for ages. We used it initially for our Krita port to Windows; back then it was exclusively Windows oriented. These days, it also works on Linux and Windows. It can build all KDE and non-KDE dependencies that KDE applications need. But then why did I forget to mention it in my write-up? Was it because there was nobody at the sprint from the craft team? Or because nobody had tried it on Linux, and there was a huge Linux bias in any case? I don’t know… It was discussed during the meeting, though.

As an aside, much time was spent discussing docker, but when it was discussed, it was discussed as part of the dependency problem. However, it is properly a solution for running a build without affecting the rest of the developers system. (Apparently, there are people who install their builds into their system folders.) Confusingly, part of this discussion was also about setting environment variables to make it possible to run their builds when installed outside the system, or uninstalled. Note: the XDG environment variables that featured in this discussion are irrelevant for Windows and macOS.

Currently, newcomers for pim, frameworks or plasma development are pointed towards kdesrc-build (https://community.kde.org/Get_Involved/development, four clicks from www.kde.org), which not unexpectedly, leads to a heavy support burden in the kde development communication channels. For other applications, like Krita, there are build guides (https://docs.krita.org/en/contributors_manual.html, two clicks from krita.org), and that also is not a feasible solution.

As a future solution, Ovidiu Bogdan presented Conan, which is a cross-platform binary package manager for C++ libraries. This could solve the dependency problem, and only the dependency problem, but at the expense of making the run problem much harder because each library is in its own location. See https://conan.io/ .

Day 2

The attendendants decided to try to tackle the dependency problem. A certain amount of agreement was reached on acknowledging that this is a big problem, so this was discussed in-depth. Note again, that the focus was on Linux again, relegating the cross-platform story to second place. Dmitry noted that when he tries to recruit students for Krita, only one in ten is familiar with Linux, pointing out we’re really limiting ourselves with this attitude.

A KDE application, kruler, was selected, as a prototype, for building with dependencies provided either by flatpak or conan.

Dmitry and Ovidiu dug into Conan. From what I observed, laying down the groundwork is a lot of work, and by the end of the evening, Dmitry and Ovidiu has packaged about half of the Qt and KDE dependencies for kruler. Though the Qt developers are considering moving to Conan for Qt’s 3rdparty deps, Qt in particular turned out to be a problem. Qt needs to be modularized in Conan, instead of being a big, fat monolith. See https://bintray.com/kde.

Aleix Pol had already made a begin integrating flatpak and docker support into KDevelop, as well as providing a flatpak runtime for KDE applications (https://community.kde.org/Flatpak).

This made it relatively easy to package kruler, okular and krita using flatpak. There are now maintained nightly stable and unstable flatpak builds for Krita.

The problems with flatpak, apart from the politicking, consist in two different opinions of what an appstream should contain, checks that go beyond what freedesktop.org standards demand, weird errors in general (you cannot have a default git branch tag that contains a slash…) an opaque build system and an appetite for memory that goes beyond healthy: my laptop overheated and hung when trying to build a krita flatpak locally.

Note also that the flatpak (and docker) integration in KDevelop are not done yet, and didn’t work properly when testing. I am also worried that KDevelop is too complex and intimidating to use as the IDE that binds everything together for new developers. I’d almost suggest we repurpose/fork Builder for KDE…

Conclusion

We’re not done with the onboarding sprint goal, not by a country mile. It’s as hard to get started with hacking on a KDE project, or starting a new KDE project as has ever been. Flatpak might be closer to ready than conan for solving the dependency problem, but though flatpak solves more problems than just the dependency problem, it is Linux-only. Using conan to solve the dependency problem will be very high-maintenance.

I do have a feeling we’ve been looking at this problem at a much too low level, but I don’t feel confident about what we should be doing instead. My questions are:

* Were we right on focusing first on the dependency problem and nothing but the dependency problem?
* Apart from flatpak and conan, what solutions exist to deliver prepared build environments to new developers?
* Is kdevelop the right IDE to give new developers?
* How can we make sure our documentation is up to date and findable?
* What communication channels do we want to make visible?
* How much effort can we afford to put into this?

Toying with iOS

So, two years ago I thought porting Krita to iOS or Android might make a dandy research project. A bit of context: if I spend 500 hours a year on an approved R&D project, I get a tax break. Plus, I like doing new stuff now and then. My 2018/2019 R&D project is Resource Management for the 21st Century, a previous one was Python Scripting.

In 2016, there wasn’t a decent Android tablet with a pen available anymore. The Wacom Cintiq Hybrid Companion is stuck on an ancient version of Android and wasn’t being made anymore, and Samsung’s Note tablet was an older model. The iPad Pro was new, so I decided to experiment with that. I got myself an iPad Pro, a Pencil and…

I tried to put a simple little example application on the iPad. I found something that demonstrated using the Pencil, and then discovered that Apple wouldn’t allow me to put code I had built myself on my iPad. I needed a developer account and keys and everything.

I told myself I would investigate that, but never had time to dig in.

Then in 2017, I gave the Cupertino Shylock the 99 ducats it wanted, and got the acccount. Mainly so we could sign our macOS builds and disk images — Apple making it deuced hard for people to run unsigned software. Now they’re going to make it even harder — they want applications in the macOS App Store to be notarized. But I digress…

SO, now, end of 2018, in the week off I usually allow me myself between Christmas and New Year’s Eve, I finally sat down to experiment a bit.

First, I loaded the test application I had selected in XCode. I plugged in my iPad in my Macbook Pro — for the first time since I had bought the hardware! Stuff happened, and I had to click various dialogs, and then the device popped up in XCode.

It was quite difficult to actually find where to put my Apple ID as the “Team” — it didn’t work to tell XCode what to sign the application with, it needed something it choose to call “Team”.

But then everything worked! Yay!

Okay, next step. Get a Qt application running on the iPad. I downloaded Qt again — usually I build it myself with a bunch of patches, but I didn’t want to try to build Qt for iOS myself, nor mess with the development tree I use for Krita.

Qt’s documentation was excellent, and pretty soon I had the Tablet example application running on the iPad. It looks a bit weird, because that’s a QWidget-based application, but that’s fine. ClipStudio Pro on iOS also is a compleat Desktop Application, with popup dialogs and menus and everything, so I am sure Apple wouldn’t mind… And the Pencil was supported really well, so that’s very hopeful.

Now I only had to make one more experiment before starting to tackle maybe porting Krita: port the Tablet example to CMake, load that in Qt Creator and use Qt Creator to build it and deploy it to my iPad.

Well, that was my Waterloo. CMake doesn’t officially support iOS yet. G’Compris, which does, does that by providing a qmake file and some manual instructions. Google turns up a ton of conflicting advice, some old and outdated, some newer and more hopeful. I have tried to make a start on it, but no dice yet. If you know how to make CMake build and deploy to an iPad, answers on a postcard, please!

And we’re almost home…

We left really early this morning on the trains to Würzburg, Hannover, Deventer. I was pretty smart, if I may say so, because I gave ourselves half an hour or more time to change trains. Deutsche Bahn is a wonderful institution, but experience has taught me that especially in summertime, 6 minutes are not enough of a safety margin. So we made all our changes, and even had time to lunch in Würzburg.

So, yesterday really was the last day of Akademy for Irina and me. And all of that day was taken up with the fundraising training by Florian Engel. And it was worth it! Oh, gosh, wake me up in the middle of the night and ask me whether it was worth it!

Practical, to-the-point, flexible, engaging, going deep where we needed that, giving examples from outside of free software so we were all getting new ideas just from those examples. I need to prepare my notes for a discussion during Krita’s Monday meeting, about our September campaign, software platform, and donation page…

For the rest, it was great to meet so many people I hadn’t seen for way too long, including Inge Wallin, with whom I, back in the Nokia days, had founded KO GmbH. Or people I work with every day, but had never met, like Ben Cooksley. Productive discussions about things as diverse as debug symbols in appimages or ways to attract and retain new contributors. Meeting and sitting down with Eliakin, my 2017 student, was awesome as well; pity KDE is so busy that we couldn’t spend more time together!

I went to Akademy feeling that the relationship been Krita and KDE is kinda difficult. Krita is part of KDE, but at the same time, Krita is getting really big. We’re using up quite a chunk of bandwidth, after plasmashell, we’re the project with the second-most bugs reported per year, and still people working on Krita don’t have much of a tie to KDE, and people working on KDE seldom have much of an idea what’s going on in Krita — other than nodding and telliing me Krita is one of KDE’s flagship projects. Sure it is, and I got very much reassured that we’re not using too large a chunk of KDE’s resources, and could even use more. I’m not sure how to “fix” this, if a fix is possible. If we’d have our Krita sprint during Akademy, I’m sure that would help — but it would also be a pretty improductive sprint for Krita.

This still needs mulling over.

Museum Day, or, the Benefit of Skiving Off

Tomorrow, there’s the fund raiser training session. Given that we’ve been raising funds for Krita since time immemorial (our first fund raiser was for two Wacom tablets and art pens so we could implement support for them, the second to let Lukas Tvrdy work on Krita for a couple of months and after that, we’ve had the kickstarters), that might seem superfluous. But I’m still hoping to learn lots. After all, it’s not like we’re exactly awash in money.

But today, we, me and Irina, we went all-out for a day in Vienna. Just took the day off, had a lazy morning with breakfast in the hotel room (tea and croissants…), then took the underground to the Karlsplatz. From there, it was an easy walk to the KHM. Vienna is quite compact.

One thing I love about Vienna is the ubiquitous availability of non-sugary soft drinks. That is, soda zitrone — sparkling water with lemon juice. Half a litre of that in the museum cafe rehydrated us sufficiently to go out and see the parts that we hadn’t seen before. The French/Italian/Spanish parts of the museum are not as paralyzing as the Flemish/German/Dutch parts, but there was plenty! In particular, the three portraits of the Infanta of Spain, at ages 4, 6, 8 (or thereabouts) were touching. Gramps, being the Holy Roman Emperor of the German Nation, had asked his son-in-law for regular updates on his little darling grandchild, and got them, painted by Velasquez.

The Roman/Greek/Egyptian part was curious more than impressive: quantity over quality, perhaps, but still, interesting. It’s also the most unreconstructed part of the museum, with the exhibits often being labeled only in type-written German, on yellowing paper.

Having gone through that section, we were conveniently close to the museum cafe again, where they do serve excellent food. So we lunched there, then went back to our favourites in the dutch/flemish/german paintings sections. I spent half an hour with Rogier van der Weyden again, and if there wouldn’t be that fundraising workshop tomorrow, I would spend an hour in that room again, tomorrow. But we’ve got a year pass and we will return. I like the KHM better than the Bodemuseum in Berlin… There were other paintings I have stared at, trying to remember all of it, like the Reynolds in a little side-room. I was going all squiggly-eyed, so I decided to try and find Irina.

As I was staggering towards the exit, I suddenly became aware of being spoken at by a clean-shaven person, in what I thought was Danish or Swedish or some other language I don’t speak. It turned out to be one of the other Akademy attendees, a Dutchman. I had so much trouble coming down to earth and realizing that he was speaking a language that I could understand! Afterwards, I felt like a loon.

From there, we went out in search of beer. It was, by now, afternoon, and a warm one. We failed though! First we reached the Treasury. Our year pass is valid there as well, and we had been told the Treasury museum is in the medieval part of the Hofburg. And since the Hofburg is, sorry…, weird, it’s like an ordinary, rather plain, apartment building like you find them all over Vienna, we were like, let’s see what the medieval parts look like!

Well, there wasn’t much of that visible. But the presentation was really pretty good: excellent explanations, impressive exhibits, lots of ancient costumes, too. What I really want to know, though, is: how can textile dating back to the Norman kingdom in Sicily, C12, be as smooth and hale as the socks and tunics and orarion are that are shown? Those 1000-year old swords: how can the steel look like it was forged last year? I’m sure it’s that old, but how has it been conserved and preserved like that?

From there we went on, and found a Kurkonditorei — I guess it’s Kur, because you can only get beer in 0.3 and not 0.5 measures, which must have a slimming effect. Still, the beer was cool, my sandwich was good, Irina’s topfenknodel were good too, or so I have been told, and there were so many interesting people to watch… We had another beer.

And then it was time to go back to the hotel, shower, read mail, go out back to the venue area, find that the Bep Viet restaurant was packed, have a pizza at the pizza place, go back again, and realize that this has been one of the nicest Akademy’s I’ve attended, and that Vienna’s one of the nicest places I’ve visited.