[ad_1]
The word alternative is one of those shifty terms, with a definition that changes depending on perspective. For instance, something that is alternative to one person is the norm for another. Generally, the term alternative is considered to be defined by the fact that it is not considered to be in the majority or the mainstream.
Then again, sometimes the term “alternative” gets attached to the second instance of something. If a web server, such as Apache, exists, then any time a different web server gets mentioned, it gets the alternative badge, because we all assume that we all silently concede that whatever it is, it’s an alternative to that big one that we all know about.
Problems of persistence
These thoughts occurred to me the other night while I was tracking down a bug in some simple animation software I wrote. In this software, a user clicks a frame in the timeline and that frame gets an overlay icon or badge to mark it as the current selection. If a user clicks the frame again, we assume that the user is toggling the selection off, so the badge gets removed. Pretty obvious, typical user interface (UI).
Click on, click off.
The problem was that if a user tried to select the same frame again to re-select it, the frame would refuse to be selected because it already believed itself to be the active selection. The problem was solved pretty easily by some rudimentary garbage collection (although the larger problem is that the application needs a more robust selection library, but I digress), but it dawned on me that this issue was similar to what we, as a community of computer users, experience when we speak about applications.
Whether an application is the first on the scene, or one that is best marketed, or one that gets adopted by a majority of influential companies, we computerists often award a badge to one application early on, when it’s fresh. There’s an implication that that software earned that badge by merit. And as that software grows and develops, it gets to keep that badge.
The badge we give it is the right to be The One to which anything else is an alternative. We do it with open source projects and closed source projects alike. We assign this invisible and silent Seal of Authenticity without any RFC, without debate or survey. Sometimes the badge is, if only by default, accurate; if there really is no other application like it, then it’s hard to argue against referring to a software that comes later as an alternative.
The problem is, there doesn’t seem to be a requisite renewal period for these badges that we unwittingly hand out on a first-come-first-served basis. We give our Seal of Authenticity to whatever makes the biggest (or only) splash at some point, and it becomes not just the standard in its class, but it becomes the specification for everything following. You can’t make a word processor at this point without it being compared to Microsoft Word. To propose that Word is an insufficient measure of efficient word processing power seems verboten, but for better or for worse, Word got the badge and there’s been no garbage collection to clear out memory addresses in order to allow for a second badge, or a new badge altogether.
There have been exceptions to this, of course—sometimes big popular applications finally fall out of favor, but more often than not, the computing public has an unnervingly long-term memory for its definitions list. You can rattle off general application types, and most people, Rorschach-style, have a brand name associated with it:
- Office: Microsoft
- Photo: Adobe
- Video: Apple
- Server: Linux
Is it really so clear, so obvious? Or are we just being trite?
Problems of scope
In programming and other industries there is a concept of scope, which defines the space in which something is true. In one function of an application, I might assign one value to a variable, but I only need that value within one function, so I make the variable local—it’s valid for this function, but another function knows nothing about it.
As it turns out, this is yet another great analogy for how we computer users define alternative software. Different people need different things from their computers, to the point that it may never even occur to someone that particular software not only exists, but is the very linchpin of an entire industry. As an employee of the visual effects industry, my definition of obvious de facto applications certainly differs greatly from someone who manages, say, construction material durability requirements, or even from someone who teaches the basics of video production to children.
The general computing public rarely acknowledges this, I suspect because of marketing, mostly. It’s not in the interest, however disingenuous, for software ads to acknowledge that there are competitors or alternatives. Every software trying to sell itself is obligated to pretend that it’s the only real solution available—nothing else compares, but if you do find something else, then you must compare it to this software, because this one’s the real one (it’s the one that got the seal, the badge).
And, strangely, outside of your own computing scope, your standard application becomes niche. You can sit down with your friends at the café and tell them how great this software is, but if it didn’t get the badge within their scope of computing, then you may as well be speaking Greek without UTF-8.
Reclaiming the term “alternative”
The requirements for getting the badge that makes all other software an alternative are pretty fuzzy. We’re not really sure if it’s first-come-first-serve, or whether it’s market-share or brain-share, or how we measure brain-share. While those measurements do feel like obvious choices, that availability rarely enters the equation seems odd.
Certainly in my own life, the natural barrier to entry to most everything I do, both professionally and as a hobby, has been a trial of acquisition. I only managed to get into audio production because Audacity existed and was $0 to use (I’ve since graduated to Qtractor but Audacity was the gateway). It was available, regardless of my financial state (which, as a college student, was not good at the time). FFmpeg single-handedly got me paid employment in the media industry, and I was able to learn and use it because it was available and cost nothing to use. The list goes on.
I realized some time ago that I live in an open source world. We all do, because open source drives so much of computing these days, but I mean that the way I compute is with open source at both the bottom and top of my stack—I use open source in my networking, I use an open source kernel to drive physical hardware, and I use open source applications at work and at home. To a degree, I live in a bubble, but it’s a bubble that I consciously built and it serves me well. So the question is: If the alternative is my everyday computing experience, why should I still define it as alternative? Surely my way of life is not alternative from my perspective.
OK, so alternative is a malleable term. But it’s bigger than that. It’s not just a question of life with The Munsters, it’s a question of who’s allowed in. With open source, there’s no exclusion; even in the worst case where you feel unwelcome by some community that is building an open source application, you still have access to the code. Then the barrier to entry is your own resolve to learn a new application.
And that ought to be the standard, no matter what. My Rorschachian responses to application types default to open source, with the alternatives being the ones that you might choose to use if, for whatever reason, you find the ones available to everyone insufficient:
The list goes on and on. You define your own alternatives, but my mainstream day-to-day tools are not alternatives. They’re the ones that gets my seal of authenticity, and they’re open to everyone.
[ad_2]
Source link