[linux-elitists] Ethicc v. Pragmatics (was: Why haven't you switched to MacOS X yet?)
Karsten M. Self
Mon Jan 13 16:51:22 PST 2003
Launching this as a new thread. And not to contest either Evan or Rick,
more taking both of them as foils...
on Fri, Jan 10, 2003 at 11:06:04AM -0800, Rick Moen (firstname.lastname@example.org) wrote:
> Quoting Mister Bad (email@example.com):
> > People who have them call 'em "ethics". People who don't call it
> > "dogma."
> Er, Evan, I hope you don't mind my saying this, but I get a little antsy
> when people immediately resort to gratuitous moralism in these debates:
> I see no reason to attribute lack of ethics to those who (mistakenly)
> label as "dogma" any preference for open-source. It's purblind and
> rude, yes; but hardly unethical.
> Furthermore, a preference for open source, where feasible, need not be
> rooted in "ethics" at all. Many people see it as a perfectly rational
> pragmatic response to decades of seeing one's IT agenda jerked around by
> third-party proprietary software publishers, resulting in uncontrolled
> business risk.
On the Ethics v. Pragmatic debate, I tend to come down slightly on
Rick's side of the debate: I use GNU/Linux because it does what I want
it to do. And I expect it to continue doing so. But I see the two
issues as inextricably bound.
I put this question to RMS at a dinner some years ago: do you believe
in free software because it is good, or because it is free? Richard's
answer: because it is free.
Richard is the idealist, the evangelist, the missionary. And to him
freedom is an absolute goal.
I fall to a more pragmatic bent: the tools I use should work. They
should be long-term credible. There should be a growth path. There
should be a consistency over time. They should be extensible. They
should be flexible.
When I began taking free software seriously, ~1997, I spent a couple of
years looking hard at the economic, legal, and technical underpinnings
of the movement. One of the more interesting items I turned up was a
history of the computing industry in the US, from the 1940s onward.
Cognition and Capabilities: Opportunities Seized and Missed in the
History of the Computer Industry
Richard N. Langlois
This begins with the emergence of IBM from the Eniac and UNIVAC
computers in the 1940s and 50s, displacing Remington Rand (aka Sperry
Rand), and beating out GE and RCA.. A number of transition points are
- Emergence of programmable (rather than special-purpose) computers.
- Modularization -- of central processing units and peripherals, by
IBM in the 1950s to create new capabilities, combined with a leasing
model of hardware revenues, which reduced the cannibalization effect
of new products on existing base (cf: Innovators Dilemma, Steven
- Unification of the computing platform from diverse systems with
incompatible characteristics to a line with a common operating
system: System 360. This also extended the modular philosophy of
- At about the same time, a transition from vacuum tubes to
solid-state circuitry, reducing costs, power consumption, and
floor-space requirements; and increasing efficiency.
- Emergence of the minicomputer. DEC's first systems had a higher
cost/performance ratio than IBM's offerings, but offered finer
granularity, and a lower entry point to enterprise computing.
- Emergence of open systems. Unix appeared on the scene in the mid
1970s, and came to trounce DEC's VMS systems in the 1980s based on
a minicomputer open architecture that extended not merely across
systems from on vendor, but, at least at the source level, across
- Personal computing also emerged in the enterprise in the 1980s,
largely as a pushback against decision making and control bottlenecks
in the "Glass House" corporate IT bureaucracy. PCs allowed both
software and hardware allocation decisions to be decentralized and
put in the hands of end users, or at least departments.
- Apple's loss of marketshare to the PC. Based on cost, centralized
control, modularity, and flexibility, a competitive, though arguably
technically inferior, PC market overcame Apple's early lead, and
continues to do so to this day.
Langlois's analysis leaves off at this point, but I'd throw in a few
additional examples or trends:
- The dominance of free and open documentation formats such as HTML
(and variants) and DocBook over closed, proprietary, or
single-purpose standards: dead word-processor of your choice,
"OpenDoc" (a Sun initiative), the GNU "info" format, etc. This is
a theme Tim O'Reilly has raised on numerous occasions, and fits
hand-and-glove with the Langlois analysis.
- The success of the X11 windowing system over OpenLook / NEWS,
despite the arguable technical superiority of the latter, largely on
- The success of GNU/Linux against both proprietary Unices and
Microsoft in the small to mid-sized server space. Similarly the
overwhelming success of GNU/Linux over all comers in the
embedded/handheld space, largely based on costs, capabilities, and
- The success of LAMP (GNU/Linux, Apache, MySQL, Perl) in dominating
the web applications space. This is turning, somewhat. More
accurately, LAMP is extending: Apache and GNU/Linux still dominate
the webserver and OS components, but we're seeing the addition of
Postgres, PHP, and Java in the database and language slots. The
competing and losing strategy is the legacy MS Windows, IIS, ASP,
SQL Server mix.
Turning this into an analytic tool rather than merely a laundry list, I
see the following principles and themes emerging:
- Worse is better. Many of the successful strategies pitted a
slightly inferior, *but good enough* competitor against a more
elegant solution. In all cases, "worse" also tended to: less
expensive (on a purchase unit if not on a capabilities basis), more
flexible, more modular, *and less centrally controlled*.
- Cheaper is better. Reduced cost, *or reduced entry cost* dominates
a more expensive product.
- Modular is better. Selling pieces to be assembled (or assembling
pieces and selling may different products) beats a highly tuned, but
- Decentralized is better. Reducing centralized control, often
written as "the right to fork" in free software discussions, means
that more ideas can be tried, and that the proving ground for new
development is larger. This is critical as the inventor of a new
technology *never* foresees its possible applications. It also means
no patent royalties or other licensing restrictions.
This dynamic is key to understanding both the rise and the likely
fall of the Microsoft PC market. PCs emerged and succeeded as a
decentralization tool -- they enabled users and broke the
stranglehold of the corporate IT fiefdom. Today, Microsoft
represents to an greater extent the role of controlling authority,
dictating terms under which other actors in the IT market can
participate. GNU/Linux and free software offer decentralization and
autonomy to hardware, software, and service vendors, as well as end
- Standard is better. Providing a uniform base on which to roll out
services tends to increase utility -- IBM's s360, DEC's minis, Unix,
the PC, and GNU/Linux, as well as industry and technology standards
such as ASCII, RFCs, etc.
So you ask, what the hell does freedom and ethics have to do with this?
Simply: the free software development model feeds each of these success
I'd like to revisit briefly the _foundations_ of free software -- the
principles on which it is grounded and which lead to its existence.
1. A development model: the open source "Bazaar" described by Eric
Raymond, which takes advantage of many eyes, tight development
cycles, and continuous evolution
2. A legal framework of free software licensing, including both
copyleft (GNU GPL and similar) and less restrictive free licenses.
3. An economic model which provides sufficient benefit to individuals
or firms engaged in development of works not exclusively retained.
4. A software architecture consisting of largely independent, modular
design, allowing individual developers to "wrap their minds" around
a given problem, and for code to be readily shareable among
5. A widespread, very low cost distribution network. The Internet.
6. Ready access to reasonably powerful computers with development tools.
7. Open, accessible, standards. GNU/Linux itself was based on the
convergence of the POSIX standard and x86 hardware, with additions
of TCP/IP networking, X11, numerous RFCs, etc. Closing standards,
or making them inaccessible (by licensing or royalty requirements)
is a serious threat to GNU/Linux.
And on top of these, factors indicating a probable ultimate success of
- Free software tends to expediency: people build stuff that works,
with an emphasis on the "get it working now" rather than "get it
- Though much is made of free as in speech, I feel that free as in
beer is also crucial. Of the factors contributing to Microsoft's
dominance of the PC market is their dual use of price as a weapon:
undercutting rivals products, when faced with competition, and
usury profits when in a monopoly position, financing other projects.
Despite its market dominance, Microsoft is utterly dependent on only
two franchises -- operating systems and office software -- for all
its profits. If either can be undercut, the company is stuck
pitting revenues against marketshare. Losing on either count is not
long term tenable. Free software allows both cost and marketshare
to be attacked simultaneously.
- The free software development model is to produce largely
independent, modular, software. The exceptions to this rule are
largely "liberated" proprietary products, notably Mozilla (which
required a complete redesign) and OpenOffice.org (which will have to
face the same music). The Unix philosophy of small, specialized
tools to do one job, largely applies. Even the larger projects --
emacs, perk, apache, the kernel itself -- are themselves highly
- Free software is by definition decentralized. Sun will eventually
learn this. The right to fork is *not* an option.
- GNU/Linux offers a uniform computing environment over virtually the
entire electronic landscape. This includes serving as the binary
compatibility standard on Intel architecture for POSIX environments,
and source compatibility across a wide range of processors (a dozen
architectures in the Debian project), scaling from wristwatches to
...which gets us in a roundabout fashion to my point.
Free software *depends* on the intrinsic freedoms RMS espouses. This
doesn't mean that a given adopter needs to embrace these, have them as a
significant deciding factor, or even be aware of them. As a whole,
however, free software's success is pinned on these principles. And due
to the dynamic that they create -- technical, cost, and control
characteristics -- the ultimate success of free software is inevitable.
It's a inevitability that can be put off for a time, at a cost. But
it's where we're headed.
And no, i don't foresee a world in which _all_ software is free. But by
and large the key components will be, and any significant software
sector will have its free software alternatives. The proprietary tools
which do remain should be the better for the competition, and may be
successful if they provide a compelling advantage.
And since Google is my filing cabinet and I can never turn up the
Langlois document when I'm looking for it....:
Keywords: computer industry history economics ibm dec vax unix
microsoft apple richard n. langlois Cognition and Capabilities:
Opportunities Seized and Missed in the History of the Computer
Industry system 360 RISC ken olsen
Karsten M. Self <firstname.lastname@example.org> http://kmself.home.netcom.com/
What Part of "Gestalt" don't you understand?
Integrity, we've heard of it: http://www.theregister.co.uk/
More information about the linux-elitists