|
|
|
Free
Software on Campus:
Something for Nothing?
This is the text of a paper being
presented at EduTex
in San Antonio on February 23. -- Ed.
Introduction
One of the most interesting features in the world of
information processing technology during the closing
years of the last century has been the meteoric rise of
"free" software. The ethos motivating this
movement has been the belief that information technology
has become so vital to the success of individuals in
contemporary society, it should be considered a public
resource with an international constituency.
Organizations such as the Free Software Foundation and
the University of California at Berkeley provided initial
implementations and the required legal apparatus to
guarantee that the work of individuals contributing to
this project would remain free to the public. The
explosion of Internet subscription in the last few years
added the communication mechanisms necessary for
programmers the world over to combine their efforts and
share their work.
One of the most widely reported products of this
activity has been the GNU/Linux computing environment.
Using an operating system implementation by Linus
Torvalds (and friends) and applications/utilities from
the GNU community, this computing environment now claims
over ten million users1 and
has been adopted as a native computing environment on
commercial hardware vendors from palm-tops to
supercomputers2. While Linux
has become the poster child of the free software
community, it is far from the only, or even first,
evidence of this philosophy. This discussion will often
use Linux as an example of features that are common to
several other free software environments as well.
Free Software: What is it and why should I care?
Much energy has been expended (largely dissipated as
heat) on the definition of free software. Rather
than attempt to constrain the discussion or argue
semantics, let's visit some of the objectives toward
which the various definitions attempt to direct our
focus.
The most obvious context assumed when you hear the
phrase "free software" is obtainable for no
money. This is often referred to as the "free as in
free beer" connotation. In point of fact, most free
software is available for download over the Internet if
you have a fast data link (or a great deal of patience)
and the technical expertise to install it. Free software
is also available in commercial distributions you can
purchase (free?) as a convenience, so cost must not be
the only issue. For example, you may have a friend who
bought a commercial distribution of software which they
are free to share with you. This second context
of free, known as the "free as in free speech"
connotation, is the definition at the very heart of the
free software movement.
The Free Software Foundation has been the most notable
exponent of the "free as in free speech" focus,
and have eloquently expressed this philosophy3,
backing up philosophy with impressive productivity. The
Free Software Foundation defines four kinds of freedoms
for the users of software:
- 1.
- The freedom to run the program for any purpose.
- 2.
- The freedom to study how the program works and to
adapt it to your needs (access to the source code
is a precondition for this).
- 3.
- The freedom to redistribute copies so you can
help your neighbors.
- 4.
- The freedom to improve the program, and release
your improvements to the public, so that the
whole community benefits (access to the source
code is a precondition for this).
In keeping with Richard Stallman's philosophy of idealistic
pragmatism, the Free Software Foundation protects
these freedoms with a copyleft. In Stallman's
own words "To copyleft a program, first we copyright
it; then we add distribution terms, which are a legal
instrument that gives everyone the rights to use, modify,
and redistribute the program's code or any program
derived from it but only if the distribution terms are
unchanged. Thus, the code and the freedoms become legally
inseparable."
Software protected by copyleft is not the only kind of
free software. The Berkeley Software Distribution license
and "public domain" software include the
freedom to change the distribution terms of modified
software so that it is stripped of one or more of the
freedoms protected by the copyleft. This freedom to give
up your rights is idealism certainly, but perhaps without
pragmatism. The term "open source" is gaining
popularity, especially among vendors of proprietary
software. This term is sometimes used as a synonym for
free software, but can also harbor "freedoms"
such as unilateral revocation of one or more of the four
freedoms cited above. It is often a useful litmus test to
determine why the software is not copylefted if it is
free. Many commercial vendors are courting the goodwill
(and free software and expertise) of the free software
community in hopes of gaining commercial advantages for
their proprietary software.
The argument that you as an educator should be aware
of this free software movement is therefore twofold. The
University of North Texas, my current employer, spends
about $100,000 dollars annually on a site license for
Microsoft software (for faculty and staff - this is not
direct assistance to students). There is no reason to
single out Microsoft, of course, since another $100,000
or so is spent on Adobe software products, $19,000 on
SAS, and you can doubtless add to this list from your own
institutional budgets. This is not a diatribe about the
evils of commercial software, as many of these products
provide valuable, and in some instances, unique
functionality. There is, however, a noticeable
distinction between the involved exercise of justifying
purchases of hardware (obtaining multiple quotes,
qualifying vendors, support for historically
under-utilized businesses) to guarantee that the state
gets the lowest bid from approved vendors, and software
purchases where the state subsidizes virtual monopolies
without noticeable concern for the best value for its
money. Surely free (as in beer) software should merit
some attention in this context alone.
The more important and far-reaching context, from an
admittedly personal point of view, is the accessibility
of information technology to students (and, heaven
forbid, even interested faculty and staff). Exposure to
the Internet and acquisition of "computer
skills" is being given high priority in the
curriculum of secondary education, but higher education
should not be satisfied with "vocational
training". The dissemination of knowledge in this
critical technology must not be sacrificed to the
amassing of profits through intellectual property laws.
Imagine a campaign to increase literacy where the
language was "owned" by some business that
charged for the (non-exclusive and legally constrained)
right to use; a right which the business could
unilaterally renegotiate or revoke. Free software is a
cultural feature of our place and time which demands
representation in any curriculum striving to produce
citizens capable of making informed decisions. The
freedom to share your ideas, your tools and even your
source code with others of similar interests around the
world is one of the fundamental rights from which a
global society might spring.
Free Software: Where does it come from?
One measure of the extent to which the
"business" model of interaction has permeated
our society is the puzzlement many feel when offered
something for free. If software production is an
expensive activity requiring considerable time from
highly skilled individuals, why would someone give it
away? The history of the manipulation, storage and
promulgation of information (the raison d'etre of
software) is one of a struggle between opposing
priorities. On one hand, the sharing of information can
provide a kind of cross-fertilization which leads to the
discovery of new information; in some instances, this
discovery of new information can lead to a re-evaluation
of existing information - a change in the status quo.
Some human endeavors, notably the sciences, tend to place
a high priority on discovering new ideas in order to
verify or refine existing ones; other activity has a
vested interest in preserving the status quo. The human
condition seems drawn schizophrenically in both
directions.
Not surprisingly, the sciences provided the field for
some of the earliest free software. One of the great
boons to science provided by computers is the ability to
model processes which are difficult or impossible to
observe physically. Since the crucible of science is
disputation, your conclusions must be reproducible by any
of your colleagues inclined to investigate your
proposition. The idea of developing and sharing programs
which all interested parties could use as a standard for
testing conclusions was therefore obvious. Trading
software was one way to let others reproduce your
results. In addition, to be credited for your
contribution to the body of scientific knowledge, you
must be free and able to publish. Software intended to
aid in the publication of technical information was
developed in the late 1970's by Donald Knuth and was made
available for free to address this issue. This
type-setting system named TEX was
provided with it's own fonts and font management software
which, after decades of refinement, can still process
today its original documentation to publication quality
standards (how's that for backwards compatibility).
Literally thousands of individuals have contributed to
the functionality, adding macro sets like LATEX and even GUI front-ends like LYX4 (with which this document was
produced). TEX (augmented by
customized macros) is the preferred format for documents
submitted to technical journals like the American
Mathematical Society, and one of the formats employed by
publishers like O'Reilly and Associates.
The GNU project5 was begun
in 1984 to provide an entire computer operating system
which was free software. The Free Software Foundation is
a tax-exempt charity that raises funds for work on the
GNU project. The reason that you often see GNU/Linux as
the operating system name is that only the kernel is
Linux; most of the actual applications, utilities and
even the compilers used to build the kernel are provided
by GNU project. When you observe confrontations between
commercial vendors and the GNU project about the
"correct" definition of free software, it might
be useful to remember that the GNU project has been at
this for over a decade-and-a-half, long before it became
savvy marketing to appear "open".
The Berkeley distribution of UNIX (generally referred
to as BSD for Berkeley Software Distribution) is
available in several free and commercial versions. The
first release of the FreeBSD distribution was December
1993, and it has since earned an enviable reputation for
stability and performance as a network server. Currently,
the most popular version of this environment is FreeBSD,
but NetBSD has been a leader in the area of hardware
portability (the ability to run on the cpu architecture
of many different vendors), while OpenBSD has long been
considered one of the most secure network platforms
available. The selection of the same BSD 4.4 code base
for the new Apple operating environment (OSX) is at least
a "left-handed" compliment to the inherent
value and stability of this free software.
Distributions: Freedom of Choice
Many folks new to the free software community are
somewhat abashed at the proliferation of dozens of
different Linux distributions (there are also several
different BSD distributions). If you are more accustomed
to a computing environment where one corporate entity
enforces uniformity by making all the decisions about how
the environment works, you may be led to conclude (and
even encouraged by commercial vendors to believe) the
free software movement is "out of control".
Consider for a moment the form that control often takes:
these decisions about the computing environment become
product differentiation features and intellectual
property that are the subject of legal controversy
costing millions of dollars (costs born by the consumer).
All this to ensure that one product does not infringe on
even the "look and feel" of another (and
coincidentally will not inter-operate usefully), and that
nobody can innovate on the product except the copyright
holder.
Now let's look at the out of control world of free
software. First, most of the different distributions of
Linux will inter-operate (ie you can move software from
one to the other without difficulty assuming the
distributions support the same cpu architecture - most
free software is designed to run on multiple hardware
architectures). As a matter of fact, many BSD systems
will also execute Linux programs and considerable effort
is expended in standardizing languages and data formats
to promote interoperability. Unlike proprietary software,
free software can be distributed in source code format.
These distributions generally employ an autoconf utility
which can determine the distribution and compile the code
correctly for many different target environments. The
areas of incompatibility come from running versions of
software which are too widely separated in release dates
(ie features in newer versions will not be in older
versions). As an example, gcc (the compiler base at the
core of most free software) has been designed for maximum
portability and runs on practically all modern processors
(there's even a version that runs in the Microsoft
environment).
If the distributions inter-operate, what is the area
of "product differentiation"? Different
distributions generally target different audiences. Some
versions are optimized as personal workstations where
others play to the network server crowd. All Linux
distributions use the same core components, but one
distribution may have a GUI installer application
intended to aid comparative novices, where another is
oriented toward batch installations of groups of systems
where user interaction needs to be minimized. Some
distributions are intended as minimal installations of a
secondary OS in a dual boot setup, assuming you will only
want to dabble with free software rather than use it as
your primary operating system. Other distributions are
designed to facilitate the implementation of clusters of
compute servers or optimize the system to serve as a
network appliance, or run one of the "tiny"
variants suitable for embedded products. Regardless of
the orientation of the distribution, nobody thinks there
is any reason to limit them (why would you only want one
breakfast cereal to choose from?). If you build a custom
distribution of Linux to suite your particular set of
requirements, maybe other folks would prefer to use it
rather than re-invent the wheel. You could make your
distribution available on the net for free (speech and
beer), or get venture capital and try to sell it for
profit. In short, people innovate with free software
because they can, and this situation should be viewed as
normal rather than the artificially constrained legal
morass of proprietary software.
Applications: What can I do with Free Software?
A common retort to inquiries about free software is
that "there are no applications." Often, this
can be restated more accurately as "there is no
support for the commercial application I am accustomed to
using." There are very few application areas where
there is no free software counterpart. The most commonly
used commercial applications, of course, are the ones
most encumbered with legal constraints designed to make
it difficult to move to another product. This generally
means that you must be prepared to learn a new program
which performs the same task in order to move to free
software. So when you're tempted to ask "Why don't
they do this like <commercial product of choice>
does?", please remember the reason is the business
practices of your commercial vendor (which is probably
part of the reason you are having this struggle in the
first place).
The most prevalent application area among computer
users of my acquaintance is web surfing. As an aside, let
me point out that the World Wide Web, initial web
browsers, even the Internet, were not productions of
commercial software vendors, but began life as free
software. When Ebusiness discusses all the innovation
involved with putting their services on the web, remember
that was only possible because the underlying technology
was made available to them for free, and therefore, they
should not be allowed to copyright the whole enchilada
when they only sprinkled a few onions on top. The most
common web server on the Internet6
(apache7) is free software. The Mosaic
and Mozilla offerings for web browsers ( beer) are common
browsers, but there are several interesting efforts to
produce smaller browsers which can run in more resource
constrained environments.
After web surfing the most common application seems to
be a communication client. Be it email, irc or some form
of online chat, people seem to find the peer-to-peer
communication provided by network-connected computers
extremely valuable. There are many email clients
available as free software, and sendmail (a program for
actually transferring email from one machine to another
across the Internet) is still the most common free mail
transfer agent among several available. The number of
chat-class clients depends on how recently you checked
the net (ie new ones seem to pop-up almost daily).
Technical documentation was one of the early
applications for UNIX. TEX is
easily as powerful a document preparation system as any
commercial product. There are special macros for
composing TEX documents and GUI
front-ends to smooth the learning curve. If your writing
needs are not so extensive, you might prefer the word
processor component of one of the "office"
suites like Gnome Office8. In
general, these suites contain a group of applications
considered useful for a business, and contain the usual
fare like spreadsheets, contact managers and personal
information managers, in addition to basic word
processing.
If you edit documents which are not intended for
printing (such as editing source code for computer
languages), you will probably find that word processors
are a pain because they have a tendency to embed lots of
"invisible" tokens in the text. A text editor
assumes, for the most part, that you will enter what you
want in the file. Editors are the main application
computer programmers spend most of their time using, so
there are a plethora of choices in the free software
community because UNIX has been a major software
development platform since its inception. Some editors
are very small, simple and easy to learn/use while others
are incredibly elaborate. Your choice depends only on
your preferences, and the amount of time you want to
spend examining all the choices. The default UNIX text
editor is named vi, but one of the many variations of
emacs is commonly found on most programmers machines.
There are several free image processing applications
for Linux, but the most commonly mentioned these days is
the GIMP (Gnu Image Manipulation Program9).
The GIMP is useful for photo retouching, image
composition and construction. It has many
user-contributed plug-ins, and is often the source of the
many decorations found on free software web sites.
A software component not generally thought of as an
application, is the window manager. This is the software
that determines how the graphic user interface appears
and works. This is the product "look and feel"
that is the subject of so much legal wrangling between
commercial software vendors. Because free software has
been freed of these constraints, a system is not required
to look any particular way. You may change it's
appearance as often as you like, and there are web sites
dedicated to window manager themes.10
The degree of possible customization is phenomenal, and
the activity is one many in the free software community
seem to find engaging. The basis of all this work is the
X Windows environment which was first provided to the
community by MIT as a student project. In past years it
has been taken up by the XFree86 project and is the most
common GUI environment on free software systems. As a
measure of the power of free software, the X Windows
System is now the default windowing environment on even
commercial UNIX systems, and there are commercial X
servers for Microsoft Windows.
Support: Freedom to Learn
To this point, my message has been primarily
evangelical: to convince you the free software concept is
an important one; the ethos laudable. Perhaps you have
been in the computer support role as many years as I, and
have come across software that you wouldn't deploy
"even if they gave it to me!" It is simply a
fact that deploying any new software incurs an increased
support burden; from the support perspective, there is no
such thing as free (beer) software. From another
perspective, however, free (speech) software returns some
time by relieving you of all the legal obligations which
are part of your contract with proprietary software
vendors. There are no license managers or anniversary
dates or audit trails to manage in order to keep you from
being sued. There are no students frustrated to find that
they cannot use this application at home on their own
computers without paying for a copy. There are no budget
proposals to justify the expenditure. You can sleep
nights (or days) knowing that nobody will expect you to
be able to document where every application is installed
or verify that older versions have been uninstalled.
There is another side to this free software business
that can cause support headaches. The very fact that it
is free (beer) means that individual users are more
likely to be installing software on their own systems. If
you currently enjoy absolute control over what software
is installed on the systems you support, this may seem a
chilling prospect. In point of fact, most commercial
systems will let users install free (beer) software
offered under various guises as browser plugins or MIME
documents. These programs do indeed cause support
headaches, and the problems are sometimes severe enough
to require measures which "protect" users from
receiving software of this type. These same measures
(email filtering, disk write permissions, etc) can be
employed on UNIX-like free software systems as well if
your situation requires.
For many of the computer support folks where I work,
the support issue is often boiled down to "yet
another computer environment to support." I suspect
there is also an element of "yet another computer
environment for me to learn before I can support
it." Indeed, it would be a mistake, in my opinion,
to deploy any software without obtaining (either learning
or hiring) a level of competence with it appropriate to
your responsibilities. The question becomes one of the
return offered for the energy expended. There can be no
avoidance of the fact that deploying new software will
cost you time and effort. I will argue that the time and
effort is, on the whole, unavoidable and required when
you upgrade commercial software as well. I believe the
process of learning is enhanced, however, when all
information on the topic is free and open for
examination. The Internet is full of web sites, news
groups, email lists and tutorials on free software, and
Linux has an entire section in most technical bookstores
these days. It has never been easier to learn about free
software.
Let's assume you decide you want to learn about free
software, but are under the impression that it is more
difficult to manage, or more insecure or more unreliable
than commercial software. Any of these propositions can
be true or false depending on the context. As a matter of
fact, these issues are interdependent to some extent - a
difficult to manage system may become unreliable or
insecure due to poor management. In the same vein, a
system which achieves ease of management by ignoring
security issues will be both insecure and thereby
unreliable when networked with other systems where
security becomes vital. Network connectivity raises the
support burden of any computing environment; the more
capable the system (in terms of network functionality),
the greater the support requirement. Often missing from
discussions of this topic is the observation that
"dumbing down" more capable network
environments is often more successful than trying to
improve functionality where the underlying technology
never assumed more than one user co-located with the
hardware.
If your use of computers has been restricted primarily
to Microsoft or Apple (that is, most folks), the first
feature of UNIX-like environments that may seem new to
you is the requirement to identify yourself (in other
words, to login). Not only do you have a user name, you
also have a password that is private information. I don't
hear the "this password business is silly nonsense;
I have nothing to hide!" argument as frequently
these days as in the past. Folks are beginning to
understand that you have individual responsibilities
under the law, and any computer environment that doesn't
recognize the inherent individuality of human society is
an unfortunate over-simplification of reality. Most
computer break-ins these days involve people wanting to
escape their own responsibilities by assuming your
identity. Consider other pieces of private information in
your life - your bank account number, credit card number,
social security number. If you are going to be held
individually responsible for activity done with your
computer, as you are for debts incurred by the use of
your credit card, it would be nice for your computer
environment to take some pains to identify you rather
than assume that anyone typing on its keyboard must be
you. Imagine using your ATM card without having to enter
your PIN!
When each user is a different individual to a computer
system, there is a level of account management overhead
that is a support cost not paid by the "any color as
long as it's black" approach. As a compensation,
user management systems have been around for a long time,
and there are many tools which make the job easier.
Especially useful these days is now the ability to do
individual user accounting, and the possibility of
disambiguating who did what when. On the desktop systems
I maintain, there are usually fewer than three or four
accounts which change rarely, so this is not a big
burden. On lab systems where several hundred students
need access, and the user base changes significantly
every few months, more complex systems based on
centralized account management must be employed. Most
contemporary commercial operating systems offer the
potential for differentiating users by roles (admin,
power-user, user etc). This functionality is inherent in
the design of UNIX-like free software systems.
Now let's address head-on the question ``Does free
software require more support effort than commercial
software?'' My response is, absolutely not for comparable
levels of security and services offered. At UNT there are
fourteen General Access computer labs open from eight to
twenty-four hours per day Monday through Thursday, and
for shorter periods Friday through Sunday. These labs are
primarily Microsoft oriented with a smattering of Macs.
None of these machines are remotely accessible (ie you
cannot access them from home); none provide
"server" class functionality (they are
generally powered down when the lab closes). There are
two Linux machines in one of these labs. These machines
have been configured to offer no remote access or server
capability (although these features are normally the kind
of functionality that draws people to Linux) and have
required no maintenance since their installation several
months ago even though they run 24/7 on the network; it
is undeniably true, of course, that they have fewer users
during the course of the day than the Microsoft systems.
There have been "break-ins" to some Linux
systems on campus, just as there have been on Microsoft
systems. The Linux systems that have been compromised
were being managed by faculty or staff untrained in
system management and without any assistance from the
computer support organizations on campus. There have also
been compromises to commercial systems that do have
professional support, so it seems unreasonable to label
Linux systems as insecure by default.
Experiences with Free Software
This section is an amalgamation of experiences from
the past few years using free software on the UNT campus.
Two application contexts which seem to make up most of
the "institutional", as opposed to individual,
systems here are lab machines and network services.
The Computer Science department at UNT has a
programming lab for CS majors which has been running free
software for about three years now. The GNU compilers
support the C++ programming classes using emacs as a
development environment and DDD as a symbolic debugger.
Accounts are managed centrally via NIS, and login
directories mounted automatically from a centralized file
server. This allows any CS student to login to any
available lab machine to access their personalized
environment. Because all the computer system hardware is
identical (or functionally so), the installation and
maintenance tasks are highly automated. The installation
time per workstation is under an hour and several can be
installed in parallel lowering the per-seat installation
time even more. This lab is open to students
approximately 76 hours per week (systems are left running
24 hours per day, seven days per week), and average
uptime for nodes is calculated in excess of 99% for the
hours the lab is open. In addition, maintenance updates
are made through an automated system that allows all lab
computers to be updated in parallel from a remote host
without reinstallation.
As lab machines, these systems are not available for
remote access (ie after lab hours). In order to address
this need, a different group of Linux systems has been
made available as remote access nodes which students
cannot access physically, but can login to remotely. The
students must use a version of ssh (secure shell client)
to login to the nodes, but they have the same facility
for file serving and account maintenance as when using
the programming lab systems. In addition, these systems
run restricted web servers so students can practice web
software programming. Most of the College of Arts and
Sciences lab systems running Microsoft provide secure
shell clients which can also access these systems.
Another project11
involves the attempt to make Linux systems available to
students who are not pursuing computer science degrees.
This project has provided a similar environment to all
members of the UNT community who use general access labs.
In order to get an account on one of the "Linux
Lab" machines, the student must print out an account
form (available over the web), sign it, and turn it in to
the computing center. This system features an automated
install (actually, there are a couple of questions to be
answered at the beginning of the installation) which
builds a generic lab system. All existing accounts are
then immediately available on that system without any
further management by lab personnel. While this system
has been built to "lower the bar" for lab
managers in terms of the installation of Linux, in the
nine months since announced, only one lab has been
willing to install two systems. Also of note is that only
three dozen or so accounts have been requested (although
this is partially due to the scarcity of machines). Your
analysis may conclude this project is a failure, but I
will respond that billions of dollars of marketing cannot
be easily or quickly counterpoised. As of this writing,
two more labs have requested installations, so we hope to
gain momentum to add to (not replace) the computing
options available to UNT students.
On the network server front, UNT has many free
software systems which provide yeoman's service. The UNT
web services group12 employs
apache web servers on Linux machines; the College of Arts
and Sciences web server13 is
also apache, but hosted on a FreeBSD system. The student
email system named EagleMail is based on cyrus imap
software14 using IMP15 as a web frontend. EagleMail
uses ldap for authentication, and the ldap services are
provided by OpenLDAP16
running on Linux systems. While the official email system
for campus faculty and staff is Novell Groupwise, email
is scanned for virus problems before being delivered to
Groupwise by software running on a Linux system. Web
servers (apache on Linux) are employed by the Research
and Statistical Support group for documentation and
tutorials, and other apache servers are used for
classroom instruction in the School of Library Sciences
and the College of Education. The university student
union building has several Linux systems modified to run
only a web browser, with important UNT sites bookmarked.
This overview of free software being employed at UNT
does not begin to be comprehensive, simply because you
don't need to ask anyone for money to bring up free
software on your system. There is no official support
policy at UNT for free software, so users have formed an
ad hoc Linux Users Group17 with
a mail list to communicate with other users on campus. We
have experimented with offering short training courses
during the summer, and the student ACM chapter18 has a "Linux Install
Party" each semester and sells Linux CDs to help
raise money. Academic Computing Services makes a mirror
site available to campus network addresses that contains
Linux software and tracks the required patches to keep an
installation current. Grass roots support is in the best
tradition of free software.
Quotes
The following quotes are offered in the hope they will
be useful in charting the dimensions of some of the
issues discussed in this paper:
- Microsoft's Windows operating-system chief, Jim
Allchin: "Open source is an
intellectual-property destroyer," Allchin
said. "I can't imagine something that could
be worse than this for the software business and
the intellectual-property business."
- Patricia Schroeder, president of the Association
of American Publishers: "We have a very
serious issue with librarians." Concerning
libraries loaning out books: "Technology
people never gave their stuff away."
- One major issue is that UCITA (the Uniform
Computer Information Transactions Act) would
allow software distributors to reveal the terms
of their license agreements after a sale.
Requiring disclosures similar to those for used
cars or other hard goods "would create
tangible harm through increased costs, litigation
and a likely decrease in competition and product
choice," wrote the Commerce Coalition, whose
members include AOL, Microsoft and Intel.
- The United Nations Universal Declaration on Human
Rights, Article 19: "Everyone has the right
to freedom of opinion and expression; this right
includes the freedom to hold opinions without
interference and to seek, receive and impart
information and ideas through any means
regardless of frontiers."
- Speaking on behalf of the nation's librarians,
Miriam Nisbet of the American Library Association
stated, "the lower court's decision
seriously harms the public's ability to make
legitimate, fair use of digital works. As the
founders of our country and Constitution
recognized, free speech and fair use are critical
components of a democracy."
- "Over the next 50 years," the
journalist Simson Garfinkel writes in Database
Nation, "we will see new kinds of threats to
privacy that don't find their roots in
totalitarianism, but in capitalism, the free
market, advanced technology, and the unbridled
exchange of electronic information."
- Polls suggest that the public is gravely
concerned: a 1999 Wall Street Journal-NBC survey,
for instance, indicated that privacy is the issue
that concerns Americans most about the
twenty-first century, ahead of overpopulation,
racial tensions, and global warming.
- Scott McNealy, the chief executive officer of Sun
Microsystems, was asked whether privacy
safeguards had been built into a new
computer-networking system that Sun had just
released. McNealy responded that consumer-privacy
issues were nothing but a "red
herring," and went on to make a remark that
still resonates. "You have zero privacy
anyway," he snapped. "Get over
it."
Footnotes
-
- 1 http://yahoo.cnet.com/news/0-1003-200-1546430.html
shows sales of over one million copies in 1999
alone.
-
- 2 Linux is the
fastest-growing operating system program for
running server computers, according to research
firm IDC, accounting for 27 percent of unit
shipments of server operating systems in 2000.
Microsoft's Windows was the most popular on that
basis, with 41 percent.
-
- 3 http://www.gnu.org/philosophy/free-sw.html
-
- 4 http://www.lyx.org
-
- 5 http://www.gnu.org
-
- 6 http://www.netcraft.com
-
- 7 http://www.apache.org
-
- 8 http://www.gnome.org/gnome-office/
-
- 9 http://www.gimp.org
-
- 10 http://www.themes.org
for one.
-
- 11 http://linuxlab.unt.edu
-
- 12 http://www.unt.edu
-
- 13 http://www.cas.unt.edu
-
- 14 http://asg.web.cmu.edu/cyrus/
-
- 15 http://www.horde.org
-
- 16 http://www.openldap.org
-
- 17 http://www.lug.unt.edu
-
- 18 http://www.cs.unt.edu/~acm/linuxcds.html
-
|