B.C. Civil Liberties Association           http://www.bccla.org
425 - 815 West Hastings Street
Vancouver, B.C. Canada V6C 1B4
(604) 687-2919  fax: (604) 687-3045  e-mail: info@bccla.org


                            May, 1994

    Prepared by John Westwood, Dale Beyerstein, and Martin Hahn

A.  Introduction

If the media can be believed, there is a real danger, against which
our laws and police are powerless to protect us, lurking behind every
personal computer connected by a modem to the telephone line.  For
example in a recent article originally published in the Ottawa Citizen,
luridly titled "Cybercrime: As the information highway grows, so do
the terrorists, vandals, paedophiles and other criminals who use it"
we read:

     Cyberspace is a lawless world....Vandals, thieves,
     terrorists, pedophiles and murderous thugs ride this
     electronic frontier, knowing law enforcement officers are
     far, far behind.  Police have a shortage of expertise,
     equipment and some would argue, laws to back them up.

     Computer users of any age, with a little bit of know-how,
     have access to hate propaganda, hard-core pornography,
     stolen credit card numbers, even a massive blueprint to
     circumvent telephone billing systems. (Reprinted in the
     Vancouver Sun, April 30, 1994, B5)

The picture presented here is frightening.  But how well does it
represent the reality of computer networking?  We shall maintain that
this picture grossly exaggerates the problems posed by computer
networking.  Thus the claim mentioned in the article that we need
new laws to handle severe problems with computers is simply not
warranted.  It is indeed the case, as has been noted by many observers,
that present laws restricting speech are difficult to enforce on
computer networks.  However, the bccla takes the position that these
laws should not exist in their present form anyway; so the fact that
they are unenforceable on computer networks is more of a blessing
than a problem.

There is, however, another type of situation where it is arguable
that changes are required in the law to provide needed enforcement
of court decisions on those involved in computer networking.  This
case is exemplified by the Ontario ruling in the Karla Homolka case
banning publication of evidence. The judge's ruling was enforceable
only in Canada, and U.S. media were free to report this evidence. 
And Americans who participate in discussions of current affairs on
discussion groups on computer networks were free to do so.  The trouble
arose because several of these discussion groups were accessible to
Canadians in Canada through the internet.  Now, if the reader thinks
that the judge's ruling in this case was wrong, then that person will
not be greatly offended by the result of this case: that Canadians
who were interested got to find out some of the reasons behind a
judge's sentencing in this particular case. Nevertheless, the example
is disturbing for other cases where the reader does agree that a
judge's decision to ban publication is a just one -- say, in cases
under the Young Offenders Act.  We deal with this case later.

But first, some information about how these computer systems work
may be helpful. To begin, we should note that we are discussing systems
which allow a person to communicate with other people, or extract
information from other computers, by connecting her own computer to
other computers by the telephone lines. (The device which allows this
connection to work is called a modem, and is a feature that must be
added to standard home computers.)  The systems which coordinate
various computers with each other are either (electronic) bulletin
boards (BBS), the university (or college) mainframe computer to which
those connected to the university in some way get access (sometimes
by paying for it) via their home computer, or by FreeNet.  These
systems serve two important functions in their own right: to allow
members to communicate with each other by leaving messages on the
system for each other; and to allow for the circulation of programs
from one user to another.  That is, User A may write a program to
do a particular task, and wish to share it with other users.  He will
upload it to the bulletin board, and any interested user may download
it for use on her machine.

Bulletin boards come in various types.  Many are operated by private
individuals on their personal computers in their basements, at no
charge to the user.  There are over a hundred such systems operating
in Vancouver at present.  They accept people as users by taking their
applications when they call in on their computers, getting the user
to choose a password, and usually vetting them by a voice telephone
call.  Other bulletin boards are run by institutions such as Airspace,
an anti-smoking lobby group, for their members or those who share
a similar view about smoking.  Similar to this are companies that
run bulletin boards as a way of allowing employees to effectively
communicate with each other, or to work from home.  Again, they limit
access to a relatively small group, and do not consider themselves
open to the public.  As well as these, there a number of bulletin
boards that charge users a fee.  Some of these have gone beyond one
city.  The largest of these, CompuServe, is worldwide, and is like
the Internet, discussed below, in its scope.  FreeNets, which now
exist in several Canadian cities, exist not only to provide people
with the services already mentioned for free, but also to provide
terminals for those people who do not own their own computers.

These systems also serve another, increasingly important, function. 
They serve as gateways to the Internet, a system which connects about
a million computers worldwide, allowing people to retrieve data from
remote computers.  Obviously, the procedures for doing this are very
complicated, requiring passwords for security, and expert knowledge. 
Bulletin boards make it possible for relative computer novices to
make use of the Internet to extract information.

Users of a local bulletin board, as we said, trade messages and files
containing programs or pictures with each other.  Sometimes they send
messages "live" to each other, akin to sending messages back and forth
to one another on a teletype.  But usually they do so by leaving a
message for the other to read at some later time.  It is important
to distinguish two types of messages. One type, electronic mail (e-
mail) is, as the name suggests, a message sent privately from one
user to another.  The other type of message is not necessarily to
one particular person, but if it is, it is still meant to be seen
by many others.  These messages are posted in what are variously called
echoes, conferences, forums, etc., depending on the bulletin board. 
As the names suggest, these are electronic places where people get
together to talk about special interests, hobbies, or current events. 
There are literally thousands of such areas where one may post a
message to start up a new discussion or contribute to one that is
already taking place.  

The Internet has allowed these discussions to become worldwide.  One
can join a discussion group on a local bulletin board in which messages
are carried via the Internet and gathered together from all over the
world.  It is this latter, trans-national feature of these systems
which cause the problem alluded to above in our mention of the judge's
ban on dissemination of information about Karla Homolka.

Of course the Internet is a new way of transmitting information; and
this means new problems with respect to dealing with this information.
But does it pose special problems requiring special legislation? Before
answering this, we lay out the BCCLA's position on freedom of
expression generally.

B.  The bccla's position on freedom of expression

Our position is derived from the writings of the American philosopher
Alexander Micklejohn. It could be called "the argument from democracy",
and it goes like this:

A democracy is a society which is organized in such a way that the
citizens, collectively, are the sovereign rulers. The citizens are
the ones who, in the end, determine what the laws and policies which
regulate our common behaviour should be, and so it is the citizens
who determine what the shape and tenor of our society will be.

It is essential to our role as sovereign citizens that we have access
to the widest possible range of ideas, ideas on any matter whatsoever
that touches on our common goals and aspirations. Such ideas would
obviously include those in the areas of politics, economics, social
welfare, and so on but would also include ideas about sex, race,
religion, morality. What is essential is not that each of us have
considered all such ideas -- though perhaps that would be a good thing
--but rather that there is nothing barring our access to such ideas.

In a democracy, the state -- that is, the government -- is the
machinery that we use to govern ourselves. We authorize the government
to enforce the laws and policies which regulate our common behaviour,
laws and policies enacted by our elected representatives. Thus, as
well as being sovereigns, we are also subject, in that we are subject
to the laws and policies enacted by our representatives and enforced
by the state. But there is one area where we cannot allow the
government to regulate our behaviour, and that area is our access
to ideas.

Consider the consequence where the government is allowed to control
our access to ideas, To the extent that government is allowed to
control our access to ideas, government controls the minds of the
citizens. It controls the agenda for which ideas can be discussed
in the public forum. In so doing, it has taken over the sovereign
role that we as citizens are supposed to have. Anyone who controls
the range of information to which we can have access controls the
scope and nature of the public debate about how our society should
operate. Therefore, so long as we are committed to a democratic form
of self-government, we cannot allow the state to prevent our access
to ideas.

The present point here is not that in a free exchange of ideas, the
best ideas will survive exposure to criticism and evaluation -- the
cream will rise to the top (although we might wish to defend the value
of open discussion for this reason elsewhere, and discussions on the
forums of the internet could be seen as examples of this notion at
work). The point is, rather, that uncontrolled access to ideas by
sovereign citizens is fundamental to, or constitutive of, our
democratic way of life. Any withdrawal from that ideal is a withdrawal
from democracy.

Now it may be that as we survey what passes for the public debate
of important ideas, we are not so sure that democracy is a wise course
to follow. When we examine some of the ideas that have gained favour
amongst citizens, it is sometimes hard to believe that the future
of Canada ought to be left in their hands.  The same worries will
be generated by an examination of the wide variety of messages posted
on the various electronic forums, from "alt.anarchy" on the Internet
to the White House Forum on CompuServe. The type of person attracted
to computer networking  is often someone with eccentric opinions,
and without any reservations about expressing an opinion on a subject
far removed from his or her formal education.  Nevertheless, one cannot
help but be impressed with the willingness to be educated, the
tolerance for outr opinions, and the respect for the rules of argument
that are the norm rather than the exception on these forums. On the
whole, the discussion on these forums gives one hope for participatory
democracy.  The B.C. Civil Liberties Association is committed to
democracy.  Democracy is not a perfect solution to the problems of
social organization, and maybe in Canada we're farther away from
perfection than we could be.  However, democracy is a good deal better
than any of the alternatives. And freedom of expression is necessary
to make democracy work.

C.   Weighing freedom of expression against other goods

Freedom of expression, and democracy which it serves, are not absolute
values. There are eight areas in which the state censorship of ideas
ought to be considered:

      (1)the use of children in the production of pornography;

     (2)  the products  of real, non-consensual violence (as opposed
          to the mere depiction of this)  in a sexually explicit

The reason for allowing censorship in these two areas is that the
production of such expressions involves criminal behaviour. The
state is justified in banning expressions where it is essential
to the content of those expressions that criminal behaviour be
recorded and displayed. That is, the justification for censorship
is derivative upon another reason; viz. the prevention of other
activities which are wrong. Restricting (1) or (2) reduces the
incidence of the criminal behaviour in those cases where it is
undertaken with the intention of profiting from the sale of the
depictions of it. Note that only a small proportion of sex or
violence is engaged in for these reasons; so it is not reasonable
to maintain that censorship is going to eliminate much of this
criminal behaviour. Nevertheless, it will eliminate some of it;
and this is a sufficient reason in the eyes of the BCCLA to
justify censorship here.

There is another rationale for censorship which is direct, in
that the information censored is morally or legally objectionable
in its own right, and not as a result of some prior objectionable
activity. Several types of case involve this type of rationale.

     (3)       the conveying of information that violates an
               other's privacy.  Examples of this would be the
               posting of stolen credit card numbers, or confi
               dential medical information obtained in contraven
               tion of employees' oaths of confidentiality or

Clearly there are grounds for preventing these invasions of
privacy no matter the forum in which they occur. Our question,
then, will be how to do this with the least disruption to the
things we value on the internet.

     (4)       judge's rulings designed to promote a fair trial
               or protection to a minor.

This area has already been mentioned; and will be considered

There are two other areas in which regulation should not be
opposed in principle:

     (5)       Material viewed by children

It is appropriate that children should not have as wide an access
to ideas as adult citizens. It is not completely wrong-headed to
judge that because children's minds are in the process of being
formed, and because they may not have had the experience or
possess the balanced framework with which to assess certain
strong or controversial ideas, their access to such ideas should
be limited.  This limit on access might take the form of legisla
tion restricting the purchase of magazines, or the viewing of
movies to those over a certain age.   Our present questions are
whether there should be, or even can be, analogous restrictions
on material carried on computer bulletin boards. We shall return
to this topic. But first we look at a type of case which calls
for something to be done, but much less needs to be done in these
cases than many people advocate:

     (6)       Offensive material encountered without warning

It is also appropriate for society to judge that people should
not be subjected against their will to ideas which they may find
repugnant or offensive. Thus, for example, banning a huge bill
board over the Burrard Street Bridge which contained a hate
message, or a sexually explicit photograph or drawing, is not
unreasonable.  Similarly, it may be argued that there ought to be
the equivalent of zoning regulations on the internet.  Note how
this differs from censoring, i.e. preventing access to material
generally. When we return to this topic our concern will be to
find the best way of determining that there are adequate warnings
or placements of such material on the Internet. We are opposed to
the censorship of this material solely on the grounds that it is

Finally we look at two types of case where censorship has been
proposed in recent discussions:

         (7)speech which could be viewed as incitement.

Incitement occurs in circumstances where there is no realistic
opportunity for the content of the speech to be cooly considered,
its pros and cons weighed, and a rational judgment made on its
merits. Rather, the circumstances are such that the speech is
likely to directly cause action. Yelling "Fire!" in a crowded
theatre is a commonly cited example. Yelling "Lynch him!" to an
angry mob holding a brutal serial rapist might be another. What
is crucial in such examples are the circumstances in which the
speech act occurred, not its contents. The very same content, if
delivered in circumstances where time for deliberation was
available, would not constitute incitement, and so should not be
banned. It is clear that messages read on the computer, whether
e-mail or ones publicly broadcast on the Internet will not
constitute incitement in this sense. One cannot immediately do
anything after reading such a message; and usually the next
message in the thread (a continuing discussion) will provide
rebuttal. Therefore  censorship of the Internet on the grounds of
incitement is unjustified and will not be considered further. 

     (8)       Material prohibited under the "hate" provisions of
               the Criminal Code, and the like

In E. we shall maintain that this provision does not justify
censoring material on the internet.

D.  When should ideas be regulated?

The types of cases mentioned in the previous section show that
the proposition that citizens' free access to ideas is fundamen
tal to democracy does not entail that the distribution of ideas
should not be regulated or classified (as in cases (5) and (6))  
or even completely censored in the extreme case.  However, such
measures must be applied only when they can be demonstrated to be
necessary.  Demonstrating this, we maintain, requires establish
ing two separate points:  First of all, we must show that there
is some competing good, of greater value, that would require the
suppression of freedom of expression in order to survive. We
concede that this can sometimes be established in cases of types
(1)   (4) considered above in C.  But second, we must show that
the suppression of the expression is the most effective way of
preserving that other good. Given the nature of the Internet, we
maintain that there are alternatives to complete suppression
which are effective enough to protect these goods, and which will
not produce as serious a damper on freedom of expression. There
fore these alternatives to total censorship should be used

These forms of regulation do not constitute unacceptable censor
ship, since none of them prevent adult citizens from having
access to the ideas or images which are regulated.  

For much the same reason as (2), the classification, as opposed
to regulation, of material destined for public viewing is well
within the democratic blueprint. There is much to be said for
people being able to know ahead of time whether a film is going
to contain offensive or repugnant material, so that they can
avoid this material, if they choose to. Similarly, classification
on computer bulletin boards could serve the same function.  Our
question will be "Who ought to be responsible for this classifi
cation?" Should it be the government, in the form of laws en
forced by police, the owners of the bulletin boards, Internet
services and the like, or the individuals, or their parents, that
are consuming the services?

Help with the answer to this question comes when we consider the
second point raised at the beginning of this section. That claim
was that in order to justify interference with freedom of expres
sion we must show not only what we showed in III above, that
there is a compelling reason for restricting the material in a
given case, but also that the interference will have the effect
of reducing the problem we set out to solve.  John Stuart Mill
(On Liberty p. 15) sets out two tests that must be satisfied
before interference with our liberties could be justified.  We

     (a)       Given the mechanisms of control available to the
               state, would the person or institution to be con
               trolled behave better because of the sanction than
               he or she would if not regulated?

     (b)       Would the exercise of this control bring about
               greater evils than the ones it was designed to

A positive answer to both of these questions, according to Mill,
must be "yes", in addition to the demonstration that there is a
real evil that needs to be prevented. We shall maintain below
that, given the way the internet is organized, the answer to both
these questions is no, when we consider specific legislation
applying to the internet. That is, the passage of new legislation
governing the Internet will bring about evils greater than the
ones it is designed to prevent. However, laws currently in place
can readily be applied to the internet without being a cure worse
than the disease. And, more important, there are non-legislative
solutions for some of the problems we have been discussing that
are even better answers.

E.  Applications to electronic bulletin boards or networks

The issue at hand is how, if at all, this account of freedom of
expression in general applies to electronic bulletin boards or
networks. For brevity, we will use the term "bulletin board" to
apply to all such electronic media which allow users or members
to place messages, files (including those that display images) in
the medium, and to access them. We use the term "network" to
refer to a connected grouping of such bulletin boards.

First,  messages on conferences, echoes or forums on various
networks are not private communications.  Their intention is to
be public; that is, they are to be read by anyone who finds the
subject heading interesting. The same point holds for files
uploaded to a library or file area: the whole purpose of up
loading them is that they be shared. They can therefore be
regarded as public communications, or expressions in the public
forum, and are therefore subject to the same rules as any public
expressions. Libel, consumer protection and protection of privacy
legislation apply to internet and bulletin board postings

The third case we mentioned above, the invasion of privacy
involved in posting someone's credit card number or personal
information illegally obtained, poses a different sort of prob
lem.  There is justification here for the state to deter the
posting of these types of messages or files, or to ensure their
removal once they have been posted.  However, the anticipation of
these sorts of problems does not justify any sort of prior
censorship or control of the bulletin boards.  Following the
principles enunciated earlier from Mill, what is required is the
method that is least onerous that minimizes this problem.  What
this is here is to leave the regulation in the hands of the
people running the bulletin boards, rather than setting up
government agencies to scan BBSs for such material.  The simple
fact is, despite the media attention to this, is that there is
very little of these invasions of privacy on BBSs.  The cele
brated cases of credit card numbers reported have largely been of
high profile executives of corporations involved in disputes with
environmental organizations and the like -- they are politically
motivated, as opposed to being done for some private gain such as
profit or revenge. Thus there is not much criminality to police.
And, as we mentioned in A., the majority of sysops are hobbyists
operating for free, who simply cannot afford to pay fees to pay
for such policing, and who cannot afford the risk of time and
money that would need to be spent for defence against heavy-
handed prosecutions.  An important defence of our freedom of
expression is defending these BBSs from harassment.  The same
points hold for those BBSs run by lobbying groups and FreeNet. 
We shall address some concerns later about how self monitoring of
these systems could be effective.  We also defer until later a
discussion of information which defies a judge's ban.

As for the other types of cases cited in various discussions,
such as instructions on how to circumvent telephone billing
systems, we view these as being blown completely out of propor
tion.  What is contained in this information is the very same
information that is contained in library books, courses in
electronics, and other sources that are readily available to
anyone.  After all, the worriers about this problem could exer
cise an immediate calming effect on themselves by simply asking
where the hackers got the information they posted in the first
place!  Thus there is simply no need to invoke any law to handle
this problem.

The other type of case to have grabbed  media attention of late
is that of pedophiles establishing contact with young children
over computer bulletin boards, and arranging for a meeting in
person at some other time.  But again, it is important to keep
this issue in perspective.  The time spent at the computer is not
the problem -- contrary to the hype issued by the computer gurus
about cybersex.  What is of real concern is the meeting later. 
But no one would seriously blame the telephone or the mail where
these were the source of the original contact with people parents
think undesirable.  Parents understand the need to monitor their
children's magazine subscriptions or telephone contacts, and to
exercise control over their children's behaviour while away from
the home or parents.  The computer is no different in these
respects from the telephone or mail. Granted, bulletin boards
offering interactive games are more attractive to many children
than are magazines. And many parents are ignorant of the workings
of, and afraid of, the computers they bought for their children.
Nevertheless, it remains the parents' responsibility to monitor
their children's use of  these instruments.  No law could possi
bly be effective in supplanting this role, and where parents are
fulfilling the role, laws are unnecessary. Granted, parents
cannot lean over their children's shoulders every moment they are
at the computer; but software is presently available to restrict
access to certain places where certain subjects are discussed.

Unfortunately, Parliament and the B.C. Legislature have not been
persuaded by the bccla account of freedom of expression. Messages
on a network could be subject to Criminal Code sanctions, and to
sanctions under the Human Rights Act.

For example:

It is likely a Criminal Code offense to possess and/or distribute
some of the material which could appear in networks.

     *  s. 318 prohibits advocating genocide;
     *  s. 319 prohibits promoting in a public place hatred
        against an identifiable group;
     *  s. 163 prohibits the distribution, and possession for
        the purpose of distribution, of obscene material;
     *  s. 163.1 prohibits the production, possession or distri
        bution of child pornography (which includes counselling
        or advocating sex with someone under 18).

A conviction under the Criminal Code could lead to jail.

As well, B.C.'s Human Rights Act prohibits the public display of
any representation which is likely to expose a person or class of
persons to hatred or contempt. A proceeding under the Act could
lead to an order to financially compensate the complainant.

Finally, it may well be that any of the originator of a message
or image, the moderator of a forum, or the operator of the
network (the sysop) could be sued in civil court for defamation
or for invasion of privacy.

It is possible that there is something about the electronic
transmission of images on a network which would preclude it from
prosecution under the Criminal Code or Human Rights Act. If so,
lawyers could have field day with this material. But even if
there are legal loopholes, it is likely that Parliament and the
Legislature would soon act to close them. 

State censorship of ideas has been confirmed twice recently by
the Supreme Court of Canada, and it is likely to be the law for
some time to come. Nevertheless, it is our position that these
sanctions have no place in a democratic society; and we therefore
oppose their extension into electronic media. Our position is
that self regulation by the sysops and conference moderators
themselves, to prevent the sorts of invasions of privacy or other
genuine concerns has worked up until now, and there is no justi
fication for legislation that would negatively affect freedom of

F.  Self Regulation

We now turn to the question of how the self regulation of expres
sion on networks would work in practice. Both sysops and desig
nated members of a network who moderate the conferences or forums
have the power either to delete material or to classify it --
that is, place it in files where special passwords are needed in
order to access the material. In most cases there are practical
difficulties involved in requiring the sysop to be responsible
for all the material on his bulletin board first of all because
of the large quantity of material which some networks contain,
and second, because of the fact that a great deal of the informa
tion gathering on a typical bulletin board is automated.  A
typical bulletin board runs unattended a good part of the time,
and messages are posted, and even removed after they have been on
the system for a reset time without the sysop doing or monitoring
anything.  Also, many of the messages on a typical bulletin board
are generated elsewhere, and passed along the network to all the
bulletin boards that take that conference or forum. On a typical
network such as FIDO Net, the computer running one bulletin board
will automatically dial another computer to receive a very large
volume of files containing individual messages, files and the
like, retain some of that information for the bulletin board, but
merely pass on some to the next computer that will call it at a
preset time. Information travels across North America by one
computer calling another in this way (programmed in so as to
eliminate or minimise long distance calls).  All the decisions
involved in this transfer of information could have been made
months ago; and the sysop of a particular system would have no
need to scan any of the information that ended up on his or her
bulletin board, let alone the large volume that "passed through"
that computer.

This would suggest that the main burden for regulating these
messages for such matters as libel or general offensiveness would
have to fall on the moderators of the conferences, since these
are the only people whose job it is to actually read the mes
sages.  However, these moderators are usually volunteers, and
cannot be expected to be aware of all the laws and statutes that
might apply in all the various jurisdictions where the conference
is picked up.  For that matter, they are often not even aware of
all the localities where the conference is available.  Also,
because they are volunteers, moderators will often resign, and
conferences will run unmoderated for varying lengths of time.  As
well, there are several conferences on the Internet that pride
themselves on being unmoderated.

This problem, however, is by no means as serious as it looks. 
Our position is that the number of messages that should be weeded
out is very small.  Therefore the need for constant vigilance on
the part of the moderators is not great.  And we ought not to
underestimate the moral sense and acuity of the other readers of
messages on the system.  Where something truly offensive slips
by, such as the posting of someone's credit card number, other
readers will immediately send a spate of messages to anyone who
looks like they may have some authority to do something about it.

A similar set of considerations apply to judges' bans of informa
tion.  The problem here is that the information that works its
way onto a BBS in an area under jurisdiction of the court is
generated in a place not under the court's jurisdiction. This
problem is not difficult to solve, however.  The situation here
is conceptually the same as a newspaper picking up a news item
from a wire service such as American Press.  In this case, there
would be no difficulty prosecuting the Vancouver Sun for publish
ing the item, despite its coming from New York.  The same point,
seemingly, would apply to BBSs.  This solution requires no new
legislation specific to electronic media. The practical difficul
ties, however, were noted above.  Sysops or forum moderators
would be required to pay attention to the messages they carry,
rather than relying on the automated system. However, these
difficulties are not insuperable. The computer provides sophisti
cated ways of searching messages for key words such as "Teale",
"Homolka" etc., and messages about banned items are likely to
show up only in certain discussion groups having to do with laws
or current affairs.  Those BBS operators who have no interest in
these areas do not carry them, and have no problem.  Those who do
as a service to their users will have to undertake a bit more
caution when a judge's order is made.  However, we should empha
size that in a well run society the number of orders that a sysop
would have to take cognizance of would be extremely small. This
is not an unfair burden to place on those who wish to provide
this service.  For those who run afoul of the law in this matter
there are already adequate provisions for punishment.

But, putting aside practical difficulties, should material --
especially sexually explicit material and hate messages -- be
deleted or regulated by private people? Would this be preferable
to censorship by the government, with all the safeguards demo
cratic governments build into their policies and procedures, and
the enforcement of them?

Let us first look at the deletion of material. There are at least
two types of material that must be distinguished. The first is
electronic mail, the second are the messages and files that are
intended for the public places on the network. With respect to
the first, privacy is the paramount concern of the users of these
systems; and therefore most users do not wish to see their
messages censored or monitored. However, we note that different
bulletin boards offer their e-mail services on different terms;
and we think that these terms should be left up to the consumer
and sysop. When there is payment for this service, there is an
enforceable contract.  We think that civil law serves as an
adequate remedy for problems here. However, the general principle
here is Neville Chamberlain's, "Gentlemen do not read each
others' mail"; therefore we think that the minimum of interfer
ence in this area the better.

Public areas such as file areas or message areas are slightly
different. There is no reason why sysops or moderators should not
delete material which, for whatever reason, they wish to exclude
from access through the network. This situation is analogous to a
bookstore owner or gallery owner who makes a personal decision
not to carry or display a particular work. They are not "censor
ing" the work -- that is, they are not using the power of the
state to prohibit the work from being distributed -- they are
simply refusing to participate in its distribution. The author or
artist is perfectly free to try to sell the work through other

An identical argument applies to sysops. They should not feel
themselves bound on principled grounds to allow messages on their
network which they find, for whatever reason, offensive or
dangerous. The person who wishes to transmit the message is
perfectly free to try to put it on another network, or to express
it through other means of communication.

However, there is a counterargument which deserves mention. Art
galleries and bookstores are about the only way that artists and
authors can get their work to the public. Some artists and
authors are controversial, in that their work challenges and
crosses accepted boundaries of taste and sensibility. It is, I
think, not wrong to say that they are the lifeblood of the
artistic community, for art when it is good challenges accepted
ways of looking at ourselves and our world. Can we conclude that
art gallery owners or bookstore owners, who profit from the
talents of artists and authors, and who represent the only venue
for the display of new works, are under a moral obligation to
take risks in showing or carrying avant-garde works, even if
their paying customers might be offended and stay away in droves,
and even if they risk criminal prosecution? We think that such an
argument bears some weight.

But can it be applied to sysops? We don't think so, at least not
to those running small bulletin boards from their own personal
computer. One of the beauties of the internet system as it has
presently involved is its plurality. When someone is denied
access to a particular bulletin board or forum, this will not be
the only venue for such messages or images -- there are ample
opportunities for displaying such messages in other places.
Therefore, this argument cannot be easily applied to sysops.

The fee-for-service bulletin boards such as CompuServe are
slightly different.  They offer a public service, and in fact
will usually be charging money to the person whose message they
are denying access.  Nevertheless, first of all, these services
specify very clearly what services they are -- or are not --
offering; and none of the ones presently operating offer their
clients unlimited rights to post messages.  If they did, that
would be another story.  Secondly, we might also think of an
analogy with letters to the editor in a newspaper.  It is not a
violation of anyone's rights in the typical case off an editor's
decision to refuse to publish a letter.   We think that a
sysop's, or a moderator's decision to remove a message because of
reasons of relevance (wrong discussion group), good taste, or
worries about libel are defensible.  Even in the extreme case of
a moderator removing a message he disagrees with, deplorable as
this would be, does not seem to us to be grounds for legislation
to monitor these decisions.

The case is even clearer for bulletin boards run by particular
groups for particular purposes.  The BBS for the Atari (Computer)
Users' Group should not be required to carry a message about the
evils of smoking, and Airspace should not be required to carry a
message on how to oil the door on a floppy drive.   

This is not to say that some sysops cannot choose to provide a
very wide latitude in the messages they will allow the network to
carry, even a completely unrestricted arena for ideas. They may
view themselves as providing a public good in doing so, but they
should feel themselves obligated to allow extreme messages, at
least not in the same way that art gallery owners may have such
an obligation.

What about the regulation and classification, as opposed to
outright prohibition, of messages on a network? Does this run
afoul of any principled protection of freedom of speech? The
answer seems to be no.

As we mentioned earlier, the bccla accepts a range of means for
regulating access to expressions. These include restrictions on
the public display of controversial material, restricting access
by children, and alerting potential viewers ahead of time of the
nature of the content of a work. These are already in place in
the "real world" for films, videos, magazines and books. There is
no reason why the justification for such restrictions on access
to material shouldn't apply to networks. The issues are
identical. And, in fact methods of regulation have already grown
up on the internet.  Moderators of conferences and forums
regularly remove messages from one message area when they are
clearly more relevant on another (e.g. when a discussion of
homeopathic remedies for dogs surfaces on an alternative health
forum for people).  However, here it is almost universally the
case that the message is forwarded, e.g. to the pet health forum,
rather than destroyed.  Moderators also on occasion will restrict
access to someone who is rude or offensive to another person on
the forum.  It is only a small step to extend this practice to
other sorts of offensive messages. Self regulation would thus
seem sufficient to handle the problems raised by those calling
for government censorship.

As for keeping children from accessing certain message areas with
inappropriate material, the technology provides two solutions
that can be implemented by sysops that are already used widely. 
One solution is for the sysop to provide different users with
different types of access to the system when a person is accepted
for membership.  Thus a minor who is accepted onto a particular
bulletin board may simply be denied access to certain file or
message areas where this material is stored.  The defect with
this approach is that it cannot distinguish between different
family members who access the system with the same password.  A
better solution is to secure certain areas of the system with a
separate password that must be entered before access to that area
is given.  Critics of bulletin boards point out that there is no
direct contact between the sysop and the user; and so the sysop
cannot monitor whether a person logged on to the system is the
original adult who was granted access or his 9-year-old son who
has found out the password and is logging on when the adult is
not around.  While this is true, the same problem exists with the
distribution of magazines that are restricted to those over 18. 
Even if the shop owner is especially careful to sell these
magazines only those over 18, how can she be assured that they
will not later fall into the hands of a minor?  We do not think
that this possibility justifies an outright ban on these maga
zines.  In fact, the situation is probably better with respect to
computer bulletin boards.  Whereas the most immature young person
can open up a magazine and look at the pictures, it requires a
certain level of maturity to operate a computer and log on to a
bulletin board. Granted, the kind of maturity involved in being a
computer wizard is different from that required to properly
appreciate pornography; but at this point we reiterate our
comments from V. that it is the parents' responsibility to
control the computers in their private homes, just as it is
theirs not to leave inappropriate reading material in the wrong
places. And this brings us to a third solution, which can be
implemented by parents. Software is already available which
blocks access to certain discussion groups by name, or which
screens messages with certain content. Thus parents do not have
to breathe down their children's necks every instant they are at
the computer in order to have control over what their children
are reading. Now granted, this software is not any more sophisti
cated than Bowdler in making selections; but it certainly rules
out a great deal of what parents usually want. The gap between
these programs and parents' judgment about a specific message
ought to be bridged by discussion about general principles that
is central to parental guidance.

As a means of directing children to appropriate material, and for
warning adults to avoid material they would find offensive, the
internet provides a kind of classification for messages.  Discus
sion groups with titles such as "alt.conspiracy.kennedy" make it
very clear what is being discussed there. (If the cryptic title
doesn't give it away, almost all forums have a statement which
members read when they first sign on telling them, sometimes in
very explicit detail, what kinds of messages are welcome there.) 
And messages within these discussion groups are not displayed to
the browser on the system.  What is displayed is a short header,
such as "Oswald CIA payments".  This header may be used by
someone responding to the original message, and the person
responding to that, and so on. (A continuing discussion using the
same header is called a thread.) Thus casual browsers can be
warned of potentially offensive messages; or, once having identi
fied messages in a particular thread that are offensive, readers
will know not to look at other messages with that heading.  

With files the system provides an even better classification:
files are given descriptions usually up to 5 lines long describ
ing their contents. Files that display an image that some may
find offensive are not immediately displayed on screen; viewers
will search by these descriptions and only then go through a
fairly complex and time consuming procedure to view them.  In
fact, because time generally costs the user money, the usual
procedure is to download these files to the home computer and
view them later.  Doing this requires special software, and an
understanding of how to use it.  A combination of extraordinary
incompetence with improbable bad luck would be necessary for
someone viewing an image by accident. The only problem is a
misdescribed image. And, given that the knowledge and skill
necessary to intentionally download and view such files is
reasonably great, and the correlation of these with maturity is
positive, though not by any means perfect, we submit that anyone
who can get the image up on the screen is not an obvious candi
date for paternalism. And where paternalism is appropriate, it
ought to be parents practising it rather than the state.  After
all, the computer the child is using is in the home. 

The other sorts of files that raise concern are text files that
can be downloaded.  These consist of threads of previous ex
changes of messages on topics someone though would be of inter
est, or long files -- sometimes rejected books, or parts of books
that never made it into the published version.  In principle,
these should be treated no differently than messages: that is,
classification and warning, as opposed to legislation and polic
ing are what is required..

G.  Conclusion

Our position is that the public cry for policing of computer
bulletin boards and networks is misplaced. None of the eight
categories mentioned in C. where censorship might possibly be
justified turn out to require special legislation to handle.

     (1) and (2), violent or sexual images produced using chil
dren or non-consenting adults, is already covered under legisla
tion dealing with the criminal actions themselves. (5) and (6),
material offensive to some adults or inappropriate for children,
is best dealt with by the self censorship mechanisms outlined in
F.. (7), incitement, is, as we argue in C., not a problem on the
internet. (8), material prohibited by criminal code and human
rights legislation, we argue in E., should not be prohibited in
the first place.

This leaves two categories that are problematic. (3) material
invading others' privacy, such as posting someone's credit card
number, is a serious, though infrequent problem. Given its
infrequency, current legislation can be applied to those bulletin
boards that make no effort to remove these items once they are
brought to their attention, and to those individuals who post the
material. Critics of our position will maintain that it is very
difficult for authorities to enforce the current laws on the
internet. This is, of course, true. But the same point applies to
any new laws anyone might propose. And if these new laws erode
the goodwill of bulletin board users and owners to police them
selves, they will only make matters worse.

Last, (4) the enforcement of court publication bans, is again a
serious problem. But we maintain that the same mechanisms for
enforcing these against wire services apply to the Internet. And,
as we have already argued, the legitimate publication bans are
few enough that they can be enforced against the very few who
defy them.

In conclusion, then, those who advocate extending state powers of
censorship to this new area should first do a better job of
demonstrating why it is needed where it presently exists. Our
position is that this job simply has not been done.