Experience should teach us to be most on our guard to protect liberty when the Government's purposes are beneficent. Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers. The greatest dangers to liberty lurk in insidious encroachment by men of zeal, well-meaning but without understanding.
Justice Louis D. Brandeis, Olmstead v. United States, 1928

II.  Latent Effects of ComputerBased Record Keeping

The dangers latent in the spread of computer-based personal-data record keeping stem, in our view, from three effects of computers and computer-related technology on an organization's recordkeeping practices.

These three effects on personal-data record-keeping are seldom observed in isolation from one another. Indeed, they are usually interdependent and may acquire a self-reinforcing momentum. The discussion that follows is focused on their potentially adverse consequences for individuals, for organizations, and for the society as a whole. It concentrates on aspects of computer-based record keeping that. highlight the influence of the technology, but also recognizes that organizational objectives, bureaucratic behavior, and public attitudes account in part for many of the potentially undesirable effects we have identified.

Too Much Data

The bare statement that computerization enables an organization to enlarge its capacity to process information deserves amplification. Although the computer enables a large organization to handle more data, the cost of changing from a manual to an automated operation may practically compel a smaller organization to exploit its data-processing capacity more fully. The cost of setting up an automated system includes not only that of equipment and special facilities, but also the cost of system analysis and design, of writing and testing computer programs, and of converting manual records into computer-accessible form. Thus, the manager of a newly automated system may have a strong economic incentive to spread the initial cost over as large a data-processing volume as he can; and to economize wherever possible in providing services. that do not make a direct contribution to the efficient operation of the system itself. A typical result of this condition is that clients receive erroneous bills, unjustified dunning letters, duplicate :magazine subscriptions, and countless other symptoms of inadequate system design and operation. Although these may be more a nuisance than a threat, they contribute heavily to the popular image of computerization as an offending and intrusive phenomenon.

The annoyance factor is worth more attention than many system managers give it. Resentment engendered in customers at the mercy of a computerized billing system, for example, spills over onto other computer operations, making unemotional discussion of computerization in fundamentally more significant contexts difficult.

An early incentive to concentrate on efficiency may also foster a tendency to behave as though data management were the primary goal of a computer-based record-keeping operation. When this occurs, unnecessary constraints may be placed on the gathering, processing, and output of data, with the result that the system becomes rigid and insensitive to the interests of data subjects. A commonly observed tendency in these situations is to make the data subject do as much of the data collection work as possible by forcing him to decide how to fit himself into a highly structured, but limited set of data categories (e.g., "Please check one of the following boxes.").

This can be a way to cut down errors in transcribing data from one form of record to another, but when done solely in the interest of economy the system may well sacrifice flexibility and accuracy. It is true that data compression and "shorthand" record entries did not originate with the computer; ill-adapted categorization has been the bane of bureaucracy for generations. However, manual record keeping can, at the stroke of a pen, take account of data that do not fit comfortably into pre-conceived categories, while a computer record is not usually amenable to any sort of annotation that was not expressly planned for in the design of the system. The relative inflexibility of computer-based record keeping, coupled with the constraints that some automated systems put on the freedom of data subjects to provide explanatory details in responding to questions, contributes to the so-called "dehumanizing" :image of computerization.

A recent occurrence in France illustrates how the inflexibility of an automated personal data system can adversely affect large numbers of people.1 The computer facility of the national family allotment system, which disburses some $600 million annually to 700,000 families in the Paris area, succumbed to the confusion created by changes in the allotment rate for nonworking wives, young people, and the handicapped. Efforts to unravel the difficulty were unsuccessful, and the computer center had to be reorganized as a manual operation in order to clear up an enormous backlog of emergency allotment payments. The disruption of human lives resulting from the inability to use the computer-based payments system was undoubtedly great and demonstrates why the difficulty of making even minor changes in the computer programs of a complex system gives cause for concern. Human bureaucracies exhibit similar rigidities, but their procedures can usually be changed by management directive, often by a simple promulgation of rules, and in a reasonably short time. In computer systems, however, even a change that has the wholehearted support of all concerned may be difficult and slow to effectuate.

This problem can become even more serious when economies of scale are sought by consolidating the data-processing tasks of several organizations into one automated system serving all. The effects of dysfunction then fall not only on the customers of the system primarily at fault, but also on "bystander" data subjects and other organizations.

Easy Access

The second effect of computerization on personal-data record keeping-that it facilitates access to data within a single organization and across boundaries normally separating organizations-is another source of concern. Quick, cheap access to the contents of a very large automated file often prompts an organization or group of organizations to indulge in what might be called "'dragnet behavior.2

An example of how a very carefully planned data system of ostensible social benefit operates as a dragnet is the National Driver Register of the Department of Transportation (more fully described in Appendix D). It provides a central data facility containing the names of individuals whose driver licenses are denied or withdrawn by a State. The purpose of the Register is to give each State access to the current revocation records of all other States, so that one may, if it wishes, avoid issuing a license to an individual whose license has been denied or withdrawn by another State.

Suppose that Missouri revokes John Doe's license for a serious offense. Doe applies in Illinois for a license, neglecting to mention the Missouri revocation. If Illinois issues Doe a license, it in effect nullifies Missouri's action, without knowing it is doing so. Before the National Driver Register was established, Illinois would have had to make specific inquiry to all other States in order to discover the Missouri record of license withdrawal. Because this was time- consuming, States tended to do it only for blatantly suspicious cases with the presumable result that many fraudulent applications were never detected. Now that Doe's record of license withdrawal goes into the master file of the National Driver Register, however, one query to the Register from Illinois will bring the Missouri action to light within 24 hours, thus permitting Illinois to make a decision to grant or withhold a license based upon the original Missouri record.

How can a system whose only purpose is to prevent fraud by drivers of demonstrated unfitness have any adverse effect? The answer lies in the efficiency of the Register; it has become easier for most States to put all their license applications routinely onto magnetic tape to be searched against the Register's file, rather than to separate out the suspicious cases for special treatment. If one accepts the objectives of the system-to identify irresponsible or incompetent drivers, and thus to reduce the number of traffic fatalities-this is not in itself an objectionable practice. However, automated matching of queries against NDR records generates identity matches so imprecise that subsequent manual ;screening reduces the system's 5000 possible "hits" per day to about 500 probable ones. Of the probable hits, the operators of the Register estimate that about three quarters are true identifications; that is, they definitely relate to an individual who has misrepresented himself in a license application. Arithmetic does the rest; a quarter of the probable hits -- 125 individuals per day -- may find that they are required to prove that their licenses have not been withdrawn. In theory, a reply from the Register is supposed to be treated merely as a "flag" to inform the inquiring State that there may be a record on the individual about whom the query was made in the revocation files of another State. At least one State, however, makes the "flagged" applicant bear the full burden of proving that such a record does not exist. Here, the "dragnet effect" of cheap arid easy data access-the fact that it is cheaper and more efficient to search the NDR on every license application-has resulted in occasional nuisance and potential injustice to some applicants:

The problems that can arise from the operation of the NDR stem from its role as a clearinghouse for information supplied and used by more than 50 independent driver licensing jurisdictions whose operations it does not control. Each jurisdiction using the; Register risks being misled by incomplete or erroneous data submitted. By another participating jurisdiction. Although mistakes propagated by the NDR can usually be corrected at small expense in time and trouble, other mufti jurisdictional clearinghouses can have potentially more serious effects on individuals. The criminal history fileof the FBI's National Crime Information Center (NCIC) is one example.

The NCIC is a computerized clearinghouse of information about wanted persons, stolen property, and criminal history records3 that will eventually provide criminal justice agencies throughout the United States with computer-to-computer access to the dicta in its files. The ultimate objective of the NCIC criminal history file is to enable law enforcement agencies, courts, and correctional institutions to determine, in seconds, whether an individual has a criminal record. The NCIC would appear to lack the potential to be used as a dragnet because inquiries are made only about particular individuals with whom law enforcement agencies have contact under conditions that constitute cause for suspicion of wrongdoing. In this respect, it differs significantly from the operation of the National Driver Register. Furthermore, the problem of mistaken identification in using the criminal history files should not arise because of NCIC's requirement that fingerprints be used to identify arrest and offender records entered into the system. Errors of identification can and do occur in using the records in the wanted persons files because these are not identified by fingerprints. However, the ease with which inquiries can be made from remote terminals located in law enforcement and criminal justice agencies all over the; country could lead to access to the NCIC criminal history files by more users and for checking on more individuals than is socially desirable.

Leaving aside the question of the probative value of arrest records, about which lively controversy exists, the consequences of excessive use of criminal history files might be innocuous if the NCIC records could be completely reliable. In practice, however, the NCIC, like the National Driver Register, does not have effective control over the accuracy of all the information in its files. The NCIC is essentially an automated receiver, searcher, and distributor of data furnished by others. If a subscribing system enters a partially inaccurate record, or fails to submit additions or corrections to the NCIC files (e.g., the recovery of a stolen vehicle or the disposition of an arrest), there is not much that the NCIC can do about it.

Furthermore, the risk of propagating information that may lead to unjust treatment of an individual by law enforcement authorities in subscribing jurisdictions cannot be fully prevented.4

The NCIC checks on records being entered into its files, and periodically audits its files to try to assure that system standards for completeness and accuracy of records are being met. When it detects errors or points of incompleteness, it can seek corrective action and can flag its records to warn users of possible deficiencies. In the cases of an arrest record, however, even if the source agency does eventually submit information about the disposition of the arrest, there is no way that the NCIC can assure that all those who have had access to the record in the interim will receive the disposition information. Once a subscribing police department contributes an arrest report to the NCIC, that report is available to any qualified requestor in the system. In some States, this means that employers and licensing agencies (for physicians, barbers, plumbers, and the like) will have access to the record under State laws that require an arrest-record check on candidates for certain types of occupational certification. Thus, unless a criminal record information system is designed to keep track of all the ultimate users of each record released, and of every person who has seem it, any correction or emendation of the original record can never be certain to reach each holder of a copy.

Systems like the NCIC and the National Driver Register illustrate one of the potentially most significant effects of computerization on personal-data record keeping-the enhanced ability to gather, package, and deliver information from one organization to ;another in circumstances where lines of authority and responsibility are overlapping or ambiguous, and where the significance attached to data disseminated by the system may vary among subscribing organizations. Unless all organizations in a mufti jurisdictional system can be counted on to interpret and use data in the same way, the likelihood of unfair or inappropriate decisions about the individual to whom any given record pertains will be a problv;m, and a particularly acute problem whenever records are incomplete or compressed. The records of school children, for instance, while highly comparable within a single school district, will be less so among the districts of a single State, and even more disparate among different States. Thus, data systems that are established deliberately to pass information across jurisdictional lines must be very carefully designed so as to foster sensitive, discriminating use of personal data.

The untoward effects of such systems (or of any system, :for that matter) do not stem in the main from poor technical security. Although public mistrust of the computer often centers on the possibility of unauthorized access to a central data bank for purposes of blackmail or commercial exploitation (such as the clandestine copying of a list of names and addresses), the. purely technical difficulties that can be placed in the path of any but the most well-equipped intruder can make almost every computer installation more secure than its manual counterpart. Unless an intruder has detailed technical knowledge of the system, and possibly also clandestine access to the facility itself, most systems can be quite well defended against "unauthorized" access (although at the present time many systems may not be well-defended). The problem is how to prevent "authorized" access for "unauthorized" purposes, since most leakage of data from personal data systems, both automated and manual, appears to result from improper actions of employees either bribed to obtain information, or supplying it to outsiders under a "buddy system" arrangement.

Concern about abuses of authorized access to "integrated" data systems maintained by State and local governments can have a particularly debilitating effect on people's confidence in their governmental institutions. Ambitiously conceived integrated systems, no matter how secure technically, may have the effect of blurring, either in fact or appearance, established lines of political accountability and constitutionally prescribed boundaries between branches of government. When different branches arrange to share an integrated data-processing facility and its data, the executive usually will operate it. This happens partly because operational functions are normal for the executive, and partly because executive agencies usually have more experience with computer systems. It leads people to fear, however, that the needs of executive claimants may be met before the needs of legislative bodies and the judiciary. The priority system for allocating computer support will, of course, look fair on paper, but in practice the result may often be to shortchange the passengers on the system in favor of the driver.5 The recent development of mini-computers, much cheaper than the big systems of only five years ago but of comparable power, is providing an attractive economic alternative to . large integrated systems. Large systems, however, are also becoming less expensive and there is no assurance that they will not become even more so as the result of new technological advance.

Finally, in terms of the historical classification of records in Chapter I, we recognize that combining bits and pieces of personal data from various records is one way of creating an intelligence record, or dossier. The possibility of using a large computer to assemble a number of data banks into a "master file" so that a dossier on nearly everybody could then be extracted is currently remote, since the ability to merge unrelated files efficiently depends heavily upon their having many features of technical structure in common, and also on having adequate information to match individual records with certainty.6 These technical obstacles are avoided if the capability to merge whole files is designed into a group of systems at the outset, a practice now characteristic of only a few multi-jurisdictional systems but perhaps becoming more prevalent. At the present time, however, compiling dossiers from a number of unrelated systems presents problems that few organizations, and probably no organizations outside of government, have the resources to solve.7

Nonetheless, public concern about such combinations of data through linkings and mergers of files is well founded since any compilation of records from other records can involve crossing functional as well as geographic and organizational boundaries. When data from an administrative record, for example, become part of an intelligence dossier, neither the data subject nor the new holder knows what purpose the data may some day serve. Moreover, the investigator may believe that no detail is too small to put into dossier, while the subject, for his part, can never know when some piece of trivia will close a noose of circumstantial evidence around him. Public sensitivity to the possibility of such situations argues strongly for preserving the functional distinctions between different classes of personal data systems.

Technicians as Record Keepers

The reputation of the computer for impersonality and inhuman efficiency is due, in part, to the publicity given the computer as a poet, a chess-player, and a translator of exotic languages. "Machine intelligence" is a subject with fascinating technical and philosophical aspects. To date, however, there is no evidence that a computer capable of "taking over" anything it was not specifically programmed to. take over is attainable. Indeed, as pointed out earlier, programming a computer to handle anything complicated is usually a very difficult and expensive job, requiring generous amounts of money, expertise, and management capability.

It seems safe to predict that economic and organizational constraints on the uses of computers . will not change: radically during the next few years. Although computing power and data-storage capability are steadily becoming cheaper, and problemoriented programming is being improved, no dramatic breakthroughs are in sight. This prediction, however, cuts two ways. If we can comfortably assume that computers will not take control of anything on their own volition, we may still feel some disappointment that the application of computers will tend to remain in the hands of trained specialists whose competence is primarily in data processing rather than in the fields that data processing serves. Some would say that this circumstance results from an abdication by managers of their proper role, but whatever the reason, the effect can easily be to insulate the record-keeping functions of an organization from the pressures of both consumers and suppliers of data.

The presence of a specialized group of data-processing professionals in an organization can create a constituency within the organization whose interests are served by any increase in data use, without much regard for the intrinsic value of the increased use. The point is underlined by an experience common to many organizations. Some unit is already operating a computer facility for accounting, processing scientific or engineering data, or for some other straightforward application to which the technology is well-adapted. Because the facility has extra computer time available, it is soon discovered that attractive software packages can be purchased to enable the computer to enlarge its scope and become a "management information system."

Such systems are founded on the proposition that efficient decision making requires that managers have available to them a greater or more timely supply of relevant information than they have been getting. As commonly observed, however, most managers do not need more of relevant information nearly as badly as they need less of irrelevant raw data.8 Thus, until the theory of management itself has progressed to a stage where the necessary data content of management-oriented systems can be predicted, their users are likely to find them disappointing.

Another, potentially more serious, consequence of putting record keeping in the hands of a new class of data-processing specialists is that questions of record-keeping practice which involve issues of social policy are sometimes treated as if they were nothing more than questions of efficient technique. The pressure for establishing a simple, identification scheme for locating records in computer-based systems is a case in point.

The technical argument for having a standard universal identifier for records about individuals focuses on increasing the efficiency of record keeping and record usage. Proponents argue that if every item of data entered into an automated system could be associated with an identifier unique to the individual to whom the data pertain, updating, merging, and linking operations would be greatly. simplified and far less error-prone than they are today. Moreover, records could be used more intensively; administrative records indexed by Social Security number, for example, could also be used for certain types of research which require matching data on individuals from several different record systems.

To reap the full technical advantages of a standard identiflication scheme, it is necessary for each individual to supply the identifier assigned to him every time he has contact with a record-keeping organization using it. This practice is already familiar to the clients of banks, credit-card services, and many other organizations that have developed their own standard schemes. What worries people is that the inconvenience to record-keeping organizations of having to devise their own numbering arrangements will encourage the adoption of a single universal scheme for use in all computer-based personal data systems. If this happens, organizations that share an interest in monitoring and controlling the behavior of some portion of the population will acquire an enlarged capacity to do so, since they will all be able to know when an individual has contact with any one of them. Fingerprints, for example, are the standard method used by the police to identify persons arrested for crimes. Fingerprinting assures accurate identification and may seem a reasonable way of dealing with criminal offenders, but it is a dubious model for other types of record-keeping organizations to follow.

It is, of course, a long step from having each individual identified in the same way in every data system to creating a giant national data bank of dossiers constructed from fragments of records on citizens in widely dispersed data systems. There would have to be some strong incentive for "putting it all together," and as we noted earlier, it is doubtful that even the dollar cost of doing so could be justified on any reasonable grounds. However, it is not necessary to build a giant national data bank to experience some of the effects of having one. There are already systems in operation which have some of the control capabilities that such a centralized dossier system would create.

One computer-based personal data system that came to our attention was a comprehensive health information system developed and maintained by an agency of the Department of Health, Education, and Welfare on an Indian reservation in the Southwest. Approximately 10,000 Indians living in the area have records in the system and another 4,000 have records in it but, for one: reason or another, are not part of the active patient population. These 14,000 record subjects are, by and large, an economically dependent population with very serious health problems. Within the confines of the geographic area covered by the system-about the size of Connecticut-they are also a highly mobile population, with each individual going by any one of several different names depending on circumstances.

The health facility consists of a combination of in-patient, out-patient, and field-clinic services. The purpose of its cornputer-based record-keeping system is to develop a complete, cradle-to-grave, medical dossier on each individual eligible to use the facility, so that all can benefit from a comprehensive diagnostic and treatment program that aims to control illness by preventing its occurrence, or by taking preemptive steps at the, first sign of a medical problem.

The record-keeping system has three basic components: (1) an administrative one that notes and describes every contact each patient has with any segment of the health facility, including the "interdisciplinary" teams of doctors, nurses, and social workers who travel about administering tests and providing ambulatory health services; (2) a statistical-reporting one that attempts to observe fluctuations in the incidence of certain types of ailments and to pinpoint "high risk" groups needing special preventive attention; and (3 ) a "surveillance" one that consists of the recorded results of medical tests administered according to a schedule established by the health facility. The system is a little more than three years old. By the summer of 1972 it contained about 50 million characters of data, or approximately 3,500 characters per patient-record. It accommodates data in narrative as well as standard computer-accessible form.

The system is an elegant tool for addressing a complex set of social problems. It would be hard to argue that the patient population being cared for would be better off without the services the system makes possible: It is also apparent that knowing who an individual is, and the details of his medical history, can be of vital importance in treating patients, but the system has certain social control capabilities that should be noted nonetheless.

The surveillance component, for example, has the primary purpose of discovering incipient medical problems in individual patients. To do this effectively, each patient must be induced to comply with the health facility's testing schedule, and the health data system can be used to encourage compliance. As long as a patient has no need for medical treatment, he can avoid the testing program. However, once he becomes a patient, for whatever reason, his record will be there at the doctor's fingertips showing all tests he has not had but should be persuaded to have before he leaves the field clinic or wherever it is that he has come to the medical facility's attention. In discussing a system serving such, patently humane purposes, words like "control" and "coercion" may have an objectionable ring, but the coercive potential of the surveillance component, especially in some other area of application, is evident.9

In another environment, the statistical-reporting component of the system could also have potentially unsavory consequences for individuals. It is characteristic of modern organizations to single out "high risk" categories of people to whom the normal standards and rules do not apply. Often these high risk groups are identified from statistical studies of populations that use the services an organization offers. The consequences for any given individual exhibiting the characteristics of the high risk group may range from total exclusion (uninsurability) to being made eligible for special treatment (remedial education, free medical care). Although there is nothing intrinsically harmful in such practices, in dealing with human populations it is essential not to assume that any single member of a statistically defined group will necessarily behave in the way predicted for the group as a whole. Theoretically, the adverse consequences of "statistical stereotyping" can be avoided by permitting an individual to know that he has been labelled a risk and to contest the label as applied to him. However, depending on the circumstances-and particularly on the stake that an organization may have in being able to predict the behavior of each individual in its clientele-a lone individual could have considerable difficulty making his case.

Even the administrative record-keeping component of a comprehensive data system can have coercive effects. When the; administrative part of the health data system was described to the Committee, repeated reference was made to the advantages of knowing that a patient has previously been treated for an emotional disorder when he shows up at a clinic claiming that he has accidentally scratched his wrist on a rusty nail. One hopes, that his chances of being discharged after some bandaging and a tetanus shot are about the same as his chances of being committed for treatment as a potential suicide. But are they? Should they be? In some other record-keeping environment, could an individual depend on having someone equivalent to a trained medical practitioner available to make such a judgment?

Finally, it is important to note that the health data system has grown very rapidly, that elements like the "high risk" categorization were not present in the beginning, and that the health facility is now trying to improve its method of identifying patients for the purpose of updating and retrieving the information it maintains about them. In this particular situation, the Social Security number happens to be considered a poor identification device because many patients are thought to have more than one; but the patients also tend to have several different names, so the managers of the data system are trying to develop their own unique numbering scheme cross-referenced with all known "aliases" for each patient.

Scheduling, labelling, monitoring, improved methods of identifying records about individuals-these are being discussed in some quarters today as if they were mere tools for delivering services to people efficiently. In the health data system just described, the surveillance component is regarded as a way of providing preventive health care; of taking preemptive steps to halt the natural development of illnesses and conditions conducive to illness. It is hard to quarrel with those objectives, or for that matter with the objectives of a great many data systems now in operation or being planned. Should a national credit card service be prohibited from using a sophisticated personal data system to prevent its card holders from going on irresponsible spending sprees? Should school districts be forbidden to use personal data systems to help prevent children from becoming delinquents?

These are difficult questions to answer. Often the immediate costs of not using systems to take preemptive action against individuals can be estimated (in both dollars and predictable social disruption), while the long-term costs of increasing the capacity of organizations to anticipate, and thus to control, the. behavior of individuals can be discussed only speculatively. One fact seems clear, however; systems with preemptive potential are typically developed by organizations, and groups of organizations, who see them primarily as attractive technological solutions to complex social problems. The individuals that the systems ultimately affect, the people about whom notations are made, the people who are being labelled and numbered, have, by comparison,, a very weak role in determining whether many of these systems should exist, what data they should contain, and how they should be used.

The Net Effect on People

Today it is much easier for computer-based record keeping to affect people than for people to affect computer-based record keeping. This signal observation applies to a very broad range of automated personal data systems. When a machine tool produces shoddy products, the reaction of consumers (and of government regulatory agencies in some cases) is likely to give the factory managers prompt and strong incentives to improve their ways. This is much less likely to be the case when computerized record-keeping operations fail to meet acceptable standards.

There is some evidence that in commercial settings competition helps to prevent harmful or insensitive record-keeping practices, especially when a record-keeping organization (a bank, for instance) depends on continuous interaction with individual data subjects in order to. keep its own records straight. It is also true that a number of schools and colleges have been forced to abandon automated registration and scheduling by determined student campaigns to fold, spindle, and mutilate. In governmental sittings, however, the dissatisfied data subject usually has nowhere else to take his business and can even be penalized for refusing to cooperate. The result, of course, is that many organizations tend to behave like effective monopolies, which they are.

It is no wonder that people have come to distrust computer-based record-keeping operations. Even in non-governmental settings, an individual's control over the personal information that he gives to an organization, or that an organization obtains about him, is lessening as the relationship between the giver and receiver of personal data grows more attenuated, impersonal, and diffused. There was a time when information about an individual tended to be elicited in face-to-face contacts involving personal trust and a certain symmetry, or balance, between giver and receiver. Nowadays an individual must increasingly give information about himself to large and relatively faceless institutions, for handling and use by strangers-unknown, unseen and, all too frequently, unresponsive. Sometimes the individual does not even know that an organization maintains a record about him. Often he may not see it, much less contest its accuracy, control its dissemination, or challenge its use by others.

In more than one opinion survey, worries and anxieties about computers and personal privacy show up in the replies of about one third of those interviewed. More specific concerns acre usually voiced by an even larger proportion.11 The public fear of a "Big Brother" system, in effect a pervasive network of intelligence dossiers, focuses on the computer, but it includes other marvels of twentieth-century engineering, such as the telephone tap, the wireless microphone, the automatic surveillance camera, and the rest of the modern investigator's technical equipage. Such worries seem naive and unrealistic to a data-processing specialist, but as in the case of campus protests against computerized registration systems, the apprehension and distrust of even a minority of the public can grossly complicate even a safe, straightforward datagathering and record-keeping operation that may be of undoubted social advantage.

It may be that loss of control and confidence are more significant issues in the "computers and privacy" debate than the organizational appetite for information. An agrarian, frontier society undoubtedly permitted much less personal privacy than a modern urban society, and a small rural town today still permits less than a big city. The poet, the novelist, and the social scientist tell us, each in his own way, that the life of a small-town man, woman, or family is an open book compared to the more anonymous existence of urban dwellers. Yet the individual in a small town can retain his confidence because he can be more sure of retaining control. He lives in a face-to-face world, in a social system where irresponsible behavior can be identified and called to account. By contrast, the impersonal data system, and faceless users of the information it contains, tend to be accountable only in the formal sense of the word. In practice they are for the most part immune to whatever sanctions the individual can invoke.

1New York Times, January 26, 1973, p. 4.

2 Although the term "dragnet" commonly connotes a system for catching criminals or others wanted by the authorities, the term, as used here, refers to any systematic screening of all members of a population in order to discover a few members with specified characteristics.

3 See Appendix E for a discussion of the development of computerized criminal justice information systems in the United States.

4 The NCIC system has been imitated by many city police departments whose systems respond to inquiries from law enforcement jurisdictions in adjacent suburbs. A suburban law enforcement officer first queries the city system to which his terminal is linked; if the file search there yields nothing, his query is passed on automatically to the State system and from there to the NCIC. These local systems have all the accuracy problems of the NCIC and some are currently the objects of law suits brought by their hapless victims. See, for example, "S.F.'s Forgetful Computer," San Francisco Examiner, May 9> 1973, p. 3, and "Coast Police Sued as Computer Errs," New York Times, May 5, 1973, p. 23. Almost all of these cases involve the failure of a local jurisdiction to report the recovery of a stolen vehicle or the revocation of a warrant.

5 For a discussion of political issues raised by computer-based information systems in urban government, sex Anthony Downs, "The Political Payoffs in Urban Information Systems," in Alan F. Westin (Ed.), Information Technology in a Democracy (Cambridge, Mass.: Harvard University Press), 1971, pp. 311-321.

6 In addition to incompatibilities of file structure, the expectation that ;some day "it will all be put together" also runs afoul of the tenacity with which record-keeping organizations tend to protect their own turf. Certainly among private organizations competitive pressures sometimes inhibit the free circulation of information about clients and also induce resistance to sharing large blocks of individually identifiable data with government agencies. The California Bankers Association, for example, is currently involved in litigation (Stark v.Connally, 347 Fed. Supp. 1242, 1972) to prevent the Treasury Department from enforcing the reporting provisions of the so-called Bank Secrecy Act of 1970 (12 U.S.C. 1E29b; 31 U.S.C. 1051-1122) with respect to domestic financial transactions.

7 It should be noted that the same characteristics of automated systems which inhibit the compilation of dossiers can also inhibit efforts by the press and public interest: groups to penetrate the decision-making processes of record-keeping organizations and expose them to public scrutiny. This is particularly true when organizations destroy "hard-copy" records after putting the information in them into computer-accessible form. In such cases, the computer can become a formidable gatekeeper, enabling -a record-keeping organization to control access to public-record information that previously had been available to anyone with the time and energy to sift through its paper filers. Putting public-record data in computer-accessible form can also increase the cost of piecing information together from several different files. The same programming costs that make it uneconomical for law enforcement investigators and private detectives to "fish" in the automated files of a credit bureau could also make it prohibitively expensive for private citizens to examine public records.

8 "See, for example, Russell Ackoff, "Management Misinformation Systems," in Westin, op. cit., pp. 264-271.

9 A computer-based information system designed to control the population of a prison is described in Appendix F.

10 For a cogent description of how this is done, see James B. Rule, Private Lives and Public Surver7lance (London: Allen Lane), 1973, especially Chapter 6. See also Robert A. Hendrickson, The Cashless Society (New York: Dodd, Mead & Company), 1972.

11 See, for example, A National Survey of the Public's Attitudes Toward Computers (AFIPS-TIME, Inc.) 1971. This survey is discussed in Alan F. Westin and Michael A. Baker,

Table of Contents