Reprint of 1973 paper by:
R. W. Bemer
Honeywell Information Systems
Phoenix AZ US
NOTE 1: This was a slightly abridged version of a presentation to the NordData Conference in Copenhagen, Denmark, 1973 August 15-17. It was not submitted in time to be included in their published proceedings, and so was published in the Honeywell Computer Journal to rectify that omission.
NOTE 2: The scanned version of the original (6) HCJ pages is also available on this site.
NOTE 3: This HTM version is free to use to abstract and/or edit any part of the text.
Computer usage is classified as either 1) advisory, 2) leading to decisions by humans, or 3) with decisions being taken by a preprogrammed computer unless countermanded in time. Some examples of difficulties even in the first two categories imply that caution in the third is imperative. The computer technology learned from the space effort is not yet transferred to the bulk of computer usage.INTRODUCTION
Both legal and voluntary (professional) measures against misuse are discussed.
In 1950, after my "graveyard shift" at the RAND Corporation, I was still working at 0830 on a 604 board to take an 8-digit square root of an 8-digit number (until then not accomplished mechanically for that equipment). A round little man approached and asked what I was doing. I told him. He then asked about the calculator, and as I answered each question the next one got more difficult and penetrating, until I was really straining every faculty to answer correspondingly. He did not introduce himself, but I found out later that day that it was John von Neumann.
Naturally the incident remains very clear in my mind. I recall that he did not leave me saying "Use the tool well for the social benefit of mankind", or anything else in this vein. There were very few men in the computer world or business then that were considering social ramifications of this sort. Ed Berkeley was, and remains, an exception. To most of us it was just a time of freeing the mind to do far beyond our previous capabilities, at a fantastic rate. We were lured and beguiled; the newness and vast potential drew us, with so much waiting to be done. We took little time for speculation about the eventual effect of computers upon our society, or the extent and scope of the usage to come.
This insensitivity may also have been due to the fact that the first work was almost exclusively concerned with processes upon numbers. Even when I started in 1949, ten years after the first program-controlled calculator was designed, the manipulation of symbols was considered by only a few, and did not even become recognized as a proper computer function until 1956.
I intend to show that there has been a significant change in the type of applications made possible by computers, a change we are ill-prepared for. Any tool that provides leverage or amplification can be misused. I shall give some case histories to demonstrate some ways of misuse and why they continue to be effective. Then I shall outline some measures to reverse the trend and stop much of the misusage.
A CLASSIFICATION OF COMPUTER APPLICATIONS
For purposes of this talk, I propose a simple and perhaps novel classification of computer usage:
The hardware developments of about the last three years, leading to microprocessors on chips, portend a tremendous increase in the third class of application. And this is why we must be on guard as to the propriety and systems aspects of such applications. Applied to digital wristwatches, this does not seem critical. Applied to automobiles, such applications could be extremely critical. One is reminded of power-steering, a boon when it operates, perhaps, but a definite danger when power fails or is turned off.
- Applications that do not lead to decisions affecting humans directly.
Examples come largely from the field of numerical computation, the earliest category of usage. Computational results that might tend to prove or lead to a theory; calculations for spaceship or missile design (they don't have to be built or launched); programs for playing games, or associating payoffs with strategies, etc. We may term such computation advisory.
- Applications with computational results that lead to decisions by humans.
Some of these can get very close to integration into human affairs. For example, someone may be denied credit or refused an employment opportunity. It has turned out, in much practice, that the human decision to be taken may be perfunctory or mindless. Nevertheless there is recourse, no matter how time-consuming and difficult it may be, and regardless of what body of law may need to be enacted to protect people in such circumstances.
- Applications where the computer has been previously programmed to take a decision and action, and will in fact act unless countermanded in time.
Examples are online patient monitoring, control of nuclear power plants, air traffic control and collision avoidance systems, automatic transportation systems (i.e., BART, in San Francisco), and automobile braking and antiskid systems.
A pair of questions indicates a possible dilemma:
Consider the announcement of an experimental device which requires matching a certain procedure before you can start your automobile. The intent, and certainly an obvious usage, is to preclude drunken drivers from operating vehicles. But suppose that you are extremely shaken because your wife has just been killed, and your child needs to be taken to the hospital. Could you start the car then?
- Q: Does technology exist to integrate computer components very closely into human affairs?
- A: Yes. For an example, see the 1974 US automobiles, which will not operate unless seat belts are fastened.
- Q: Are system design and good practice manuals available for such a level of technology, and/or is suitable indoctrination and education available in our educational institutions?
- A: Emphatically NO! And this is frightening enough to sugest a moratorium on such developments until we understand the tool better.
Or consider the case of online patient-monitoring reported in Datamation magazine of 1972 October. The programming was correct but the computer was not 100% reliable. This, as we know, is taken care of by having a customer engineer to fix it. But nobody remembered to find out whether the customer engineers would always be available over the weekend, and speedily. As reported, a patient died because confusion in the human system caused the computer to remain inoperable.
Certainly the US space effort has gathered ample experience in the matter of letting computers decide, when they are capable of it, and of overriding them sensibly when it is shown that they were programmed incorrectly or without consideration for all eventualities and malfunctions. We see many spin-offs from the space effort with respect to products, but very little in methodology which could be so very applicable to computer usage.
I now wish to make it quite clear that I like computers. I believe that they are presently more beneficial than harmful to society, and that this ratio can be increased if we take careful consideration and plan for their best and proper usage. If I were fatalistic, I should feel that they have arrived just in time to save us from our enemies, who are ourselves. In 25 years as a programmer I have never faced a day of working with computers without pleasant anticipation.
I also like a fire in the fireplace, but not arson. Both fire and computers are tools accessible to all of society in some form, and society uses such basic tools in many ways, some deemed good and some bad.
Fire was an early tool, useful for hollowing out logs to make vessels, to make transformations in food, and to heat enclosed air. It was also used to burn vegetation and trees, sometimes accidentally (which was thought bad) and sometimes deliberately, to clear for planting (which was thought good).
A major difficulty in analyzing the contribution of a tool is the inability to categorize, in an absolute way, its uses as being good or bad. This is not philosophical, but only to remind us that we make these judgments of good and bad in the narrow context of our mores and morals, which are in turn conditioned by our accumulated knowledge and analysis of the workings of our world. We have learned a little more of those workings lately, not because we sought the knowledge so much as because it has been made painfully evident to us that there is more or less coupling between all the elements of our world.
I quote from an interview with Dr. Carl Hammer of Univac, regarding a conversation with V. A. Trapeznikov, acting Co-Chairman of the United States and Russian Joint Commission on Scientific and Technical Cooperation: "He told me, as he told President Nixon one day earlier, that 'we all must cease to make wrong decisions on a large scale because mankind can no longer afford it. Mankind's resources are highly limited, and we can no longer squander them' ... we must develop not only national but international models for improving our decisionmaking processes. Decisions which at this time are made on a political or emotional base; neither way will produce optimal results".
So I touch on some bad uses of computers only to illustrate the problems to overcome by legislation, education, and professionalism to make computers serve us better.
THE POWER OF THE COMPUTER
Carl Hammer says "We have already built into our society a mind-amplifying factor of 2000 to one. Behind every man, woman, and child in this country (the US), there stands the power of 2000 human beings. The responsibility of any data processing manager of today, of the computer scientists ... is so enormous that even I cannot envision it. It is the greatest challenge that has ever faced mankind".
Power it is, in elemental form. IBM's recent advertising stresses "think of the computer as energy". Theoretically, the computer is vast power at the service of people, to be used as the imagination of the people leads it, subject of course to limiting legislation. But let us not be lulled by any advertising into thinking that the energy is just like electricity. Computer power is work power, but it is also knowledge power, of the kind that has been used throughout history for aggrandisement as well as the good of the people. In a time when technology stands at bay, it will be well to consider the dangers of computer misuse in prejudicing the population against a valuable tool, and of misuse by corrupt or ignorant officials.
There are no known instances of computers voluntarily stopping normal work to perform illegal acts without direction by humans. Consider the science fiction capability of walking through matter; we have seen it in the cinema, usually used to get into the bank vault or perform some other evil deed. But in the cinema it was a power accorded only to a few, being so technically difficult. Computer power is available widely, and we must not be surprised that some people should turn it to their own ends in disregard of the general benefit of society.
Consider the case of Jerry Schneider. There is no problem with mentioning his activities. He sent an abstract of a paper that he wanted to present at the 1973 National Computer Conference, telling about how he tapped into a computerized ordering system and stole something like $1 million of telephone equipment by having it delivered to a telephone company van bought at auction. The computer program, not knowing how to bill and get payment, ignored it as being within loss limits. When turned in by an employee, Schneider spent two months in jail and was back in business as a computer security consultant!
There is no question but that computer power may be abused by individuals. It may be so used by larger entities, such as corporations, to fool or defraud. It may be so used even by governments, however wittingly.
Dr. Henry Bruck of M.I.T. spoke of this at ACM 70, in a talk entitled "To Redress the Balance". His thesis was that computers, because of cost and training investment, were more likely to become the tools of government and big business than the general public. Countering the argument that minicomputers, microcomputers, and hand calculators are available to individuals at low cost, he said that it was a fallacy to assume that this meant that computer power was available to the general public for this reason. Shovels for a penny are useless unless one knows how to dig, and has arms. It is the usage skill that is important.
He thought that modifying education so that imparting basic computer skills (and problem-solving techniques) would be given as much emphasis as learning one's own language would be unnecessary overspecialization. Nor would the answer be to reduce usage by government and business, for we have ever more need for decisionmaking information that is more likely to be accurate and complete, taking into account the overall advantage to people. However, he saw no reason why computer services could not be provided to the citizenry through public institutions.
I agree. There are many opportunities for computer services to be provided by municipalities and/or private ventures. One can imagine data banks that could serve as advisories for human action and choices. There is an experiment in Los Angeles where the computer serves as a general counselor for a multitude of services. Consumerism could be served in a great many ways -- product safety and efficiency, comparative shopping, financing aid for major purchases, reminders for preventive maintenance, etc.
Thus there are many ways to redress the balance by making computer power really available to everyone in a direct manner and without having to learn how to program. There is a need, however, for a certain amount of "computer literacy" in order to feel comfortable with such usage.
A PANORAMA OF EVILS ARISING FROM
THE "AUTHORITY" OF THE COMPUTER
As a tool, the computer has become commonplace with a rapidity exceeded by no other, even the automobile. This has caused some disallocation and unease, which the practitioners have not been able to avoid. Most major tools, when introduced, have had their custodians, and then their guilds or professions that, from gradual experience, added to the body of law and practice those safeguards for usage that appeared necessary from gradual occurrences of misuse.
This did not occur with computers, and perhaps we did not even use the time that was available to us, so caught up were we with the mystique and power. Certainly we did not familiarize people generally with computers; instead, they were publicized as "giant brains", and the mystique grew into authoritativeness. One of the main problems with authority is that it can be blamed. Surely you all know many examples, but I shall add a few to your knowledge:
The Authority of the Computer
as a Scapegoat and Excuse
Perhaps it is a worldwide phenomenon. One calls the store that has made a mistake in the bill, the bank that has not returned the cancelled checks, the association that has blacklisted your credit -- and the voice replies "I'm sorry, sir, but we have a computer now...".
The Authority of the Computer
- The Allen Piano and Organ Company of Phoenix advertised by radio that its computer had made a mistake in ordering inventory; they were now overstocked and were therefore holding a sale. I wrote the company a letter, on behalf of the Association for Computing Machinery, offering to fix the computer or program so it would not make such a mistake anymore, on condition that if it developed that a human was at fault, and not a computer, they would so acknowledge this in their subsequent broadcast advertising. Datamation magazine followed the story -- it turned out that the Allen Piano and Organ Company did not have a computer, nor did they use any computer facilities.
Note the convictions of the advertisers that a computer would give authority to their spurious claim of overstocking.
- One Mr. D'Unger, not of the computer community, wrote to several companies maintaining mailing lists containing his name, either for billing or solicitation, asking them to please spell it correctly. Not DUNGER, and not D UNGER, and not Dunger (for those with lower case capability). He received several replies, all saying that it was unfortunately impossible with their computer equipment. Learning of this from his letter to Computerworld, I called several of these data processing departments, to find in each case that the print chain was in fact an IBM chain that did have the apostrophe on it, but that they had not bothered to use it! It seems to me that a man's name is a dear possession, and not one to be treated cavalierly under cloak of computer authority.
- I once visited a home where four elderly women were playing bridge. When they found out that I was in the computer profession there was a chorus of horror stories. Then one brought out a letter from her bank, with a handwritten apology from the teller for the shortcomings of the computer. I was on the spot. To save face I called the bank vice president to see what could be done. They didn't have a computer either!
as an Accomplice
The computer is a convenient means of implicitly or explicitly covering activities that run from illegal to self-serving, intentional or unintentional:
Perhaps there may come a day when the US augments its Environmental Protection Agency with a Human Protection Agency. Then, taking the lead from the present requirement to make notification on cigarette packages that "cigarette smoking is dangerous to your health", it could order that each computer-printed page be preceded by:
- The notorious Equity Funding scandal will certainly become a classic, even though the exact ways that it was perpetrated will take some time to discover. We know, even now, that it was a pyramiding operation, and that computers were used to give authority and extra layers of protection from discovery. Many corrective actions could arise from the case, such as new emphasis on EDP auditing. It appears that perhaps as many as 200 people were involved in collusion.
- The University of Michigan has a research service that projects the effect of various decisions and actions upon the GNP (Gross National Product) and its growth, with respect to the State of Michigan. The results could easily be given in regular typewritten (or typeset) reports, but they are not! A computer printout accompanies the report to give it authority. The set of results that I saw seemed both spurious and misleading, and perhaps others could have detected this had they been as unawed by computers as I am.
"WARNING -- these answers were produced by a computer, and could be hazardous to your health!"Of course I am being facetious about the overkill which does not seem to diminish smoking anyway, but I do recall a case when:
The Computer as a Sewage System
- Univac was attempting to sell the US Army an 1107. The benchmark process included a compilation and run of a certain FORTRAN program. The 1107 compiler printed a diagnostic indicating an entry into the middle of a DO loop. The General in charge indicated that this was impossible, as they had been running that same program for three years, and asked a programmer to examine the situation. He returned in a short while and said "Sorry, General. Three years of wrong answers".
A welI-known truism of computer usage is "Garbage In, Garbage Out". But what happens when we put perfectly valid data in? Can we get it out again? Can someone else do so? If it does come out, is it legible?
We still live in the computer era where 90% or more of the data depends entirely upon the associated program to be turned into information. The data description of COBOL is a start to improve this, but why should the description be appended to the program rather than to the data itself?
Do you need a program to read a book in the library? At ACM 70, Dr. John Richardson of the US Dept. of Commerce said "Information Conserves Resources Through Better Decisions", but some of the valuable data that we need to make those better decisions is not, in fact, retrievable, exchangeable, or digestible. It cannot be turned into information. Indeed, one of the major findings in the various studies of data banks is that the sum of many small data banks is not a large data bank, at least not yet, contrary to the fears of many. And yet there are good as well as harmful reasons to consolidate data. If, for example, the US Congress had two reliable pieces of information -- 1) how much it was costing to not grow cotton, and 2) how much it was costing to promote the use of cotton -- the very juxtaposition might give rise to some better decisions. The organizing power of the computer depends completely upon legibility and interchangeability of data.
A classic example is the situation that arose when the EPA (US Environmental Protection Agency) was formed by consolidation of several diverse groups, each with its own information systems. When they tried to consolidate the data as well, surely one of the main reasons for the coalescence, they found out that data could not only not be exchanged between various components, but not even between the several computer systems in the subdivisions of the agencies! And, of course, the air masses travel over many states, each with its own computers and monitoring systems, and each incapable of making decisions that would optimize for the entire country, much less the world -- if that possibility were permitted.
Examples of the illegibility of computer data without the program are countless. Dr. Fred Whipple, the astronomer, once mentioned that only 1 % of his information from satellite and probe vehicles was being processed. I corrected him slightly to say "data", and he reiterated "information". I asked if anyone could process the tapes if the program were destroyed? He admitted that it would be impossible. "Data" it was.
The Las Vegas city police and county sheriff's department recently consolidated to form a "Metropolitan" Force. It will be many years before their computerized data files can also be consolidated to be of efficient use.
Of course this particular manifestation of swallowing of data and not giving it back to anyone else could be largely solved by using labels and data description on data media, so that the data can be self-descriptive. Congressman Brooks of the US has called for a "declaration of independence for data". Another way of not being able to get data out is to have the computer system fail.
Integration of computer systems into human affairs demands extreme reliability. We all know this, yet there are many times when one is tempted by the power of the computer to entrust to it a function that has some deadlines. I am guilty of this myself. We use a computer for the text processing and publication of the Honeywell Computer Journal. The problem is that we are forced to share a computer that is used for software experimentation and new system software validation, or for benchmarking in various configurations. While the hardware may be very reliable, newly-developed software is, unfortunately, not -- and we have entrusted our total text to the disk files. When difficulties occur, no manual methods, however desperate and strenuous, can be employed to do a makeshift job. It is the ultimately perfect job or none at all; we are at the mercy of a system that must be fully operational.
The point of my story is that it is a human failing to be optimistic that the computer will be up! So one does not plan for back-up, duplicating files on another system, or batch methods that work even when timesharing is down. Now we cannot even reprocess the sewage. We have given the computer valid and useful data and cannot get it back until too late.
SOME ACTIONS TAKEN FOR
Society long ago learned to impose minimum restrictions and educational or training requirements upon classes of workers whose operations affected the public safety or welfare. These constraints led to professions, with codes of ethics and a store of recommended practice often embodied in local law, such as building codes. Examinations by peers is a prerequisite to practice -- for doctors, lawyers, engineers, accountants, ad infinitum. Until now such restrictions have not been imposed upon the computer community; one can only suppose that the professions just mentioned did not materialize so abruptly before the social consciousness.
Some public exposure of malfeasances moved the legislature of the State of California to consider, in 1971, the certification of computer programmers as a class. This was given attention by the press and, together with the fact that the legislature was in a quandary, it was sufficient for assistance to be asked of AFIPS (American Federation of Information Processing Societies). AFIPS convened a System Certification Committee in 1972 February.
The committee arrived very quickly at the conclusion that there seemed to be no authoritative way to achieve certification. I proposed that a series of books of good practice should be conceived and constructed through AFIPS. This project is now underway. The first such book of good practice is on confidentiality and security, due to the very strong and justifiable interest in this topic at the moment, and is about to be field-tested. It is largely in checklist form. As a minor note, the committee has changed its name to "Systems Improvement", to emphasize the fact that it does not feel that any form of certification is feasible yet.
RELIABILITY FOR INTEGRATION
INTO HUMAN AFFAIRS
This was the title of one of the sessions of the 1973 National Computer Conference in the US. The session had a certain distinction. The other sessions were, by design, to reflect a "vertical" or "end use" orientation. Here I deliberately chose, in planning the program, to take a further step, to see what aspects of computer systems design were common to many end uses for the specific reason that they were directly integrated into human affairs.
The panel included representatives from air collision avoidance systems, online patient monitoring, online power plant control, credit systems, ground transportation, and merchandising. Many of these applications are of Type 3; power control against blackout, for example, requires a response faster than a human can achieve. Air traffic control is another; in the 1980s there are expected to be 5000 people always in the air above Los Angeles, in 700 craft! The representative gave two major requirements:
This second point created much discussion. Many of the builders of complex computer-controlled systems found that the people that ran such systems were seldom able to practice fixing them. When they did fail, they were not properly capable of coping. It was suggested that holding "fire drills" for such systems was a basic element of good practice.
- Predictable reliability should be astronomical.
- There should be "bail-out" capability for whenever the systems fails unpredictably.
Searching for other elements of good practice, it appeared that none of the panelists or their design teams knew of any source or reference book to use for reliability aspects of computer usage, even though there was much commonality in their applications. There are some specialists in this field, such as Bob Patrick, but no body of knowledge is available generally. Patrick gives some examples of bad design:
Donn Parker, who chaired the above-titled session, is an authority on computer-related crime. His estimate is that this now amounts to $300 million a year, and will reach $2000 million in the 1980s! Dick Mills of the First National City Bank says that the bank has $8000 million per day in its interchange "pipeline", so that even a small leak drains a lot. It would seem that we are not being overcautious in insisting upon reliability in such "people-sensitive" applications.
- One computer installation had back-up tapes in a fireproof vault, and "grandfather" tapes inside a mountain. But there was only one copy of the "run book" that told the operator how to read the tapes, and that was in the machine room, and would be lost in a fire.
- A military installation had high security, and was very protective of the data. To ensure good readability, the tapes periodically had the first 20 metres or so clipped. The problem was that these tape strips were thrown away, under custody of garbage men without clearances, and they had not been erased!
PROTECTIVE MEASURES AGAINST MISUSE
There are many examples of laws for involuntary personal protection. Construction workers must wear hard hats; cylists must wear leather and helmets. These are occupational protections enforced upon the individual presumably because he represents an investment by society.
The US Government has imposed certain requirements upon the manufacture of automobiles, i.e., to be constructed so as to withstand collision of X km/h without sustaining more than $Y in damage, or the like. The Government has stated that requiring such action is within its right to protect the safety of its citizens. It seems certain that the computer has a direct effect upon not only the safety of our citizens, but also upon other rights. It might thus be reasonable to demand that software and hardware should also be built to certain standards to protect these rights.
We are certainly going to have to build computer systems with facilities for confidentiality and security. Although there is no law on this, there is little doubt that US Government users will be demanding these features.
I shall not mention more, because this area is covered comprehensively in "Legal Aspects of Computerized Information systems", a US Govt. Report that the Honeywell Computer Journal was privileged to present in 7, No.1.
Dr. Harold Sackman, Chairman of the AFIPS Committee on Social Implications of Computers, called recently for a "computer user society of America". This was to be a computer citizen's group active in social reliability, for the reason that the computer community really gets to see the problems first, and has the responsibility to expose the problems to those who can treat them. The ACM owes much to the Scandinavian creation of the ombudsman; its ombudsman program has solved many problems of bad computer usage.
There is a growing class of auditors versed in data processing, but we may have to take drastic measures to aid them. There are many current efforts for better methods for software construction. One hopes that increased simplicity will lead to more direct legibility and auditability of computer programs. Most programs are documented poorly, and I see only one hope of solution -- the program specifications, narrative documentation, and operating instructions must be integral! Using a block-structured language is vital to constructing auditable software. It also enables programmed devices to detect tampering with the running programs.
Handbooks of design and practice are required to be available before computing can truly be a profession. Many computer societies are in various stages of using codes of practice and certification of practitioners. One hopes that they will not stop short of general certification but will also adopt application-oriented certification in joint action with the professions of those applications.
We will have to equip our systems with performance measuring and evaluation capabilities. Wastage of resources has been considered an evil in other fields before this.
As custodians of the power source we have many responsibilities. When I planned the ACM 70 Conference it was as a model for a National Computer Year, which could possibly be followed by an International Computer Year. A possible list of goals for such a Year could be:
It is not too soon for a comprehensive examination of the interaction between computers and our society. Two papers from the 1973 National Computer Conference support this view -- "The Social Implications of the Use of Computers Across National Boundaries" and "A New NSF Thrust -- Computer Impact on Society". NSF is the National Science Foundation of the US, and this paper demonstrates concern on at least a national level.
- To consciously put computers in service to international goals, to increase public understanding of the role and potential of computer usage, and to accent the role of the computer as servant by more humanization of applications and usage.
- To develop strategies for the best future use of computer systems (technological, social, educational, political, and legislative).
- To conserve, and maximize utility of, those existing and future intellectual resources known as data and programs, by finding how to utilize them on multiple equipment and in multiple applications.
- To aid government, business, and private decisionmaking by opening up new and more complete data for those decisions, and to facilitate the making of those decisions by reducing the information volume required (as opposed to data volume).
- To plan a closed cycle for redistributing work assignments between people and computers, for re-education prior to change of assignment, so that people can best fulfill their potential.
- To ensure that public safety and welfare are considered adequately when computers are integrated directly into human activity.
- To set up new and broad interdisciplinary paths for exchange of information among hitherto segregated organizations, and to foster their maximum involvement on an international scale.
- To plan the most economical and effective interaction between computing systems and other systems such as communications.
I have the feeling that it won't be so difficult for computers and society to adjust to each other if we really put our minds to making it happen. In 1970 an Assistant Postmaster General of the US observed that a third of all first class mail is machine-addressed, but only 6% arrives on the post office docks in Zipcode order. He asked why the computerized address files could not be ordered by Zipcode as well as any other way? So I asked many data processing departments the same question~ The answer was that they had not thought about it, and would just as soon do it that way.
It's as simple as that.
Back to History Index Back to Home Page
- R.W.Bemer, "Computers and our society", NordData, Copenhagen,
1973 Aug, Honeywell Computer J. 8, No. 1, 49-54, 1974
a -- Reprinted in Jurimetric J., 1974 Fall, 43-55
b -- Reprinted (translated to German by Prof. Heinz Zemanek)
as "Über den computer in unserer Gesellschaft",
Elektronische Rechenanlagen 19, No. 4, 167-172, 1977 Aug
c -- Republished as "The frictional interface between computers and society",
Computers and People, 14-19, 1975 Jan
-- Computing Reviews 29244
-- Computer Abstracts 77-2514