[The Montana Professor 1.1, Winter 1991 <http://mtprof.msun.edu>]
David Barnhart
Chemistry
Eastern Montana College
By the late 1950s the model of a calculating machine that we now call a computer had been developed. The intervening years have led to the development of miniaturization processes allowing for storage of large amounts of data in small units and the access to such in relatively short intervals. However, there has been little change to the basic structure of these machines, and while the future holds no doubt some departures from this state, for the most part the last 30 years have seen technical improvements of what existed in that period. There has been a change in the way many people conceive of the computer as witnessed by the change of terminology from "calculating machine" to "data processor" and so on. One of the outstanding features of this period in the software area has been the implementation by many scientists of the FORTRAN programming language. This permits the use of large data files and/or complex mathematical processes in the standard scientific method of testing hypotheses by statistics in situations where the enormity of the problem made a solution highly improbable from a time basis. Below I will briefly discuss the influence of mathematicians on the development of computing, but for now we should note that FORTRAN was possible primarily due to technological developments. Higher level language usage was achieved by development first of the various machine languages, and second by the use of assembly languages, the latter being somewhat of the later type of languages mixed with machine language forms. I began my programming by setting up hexadecimal codes for words and then physically toggling these into the central processing unit (CPU) of the computers. The present day languages are translated into the CPU via translators called compilers, and from the compilers into machine specific operations. Usually several steps have to be effected before the binary structure of the actual machine process is carried out. However it is done, there are many problems involving vast numbers of calculations that can be accomplished in relatively short intervals. The non-linear function least squared process used in finding chemical structures as in the case of determining the helix nature of DNA by x-ray crystallography is one well known example.
University scientists generally work individually in their laboratories and find the somewhat non-transportability of FORTRAN and the difficulty of translation by outsiders not a major problem. The practical application to their problems is the main object, and to them the computer is a giant calculating machine. In these programs one would find at the beginning some statements that were machine specific that set up the CPU such that the necessary temporary storage spaces could be allocated. This was followed by statements that directed the computer to establish the formatting of the input and output of data and results. At the end, then, one would tell the computer to free these areas. In between were groups of statements that were the heart of the programs, i.e., the paradigms by which the calculations were made. The key to FORTRAN programming is the art by which these paradigms were constructed and written into the programs. They reflected the mathematical, theoretical, and empirical work of the scientist. I say art, because not everyone can efficiently write these wonderful programs. Computer centers attracted a breed of creature that was truly a mathematical artist in sheer creativity of transforming the analytical methods of the great French and German thinkers of the last century, and those of the great relativists and quantum mechanists of this, into an almost poetic structure of economy and method. I, like many others, would carry my hopeful examples of programming to the center and watch entranced as these writers would transform my voluminous prose into such efficient poetry. When the paradigm was complete, the programmer in a careless fashion would simply tell the computer to go to someplace else in the program. Credit should be given of course where credit is due. The basic processes of computing are not pulled from air. For a long period, reaching a high point just prior to the Second World War, mathematicians had studied the problem of computability. A large number of these people came from central Europe and the British Isles./1/ War brought many of them to the U.S. from where much of the work was done. The calculating machine was a reality based on the realization that there were connections between symbolic logic, Boolean algebra, mathematical algorithms, and electronics./2/ It did not just happen.
The necessities of war-based research led to the rapid development of calculating machines by such men as Von Neumann and later, Olam./3/ The programmers that make use of these machines at the basic level posses intimate knowledge of the underlying mathematics and logical principles, and while their results appear magical sometimes, it is in part creativeness, and in part a deep understanding of the processes./4/ I will say more on this later.
A key to understanding the difference in the latter day programs and FORTRAN can be found in the attack upon the GOTO statement./5/ It is known that the "logical" (by current usage as applied to hierarchical forms, for example a WHILE language) languages can be put into a GOTO form but the converse is not true./6/ This probably doesn't prove anything, but it gives an insight into the problems that may arise when "structure" and "logic" are made synonymous. The programmer knew where he or she was going (in a GOTO sense), but hardly anyone else. The whole problem can be resolved by the use of non-executable comments in the program, but the nature of the programmer, i.e., the academic, usually meant these did not appear.
The yet unparalleled utility of the FORMAT statement in FORTRAN made it the language of choice by scientists, along with the nature of the actual scientific-mathematical calculations. Academic scholars need to publish from within their laboratories reports that will stand the test of the American system of external peer review, a perhaps unparalleled method of insuring scientific soundness. Scientists try, not always with success, to stay within the bounds of empiricism. I would venture to surmise that most academic scientists are in the field because of the pure enjoyment of the experimental processes that lead to discoveries of how the natural world of mass and energy operates. Blaise Pascal would approve of this behavior as he states in his Penses:
All our dignity consists, then, in thought. By it we must elevate ourselves, and not by space and time which we cannot fill. Let us endeavor, then, to think well; this is the principle of morality./7/
That he did think well there is no doubt. Perhaps many are aware of his achievement in developing the barometer for studies of atmospheric densities for which modern science has awarded him the unit of pressure in the System International, and those who have studied physics know of the outstanding work he did in hydrostatics. Many a student has had to puzzle over the conclusions to be reached by studying those intriguing forms known as Pascal's vases. In the same vein we might consider that Western man, who inherits the concept of being created in God's image, should reflect on the meaning of this in terms of human freedom and its direct results of service and creativity. The academic more than anyone else exists on these two principles, and methods that stagnate creativity should be opposed. We all are too aware of what became of Scholasticism in the 16th century.
Scientists are humans and do, however, sometimes confuse non-empirical, hence not operationally defined, concepts with the concrete results of experimental study. The quest to avoid a teleology in rational science has been carried on for two centuries now and has not succeeded very well. The mathematician-physicist has attempted using various "axioms" in an attempt to avoid the metaphysical nature of modern "theories," but these so far have failed. Few scientists are interested in the mundane aspects of applied science, or at least purport to be, but yet many scientists are really technologists. The largest deviations from these generalizations would be found in the life sciences where the obvious medical uses would have a more humanitarian appeal, i.e., service. Perhaps not all can be, as a friend of mine who remarked to one of his doctoral committee questioners' inquiry as to what practical application his mathematical study was valid for. "None, I hope," he wishfully replied. Olam thought much about such things. There are others also interested in escape from the letters of a scientific conservatism of a non-empirical nature. David Bohm has written extensively on the subject./8/ Olam states,
What a fertile field for mathematicians who, in contrast to physicists, need not be bound by considerations of the one and only "real" universe...so now a more general "physics" could be imagined, currently fictitious but perhaps useful in the future./9/
Decisions as to the axiomatic nature of human behavior from a managerial standpoint are highly teleologically made, and could prove extremely harmful to optimistic viewpoints as given above. Case states,
"Structured programming" (also known as "goto-less programming") was the logical place to start. This concept required that programmers structure their code in a particular manner to gain efficiency in the development, spread large programming projects controllably over several people, and reduce the effort required to maintain programs./10/
That such assumed "logical" methods are failing can perhaps be understood on the basis of the illogical conclusions drawn in the last parts of the statement.
Scientists in universities do not usually exist in a managerial system with respect to their work. The present trend, to use the jargon of the organizational man, has found the use of the managerial term being applied more often there than one would want, however. Academic freedom depends to a degree on a lack of administrative control and FORTRAN suits the academic well. In a recent article in Academe it has been pointed out that many American universities are adopting the industrial managerial systems with the expected degradation of academic quality that arises from such./11/ Many of the guidelines for the processes were established by these same universities, and a generation after the adoption by industry the universities now are adopting them. That the processes are failing to fulfill their expectations in industry and that many industries are beginning to question the validity of the process seem not to deter the administrations./12/
The grant moneys of the science departments developed most computing centers in the schools and it wasn't until administrations began using those computers that managerial aspects began to shove the scientists aside. It is to be noted that one of the largest uses by administrations and one of the most time-consuming is in the printing of mailing labels. In response the scientists when possible have created micro-centers within their own departments. Industries and their associated research and development institutions (R and D), however, have found that such a situation as exists for the university-based scientist is intolerable. I propose that this has led to the development of languages, and the altering of FORTRAN, that reflect the managerial system as used in technological environments outside of the universities. While one would like to believe that engineering and science follow the same processes in the two environments and would be more or less identical, it turns out not to be the case. That the difference in the two lies in that the university professor might not have a definite idea of where his or her work may lead, and the engineer has a preconceived idea of what his efforts were for is understood, but there exist other differences, and one might speculate on what may be the result of those differences.
A scientist of course must form a hypothesis of what he believes to be the case, and as the facts are gathered he alters the hypothesis, not the facts, until within statistical confidence levels he can argue his position. The engineer uses the facts available from several sources, not all from his own laboratory, seeking if he can the best pathway to his result. Then one must ask the question: is engineering "science" at all? We have a tendency to regard anyone wearing a white coat a scientist in this country. But scientists don't necessarily consider gadgetry science, and there is the danger of losing empirical control to semi-mystical authority based on some hierarchal concept of what is axiomatic. Engineers work probably most of the time from collected data and hence become the data managers of the data banks. Perhaps one of the reasons for the failure of the scientific method as applied to societal and historical problems is that either the hypotheses are not altered to meet the facts or "facts" are invented to meet hypotheses, i.e., ultimate results, that are assumed to be axiomatic. The failure of the Marx-Engels Communist economics certainly can be traced to this process./13/ One would think that technology was scientific, but even though machines may operate on natural principles, all other aspects of their development may have little scientific basis. Manufacturers argue that sometimes they give only what is desired, and sometimes they state that what they give is a natural end product. The difficulty in engineering education, as has long been faced by engineering colleges, is that engineering becomes unscientific when it assumes that a particular end product is the obvious result of some "natural" way, and engineers obviously don't understand this. It is simply the old question as to why the chicken crosses the road, and the answer may not be to reach the other side. The technological world defines the results and the processes based on non-empirical data. Engineers used to control the processes of manufacture, and whatever deficiencies existed did so due to the problems of engineering education, but with the coming of the organizational man, the control of industry, R and D, and other related processes, has passed to the professional manager which means that the problems of both realms are now present. President Reagan boasted that the U.S. had become a nation of managers, and went so far to imply that we would manage the affairs of the world. There are signs that all is not well with this concept.
There appear to be two problems faced in this issue. The one is the problem of hierarchy and the second of the basic nature of the computer. Putting the two problems together then makes for additional complexity. The problem of hierarchy is being primarily addressed in this article. Berdyaev has considered the problem from an ethical standpoint, but points out,
The object of reverence in that case is not man, not human gifts and qualities, but the bearer of authority, of the impersonal hierarchical principle.... This kind of hierarchism is not human at all./14/
Incompetence then is perfectly allowable in this kind of hierarchism. The position within the hierarchy is more important than the quality of the entity within the system. The second problem has been addressed by Plank in a paper previously submitted to The Modern Language Association. He states,
The fact that garbage-in-garbage-out introduces necessity between the input and output, the fact that binary machine language becomes a kind of perfect encodement and universal logic, means that a unique event has taken place in the signifier chain: there are two signifiers whose relation is no longer arbitrary..../15/
The key word of necessity here should make users of computers realize that the computer is not operating as we would think of the "scientific" brain as operating. As Plank further points out, "The computer becomes a mirror of the logos...." In short I propose that we use the computer as a teleologically operated device to propagate a de-humanizing process under a false use of the concept of empirical science.
The quality of the American product and the productivity of American industry began to slide some time ago. This slide was recognized by industry faced with the post-war world of international banking and commerce and the competition from previous allies and foes alike. The answer to the problem was then, and is now, sought in the technology of computers./16/ The computer as a device for automation and subsequent removal of the worker from the direct production process is still looked at as the answer. It is interesting to note that in a daily episode of Buck Rogers during the 1930s, Buck made some comment (remember he had slept for 500 years beginning with the First World War) about the lack of automation when the means for such were widely available. It appears that the reply indicated that such a concept was counter to what those advanced people had discovered, i.e., that people need to work in order to maintain their inherent strength of character. That seemed to be then a good idea as one must also remember that the basic nature of the Buck Rogers strip was to inspire, and inspire it did, young people to explore their universe and try to reach heights where none had gone before, employing the seeming limitless possibilities of the use of the human mind. That we have reversed this idea and have imposed automation and managerial stagnation on the same young people is strange in that we are faced with continual failure in our system. Some industries are attempting to modify the managerial system to include more input and control from people at various levels, but it will be as difficult to abandon the managerial system as it will be to undo the centralized bureaucracy of the federal government. Power gained is not easily given up. With respect to the present discussion it should be noted that in the process of developing the software of industry and its R and D facilities, the managerial system was applied. In the book published recently referred to above, Case specifically establishes a one-to-one correspondence between the hierarchy of a modern banking facility and the management system to be used in developing software. That all is not well in the software industry also hardly needs to be stated, and many companies are beginning to question the processes. It still will remain to be seen here also whether the desire for existence and certainly profit can overcome the damages of the managerial system. When one has so many people in positions of authority, most of whom are making the wrong decisions, it is difficult to see the remedy. When the situation arose in ancient Greece, a tyrant would arise, lop off heads, set things right, and retire. The problem was solved temporarily until the next batch of incompetent power seekers were elected.
The managerial system of non-university research and development centers brings forth many more differences than those given above. The almost total adherence to the outside peer-reviewed structure of academic journal publication versus the accepted in-house technical report of the R and D environment is one. A more serious problem might lie with the "data management" concept of the latter versus the more stringent statistical quest of "confidence levels" of the former./17/ That data can be managed is not a popular concept among university professors. Serious embarrassments in history, both recent and past, have occurred in cases where scientists have done just that. There are degrees of data management, however, and to what degree data are selected and used in developing a hypothesis and rational or empirical laws varies from serious to not so serious offenses. In many cases engineers are interested only in ad-hoc empirical relations such that they can attempt at least a trial and error development. It is when one encounters such events as the Challenger disaster, the breakdown of nuclear reactors, or the non-functioning of military weapons that one might begin to question the extent of data management procedures, and the possibility they may have led to such serious failures.
PASCAL is a language that is quite popular among serious programmers in the data management system. It isn't clear to me that the use of Pascal's name is the appropriate one to apply to a process that utilizes the concept of the organizational man, one he wouldn't approve of. He makes quite clear what he thinks of the modern quests of technology as he states, also from the Penses,
Hence, it comes that play and the society of women, war, and high posts, are so sought after. Not that there is in fact any happiness in them, or that men imagine true bliss to consist in money won at play, or in the hare which they hunt; we would not take these as a gift.
Perhaps we can forgive the pious philosopher his statement about the "society of women," but it is clear that the organization man certainly wouldn't be the type he would seek out. PASCAL provides many drawbacks for the programmer used to the FORTRAN style. It has so many reserved words that one has to develop practical literal variable definitions. This is highly recommended in the R and D environment since it assures transportability of variable definition. Where a professor might use "y" for efficiency, PASCAL would require "Barrels of Oil Daily," or something of that nature. Managers obviously want to know what the people they are managing are writing. There are vast glossaries of pre-written functions that are available, which of course is an advantage for systems where uniformity is required, but also even to a FORTRAN programmer there is utility to the idea. FORTRAN is an easy language to use with basically few restrictions to an individual's imagination in developing paradigms. It does wondrous things. PASCAL can also do wondrous things, but only at the expense of learning a complicated language and the surrendering of many freedoms. Obviously one can be inventive and even secretive in PASCAL if one wishes, ignoring the highly hierarchical internal structure, but R and D organizations have means to discourage this. The so called "quality assurance" programs guarantee this. A large number of the personnel of the organizations are actually big-brothers. Quality assurance is the double-speak word for the watchdog process. Systems development people speak of "control" in the same sense of quality assurance. The actual meaning has very little to do with quality, but rather with control over deviations from managerial decisions. A manager at the higher levels must have documented in detail each step of a process. The easiest way to circumvent this is of course not to do anything. It is a disturbing feature of the environment, especially for those in the lower levels. Not only is the language managerial, its function is to channel everything upward from level to level in the hierarchy until at the top the final manager can, by pressing buttons, extract from below at any level the information available, and at each step downwards likewise. Whether there is any relationship between the process of channeling the capital of the nation from the lower levels to the top levels of society, leaving the lower levels with less and less capital or even the means to acquire capital, I could not say; but in the spirit of this inquiry I would not be hesitant to speculate on such. Any language's basic format can be overridden, of course. The point here is not that PASCAL is absolute in its application to the managerial system, but that it is basically of an homologous form.
Within a computer's hardware, firmware, and software core, of course, decisions have to be made. Authority seems to be necessary in many activities, but not necessarily implying power or some rigid hierarchical system. The attempt is made, however, to relate authority and decision-making processes with hierarchy of power. Writing a symphony by committee may not be a good idea either, but simply because a taskmaster within a computer needs to allocate tasks does not imply an axiomatic use of the managerial hierarchical system as is found in a large number of software development textbooks.
The problems that might arise, and certainly do, include the overabundance of managers in the system, and the alienation of the lower levels from the decision making process. There is certainly the problem that the salary levels increase upwards as does the separation of the top managers from the data acquisition process. The incredible process of getting contracts on multi-million and multi-billion dollar government projects has been well documented. Entire convoys of semi rigs haul the tons of paper work into the bureaucratic centers where form, art work, and sheer bulk play the prominent role in selection. These bureaucratic agencies likewise are based on the managerial system, and the top managers are located in the centralized Washington, D.C. area, far from any regional contact with the processes being treated. This is vastly more true now than it was a dozen years ago before the process became a way of life in the U.S. Most of the technical people working in this environment either in the bureaucracies or the R and D centers are not involved in the data gathering process (this has become basically a blue-collar job using the automation processes of the computer itself), but in either the management of the data or of those managing the data. Thus the management of data, whatever that may be, becomes more important than its acquisition. Few of the technicians collecting new data, if that goes on, are trained to even a fraction of what their counterparts in the universities are. Whenever the non-standard situation arises, then, there is generally a lack of training to make the educated judgment on what to do. They must feel as Krazy Kat did when s/he said, "What to do, How to do it, and Why?"
One of the features of PASCAL, as different from FORTRAN as anything, is the absoluteness of the declarations that must be made before programming begins. This implies of course for the most part a data base as opposed to data to be empirically collected. Associations are assumed, and all connectives are predetermined. These form what the programmers call "Records." Then one "points" to parts of the records and these then are "managed." In non-scientific processes this might serve some value such as finding individuals from a large number of known persons having certain characteristics, although one still has to recognize that certain a priori decisions have been made before the pointing process begins. This is the drawback to all such usages: pre-judgments are made. One cannot very easily approach data banks from an unprejudicial standpoint. One of the characteristics of the development of mathematics in the modern sense was to strive to eliminate "gut feelings" from mathematical systems, and certainly this approach has greatly expanded mathematics and its application in physics. Playing with numbers may give insights and we all like to do it but the line must be drawn between unproven intuitions that might be false and hard empirically obtained facts. When one goes into the development of a program from the standpoint of data-management rather than from a rational-empirical stance, great care must be taken with respect to the outcome of such processes. A data base assumes from the computer's viewpoint, which becomes the programmer's also, that a universe of knowledge is available. This is certainly not true of a rational-empirical process and therefore the results have to be, a priori, prejudiced.
Let us assume that we now have a data base of information of all the experiments that have been run, and limited even perhaps to some particular area. The molecular structural scientists referred to at the beginning of this paper have attempted to construct a data base of many thousands of structures. Let us assume that this information is available through a PASCAL program. We have a variety of pointers defined and we can manage the data as we wish. We are seated at the terminal and are ready to begin. Plank in a private communication has presented the dilemma as: "What do we ask? What is it we wish to know?" How are we going to treat the data we receive? We have enormous quantities of predetermined values for all kinds of properties. What are we going to ask? Kowalski states,
Database queries are regarded as problem specifications, which are formulated without any concern for efficiency. It is the responsibility of the database management system to optimize the query and to convert it into a program which accesses the database./18/
With the above in mind, we see that the scientist is in a dizzy position. He now has power, as Plank rightly states. Before he vaguely knew his question as a preliminary hypothesis, and would then seek out a method by which he could go into the laboratory and gather data. He would refine his hypothesis as the data was gathered, and eventually he would find a meeting with his rational and empirical processes. He would have found how a small part of his universe operated. Now he must only find a question, but he isn't sure what to ask. In the back of his mind, going back to the process just described, he knew that if he had the proper question, he would have had the proper hypothesis to begin with, but he wonders if he has also lost control.
Let us look at a fairly recent example of data management and the question asked in that case. Let us assume that the situation is as described above for a data information system. The questioner asks if marijuana smoking leads to heroin addiction. He uses the pointer process involving heroin.marijuana.user, and he finds as did the ones who did this that there is a one-to-one correspondence, and his question is answered: yes, it does. This was worth a great deal of publicity. This is a rather trivial example of the misuse of a data base system, but if it happens in a trivial example, why not in not so trivial cases? There is a subtle point here. To use a data base system, questions must be asked and the formation of the question may very likely predetermine the result. We are no longer following a natural process under the processes of science, we are interacting with data and possibly forcing the results by the very nature of the question we may ask. We might go even further and inquire as did Olam if any empirical means we use might affect our results. If we use an optical device to observe nature, do we then describe nature as existing in waves? If we spend five billion dollars to build a particle explorer based on particle concepts, do we find the next sublevel of particles we are looking for automatically? One certainly has to wonder if there is a relation between Ptolemy's geo-centric epicycles and quarks. Is rational science inherently flawed? If we can't answer this question, then we certainly want to be wary of the axiomatic statements of the purveyors of hierarchialism.
To return to the basic question, then, is to ask if PASCAL as a language is independent of the process of the managerial system or reflects the manner in which the running of the nation's non-academic resources is now so well understood to occur. Certainly one cannot argue for efficiency. It that were the case, then the programming would go the other direction, i.e., toward a more machine language orientation. Some scientists even argued in the older days that point, especially before increased speeds and larger storage-retrieval systems developed. The cost of PASCAL in storage and time is enormous, but the managerial systems are highly inefficient also. Does PASCAL reflect some basic principle of computing? Is there some known or unknown factor in the mysticism of cybernetics that has caused the natural development of PASCAL? Or has someone made a Marxist-Engels decision that PASCAL is axiomatic? Or perhaps might we ask if culture decides language and computer languages follow the same patterns as any other language, or vice-versa? Or is it really simpler than all that and the language really is an attempt, consciously so, to make computers into super-managers based on the same reasoning that managing is more important than the object of which it is the manager? To continue Pascal's last quote given above, we see that
they do not know that it is the chase, and not the quarry, which they seek.... The last act is tragic, however happy all the rest of the play is....
The answer to the question about the basic nature of the computer can be approached. As I mentioned above, there has been very little change in the structure. The machine is operated by forming words from binary bits that act on numbers stored by address. The process is the same as when I flicked the toggles by hand. Only now the toggles are switched a mind-boggling number of times each second. The process obviously uses either a Boolean algebraic process or some symbolic logical process that can utilize the binary nature of the switching. For the most part the process is of simple addition. To add two sticks to three sticks is a simple process of laying two sticks in a row alongside three sticks and observing that one has five sticks. The computer basically does just that, shifting bits around from place to place. The concept of the use of the machines has changed since Von Neumann's time. At first the idea of a calculating machine prevailed. There are still computers that are considered just that, in fact some of the biggest. Some computers are built around the logical process completely. These we know less about as it appears that these "game" type computers are deeply involved in military planning, a scary thought. There are other specialized computers, but made generally at high cost and are not the mainstream computer used in the R and D environment. The problem of the computer lies not in the hardware structure of the computer, since it is basically a logical device used to locate items and simply move them about in various cells. The arithmetic processes and logical steps are used in simple algorithms of these movements. To build a computer with complex processes suiting all needs is simply not economically feasible, hence most computers consist of software. The computer itself is generally of little utility and its cost is minimal. In order to use it, then, large sums must be spent on memory devices, input/output devices, other parts of the "system," and most of all the software: the operating systems and various translators used. The computer then actually becomes less complex as one goes to the basic CPU. Consequently the problems are those of language. The software developers use supposedly "formal" languages which are supposed to avoid the issue of semantics in "non-formal" languages, but one feels that whatever problems are inherent in the latter may also be lurking in the former. At the beginning of this decade there was a feeling among programming specialists that were developing the operating systems that the software itself was inherently hierarchical. One could detect almost a plea that the processes be made in the structured form of such. Relations between data structures and the algorithms were drawn and it was a successful campaign. The industry went full ahead with the concept. Library shelves are full of the texts on such. The university provided the personnel for its implementation. The writing of operating systems was to be done managerially and the programs were to be written managerially. I believe that perhaps there may be a confusion in the minds of those caught up in what is turning out to be an industrial disaster between authority and management. Authority comes from having to make a decision, but decisions can be made on some basis other than arbitrariness of some managerial process.
The process must deal somewhere, of course, with data. In today's computers few programmers know much about where the data is. Machines have been made such that memory becomes simply virtual rather than real. As part of the above process arguments were brought out to support the operating system assuming that it is a managerial process also. So a user might be using a mixed programming technique, a quite common process, in which the software that handles his data manipulation, hidden to him, might be a managerial one, and he is also using a higher level language that itself requires such. Then one begins to wonder if he is in control of what he is doing or has some structure been forced on him. To again quote Kowalski,
Since symbolic logic blurs the distinction between databases and programs, it is no surprise that logic, often in disguised form, is increasingly favored as a database query language./19/
He was not referring to PASCAL, but still the point is in linguistics then: are we masters of our language or does the language dictate what we do? We have in our computer large numbers of things in which we wish perhaps to find some "truth." One, a physical chemist like myself, might inquire as to some sort of structural basis of such a large number of entities, each obeying a simple function yet affecting a system's behavior. The difference is that in classical theory at least the interacting entities do so randomly without any addressing of each other, while in the computers each binary switch is addressed and consciously placed in a given state. In a quantum system we may impose constraints, but the particles are indistinguishable and unaddressable.
The empiricists, i.e., as a religion not the scientist's, have long claimed that by technology, the application of reason to life, we will find utopia. A recent writer, touted as the "greatest philosopher of our age" on the dust cover of his books, stated, "I believe that cybernetics and evolutionary biology will eventually solve all of man's problems," or something to that effect. He certainly said "believe" in the sense of "belief." It seems we can do better than this at this stage of history. Our friend Pascal said,
Man is only a subject full of error, natural and ineffaceable without grace. Nothing shows him the truth. Everything deceives him. The two sources of truth, reason and the senses...deceive each other in turn. The senses mislead the reason with false appearances, and receive from reason in their turn the same trickery which they apply to her: reason has her revenge.
Modern man owes much to the past, but we appear to cling to some ideas that have long not had any validity, and ignore much that does. A balanced perception of the place of technology, especially with respect to the computer, is needed,and there is much in what the person Pascal has to say to us today. It isn't clear that the PASCAL of the computing world does.
Notes
Olam (see below) and Von Neumann stand out among this group. Von Neumann died just after the war, but see his incomplete volume The Computer and the Brain (New Haven: Yale UP, 1959).[Back]
An excellent text without direct use of computer languages is James T. Culbertson's Mathematics and Logic for Digital Devices (New Jersey: Van Nostrand, 1958).[Back]
Stanislaw Olam wrote heavily in a popular vein as well as technically. A rather interesting book is his Science, Computers, and People (Boston: Birkhauser, 1986).[Back]
An extension of the material in note 2 above as directly applied to computers can be found in the readable book by Robert McNaughton, Elementary Computability, Formal Languages, and Automata (New Jersey: Prentice Hall, 1982).[Back]
One of the advocates of the structured systems approaches that has led to many disappointments is Albert F. Case, Jr. His book Information Systems Development: Principles of Computer-Aided Software Engineering is a model of the one-to-one relationship made with respect to the computer and management systems. See page 113 for his "go-to-less" programming.[Back]
See McNaughton (note 4), page 57.[Back]
Several editions of this work exist. See W.F. Trotter's translation in the Everyman Library (E.P. Dutton).[Back]
David Bohm, Causality and Chance in Modern Physics (U of Pennsylvania Press, 1957; see also the Modern Library edition.)[Back]
See Olam (note 3), chapter 2.[Back]
See Case (note 5).[Back]
Laurie J. Bilik and Mark C. Blum, "Deja Vu All Over Again: Initiatives in Academic Management," Academe (Jan.-Feb. 1989): 10.[Back]
An interesting editorial appeared last year in Research and Development (June 1988: 11) entitled "Technology Has Too Many Managers and Bean Counters." This editorial lists a number of "solutions" without coming to grips with the failure of the system of bureaucracy.[Back]
An interesting viewpoint and discussion of this philosophy can be found in Matthew Spinka's volume Christian Thought from Erasmus to Berdyaev (New Jersey: Prentice Hall, 1962). The whole problem of teleology can be found in this survey treatment of modern philosophy.[Back]
Nicolas Berdyaev attempted to study the problems of the man from a teleology-free viewpoint. See the chapter "Man" in his The Destiny of Man (Charles Scribner's, 1937; Hyperion, 1954; see also Harper Torchbacks).[Back]
William G. Plank, "Computer Assisted Research in Literature and Semiotics: Logos in, Logos Out," paper presented at the Modern Language Association, 1985, page 3. [Rpt. in this issue of The Montana Professor.][Back]
Academic Press publishes an A.P.I.C. Studies in Data Processing. All of the series can be read with limited backgrounds. R.W. Doran, in "Computer Architecture: A Structured Approach (Vol. 15, 1979), becomes almost pleading in his tone to sell the approach. On page one he states, "Whenever the word 'computer' appears...replace it by the term 'Data Processor.'"[Back]
The process is outlined in the Handbook of Software Quality Assurance by G. Gordon Schulmeyer and James I. McManus (Von Nostrand Reinhold Company, 1987). "QA" as it is known in industry is used in almost all processes of the management system, not just in software.[Back]
Volume 16 of the above-cited series [note 16], edited by Clark and Tarnlund, Logic Programming, introductory chapter by R.A. Kowalski, "Logic as a Computer Language," page 6.[Back]
Ibid.[Back]
[The Montana Professor 1.1, Winter 1991 <http://mtprof.msun.edu>]