Devin Carraway

AMCS 350, Term Paper



Faun Gray

Of Ethnicity and Gender in the Computer Field



The field of computer science is currently one of very rapid growth and change. While computer science is not in any sense a new field, seldom in history has a single technological or scientific evolution produce changes so rapid or so dramatic, both to itself and to economic and social conditions as well. This rapid growth has swept many concurrent processes along in its wake, whilst accruing potentially serious social deficiencies which may prove damaging in the long term, or require substantial retributory effort to correct. Simultaneously, a time of rapid growth is an opportunity to correct for previous problems, when conservative inertia may be unseated in the need for rapid adaptation.



For most of the history of the personal computer - the period beginning with one of several electronic gadgets available in price and physical size to consumers - computers have been available in a multitude of subtle variations on a single color. The official designation for this color, now without attribution, is "Faun Gray." It's a benign shade of faintly beige-tinted light gray, instantly forgettable. Until very recently, the color was almost wholly ubiquitous, and had been so since the first mass-marketable computers in the 1960s. Unlike most consumer electronic devices, computers had until recently never been sold adorned with artificial wood-grain panelling, nor chrome-plated or offered in sepia tones, nor in any shape other than the most basic one or two boxes with wires between. Not until an unexpectedly fortuitous last-ditch marketing campaign by Apple Computer in 1998(1) did consumer-market computers even achieve the level of color variance achieved by prophylactic contraception many years previously.



The motivation for this monotony of color choice in computer cosmetics, inasmuch as there can be said to have been one, is most probably simply a matter of minor marketing technique. Like popular media, clothing fashion and Muzak, marketers rapidly selected a color most broadly acceptable and non-offensive to the largest possible number of people (at the same time, consumer electronics departed from brushed-aluminum and came to be sold overwhelmingly in black.) Little of this monotony may be readily attributed beyond inertia and prioritization - consumers had never exhibited substantial preference for the coloration of a device the principle purpose of which had nothing whatsoever to do with its appearance.



By coincidence, the introduction of colorful computers corresponded with a rapid escalation in media attention - and by consequence attention on the part of the United States government - in the racial and gender disparities attending the computer industry. Not since the Manhattan Project in recent history had there existed a more "lily-white" (and overwhelmingly male) professional and academic work force. Both the computer field and Manhattan Project, indeed, shared common origins in military sponsorship, and were subject to more or less strident precautions of wartime secrecy. Ironically, at the time, the term "computer" in fact referred to a woman; specifically any one of thousands of women employed by the US Army, equipped with a mechanical adding machine and tasked with the assembly-line calculation of parabolic artillery-shell trajectories. These calculations, even with the assistance of crude analog differential-equations calculating machines, could require as much as a day for one "computer" to produce - the goal of the ENIAC, arguably the first digital computer, was to assume this task. ENIAC, when completed (almost three years after VJ-day and two hundred percent over budget), could calculate a shell trajectory in sixteen seconds. With ENIAC's completion came the virtual end of female participation in the computer industry for some fifteen years.



Outside of military history, it is worth the effort of considering the origins of computer science. Most of the credit for the designs of early computers is attributed to mathematicians; indeed the principles by which computers work were established in mathematics long before they were ever etched in silicon. Much of the early development of computers (including ENIAC and the artillery trajectories) was performed with mathematics in mind. Mathematics, as an academic field, has still not achieved gender parity, though its ethnic diversity has been aided substantially by the broad applicability of mathematics throughout the world, and the many contributions to mathematics made throughout history by different ethnic groups. Perhaps the most pronounced example of this was the development of functional algebra by Arabic mathematicians during a period when western mathematics was crippled by domination of the Roman empire(2)

. Functional mathematics, which describes a system of association between a set of inputs and outputs, where the selection of the former defines the function's behavior with respect to the latter, is now an essential component of most higher math, and is equally essential in computer programming, where the concept has remained largely unchanged.



The single most widely recognized woman in computer science, the late Rear Admiral Grace Murray Hopper, began her career in mathematics, earning her MA in 1930 and her doctorate in 1934 at Yale University, a not insignificant accomplishment in itself. Hopper is credited with many major advancements in the field of computer science, most notably the introduction of higher-level programming languages and compilers, two contributions now inextricable from the computing field. Hopper, despite her skills in computing, was simultaneously skilled at the marketing of computers, and foresaw the uses for nonscientific uses for computers years before the fact. She received the first "Man of the Year" Award from the Data Processing Management Association in 1969, and was the first woman named a Distinguished Fellow of the British Computer Society in 1973 (CWC 1994).



Grace Hopper, sadly, was an exception in her field. Today female enrollment in computer science programs is drastically disproportionate with female enrollment in American colleges, and this enrollment is rising only very slowly. Women in computer science are much more likely to be pursuing their studies by way of career goals, and more female than male computer science students are doing so as a secondary career choice. After aggressive recruitment efforts, Carnegie Mellon University announced an anticipated female enrollment in Fall 1999 of 37% - up from 8% in 1995 (SL Post-Dispatch 1999). Ethnic enrollment in collegiate computer science programs is subject to the issues attending ethnic diversity in higher education more generally, which apply as strongly in computer science as in any field.



To see the origins of today's computer scientists, it is especially useful to consider their backgrounds. While I have previously discussed the mathematical foundations of computer science, mathematics is only one of the two pillars of computing. The other, and numerically the more significant, is engineering. Indeed, computing ranks its own selection of engineering titles, as well as drawing on electrical and mechanical engineering. Computers are foremost agglomerations of electronic circuits, but they comprise a complexity almost unprecedented in human engineering history. NASA's Space Shuttle is generally regarded as the most complex machine ever built by man - but largely because it has computers aboard. Much of that complexity is the result of evolved engineering. Scarcely anyone attempts to understand a computer in its entirety. The very arcane nature of computing, however, places computers in a position shared with other technically specialized fields, in that it makes it an area of particular interest for those inclined towards experimentation and tinkering.



The computer industry, and to a lesser extent academic computer science, is overwhelmed with one specific class of participant: a young, intelligent white male with background interests in science and mathematics. Most of these young men began experimenting with computers, often (initially) encouraged by their parents, at a young age, often in primary school. Early experimentation with computers is highly prevalent in computer science students, and students' early explorations provide a dramatic advantage over other students who lack them. Not only does early experience provide useful background, but these first explorations frequently begin a process of personal development which "culminates," at the ripe old age (in the computer industry's system of values) of roughly 21 through 25. As of this writing, the computer industry represents the single most prominently developing segment of the American economy, and its demand for workers - among which these young men are perceived as the most valuable - is insatiable.



Whether emerging fields have always attracted a disproportionate representation of young men of a dominant culture is an issue upon which I am unqualified to speculate. More germane to our discussion, however, is the effect of this particular technical elite upon the diversity, resiliency and direction of the field.



The growth of the computer industry over the past 15-20 years has been explosive - though a more apt term, if one discounts the pejorative, might be cancerous. Rapid progress, at least in the United States over the last century, has often come as a product of intense specialization by a small number of scientists or personages. The computer industry most strongly attracts, and most richly awards, those who have specialized in learning computer science to the deprecation, even exclusion, of all other areas of personal interest. The analogy to cancer cells proves apt in one respect - a cancerous cell is one which, due to corruption of its genetic code, reproduces at the maximum possible rate, to the exclusion of all other normal cell operations. Such cells accomplish reproduction beyond any normal facility, but are largely incapable of other function. Correspondingly, the extreme focus demanded of computer industry professionals, most particularly those in engineering positions, leaves little time or energy for those other pursuits which make no difference to technical productivity, but may make a substantial difference in personal breadth. David James Duncan wrote of the hazards of specialization by analogy to subterranean mining: "The more deeply one delves, the richer the rewards, and the more remunerative the gems or ore. But so too does the tunnel grow narrower, and the dangers increase of never finding one's way back to the surface."(3) (Duncan 1996; paraphrased)



I have cast the members of the computer industry in a mathematical and academic light thus far. I should clarify, however, that this is only one of the traditional points of entry to the computer field, and numerically one of the less significant. While a college education has historically comprised the first major prerequisite for admission into a technical field, the computer science field has to a large extent demoted education where it presented practical difficulties. Firstly, academic computer science has seldom kept pace with commercially- and government-sponsored research in the computer field. Unlike most sciences, the computer field currently grows too quickly for academics to adapt. For this reason academia is held in a modest level of contempt by large portions of the industry, who value rapid evolution above all things. Academic research does still spur new developments in the computer field, but this research is done by the very few students who delay the immediate rewards of industry to seek post-baccalaureate study. For most graduates, the immediate rewards of departing academics for the industry hold tremendous temptation - entry-level positions for graduates in the industry, taken cumulatively as of 1998 were $48,000/year (USENIX 1999), and promise rapid escalation, generally within about four years, to the nation's uppermost economic quintile (ibid.), if a graduate's career is structured with income as a goal.



Secondly, while a university degree does enhance a computer industry worker's initial job placement, its on-the-job benefits tend to extend little further. In the perception of the industry, baccalaureate (and hence non-research oriented) education does little more than formally codify knowledge which has both limited applicability and a potential to limit one's flexibility on the job. For the former issue, it should be understood that computer science is an evolving field, but has achieved the status of a component-reusable practice, in which new works may be built using preexisting hardware and software components, thus eliminating the necessity of re-engineering those components. This is a necessary advantage in an industry which often operates under extremes of time pressure and thin (or nonexistent) profit margins. For the second, there exists a level of criticism by the industry of the formality and rigor in higher education, which is often likened to the plodding, formal pace of industry's own education-bound "dinosaurs," colloquially including IBM and Hewlett Packard. Indeed, the computer industry is generally willing to forgo completely any higher education requirements in favor of "equivalent experience." Financial evidence amplifies this disparity, since the four to six years required for a baccalaureate degree in computer science, if spent working in the industry, will yield higher salaries than a degree; as stated earlier, experience then outweighs the possession of a degree in subsequent job placement and rates of pay.



At the same time, the persistent demands for rapid development have suppressed most every other consideration. Computer companies seeking qualified employees - as virtually every company not in imminent danger of collapse does continuously and with great effort - seek the field's specialists in preference to those more rounded applicants with broader skills or education. This has led, in some respects, to an "immature" work force, whose youth was spent tinkering with circuits, and then immediately made the transition to highly-paid 60-hour weeks without pause. Many computer industry workers, especially those who forgo or abandon formal education for the hands-on application of their skills to be found in industry, arrive with overall social deficits. A typical geek (the generally accepted non-pejorative label preferred by computer professionals, especially the young) does more or less fit the public stereotypes in social terms, if not degree. Many, if not most computer professionals are socially undeveloped, with resultant problems that would be seen as severe, even career-threatening, in any other industry. In a sense, they learn social skills "on the job," which may carry with it penalties for their colleagues. Being young, and numerically over-representing attitudes of both U.S. coasts, few come to the field with overtly sexist or racist viewpoints. However, being young and inexperienced in non-technical matters, for most this is not a considered viewpoint; correspondingly, they will tend to echo unconsciously the attitudes of their background while appearing overtly blind to race or gender.





These qualities of the environment for and demands upon computer industry workers have had a degree of homogenizing effect on the entry of new members into the field. A CMU study on the entry of female computer science undergraduates highlighted the inhospitability of the field to women. Regarding perceptions of the requirements to study computer science, the study found entrants "[need] to be 'super smart,' to experiencing work overload, to liking to 'sit in front of the computer all day' and 'talking about nothing but computer science.'" (Margolis, Fisher, 1997) This study surmised that this environment, combined with the recurrent examples of the "boy wonder[s]" of the field, create an inhospitable environment for female entrants. A great deal of inertia is likely involved here as well - in both industry and academia, there is a dramatic under-representation of every demographic group other than European-descended white males. The same study, after interviewing CMU's female undergraduates, found a much stronger interest than among male graduates for the specific applications of computer science "to a larger agenda, i.e. what can computer do in the world for the betterment of people" (ibid.) The present climate of growth in the computer industry, however, suggests that these goals will not soon be widely appreciated. Much of that industry, at present, is engaged in servicing itself, with products and services designed by and for computer professionals, not with an eye towards broader applications outside Silicon Valley.



Working conditions in the computer industry, most particularly in its areas of geographical concentration (of which Silicon Valley is the canonical example) may also be inhospitable to those with stronger values of family and cultural ties. Silicon Valley exhibits one of the higest average age at time of marriage, a low birth and nuptial rate , exhibits one of the highest divorce rates of any industrially ascendant region. All these aspects are fueled directly by the overwhelming drive to maximum productivity in the industry, which often incurs long hours and high-stress workplace, whose detriments to family survival are well documented, if anecdotal. This environment may be more particularly repellant to those persons with a stronger attachment towards family and culture.



There exists as yet very little usable evidence concerning racial and ethnic issues in the computer field itself. Most recent attention of both media and government has concerned itself with the so-called "Digital Divide," usually summarized as the disparity between the technological haves and have-nots. There does appear to be a disparity in computer ownership and Internet access amongst minorities, not all of which falls along socioeconomic lines. A Vanderbilt University study found demonstrable distinctions between Caucasian and African-American families: whites were more likely to own personal computers (44.2% vs. 29%), and more whites than African-Americans who desire access to the internet appear already to have done so (16.7% vs. 27.2% indicated a desire to acquire net access). That same study found a direct correlation between education and access to the Internet via the workplace, but insufficient correlation between education (and thus median income) and home computer ownership to explain the disparity in the latter. While access to a computer and to the Internet is indeed applicable to the potential for participation in the computer industry, most studies in this area have been concerned with the potential for the development of stratification of society along informational lines, or even the development of an "infocracy."



The most predominant issue of race in the computer industry at present is that of immigration. The demands of industry have so far exceeded the supply of qualified workers that there is now a substantial demand for immigrant labor. Unlike previous instances of this demand, however, the demand is not for cheap, unskilled laborers during wartime or economic boom, but rather for educated and trained workers willing to immigrate to the United States in return for the high rates of pay attending the computer field. The demand for immigration has rapidly exceeded the preexisting limitations applied by the United States government. Under current U.S. immigration law, immigration is restricted relative to the preexisting population of any given nationality already present in the country. The path to U.S. citizenship is laborious for any immigrant, and is subject both to immigration restrictions and to the bureaucratic obstacles of the Immigration and Naturalization Service. The more common approach is to seek H1-B work visas for overseas labor, which have lesser restrictions than full immigration. However, quotas on H1-B allocation have been fully saturated by the demands of the industry, and the INS is facing a two- to four-year backlog in visa processing. Efforts on the part of computer companies to address these obstacles through legislation have met with attention, but as yet little actual progress. This difficulty may be due in part to congressional resistance to immigration, especially in California, where a protracted battle over immigration has already been waged for most of the state's history. Recent federal resistance to immigration has led to a partial fiscal and legal immobilization of those portions of the INS not directly associated with prevention of illegal immigration and border patrols. The resulting paralysis on the part of the INS has greatly impeded the immigration of foreign technical workers, even those with outstanding qualifications and the direct support of U.S. companies for which they are often already working. Perhaps the most famous recent example is that of Linus Torvalds, the highly popular originator of the tremendously successful Linux operating system and employee of Transmeta, a prominent Silicon Valley research company and chip maker. Torvalds, whose economic contributions to the U.S. economy (even whilst living in his native Finland) would please even the most staunch conservatives, is now eligible for deportation as his H1-B visa has expired and its renewal has been stranded in the INS for almost two years.



The specific environment of Silicon Valley demands notice also of its living conditions. Skyrocketing living costs have placed a tremendous burden on anyone not working directly in the computer field, and thus drawing a commensurate salary. As of mid-1998, the minimum requirement for federal housing subsidies in the area had risen above the median pay for elementary school teachers, police and fire-fighters.



Work is easy to find in Silicon Valley, even for those with only modest skills. Despite this, such workers find it difficult to find housing, and are obliged to fall back upon homeless shelters despite working full-time for wages which would easily secure basic comfortable housing and necessities in a less inflated local economy. "'Easily' 40 percent of the people who use the Emergency Housing Consortium already have jobs," according to an San Jose Metroactive report of late 1999. Demand for homeless shelters and food banks is soaring in direct correlation with housing costs. Simultaneously, contributions to charitable organizations for the needy have declined 34% since 1970. (Goldberg 1999)



Unemployment, while at an all-time low both in Silicon Valley and the nation generally, has not prevented marginalization and hopelessness, simply due to a tremendously inflated cost of living and stagflation of lower- and lower-middle-income workers, who in California have a disproportionate representation of minority workers. According to a joint study by the Economic Policy Institute and Working Partnerships USA, "wages for the bottom 25 percent of the workforce have actually declined by more than 13 percent in inflation-adjusted terms since 1989," while "average rents have increased 28 percent in the last four years." (ibid.) Quite recently, media attention has focused on strikes by janitorial unions in technologically-driven local economies, where maintenance workers employed by technical companies are unable to afford the cost of living in the areas where they work, or bear the costs of commuting from any more affordable area(4).



While the desperate need of the computer industry for labor has in some respects made it easier for ethnic majorities and women to enter the industry - to the degree, at least, that they are currently equipped and desire to do so (see above) - the issue of age discrimination is largely unbroached. In the computer industry at large, "elderly" is a designation whose qualifications start at around at age 35 - by middle age, technical workers are expected to have moved into management. Those that do not, usually through the industry's pervasive attachment to doing actual hands-on work (for most computer engineers, "manager" is a modestly derogatory label), often find it very difficult to get work, even for which they are highly qualified. There is a prevalent attitude in the industry that computers are predominantly comprehensible only to the young, likely an artifact of the rapid progress of obsolescence in computer equipment. While computer equipment has a very short mean-time-to-obsolescence (a parody on "mean time to failure," regarding hardware reliability estimates and employee burnout), that attitude is conveyed directly to the age of a prospective employee as well. This phenomenon may even reflect a disenfranchisement between the youthful hyper-specialists who dominate the industry and their own familial or cultural backgrounds.



There is also to be considered an issue of the social applicability of the products of the computer industry. It should be remembered at this stage that computers were originally conceived of as scientific tools, and later found applications in business. The precursors to actual "computers" included tabulating machines employed for the US Census, the development of which provided many of the input-output systems employed by early computers. (Shurkin 1996) Throughout the history of the computer, however, much of computer development, especially in software, has been intended for the consumption of the makers of computers themselves, with the narrow sociological definitions that implies. Not until 1968 was the issue of user interface made the subject of any significant research. In developing the IBM System 360, the first broadly successful computer intended for business use, the extent of interface design was quite literally to ensure that operator consoles were at an appropriate height (ibid.) The notion of the computer as a general-purpose tool for which no specialized skills should be required is still in its infancy. The first "windowing" environment was proposed at MIT in 1968, but not until ten years later did its first implementation appear in an internal-use-only prototype at Xerox's PARC research facilities.



The broad purpose of a computer's user interface (for which the proper term now appears to be human-computer interface, or HCI) is to make the task of operating a computer resemble more conventional human behavior patterns. The usual term for this quality is "intuitiveness." That is, the degree to which the actual interchange with the computer approximates the level the user will assume to be correct at any given point. Apple Computer conducted substantial HCI research throughout the development of its MacOS operating system, and the result is widely regarded as the best, or at least the most widely acceptable interface yet attained. However, to date virtually all HCI development has been conducted using only a very small population sample (predominantly nontechnical persons living near the Silicon Valley research centers). As a result, there has seldom been adequate attention paid to the testing of experimental interfaces with a broad range of ethnic groups, with corresponding narrow range of options with respect to linguistic and cognitive relation differences in various cultures. Only recently has the first interface been designed which can accommodate, for example, Arabic scripts which are read right-to-left, or Hebrew scripts which not only read right-to-left, but are not "selected" with a mouse contiguously. The oft-cited cause of this lack is simply the economic necessity of serving the largest possible section of the market (and thus attracting the maximum possible number of customers) with the minimum investment. With this system, addressing the specific needs of a small portion of the readily evident market is either assigned a minimal priority, or else regarded as a "niche" market to be sought out only by specialized companies, who carry little weight in overall industry direction.



We have seen a survey of some of the more prominent diversity issues currently facing the computer field. These issues, principally associated with appropriate representation of cultures and genders in industry and academia commensurate with real populations, may yet be surmountable. A time of rapid growth and upset has the potential to be an opportunity for all, and the current period of rapid growth may yet prevail upon itself to correct its inequities. However, I suggest that these difficulties may be more significant than one pertaining to problems with the computer field alone. There is some evidence that the U.S. economy may come to reorient itself as an "information economy," which would depend in large part upon the computer industry. As such, inequities in that industry could potentially have magnified consequences in larger economic systems built upon itself. Simultaneously, there is cause for concern along similar lines should the information and tools provided by computers come to have a strong structural influence upon the larger societies of the United States and beyond.



















References



Margolis, Jane and Fisher, Allan. "Geek Mythology and Attracting Undergraduate Women to Computer Science." Carnegie Mellon University, 1997.

http://www.cs.cmu.edu/~gendergap/papers/wepan97.html



Grace Hopper Celebration of Women in Computing (CWC), conference proceedings, 1994.

http://www.cs.yale.edu/HTML/YALE/CS/HyPlans/tap/Files/hopper-story.html



"Why Girls Dislike Technology." Forum, Slashdot, October 1998, referencing Wired Magazine article of same title by Judy DeMocker, Oct. 14 1998.

http://slashdot.org/articles/98/10/16/1012242.shtml

http://www.wired.com/news/news/culture/story/15610.html



"Encouraging Female Programmers." Forum, Slashdot, August 1999, referencing St. Louis Post Dispatch article (now inaccessible, second URL)

http://slashdot.org/articles/99/08/22/174242.shtml

http://www.postnet.com/postnet/stories.nsf/ByDocID/6872E6A944ED94D0862567D4002D88DA?OpenDocument



Shurkin, Joel N. Engines of the Mind. W.W. Norton & Company, 1996.



Duncan, David James. The Brothers K. Bantam Books, 1996.



Novak, Thomas and Hoffman, Donna. "Bridging the Digital Divide: The Impact of Race on Computer Access and Internet Use." Vanderbilt University, 1998.

http://www2000.ogsm.vanderbilt.edu/papers/race/science.html



Goldberg, Michelle. "Let them Eat Microchips." San Jose MetroActive, December 1999.

http://www.metroactive.com/papers/metro/12.16.99/cover/charity-9950.html



USENIX Association, System Administrators Guild. SAGE 1998 Salary Survey. February, 1999.



Footnotes



1. Referring, of course, to the hugely popular "iMac," a product which comprised no technological evolution whatsoever, but the first major effort on the part of a computer manufacturer to consider individual taste or aesthetic appeal in product choice. The iMac is now credited as the turnaround that likely saved Apple from collapse. Silicon Graphics, Inc. produced high-end workstations in vivid purple and blue for years before, and Sun Microsystems started adding colorful touches to their computers in the mid 1980s, but both product lines were targeted at geeks who spent so much time with the machines as to notice such details.

2. Roman mathematics now survives only in specialized and esoteric uses for their awkward and nondescriptive numbering system; Roman math in particular had neither zero, nor negative numbers, nor fractions, and is widely believed to have earned its fate. At the height of Roman power, mathematics systems other than the Roman one were banned from Roman-controlled ports, in large degree due to the unfair advantage lent by middle-eastern and other more powerful systems of mathematics.

3. For this quote I must apologize; I find myself unable to locate a copy from which to quote more precisely, so I'm drawing from memory. Duncan's comments about the products of overspecialization and its consequences are excellent, as is the rest of the book.

4. Of which, practically speaking, there aren't any. The two options for cheaper housing than Silicon Valley are San Francisco and Santa Cruz counties, both of which have housing crises of their own.