SciELO - Scientific Electronic Library Online

 
 número96Sandboxing. Un juego de negociación de recursosHerramientas digitales para la ciudad del futuro índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Não possue artigos similaresSimilares em SciELO
  • Em processo de indexaçãoSimilares em Google

Compartilhar


ARQ (Santiago)

versão On-line ISSN 0717-6996

ARQ (Santiago)  no.96 Santiago ago. 2017

http://dx.doi.org/10.4067/S0717-69962017000200036 

Readings

Missiles and timers: architecture’s instrumentality after management tools

Gonzalo Carrasco Purull 1  

1 Docente, Escuela de Arquitectura, Facultad de Arquitectura y Diseño, Universidad Finis Terrae, Santiago, Chile. gonzalocarrasco3@gmail.com

Abstract:

It is a well-known argument that many of the technological innovations that we use daily have been invented for war. However, we know less about the impact of those developments on architecture. With Cold War as backdrop, this research shows how management tools developed during the postwar period had an impact on architecture’s work processes, making its fluent insertion among the logics of capitalism possible.

Keywords: Cold War; Vietnam; cpm; pert; som

During the sixties, relationships between manufacturing, technology and architecture were strongly determined by Cold War tensions. Such a scenario led to technological transfers between architecture and the industrial sector, drastically modifying disciplinary practices at the end of the 20th century. Among these changes was the introduction in architecture of advanced scientific management systems such as cpm (Critical Path Method) and pert (Program Evaluation and Review Technique). Developed at the end of the 1950s in military and industrial fields, along with the increasing use of computers in design and calculation processes, these instruments came to transform both the means and the criteria for a new architecture. The latter became thus fundamental for the reproduction of political and economic conditions that dominated the final third of last century, foreshadowing many of the discipline’s current contradictions.

“Where is your data? Give me something I can write on the computer. Don’t give me your poetry.” Robert McNamara after being informed by a White House adviser that the Vietnam War was doomed, quoted in Bousquet, 2008:77.

Polaris

At 12:39 on July 20, 1960, from the uss George Washington submarine thirty miles off Florida’s coast, the us Navy managed to launch Polaris, its first missile sent off from a submarine. The task provided ample mobility to the American nuclear arsenal which, until then, had been securely transported only by the Strategic Air Command giant bombers (Figure 1). This was how Polaris inaugurated the fbm (Fleet Ballistic Missile), an irbm (Intermediate-Range Ballistic Missile) armament type that became fundamental within the containment strategies implemented by the United States during the Cold War, as its technology accelerated response times in the face of a possible nuclear attack: from the six hours that the B-52 bombers took to cover 5,000 kilometers to only 25 minutes, equaling in distance the irbm missiles to the icbm (Intercontinental Ballistic Missiles). An uncertain scenario that increased in 1957 when it was known that Moscow was experimenting with ss-6 Tyuratam, an icbm missile type.

Source: © Creative Commons

Figure 1 uss Observation Island (E-AG-154) at NorfolkNaval Shipyard, c.1959. The ship is firing a test model of the Polaris missile, the result of upgradesperformed to equip the vessel with the Fleet Ballistic Missile System. u.s. Navy. 

In this context, the Polaris program was born following the Killian committee’s recommendation for the development of ballistic missiles covering a range of 1,500 nautical miles (Wyndham, 1963). The National Security Council raised this suggestion to President Eisenhower, who approved it appointing the Department of Defense for its execution. This is how Charles E. Wilson, from his position as Secretary of Defense, initiated two irbm programs, in a joint venture with the Army and the Navy, at least in a first stage. Thus, while the Army’s work resulted in the creation of the Jupiter missile, the Navy’s work ended in the assembly of Polaris.

Conscious of being in the midst of an arms race with the Soviet Union, Secretary Wilson demanded from the beginning the implementation of a project management system, aimed at speeding manufacturing times (Figure 2). This was due to the fact that the Polaris program not only involved solving the problems regarding building a missile capable of taking off from a mobile platform on the high seas, but also required the development of technologies to create the submarine from which the missile would be launched. This management system, designed to administer the times and costs of the entire Polaris program, was given the name pert (Program Evaluation and Review Technique), a coordination tool rooted in both the Operations Research program - developed in England in the 1930s and successfully implemented in World War ii - as well as in the latest advances of newly-born cybernetics.

Source: arq redraw from original in: Malcolm, Roseboom, Clark, 1959:665.

Figure 2 pert integrated panorama for the Polaris missile program.  

The pert method - widely implemented by manufacturing and business in the 1960s and 1970s - played a key role in the Polaris program’s success. Initially, the calendar set the 1963 deadline for the beginning of trial tests, considering the project was to be completed by 1965. However, an event not foreseen by planners ruled all previous estimates and forced the acceleration of the program. On October 4, 1957 the Soviet Union put sputnik 1 into orbit, a satellite of only 58 centimeters and 83 kilos in weight, causing panic among the American population, which felt vulnerable against an apparent Soviet domination of space. As a result, both the Government and public opinion demanded the acceleration of the North American missile program. Such sense of emergency only increased when the news concerning the launching of a second Soviet satellite broke on November 3 of that same year. Thus, the implementation of pert had to be adjusted to this new scenario, leaving aside the development of a series of tests and components - initially considered necessary, although requiring an amount of time no longer available. The strict adherence to the schedule set by pert, the elimination of many of the tasks diagnosed as holdups (therefore tending to generate delays) and the appeal to the patriotism of each worker and contractor involved, allowed to reach a goal that seemed impossible in the beginning: to construct the Polaris submarine faster than any other ship ever mounted by the United States in times of peace.

In a time when byproducts of the military industry - such as the electric washing machine, the TV or the vacuum cleaner - enjoyed an indisputable quality recognition and transformed the second postwar period’s daily life, the success of the Polaris program (as shown in the images of its silhouette emerging amid a tower of foam and water) was the best reason for pert’s dissemination in several manufacturing areas, including obviously that of construction.

Maps to the future

Like any other management tool, what pert offered was the making of a “map to the future” (Getz, 1964), a cartography that required a simplification of reality through a model and a finite number of variables . Unlike other methods inspired by the doctrines of Frederick Winslow Taylor, such as the tables elaborated by Henry Lawrence Gantt, pert decomposed the tasks required to obtain a goal not following a schedule of parallel and independent lanes, but one that formed a ‘network’ of multiple connections of dependence and autonomy degrees. Thus, pert was implemented from the identification of final objectives or goals, for which it was necessary to determine a series of activities or ‘events’ to be executed. These were represented through different ‘boxes’ that could be filled with an expert, a machine, a process or a contractor; likewise, arrows indicating the end of one activity and the beginning of another showed the links between such boxes. Consequently, pert associated the time consumed at the end of each of these activities, defining three estimates: an optimistic, a pessimistic and a probable one, where the optimal time resulted from a mathematical combination of these three facts. Inside this data universe constituted by a network that set not only the logical sequence of events but also the interdependencies between them, the ‘critical path’ was a byproduct of the flow of activities demanding the most time. Accordingly, pert provided feedback on the initial budget model, thus showing the decisions that had to be made in order to achieve time or cost reductions.

Amid trial and error practices, pert offered a backdrop of certainty. Decisions were not only validated by numbers, they constituted altogether an interconnected system with continuous feedback, allowing decision-making processes to be adjustable to contingencies, reducing thus uncertainty degrees. Intuitive or experience-based knowledge was gradually replaced by seemingly objective criteria that could be measured and quantified using time and cost variables. These criteria were first transferred to industry, to public administration, to construction, and finally to architecture, entailing a displacement of knowledge - as the latter worked with qualitative rather than quantitative variables. Such displacement implied a transformation process subject to strong pressures, since it demanded architecture to redefine its position within capitalist productive system.

Isolated worlds

pert forces logical thinking. It encourages program planners to recognize the relationship of parts to the whole; as a consequence, pert is as natural as a planning tool within an armament system (Getz, 1964: 15).

What pert attempted to control, as well as cpm, was the apparent disorder of a world dominated by contingency and accident, a goal shared by a second technology developed in the postwar period: cybernetics. Cybernetics could be understood as the overcoming of 19th-century categories based on biology and zoology, which assumed complex organisms as hierarchical organizations formed by a structure and its organs. For Norbert Wiener, one of cybernetics creators, such a way of describing machines and organisms’ organization was close to the mechanical vision derived from 17th and 18th century clock systems or the circulatory models inspired by 19th-century steam machines. In the 20th century, however, both organisms and machines could be thought primarily as communication and control systems where information, which constitutes their raw material, is defined by its opposite: entropy.

Based on the second law of thermodynamics, which states that levels of disorder or the total entropy of an isolated system tends to increase over time, Wiener noted that - like energy - the amount of information within a system is subject to similar processes of entropy reduction and leveling. For Wiener, any complex organizational system generates an output when immersed in entropic environments. Thus, while complete entropy domination within a system would lead to its death, its adaptation would involve the activation of an anti-entropic process called homeostasis . What makes an information system or organization anti-entropic would then be its ability to be regulated through a continuous feedback cycle that allows making any correction required in order to respond to environmental data variations (input) through a rectification of the system itself (output).

Thus, management systems will share with Wiener's postulates not only their approach, by being developed in environments built from information or data systems, but also the need to find mechanisms for overcoming dominant entropy from its output, through self-regulation forms (feedback) that help to amend the course. As activities are understood in terms of functions aimed at achieving a specific goal or objective, Wiener shapes both organisms and machines’ destiny to the resolution of a purpose. Indeed, management program decisions will be measured by such teleological perspective, with three possible answers to the destabilizing effects of contingency: increasing invested resources, increasing the threat of failure or risk, or improving performance - the latter often involving the removal of certain items or the modification of entire parts of the project .

As noted by Reinhold Martin (1998: 112), one of the concepts that Wiener's ideas came to modify was that of organization. Since the 19th century, organization rested on the idea of a hierarchical structure subordinating parts to a whole. For Wiener, on the other hand, an organization - be it machine or organism - is an information system that self-regulates its own parts, all responding to the data flows transmitted in every direction. Such self-regulation or feedback implies the existence of a highly efficient communication system between each of the parts. However, while the immediacy of responses to an entropic medium is constantly changing hierarchies within the system, it always maintains its isolated condition. This is Martin’s platform for pointing out that in the informational organizations discussed by Wiener, categories of ‘interior’ and ‘exterior’ lose their meaning (1998: 113). Whereas the 19th-century idea of body involved an interior that comprised a hierarchical organization of organs, in an information system, where each part is connected to the others by a network of relations, the external environment itself is included within the system through the self-regulation or feedback mechanism. Thus, the definition of any degree of interior in an isolated information system becomes a fact of highly diffuse boundaries. The loss of both vertical hierarchy (top-down, proper of Taylorist organizational systems) and the isolated character in all information systems will have consequences for the fields of construction and architecture with the adoption of various organizational and managerial tools derived from the military industry, as was the case of pert.

Although the hierarchical fragmentation of tasks will find in Taylor its main ideologist and in Henry Ford its most celebrated promoter, it will not be until World War ii when it reaches its full development, as exemplified by the organizational system adopted by the u.s. Aviation for the coordination of large bombers manufacture. Such war logics, dominated by a strict military verticality coupled with the atomization of assignments - translated into a series of ‘boxes’ filled by departments responsible for a finite number of tasks - was implemented not only by large manufacturing areas, but also by engineering and architecture offices such as The Austin Company (1878), Albert Kahn Associates (1895), and Skidmore, Owings & Merrill (1936). These practices were largely benefited from the optimization of labor times resulting from standardization and professional division between architects, engineers and contractors required for the construction of armament factories. The organizational transformation of Albert Kahn's architectural firm was described by Henry-Russell Hitchcock as “the bureaucratic office par excellence,” its main strength being that design success no longer depended on the genius of a single man, but of “the organizational genius which can establish a fool-proof system of rapid and complete plan production” (Hitchcock, 1947: 3-6).

The efforts to systematize the work within postwar architecture offices came to modify an organization that had been based historically on the artist's atelier model, where the very instruments of architectural creation defined a quality criteria closer to the world of craftsmanship than to that of industrial capitalist production. The case of som is part of this transformation, giving great visibility to this new work model while becoming the first corporate office to have an exhibition at MoMA.

som maintained close connections with the u.s. Army, with whom it collaborated in major projects during World War ii. The most important one was the design of Oak Ridge city, Tennessee, also known as Atom City (1945). This 75,000-people settlement was designed to house contractors, scientists and engineers working on the Manhattan Project. A secret project, Atom City was built from scratch between November 1942 and 1945. To meet the deadlines, som had to make adjustments such as incorporating a massive use of prefabricated components in order to comply with the tight schedule set by the Government. After the war, som would follow and develop the lessons learned through the Oak Ridge experience (Figure 3) (Figure 4).

Source: arq redraw from original in: Galison, 2001:9

Figure 3 Organizational Chart, u.s. Strategic Bombing Survey (ussbs). 

Source: arq redraw from original in: Boyle, 1984:283

Figure 4 Organizational scheme from Skidmore, Owings & Merrill architectural office, 1957. 

Following som, the corporate architecture office was broken down into a series of ‘boxes’ subordinated to a board of directors. The division into departments - administration, design, production and construction - had effects on the notion the team had of the building itself, which was fragmented into a series of compartmentalized functions; ‘black-box’ groups identified by quantitative characteristics, such as the definition of built volumes and generic functions showed in performance indexes. The notion of postwar corporate building - perhaps som’s greatest legacy - was assumed years after in a similar way: as a complex organism with components that could be simplified into specific functions. However, unlike those generated in the late 1950s, these organizational systems retained a strong vertical ‘top-down’ hierarchy. Both in the production scheme for American Air Force bombers and in som’s work structure, every department was headed by a director who answered to a superior, who, in the turn, was subject to decisions made by a board - embodied in a person or group controlling the official ‘vision’ or idea of the whole project.

With the assimilation of management systems such as pert or cpm, work subdivision was radicalized. If labor was once represented by a department or group of professionals and contractors, it now became a mere ‘activity.’ Engineers, architects and contractors, together with equipment, materials, and bureaucracy, were placed on the same hierarchy level, all oriented to the accomplishment of a certain task in pursuit of a goal. Activities themselves, rather than having an a priori relevance, maintained a relative importance measurable in the number of connections to the whole system - that is, in volumes of information - translated in the amount of time and resources each of these activities demanded. This way, attention was placed on those critical points prone to generate ‘bottlenecks,’ where the time spent could exceed the one initially set. The system was regulated by each of these contingencies, rearranging, postponing or leaving aside all those activities that consumed the most time and resources. The response of this self-regulation or feedback to the entropic working environment was to take the decisions necessary for every activity to be completed.

The contemporary architect is thus fixed within a total program as the one that provides the system with its initial input - design - which must be adapted at each stage of execution to the responses or feedback generated by the rest of the program components, always changing in pursuit of a single purpose: the successful conclusion of the project, no longer measured in terms of quality but instead in terms of estimated time spent and available resources versus those consumed.

The need to handle large amounts of data in organizational models forced to use computers for calculations. It was necessary thus to translate into computable data the variables present in each of the problems, following an operation that privileged abstract formalisms over experience and situated knowledge (Bousquet, 2008:82). The very fact of having to work on systems implied understanding reality as a finite, manageable and computable scenario, capable of being predicted and controlled. Everything that could not be translated into numbers could not be manipulated or take the form of a quantifiable answer. This was the basis for Operations Research teams, from where tools such as pert and cpm emerged. Thus, in the early 1960s, organizational systems reinforced by mathematics seemed to offer the most ‘scientifically,’ correct path to face a problem (Bousquet, 2008:90).

Towards the end of 1960s, these tools were introduced into architecture schools curriculums, as was the case at the University of Washington in 1968 (Montgomer & Boxerman, 1968). Such assimilation reached Chile through the Corporación de Mejoramiento Urbano (cormu), in the case of the Remodelación San Borja and the unctad iii building, where pert calculations were made using the ibm 360/30 computer (the first to use micro-transistors instead of vacuum tubes), which had arrived at Universidad de Chile in 1967.

Hyperreal

When the Nixon Administration took over in 1969 all the data on North Vietnam and on the United States was fed into a Pentagon computer - population, gross national product, manufacturing capability, number of tanks, ships, and aircraft, size of the armed forces, and the like. The computer was then asked, “When will we win. It took only a moment to give the answer: “You won in 1964!” Colonel Harry Summers, quoted in Bousquet, 2008:98.

In 1961 Robert McNamara was elected Secretary of Defense for the Kennedy administration, a triumph for the group of specialists behind the rand (Research and Development) program, who were transforming the way modern warfare was conceived. McNamara, who had excelled during World War ii for his work as an analyst in the Statistical Control Office, had transferred many of the tools developed within Operation Research to his following position as president of the Ford Motors Company. With McNamara in charge of defense, management tools displaced those used in the prior war administration, which were based on the existence of a Central Command - a team of senior officers who, relying on extensive experience and soldier's instinct, defined scenarios and decisions to be made. This model was replaced by the concept of Command-Control, a team of mainly technicians who based their decisions on data analysis. This way, war scenario was translated into a series of measurable indexes, turning decision-making an objective and, most important, predictable process. Vietnam emerged as the best field for applying such uncertainty-control tools.

Like the war designed and executed by McNamara (who entrusted the prospect of victory to the certainty of management tools), postwar corporate architecture, and especially urban planning, was managed from an approach where reality was represented as fragmentary, built out of highly specialized compartments in which relationships were based on the communication of facts capable of being translated into objective values. Data for which the analysis could only be based on the same criteria after which these were collected, that is, one that could be adapted to time-cost and cost-resource equations. This conception transformed most of corporate architecture into a closed reference universe; a world of isolated, anti-entropic systems which, through ostensible control of uncertainties inherent to building themselves - which operate on the basis of the qualifiable rather than the quantifiable - made the construction of architecture a simulacrum. At times, this world became more real than reality itself, to the point that, like McNamara's techno-war, it crashed into the city's own contingency, in one case, and into that of the jungle, in the other.

Referentes

ALSAKER, E. T. “La técnica básica: análisis de la red” (1964). En: Stilian, Gabriel. pert: un nuevo instrumento de planificación y control. Bilbao: Ediciones Deusto, 1973. [ Links ]

BOUSQUET, Antoine. “Cyberneticizing the American war machine: science and computers in the Cold War”. Cold War History , Vol. 8, Issue 1 (2/1/2008):77-102. [ Links ]

BOYLE, B. “El ejercicio de la arquitectura en América, 1865-1965. Ideal y realidad”. En: Kostof, S. El arquitecto. Historia de una profesión. Madrid: Cátedra, 1984. [ Links ]

ENGELHARDT, T. El fin de la cultura de la victoria. Estados Unidos, la Guerra Fría y el desencanto de una generación. Barcelona: Editorial Paidós, 1995. [ Links ]

GALISON, Peter. “The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision”. Critical Inquiry. 21 (1) (1994):228-266. [ Links ]

GETZ, C. W. “Visión general del pert” (1964). En: Stilian, Gabriel. pert: un nuevo instrumento de planificación y control. Bilbao: Ediciones Deusto , 1973. [ Links ]

HITCHCOCK, Henry Russell. “The Architecture of Bureaucracy and the Architecture of Genius”. The Architectural Review (Oct., 1947):3-6 [ Links ]

MALCOLM, D. G., Roseboom, J. H., & Clark, C. E. “Application of a Technique for Research and Development Program Evaluation”. Operations Research, Vol. 7, Issue 5 (1959):646-669. [ Links ]

MARTIN, Reinhold. “The Organizational Complex: Cybernetics, Space, Discourse”. Assemblage 37(1998):102-127. [ Links ]

MAULÉN, David. “Una trayectoria excepcional. Integración cívica y diseño colectivo en el edificio unctad iii”. ARQ 92 (Abril, 2016):68-79. [ Links ]

MONTGOMER, R.; BOXERMAN, S. “An Applied Mathematics Course for Architects and Urban Designers”. Journal of Architectural Education, Vol. 22, No. 2/3 (Mar-May, 1968):29-31 [ Links ]

SCOTT, Felicity. “On Architecture under Capitalism”. Grey Room 6 (2002):44-65. [ Links ]

WYNDHAM, M. D. “The Polaris”. Technology and Culture, Vol. 4, Issue 4 (1963):478-489. [ Links ]

Creative Commons License Este es un artículo publicado en acceso abierto bajo una licencia Creative Commons