Skip to content Skip to navigation


You are here: Home » Content » Technological Choice



What is a lens?

Definition of a lens


A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

This content is ...

Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.
  • EAC Toolkit display tagshide tags

    This module is included inLens: Collaborative Development of Ethics Across the Curriculum Resources and Sharing of Best Practices
    By: University of Puerto Rico at Mayaguez - College of Business AdministrationAs a part of collection: "CIVIS Project - UPRM"

    Click the "EAC Toolkit" link to see all content affiliated with them.

    Click the tag icon tag icon to display tags associated with this content.

Recently Viewed

This feature requires Javascript to be enabled.


(What is a tag?)

These tags come from the endorsement, affiliation, and other lenses that include this content.

Technological Choice

Module by: William Frey. E-mail the author

Summary: This module will help you explore the broader social and technical impacts of business systems, services, and products. It has been developed to compliment another module, "Social and Technical Systems for Professional Decision-Making, m14025. It presents three lenses under which to view social and technical impacts: Technological Determinism, Social Construction, and Technological Politics. These lenses come from Robert Heilbroner, Trevor Pinch and Wiebe Bijker, and Langdon Winner. It provides exercises that can be used to complement the readings provided in Technology and Society edited by Deborah Johnson and Jameson Wetmore. This "Technological Choice" module is part of the EAC Toolkit, a project developed under National Science Foundation grant SES 0551779.

Test Cases

  • Your company, Cogentrix, proposes a cogeneration plant that uses coal to produce electricity and steam both of which it sells to make money. Because western Puerto Rico lacks electricity generating capacity and because the steam by-product can be sold to nearby tuna canning plants, your company finds the Mayagüez area particularly attractive. But there are rumors that different local constituencies oppose this project because of concerns about the environmental impact of coal use and industrialization. Prepare a STS analysis of the Mayagüez area. Can you identify any potential value mismatches between this system and the cogeneration plant your company is planning? How can these mismatches be mitigated or eliminated?
  • Your company, Southern Gold Resources, wants to mine different regions in central Puerto Rico for copper and gold. But you know that twenty years earlier, two proposals by two international mining companies were turned down by the Puerto Rican government due to strong local opposition. Carry out a socio-technical system that concentrates on the financial, social, and environmental impact of a smaller scale mining project. How would you address the opposition to the older mining proposals should they also arise in relation to yours? What does your STS analysis tell you about social and ethical impacts, financial potential, and possible grass roots opposition? Are profitable mining operations compatible with the community and environmental values? What is your recommendation based on your STS analysis?
  • Windmar, a company that manufactures and operates windmills for electricity generation has proposed to build a windmill farm adjacent to the Bosque Seco de Guanica. They have encountered considerable local opposition which stems from three concerns: (1) Given that Windmar is a private business, can it be trusted to carry out its project as proposed? Can any private business be trusted to “keep its word.” (2) Will locating the windmill farm so close to the Bosque Seco de Guanica have a harmful impact on the fragile ecosystem and its non-human inhabitants? (3) Why were public hearings on the project held so far away from the very communities who would most likely suffer its impacts? Windmill technology has traditionally been considered one of the cleanest ways to generate electricity. But this doesn’t mean that it represents a harm-free technology. How can windmills harm the environment? How can their construction and operation harm the communities in which they are housed? Carry out a socio-technical system analysis to understand and clarify this opposition. Can the concerns of local stakeholders be integrated with a profitable, privately owned and operated windmill farm? How should the windmill project be modified to improve the chances of its being implemented?
  • Assume that the Puerto Rico government has decided to give a laptop computer to every public school student. What would happen? What would be the benefits? What would be the harms? Construct a socio-technical analysis of the Puerto Rico public school system and study the impact of the laptop project on this system. Are there any mismatches between the values embedded in laptops technology and this STS? Would the laptop project be feasible? (What constraints are likely to make integration difficult?) Would it be necessary to redesign laptop computers to make this technology more responsive to the special needs of children? What changes or adjustments would need to be made in the Puerto Rico public school STS?


This module is a companion to the module, “Socio-Technical Systems in Professional Decision-Making” (m14025). It also responds to recent work in an area dubbed, Science and Technology Studies. (Johnson and Wetmore’s anthology, Technology and Society: Building Our Sociotechnical Future, provides a good sampling of recent articles in this area.) You will be provided with three lenses through which to view technologies. Each lens presents a different conception of the relation of technology to society; no single lens is completely true or completely false. Rather, each is distinguished by the way in which it selects certain elements from experience as areas for concentration and focus. Thus, lenses are tools that will prove useful as you navigate through the complexity of different socio-technical systems. Working with these lenses will give you a multi-layered and multi-dimensional view of the different ecologies (social, technical, and natural) that surround you and within which you work.

What you need to know.

Lenses are not ideologies

  • An ideology presents a particular world view as the truth. Thomas Kuhn characterized ideologies or world views as “paradigms” in his book, The Structure of Scientific Revolutions. For Kuhn, paradigms form self-contained accountings of the world and are incommensurable with one other. During certain “normal” periods, these paradigms can construct positive and useful lines of inquiry and discovery.
  • But incommensurable paradigms also battle with one another during “revolutionary” periods for dominance. Kuhn’s highly controversial claim is that disputes between rival, paradigms cannot be resolved by recourse to rational means. Instead, they become power struggles, not unlike the power struggles in the political realm between competing classes and their supporting ideologies.
  • Treating different views on the relation between technology and society as lenses rather than incommensurable ideologies or paradigms, allows us to explore and compare the different lines of inquiry each opens. Lenses are tools that support inquire, drive discovery, and refashion the surrounding world. Each lens provides a partial view of experience. Viewing experience through multiple lenses helps us to build a richer, multi-level and multi-perspective for trouble shooting and problem-solving.
  • In this module, you will view the four cases presented above through the lenses of technological determinism, social construction, and technological politics. These different lenses should help you to understand and control technology more effectively and safely.

Lens One: Technological Determinism

  • Marx provides the classical statement of technological determinism: “In acquiring new productive forces men change their mode of production; and in changing their mode of production, in changing the way of earning their living, they change all their social relations. The hand-mill gives you society with the feudal lord; the steam-mill society with the industrial capitalist.” Quoted by Langdon Winner in Autonomous Technology, 79
  • Technological determinism thus claims that certain technological devices (electricity, the automobile, the computer) recreate our material conditions in such a way that they determine the nature of our social consciousness and restructure our social and economic relations to one another.
  • The following quote shows that, for Heilbroner, technological determinism comes from a unique convergence of events during modern times. Science has advanced to a particular point in harmony with certain machine-oriented skills.
  • “Technological Determinism is thus peculiarly a problem of a certain historical epoch—specifically that of high capitalism and low socialism—in which the forces of technical change have been unleashed but when the agencies for the control or guidance of technology are still rudimentary.” (Johnson and Wetmore, 104)
  • Thus, a knowledge base (formed out of value-neutral, mechanistic science) has been combined with a platform of technical know-how (such as the ability to fashion metal with precision into complex machines) to give rise to certain economic relations (capitalist to worker). But, because our political system was developed in pre-industrial times, it is not able to control the current technological revolution. The technology controls us much to our detriment. To take back control, we must radically reconstitute both our technology and our social and economic relations.

Feenberg on Technological Determinism

  • Andrew Feenberg in Questioning Technology provides a concise characterization of technological determinism. It is based on the assumptions of unilinear progress and determinism by base.
  • Unilinear Progress: Technological progress appears to follow a unilinear course, a fixed track, from less to more advanced configurations. Each stage of technological development enables the next, and there are no branches off the main line." Feenberg, 77
  • Determinism by Base: "Technological determinism also affirms that social institutions must adapt to the "imperatives" of the technological base. This view, which do doubt has its source in a certain reading of Marx, is long since the common sense of the social sciences. Adopting a technology necessarily constrains one to adopt certain practices that are connected with its employment." Feenberg 77
  • Leslie White on determination by base: "We may view a cultural system as a series of three horizontal strata: the technological layer on the bottom, the philosophical on the top, the sociological stratum in between. These positions express their respective role in the culture process. The technological system is basic and primary. social systems are functions of technologies; and philosophies express technological forces and reflect social systems. The technological factor is therefore the determinant of a cultural system as a whole. It determines the form of social systems, and technology and society together determine the content and orientation of philosophy." Quoted by Winner, Autonomous Technology, 79.

Lens Questions:

  1. What forms of social and political organization does the technology create as it is integrated into the surrounding socio-technical system? This general question can be specified in the following ways:
  2. Which organizational decision-making approach is elicited by the technology? A vertical approach where those at the bottom carry out mandates set by those at the top? Or a horizontal structure, where decision-makers collaborate in a consensus-based approach to problem-solving and decision making?
  3. Does the technology elicit a division of work tasks into specialized roles that are coordinated from above? (By high-level managers?) Or does it encourage a more holistic approach to work that consists of overlapping roles and constant communication between these roles?
  4. Does the technology lead to centralization or decentralization of power and control? For example, many advocate windmills (and other technologies on the "soft path") because they allow for the generation of electricity from small, local, and dispersed areas of production. (This despite the fact that windmill technology is becoming more complex and windmill "farms" represent larger centers of energy production.) On the other hand, nuclear technology requires highly centralized operating and decision procedures because of the risk of accidents of high magnitude and scope. Centralization enhances control which, supposedly, reduces the chance for disastrous accidents.
  5. Following Mumford, we might ask whether the technology elicits a democratic or authoritarian organization of economic, social and political activities as it is implemented? Democratic exercise of power would take place through horizontally organized, decentralized, and locally situated centers of control and power. Authoritarian exercise of power would take place through vertically organized, centralized, remotely situated centers of control and power.

Lens Two: Social Construction

  • This lens comes from Pinch and Bijker’s article, “The Social Construction of Facts and Artifacts.” Social construction makes the opposite claim to technological determinism. Instead of holding that technology determines society, the social constructionist argues that society determines or “constructs” the technology. This lens, then, will help you to see the contribution that individuals and groups make to the social construction of technologies.
  • Pinch and Bijker begin with an application of epistemological relativism to science and technology.
  • Relativism may be a misnomer here since it argues that individuals or groups bestow truth and value on the surrounding world. Humans according to the Greek thinker, Protagoras, are the measure of all things, of those that are, that they are and of those that are not, that they are not. So classical relativism holds that humans-—as individuals or as groups-—provide the standards by which all things are assessed.
  • But the relativism that Bijker and Pinch advocate is for methodological, not ontological, purposes. All scientific theory proposals and all technological variations are treated the same whether successes or failures. They are all grist for the historian's mill. This gives us special insight into how they are generated, how they compete with one another, how individuals interact with them, and on how, finally, the successes are selected and the failures de-selected. This methodological relativism lays bare the process of social construction concealed in the final product.

Looking at the development of technologies, Pinch and Bijker identify three stages:

  1. The first stage exhibits interpretive flexibility. Because the design of an artifact and its meaning are open, social interaction and transaction generates different variations. (Their example is the different bicycle designs that competed for market share before the small wheeled, safer version won out.) Many variations are generated which compete with one another. This positive competition stimulates creativity. Individuals interact with the variations that are produced, experimenting with them and, through this experimentation, clarifying their interests, values, and concerns. The interests, needs, and problems clarified become filters that select and de-select variations.
  2. The second stage is characterized as the closing of interpretive flexibility. Needs, interests, and problems stabilize. They select and de-select variations so that most drop off to the side. Because individuals interact with facts and artifacts, because they experiment with them, select those that meet their needs and de-select those that don't, they literally and socially construct them.
  3. In the third stage, closure is achieved through rhetorical means (such as advertising), problem definition (which keeps some problems and dissolves others), and inclusion in a wider context where the variations selected fit into the surrounding socio-technical system. Closure leads us to forget the historical process of social construction, i.e., interpretive flexibility and closure of interpretive flexibility. Hence, we treat the final technology as a black box that has always been there and is somehow inevitable. But re-opening the historical process reminds us that the black box has been constructed and selected to incorporate our needs, problems, and values.

Lens Questions

  1. What is the historical process that has culminated in the current form the technology has taken? Specifically...
  2. Did users and non-experts participate in the process of generating alternative interpretations of the technology? How did they participate? Do these alternatives embody the values and interests of stakeholders in their designs? (Corresponds to flexibility of interpretation.
  3. Did users and non-experts participate in the closing of flexibility of interpretation by helping to select "winners" from among the competing forms? How did they participate and how does the design of the "winners" reflect their interests and values? (This corresponds to the closing of interpretive flexibility.)
  4. What are the final criteria embodied in the closed and fixed technological design? Did a broad range of stakeholders participate in establishing these criteria? Did these criteria play a direct role in selecting the final design from among the initial variants? Does the final design or "black box" adequately reflect the needs, interests, and values of the broad range of stakeholders affected by this technology? (This reflects the final or closure stage.)

Lens Three: Technologies and Politics

Background from Autonomous Technology by Langdon Winner (From Hickman, John Dewey’s Pragmatic Technology, 148 and following.)

Winner starts by criticizing the “straight-line” notion of tool use: tools serve ends bestowed on them by the user. There are four reasons why this doesn’t work:

  1. Step 1. Manifest Complexity: The technology or tool displays complexity such as “tightly coupled systems” and “non-linear” chains of causality." For example, nuclear reactors are highly complicated and, therefore, difficult to control. Because they are tightly coupled they are subject to what Perrow calls “normal accidents” where minor failures produce a chain reaction of other failures because these failures cannot be isolated.
  2. Step 2. This example will help. When systems are tightly coupled, prediction is rendered difficult because systems interact in unexpected ways and a breakdown in one part quickly spreads to others. Think about a tightly coupled schedule. When one part changes--you are called into work because a co-worker didn't show up--it spills over into other parts of your schedule--you do bad on the test the next day because you couldn't study because you were working.)
  3. Step 3. Concealed Complexity: Technologies are frequently backed by decision-making procedures that are opaque to independent scrutiny. For example, the procedure by which nuclear reactors are regulated is extraordinarily complicated. This makes it difficult to assess independently whether these procedures guarantee that only safe reactors designs will be approved by the regulatory process.
  4. Step 4. Technological Imperative: Technologies transform and redefine human needs. Machine needs become imperative and trump human needs. For example, food clothing, and shelter (basic human needs) are replaced by machines requirements such as electrical power, highways, bridges, sewers, and other infrastructure. Technologies (in the form of complicated machines) have requirements that tend to push aside our own needs, values, and interests. We build infrastructure to respond to these needs. The tool no longer serves us; we serve the tool.
  5. Step 5. Reverse Adaptation: Because complex technologies redefine needs (and values), we are forced to adapt ourselves (and our needs) to them. (It is assumed that we cannot adapt them to our needs because of manifest and concealed complexity.)

Lens Questions

  1. Assess the manifest complexity of the technology in question. For example, what is the manifest complexity of windmills? (Do they present tightly coupled systems that lead to unpredictable breakdowns?) Which is more manifestly complex, nuclear reactors or windmill turbines?
  2. Assess the concealed complexity. For example, do the operating procedures of windmills conceal complexity? Do nuclear reactors conceal complexity in the complicated regulation process that has developed between manufacturers and the Nuclear Regulatory Commission? (Maybe, complexity is concealed in the divergence between formal and informal regulatory procedures, the latter having evolved as the NRC has been "captured" by reactor manufacturers.) See Ford
  3. Technological Imperative: Does the technology redefine or displace basic human needs or basic values? Does it require that we adopt ourselves to it?
  4. Reverse Adaptation: Does the technology require reverse adaptation? If yes, are there any viable "work around" strategies that could be implemented to align better the technology's needs with our own.

What you are going to do

Exercise One: Construct a Socio-Technical System Grid

  • Choose a test case from above. (The alternatives include Cogentrix, Copper Mining, Windmills, and Laptops.)
  • Read the module, Socio-Technical Systems in Professional Decision-Making, and modify the STS table for Puerto Rico to fit the test case you are using.
  • Identify the values embedded in the technology of your test case and the STS you have modeled.
  • Identify any possible value mismatches between the technology to be introduced and the underlying STS.

Exercise Two

  • Select two of the lenses outlined above.
  • Examine your test case under the first lens by answering the questions. Give a global assessment of whether your test case technology is acceptable under the lens.
  • Examine your test case under the second lens by answering the questions. Give a global assessment of whether your test case technology is acceptable under this second lens.
  • Compare the results of the two lenses. Discuss areas of divergence between the two lenses. Discuss the areas of convergence.

Prepatory Questions and Module Worksheet

Technology Choice Preparatory Questions

Media File: Questions to Prepare for Technology Choice Module.docx

Technology Choice Worksheet

Media File: Worksheet on Technological Choice.docx

STS Presentation for Technological Choice

Media File: Technological Choice Module.pptx

Table displaying components of STSs

Media File: Socio-technical Systems.docx

Presentation on Capabilities Approach

Media File: Capabilities Approach.pptx

The Legal Environment: Civil and Criminal Responsibility

Media File: Moral and Legal Responsibility.pptx

Technology Choice Jeopardies

Technological Choice Cases Jeopardy

Media File: Technological Choice Cases.pptx

Socio Technical Systems Jeopardy

Media File: Socio Technical Systems.pptx

Jeopardy and Responsibility

Media File: Jeopardy_Responsibility.pptx


Evaluate the Lenses

  • Which of the three lenses presented in this module would you eliminate?
  • Which lens did you find most helpful? Why?
  • Would you recommend a new lens? What is it?

Muddy Point

  • What was the most obscure or muddiest point? (What didn’t make sense to you? What did you find objectionable?)
  • What was the strongest point of this module? What did you learn? Will you be able to put it to use?


  1. Feenberg, Andrew. (2002). Transforming Technology: A Critical Theory Revisited. Oxford, UL: Oxford University Press.
  2. Feenberg, Andrew. (1999). Questioning Technology. London: Routledge.
  3. Ford, D. (1981). A Reporter At Large: Three Mile Island. In The New Yorker, April 6, 1981: 49-106.
  4. Heilbroner, R.L. (2009). Do Machines Make History? In Technology and Society: Building Our Sociotechnical Future, Johnson, D.G. and Wetmore, J.M., (Eds.). Cambridge, Mass: MIT Press: 97-106.
  5. Hickman, L. (1990). John Dewey’s Pragmatic Technology. Bloomington, IN: Indiana University Press: 140-153.
  6. Hickman, L. (2001) Philosophical Tools for Technological Culture: Putting Pragmatism to Work. Bloomington, IN: Indiana University Press. Your first item here
  7. Huff, C. and Finholt, T. (1994). Social Issues In Computing: Putting Computing in its Place. New York: McGraw-Hill.
  8. Kuhn, T. (1970). The Structure of Scientific Revolutions, 2nd Edition. Chicago, IL: University of Chicago Press.
  9. Mason, J. (1979). The accident that shouldn't have happened: An analysis of Three Mile Island. In IEEE Spectrum, November 1979: 33-42.
  10. Perrow, C. (1984). Normal Accidents: Living With High-Risk Technologies. Basic Books.
  11. Pinch, T.J. and Bijker, W. (2009). The Social Construction of Facts and Artifacts. In Technology and Society: Building Our Sociotechnical Future, Johnson, D.G. and Wetmore, J.M., (Eds.). Cambridge, Mass: MIT Press: 107-139.
  12. Reason, J. (1990). Human Error. Cambridge, UK: Cambridge University Press.
  13. Sismondo, S. (2004). An Introduction to Science and Technology Studies. Oxford, UK: Blackwell Publishing: 51-52.
  14. Trent, March. (1992). The AES Corporation: Management Institute for Environment and Business. In Ethical Issues in Business: A Philosophical Approach, 5th Edition. Donaldson, T. and Werhane, P. (Eds.). Upper Saddle River, NJ: Prentice Hall: 424-440.
  15. White, Leslie. (1949). The Science of Culture. New York: Farrar, Straus and Giroux, 366.
  16. Winner, L. (2009). Do Artifacts Have Politics? In Technology and Society: Building Our Sociotechnical Future, Johnson, D.G. and Wetmore, J.M., (Eds.). Cambridge, Mass: MIT Press: 209-226.
  17. Winner, L. (1978). Autonomous Technology: Technics-out-of-Control as a Theme in Political Thought. Cambridge, Mass: MIT Press paperback edition.


Practical Lenses for Socio-Technical Systems

Media File: Practical Lenses for Socio-Technical Systems.pptx

Content actions

Download module as:

Add module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens


A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks