<< Chapter < Page Chapter >> Page >

    Lens questions

  1. What is the historical process that has culminated in the current form the technology has taken? Specifically...
  2. Did users and non-experts participate in the process of generating alternative interpretations of the technology? How did they participate? Do these alternatives embody the values and interests of stakeholders in their designs? (Corresponds to flexibility of interpretation .
  3. Did users and non-experts participate in the closing of flexibility of interpretation by helping to select "winners" from among the competing forms? How did they participate and how does the design of the "winners" reflect their interests and values? (This corresponds to the closing of interpretive flexibility. )
  4. What are the final criteria embodied in the closed and fixed technological design? Did a broad range of stakeholders participate in establishing these criteria? Did these criteria play a direct role in selecting the final design from among the initial variants? Does the final design or "black box" adequately reflect the needs, interests, and values of the broad range of stakeholders affected by this technology? (This reflects the final or closure stage.)

Lens three: technologies and politics

Background from Autonomous Technology by Langdon Winner (From Hickman, John Dewey’s Pragmatic Technology , 148 and following.)

    Winner starts by criticizing the “straight-line” notion of tool use: tools serve ends bestowed on them by the user. there are four reasons why this doesn’t work:

  1. Manifest Complexity : The technology or tool displays complexity such as “tightly coupled systems” and “non-linear” chains of causality." For example, nuclear reactors are highly complicated and, therefore, difficult to control. Because they are tightly coupled they are subject to what Perrow calls “normal accidents” where minor failures produce a chain reaction of other failures because these failures cannot be isolated.
  2. This example will help. When systems are tightly coupled, prediction is rendered difficult because systems interact in unexpected ways and a breakdown in one part quickly spreads to others. Think about a tightly coupled schedule. When one part changes--you are called into work because a co-worker didn't show up--it spills over into other parts of your schedule--you do bad on the test the next day because you couldn't study because you were working.)
  3. Concealed Complexity : Technologies are frequently backed by decision-making procedures that are opaque to independent scrutiny. For example, the procedure by which nuclear reactors are regulated is extraordinarily complicated. This makes it difficult to assess independently whether these procedures guarantee that only safe reactors designs will be approved by the regulatory process.
  4. Technological Imperative : Technologies transform and redefine human needs. Machine needs become imperative and trump human needs. For example, food clothing, and shelter (basic human needs) are replaced by machines requirements such as electrical power, highways, bridges, sewers, and other infrastructure. Technologies (in the form of complicated machines) have requirements that tend to push aside our own needs, values, and interests. We build infrastructure to respond to these needs. The tool no longer serves us; we serve the tool.
  5. Reverse Adaptation : Because complex technologies redefine needs (and values), we are forced to adapt ourselves (and our needs) to them. (It is assumed that we cannot adapt them to our needs because of manifest and concealed complexity.)

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Civis project - uprm. OpenStax CNX. Nov 20, 2013 Download for free at http://cnx.org/content/col11359/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Civis project - uprm' conversation and receive update notifications?

Ask