Skip to content Skip to navigation


You are here: Home » Content » Good Computing Reports for Computer Ethics


Recently Viewed

This feature requires Javascript to be enabled.

Good Computing Reports for Computer Ethics

Module by: William Frey. E-mail the author

Summary: This module present a version of the Social Impact Statement exercise developed by Charles Huff in "Practical Guidance for Teaching the Social Impact Statement (SIS)" from the Proceedings of the 1996 Symposium on Copmputers and the Quality of Life. This adaptation has been successfully used at the University of Puerto Rico at Mayaguez for three years. Students, working in small groups of 4 - 5, select a computing system to study in detail with the objective of identifying and mitigating ethical and social problems that stem from conflicts in values embedded in the system and its surrounding societal context. This module is based on the assumptions that computing systems embody values and that there is an analogy between designing and problem solving in ethics. This module is being developed as a part of an NSF-funded project, "Collaborative Development of Ethics Across the Curriculum Resources and Sharing of Best Practices," NSF SES 0551779.

Good Computing Reports (From Charles Huff, "Practical Guidance for Teaching the Social Impact Statement (SIS). From Proceedings of the 1996 Symposium on Computers and the Quality of Life, pp. 86-89. New York, ACM Press.)

Key Links

1. Materials from Magic Copy Center: Good Computing: A Virtue Ethics Approach to Computer Ethics, Chapter Two, Huff/Frey



1. To uncover ethical surprises in major design projects. (These are ethical issues—potential ethical problems—that are embedded in the design project.)

2. To communicate effectively to the client the importance of considering ethical issues and problems associated with design projects upstream in the design process. (This means raising ethical problems from the beginning of the design process and continuously throughout the design process. This is opposed to the idea of waiting until the design process is finished to raise ethical issues.)

Four Presuppositions

1. Socio-technical systems and their components (hardware, software, physical surroundings, people/groups/roles, procedures, laws, data/data structures) embody values.

2. Computing technologies (CTs) are always embedded in socio-technical systems.

3. CTs instrument (magnify or augment) human action.

4. There is a close analogy between solving ethical and design problems:

The table below provides a summary of this analogy that helps to introduce the Software Development Cycle. For a more complete account of this analogy see Carolyn Whitbeck: (This link is attached above in this module.)

Table 1
Analogy between ethical and design problems
Design Problem Ethical Problem
Construct a prototype that optimizes (or satisfices) designated specifications Construct a solution that realizes ethical values such as justice, responsibility, reasonableness, respect, and safety)
Conflicts between specifications are resolved through integration Attempt to resolve conflicts between values (moral vs. moral, moral vs. non-moral) by integration
Designed products or services must be implemented over background constraints Ethical solutions must be implemented over resource, interest, and technical constraints.

Normative Methodology:

The Software Development Cycle (SDC) arises out of the analogy between design and problem solving in ethics. The core sections of Good Computing Reports are based on its four stages of problem specification, solution generation, solution testing, and solution implementation.

Problem specification: This stage requires specifying the socio-technical system that influences the software in question, recognizing the values embodied in the system, and uncovering the conflicts between these values, for example, between efficiency and safety.

Solution generation: This stage requires resolving value conflicts by changing the design or requirements, or by recommending change to other aspects of the socio-technical system. Brainstorming forms an essential part of this stage.

Solution Testing: The solutions developed in the second stage must be tested in terms of ethics tests (reversibility, harm/beneficence, and public identification) and a code test.

Solution implementation: The chosen solution must be examined in terms of how well it responds to various situational constraints that could impede its implementation. What will be its costs? Can it be implemented within necessary time constraints? Does it honor recognized technical limitations or does it require pushing these back through innovation and discovery? Does it comply with legal and regulatory requirements? Finally, how does it respond to the general social and political conditions surrounding implementation?

Empirical Methodology:

This stage employs various methods for collecting and reviewing data including (1) constructing questionnaires and surveys, (2) holding open and structured interviews with clients and stakeholders, (3) employing methods of participatory observation including on-site visits and day-in-the-life scenarios, and (4) conducting archival research that includes online searches and reading operating manuals.

Basic Format of the Report: This report has nine sections that include the following: (a) executive summary, (b) problem specification, (c) solution generation, (d) solution testing, (e) solution implementation, (f) documenting ethical data collection, (g) readers guide, (h) methodological appendix, (i) group self-evaluation. These stages are set forth in the following description that includes tables and matrices.

1. Executive Summary (From Huff, “Practical Guidance):

One or two page summary of the report that includes (1) description of the report and of the system, (2) discussion of the significant issues discovered, (3) list of the top recommendations highlighted on the page (keyed to page numbers in the longer report). The idea is to provide a summary that an executive can read in 5 to 10 minutes to get the basic information about the report.

2. SDC: Problem Specification

2a. Identify the values embedded in the system and the STS component in which they are specifically located. Use this table and then add a detailed written explanation.

Table 2
Component/ Value Hardware Software Physical Surroundings People, Groups, & Roles Procedures Laws & Regulations Data/Data Structures
Safety (Respons-ibility)              
Property (Respect)              
Privacy (Respect)              
Free Speech (Respect)              
Equity & Access (Justice)              

2b. Specify the problem using the following problem classification matrix. More information can be found at www.computingcases.orgor in Good Computing: A Virtue Approach to Computer Ethics.

Table 3
Problem Type Sub-Type Solution Outline
Factual Type and mode of gathering information (Archival Research, Interview, Participatory Observation, Survey)
Conceptual Concept in dispute and method for agreeing on its definition
Moral vs. Moral
Non-moral vs. moral
Non-moral vs. non-moral
Value Integrative Partially Value Integrative Trade Off
Social Justice
Value Realization
Strategy for maintaining integrity Strategy for restoring justice Value integrative, design strategy
Intermediate Moral Value Safety, Property, Privacy, Free Speech, Equity & Access Realizing Value Removing value conflicts Prioritizing values for trade offs

2c. The problem classification matrix must also be accompanied by a verbal explanation of your problem classification.

2d. Be sure that you problem description corresponds with the elements of the above matrix.

2e. Be sure that you have shown that the solution you eventually propose responds to the components of the problem you have specified.

3. SDC: Solution Generation

Brainstorm solutions to the problem specified above: (a) describe the brainstorming methodology you employed, (b) include the preliminary brainstorming list and provide at least ten (10) solutions, (c) follow with the refined list, (d) explain the process used to refine solutions, and (e) briefly describe how the solutions on the refined list respond to the components identified in the problem specification stage.

4. SDC: Solution Evaluation

Do a comparative evaluation of the solutions you designed in the previous stage. Structure your evaluation around the following matrix: (More information can be found at www.computingcases.orgor in Good Computing: A Virtue Approach to Computer Ethics.)

Table 4
Solution/Test Reversibility or Rights Harms/Beneficence or Net Utility Value Code Global Feasibility
Description Reversible with stakeholders? Honors basic rights? Produces the best benefit/harm ratio or maximizes utility? Moral values: realized or frustrated? Value conflicts resolved or exacerbated? Does the recommendation violate code provisions? What the resource, technical, or interest constraints could impede implementation?
Best recommendation          
Second Best recommendation          
Status Quo          

4a. Accompany this matrix with an in-depth verbal comparison of these alternatives. Recapitulate how each alternative stands with each test. Then provide a justification for each recommendation.

5. SDC: Solution Implementation

Fill out a Feasibility Matrix. Then discuss the obstacles that could impede the implementation of your solution and how you plan to overcome them. You may want to formulate and have ready a Plan B in case these obstacles prove insurmountable. (More information can be found at www.computingcases.orgor in Good Computing: A Virtue Approach to Computer Ethics.)

Table 5
Feasibility Matrix
Resource Constraints Technical Constraints Interest Constraints
Time Cost Available materials, labor, etc Applicable technology Manufactur-ability Per-son-alities Organiza-tional Legal Social, Political, Cultural

6. Discuss measures taken to avoid ethical problems that could arise in carrying out a Good Computing analysis. Use the following table to help identify the pertinent topics

Table 6
Collection Analysis Reporting
Establish a client-professional relationship All information, claims, and solutions need to be tested, triangulated, and validated Intelligibility: check for coherence between problems and recommendation; prepare a clear executive summary; document and attribute; discuss and justify methodology
Get client consent Avoid misconduct. Three sins of academic integrity are plagiarism, falsification, and fabrication Comprehensiveness: (a) scope out topic carefully; (b) avoid extremes of covering too much and too little; (c) be open about limits and boundaries of investigation
Respect: listen to client, dress professionally, thank interviewee/client for time Minimize bias and avoid loaded and complex questions Objectivity and Impartiality: Be sure to report each stakeholder perspective
Confidentiality: (a) explain how you are going to use information; (b) design measures to prevent unauthorized access; (c) destroy raw data after it has been analyzed Triangulate, that is, use different methods to collect the same data to overcome limits and biases associated with each data collection method Deliver bad news proactively by minimizing blame language, presenting solutions, and by providing clear and comprehensive justifications of recommentations

7. A Reader's Guide: An annotated bibilography of materials that could provide the client with the detailed background to the Social Impact Analysis

8. A Methodological Appendix that includes the following:

  • a) Rational for particular methods chosen
  • b) Detailed and concrete descriptions of those methods
  • c) Individual interviews should be noted respecting privacy and confidentiality
  • d) Description of field observation including significant events looked for, significant events discovered, changes made in observation protocol, etc
  • e) DLS or day-in-the-life-scenarios along with a rational for choice of perspectives and time frames, information from which they were complied (e.g., interviews, manuals, etc.), and finally, the detailed scenarios themselves.

9. Group Self-Evaluations

  • Repeat Group Values and provide an objective assessment of how well these have been met during the semester.
  • List Group Work Pitfalls and describe measures taken by your group during the semester to prevent or mitigate them. Assess objectively your success in preventing or mitigating them.
  • Discuss the obstacles to successful group work that arose during the semester and the measures your group designed to overcome them. Objectively assess these measures. Would your group recommend these practices as "Best Practices" to other groups? Are they orignial? Robust?
  • Evaluate how effectively the team members worked together referring to the "Team Member Evaluation Form" (See form and 10 crieteria just below)

Check List

  1. ____Group Goals (copy)
  2. ____Preliminary Topic Report (copy)
  3. ____Final GCR Presentation (copy in PowerPoint format or online display)
  4. ____Final GCR written report (10-20 pages) due on December 8, 2006
  5. ____Group Self-Evaluation including Team Member Rating Sheets
  6. ____Portfolio including Hughes Solution Evaluation Matrix, Virtue Table, and Right Table, and Presentation Evaluation Prepared by Instructor

I certify that these materials have been prepared by those who have signed below and no one else. I also certify that we have not plagiarized any material but have given due acknowledgement to all the sources used. All who sign below and whose names are included on the title page of this report have participated fully in the preparation of this project and are equally and fully responsible for its results.


Figure 1: This evaluation form will be used by the instructor to give your group preliminary feedback on the presentation and to identify issues that need to be integrated into the final report.
Presentation Evaluation Form
Media File: Presentation Evaluation Form.doc
Figure 2: This form needs to be filled out by each team member and turned in confidentially to the instructor. Be sure to evaluate each team member, including yourself.
Teamd member Rating Sheet

Content actions

Download module as:

Add module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

Definition of a lens


A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

Who can create a lens?

Any individual member, a community, or a respected organization.

What are tags? tag icon

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks