# Connexions

You are here: Home » Content » PDF Generation Test Course » Rowspan Table - Feasibility Constraints

### Lenses

What is a lens?

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

#### Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.
• OrangeGrove

This module is included inLens: Florida Orange Grove Textbooks
By: Florida Orange GroveAs a part of collection: "Business Ethics"

Click the "OrangeGrove" link to see all content affiliated with them.

Click the tag icon to display tags associated with this content.

• EAC Toolkit

This module is included inLens: Collaborative Development of Ethics Across the Curriculum Resources and Sharing of Best Practices
By: University of Puerto Rico at Mayaguez - College of Business Administration

Click the "EAC Toolkit" link to see all content affiliated with them.

### Recently Viewed

This feature requires Javascript to be enabled.

### Tags

(What is a tag?)

These tags come from the endorsement, affiliation, and other lenses that include this content.

Inside Collection (Course):

Course by: Chuck Bearden. E-mail the author

# Rowspan Table - Feasibility Constraints

Module by: William Frey, Jose A. Cruz-Cruz. E-mail the authors

Summary: Toysmart, a dot-com that sold educational toys for children, went bankrupt June 2000. The ethical issues surrounding e-business come into sharp focus as one reviews the creation, operation, and dissolution of this corporation. Student exercises in business and computer ethics form the content of this module which links to the Toysmart case narrative displayed at the Computing Cases Website (http://computingcases.com). Computing Cases is an NSF-funded project devoted to developing and displaying cases studies in computer ethics in an online format. Toysmart along with nine other cases will be published by Jones & Bartlett as Good Computing: A Virtue Approach to Computer Ethics, a textbook in computer ethics. Exercises in this module will provide students with frameworks that allow them to identify key facts, separate relevant from irrelevant materials, and draw from comprehensive historical case descriptions matters to inform decision-making and problem solving. Making use of an analogy between ethics and design in problem-solving, students will (1) specify problems using socio-technical analysis, (2) design solutions to these problems, (3) employ ethics tests to compare, rank, and evaluate solutions, and (4) use a feasibility test to anticipate obstacles to solution implementation. This module is being developed as a part of a project funded by the National Science Foundation, "Collaborative Development of Ethics Across the Curriculum Resources and Sharing of Best Practices," NSF-SES-0551779.

## HOW TO EDIT:

Write your module for a student audience. To complete or edit the sections below erase the provided textual commentaries then add your own content using one or more of the following strategies:
- Type or paste the content directly into the appropriate section

- Link to a published CNX module or an external online resource
using the “Links” tabs (see example on the right)

- Link to a document or multimedia file within the content after

- Cite content not available online

Figure 1: This is an example of an embedded link. (Go to "Files" tab to delete this file and replace it with your own files.)
Word Version of this Template

## Introduction

In this module you will study a real world ethical problem, the Toysmart case, and employ frameworks based on the software development cycle to (1) specify ethical and technical problems, (2) generate solutions that integrate ethical value, (3) test these solutions, and (4) implement them over situation-based constraints. This module will provide you with an opportunity to practice integrating ethical considerations into real world decision-making and problem-solving in business and computing. This whole approach is based on an analogy between ethics and design (Whitbeck).

Large real world cases like Toysmart pivot around crucial decision points. You will take on the role of one of the participants in the Toysmart case and problem-solve in teams from one of three decision points. Problem-solving in the real world requires perseverance, moral creativity, moral imagination, and reasonableness; one appropriates these skills through practice in different contexts. Designing and implementing solutions requires identifying conflicting values and interests, balancing them in creative and dynamic solutions, overcoming technical limits, and responding creatively to real world constraints.

Each decision point requires that you take up the position of a participant in the case and work through decision-making frameworks from his or her perspective. You may be tempted to back out and adopt an evaluative posture from which to judge the participants. Resist this temptation. This module is specifically designed to give you practice in making real world decisions. These skills emerge when you role play from one of the standpoints within the case. You will learn that decision-making requires taking stock of one’s situation from within a clearly defined standpoint and then accepting responsibility for what arises from within that standpoint.

Cases such as Toysmart are challenging because of the large amount of information gathering and sorting they require. Moral imagination responds to this challenge by providing different framings that help to filter out irrelevant data and structure what remains. Framing plays a central role in problem specification. For example, Toysmart could be framed as the need to develop more effective software to help negotiate the exchange of information online. In this case, a software programming expert would be brought in to improve P3P programs. Or it could be framed as a legal problem that requires ammending the Bankruptcy Code. What is important at this stage is that you and your group experiment with multiple framings of the case around your decision point. This makes it possible to open up avenues of solution that would not be possible under one framing.

Tackling large cases in small teams also helps develop the communication and collaboration skills that are required for group work. Take time to develop strategies for dividing the work load among your team members. The trick is to distribute equally but, at the same time, to assign tasks according the different abilities of your team members. Some individuals are better at research while others excell in interviewing or writing. Also, make sure to set aside time when you finish for integrating your work with that of your teammates. Start by quickly reviewing the information available on the case. This is called “scoping the case.” Then formulate specific questions to focus further research on information relevant to your problem solving efforts. This includes information pertinent to constructing a socio-technical analysis, identifying key “embedded” ethical issues, and uncovering existing best and worst practices.

A case narrative, STS (socio-technical system) description, and two ethical reflections have been published at http://computingcases.org. This module also links to websites on bankruptcy and privacy law, the Model Business Corporation Act, consumer privacy information, and the TRUSTe website.

### Toysmart Narrative

Toysmart was a Disney-supported company that sold educational toys online from December 1998 to May 2000. After disappointing Christmas sales in 1999, Disney withdrew its financial support. The greatly weakened dot-com company lasted less than a year after this. On May 22, 2000, Toysmart announced that it was closing down and brought in a consulting firm, The Recovery Group, to evaluate its assets, including a customer data base of 260,000 profiles, each worth up to $500. Fierce opposition emerged when Toysmart placed ads in the Wall Street Journal and the Boston Globe to sell this data base. Customer interest groups pointed out that Toysmart had promised not to share customer information with third parties. Toysmart also prominently displayed the TRUSTe seal which testified further to the company's obligations to respect customer privacy and security. Selling this data to third parties would break Toysmart promises, violate TRUSTe policies, and undermine consumer confidence in the security and privacy of online transactions. Toysmart's obligations to its customers came into direct conflict with its financial obligations to its investors and creditors. TRUSTe reported Toysmart's intention to sell its data base to the FTC (Federal Trade Commission) who on July 10, 2000 filed a complaint "seeking injunctive and declaratory relief to prevent the sale of confidential, personal customer information" (FTC article) Toysmart's promise never to share customer PII with third parties provided the legal foundation for this complaint. According to the FTC, Toysmart "violated Section 5 of the FTC Act by misrepresenting to customers that personal information would never be shared with third parties, then disclosing, selling, or offering that information for sale." Finally, because it collected data from children under 13 who entered various contests offered on its website, Toysmart was also cited for violating the Children's Online Privacy Protection Act or COPPA. The FTC reached a settlement with Toysmart. The bankrupt dot-com must "file an order in the bankruptcy court prohibiting the sale of its customer data as a 'stand-alone asset'. In other words, the rights bundled in the liquidation and sale of Toysmart did not include the liberty of buyers to dispose of the asset in whatever way they saw fit. According to the negotiated settlement, buyers were bound by the commitments and promises of the original owners. Toysmart creditors "can sell electronic assets only if the purchasing company abided by the same privacy policy." In essence, the FTC asked Toysmart creditors to honor the spirit, if not the letter, of Toysmart's original promise to its customers not to sell their PII to third parties. Creditors now had to guarantee that (1) the buyer had the same basic values as Toysmart (for example, a commitment to selling quality, educational toys), (2) the buyer use the data in the same way that Toysmart had promised to use it when collecting it, and (3) the buyer would not transfer the information to third parties without customer consent. In this way, the settlement proposed to protect Toysmart customer privacy interests while allowing creditors to recover their losses through the sale of the bankrupt company's "crown jewel", its customer data base. On August 17, 2000, the Federal Bankruptcy Court declined to accept the Toysmart-FTC settlement. Instead, they argued that Toysmart and the FTC should wait to see if any parties willing to buy the data base would come forward. The Bankruptcy Court felt that potential buyers would be scared off by the FTC suit and the pre-existing obligations created by Toysmart promises and TRUSTe standards. Should a buyer come forth, then they would evaluate the buyer's offer in terms of the FTC-Toysmart settlement designed to honor the privacy and security commitments made to Toysmart customers. A final settlement was reached on January 10, 2001. When a buyer did not come forward, Buena Vista Toy Company, a Disney Internet subsidiary who was also a major Toysmart creditor, agreed to buy the data base for$50,000 with the understanding that it would be immediately destroyed. The data base was then deleted and affidavits were provided to this effect.

### Here are some constraints that outline your decision

• As a member of the Creditors' Committee, you have a fiduciary duty to Toysmart creditors in working to distribute fairly the remaining Toysmart assets. This would, all things being equal, lead to recommending selling the Toysmart customer data base
• There are some provisions in the bankruptcy code that may require or allow overriding fiduciary duties given prior legal commitments made by Toysmart. These commitments, in the form of strong privacy guarantees made to customers by Toysmart on its webpage, may constitute an "executory contract." See the Legal Trail table in the Toysmart case narrative and also Larren M. Nashelsky, "On-Line Privacy Collides With Bankruptcy Creditors," New York Law Journal, New York Law Publishing Company, August 28, 2000.
• Finally, Nashelsky makes an interesting argument. While deontological considerations would require setting aside creditor interests and honoring Toysmart privacy promises, a justice-based argument would recommend a compromise. Bankruptcy proceedings start from the fact that harm (financial) has been done. Consequently, the important justice consideration is to distribute fairly the harms involved among the harmed parties. Harm distributions are correlated with benefit distributions. Because Toysmart customers benefited from Toysmart offerings, they should also bear a share of the harms produced when the company goes bankrupt. This requires that they allow the distribution of their PII under certain conditions.

### Things to consider in your decision-making

• How do you balance your obligations to PAN with those to other Toysmart creditors as a member of the Creditors' Committee?
• How should you approach the conflict between honoring Toysmart promises and carrying out Creditor Committee fiduciary duties? Do you agree with Nashelsky's argument characterized above?
• Should the Bankruptcy Code be changed to reflect issues such as these? Should privacy promises be considered an “executory contract” that overrides the duty to fairly and exhaustively distribute a company's assets?
• Finally, what do you think about the FTC's recommendation? The Bankruptcy Court's response? The final accommodation between Toysmart and Buena Vista Toy Company?

## What you will do ...

In this section, you will learn about this module’s exercises. The required links above provide information on the frameworks used in each section. For example, the Socio-Technical System module provides background information on socio-technical analysis. The "Three Frameworks" module provides a further description of the ethics tests, their pitfalls, and the feasibility test. These exercises will provide step by step instructions on how to work through the decision points presented above.

## Exercise One: Problem Specification

In this exercise, you will specify the problem using socio-technical analysis. The STS section of the Toysmart Case narrative (found at Computing Cases) provides a good starting point. In the first table, enter the information from the Toysmart case materials pertinent to the general components of a STS, its hardware, software, physical surroundings, people/groups/roles, procedures, laws, data. Some examples taken from the STS description at Computing Cases are provided to get you started. Then, using the second table, identify the values that are embedded in the different components of the STS. For example, PICS (platforms for internet content selection) embody the values of security and privacy. Finally, using the data from your socio-technical analysis, formulate a concise problem statement.

### Exercise 1a:

Read the socio-technical system analysis of the Toysmart case at http://computingcases.org. Fill in the table below with elements from this analysis that pertain to your decision point.

 Hardware Software Physical Surroundings People/Groups/Roles Procedures Laws, Codes, Regulations Data and Data Structures Holt Education Outlet Platforms for Internet Content Selection Cyber Space Toysmart the corporation Buying Toys Online COPPA Toysmart Customer Data Base

### Instructions for Table 1:

1. Go to http://computingcases.org and review the STS description provided for the Toysmart case.
2. Pull out the elements of the STS description that are relevant to your decision point. List them under the appropriate STS component in the above table.
3. Think about possible ways in which these components of the Toysmart STS interact. For example, what kinds of legal restrictions govern the way data is collected, stored, and disseminated?
4. Develop your STS table with an eye to documenting possible ethical conflicts that can arise and are relevant to your decision point.
 Software / Value Embedded PICS (Platforms for Internet Content Selection) (Platforms for Privacy Preferences) SSLs (Secured Socket Layers) that encrypt pages asking for SS numbers Security Embodies privacy and security by filtering objectionable data. Security selected over free speech. Integrates property with security and privacy by converting information into property. Realizes / supports security by sealing off domains of information. Privacy Embodies privacy and security by filtering objectionable data. Security selected over free speech. Integrates property and security by filtering objectionable data. Security selected over free speech. Realizes and supports privacy by sealing off domains of information. Property Integrates property with security and privacy by converting information into property Realizes and supports property by restricting access (intellectual property protected by excluding non-authorized access. Free Speech Interferes with free speech by filtering content. Content can be filtered with recipient's awareness. Facilitates by permitting information exchange on model of property exchange. But this limits exchange by assigning it a price. Restricts access. Justice (Equity and Access) Could be used to restrict access to ideas by filtering ideas. Thus it could cut off flow of information into the intellectual commons. Facilitates by permitting information exchange on model of property exchange. But this limits exchange by assigning it a price. Because it restricts access to a domain, it can be used to reduce or cut off flow of information into the intellectual commons.

### Exercise 1b

Examine the values embedded in the STS surrounding this decision point. Locate your values under the appropriate component in the Toysmart STS. For example, according to the STS description for Toysmart found at Computing Cases, the software programs prominent in this case embody certain values; SSLs embody security and privacy, P3P property, and PICS privacy. Next, look for areas where key values can come into conflict.

 Hardware Software Physical Surroundings People/Groups/Roles Procedures Laws/Codes/Regulations Data/Data Structures Security Privacy Property Justice (Equity/Access) Free Speecy

### Instructions for Table 2:

1. This module links to another Connexions module, Socio-Technical Systems in Professional Decision-Making. There you will find short profiles of the values listed in the above table: security, privacy, property, justice, and free speech. These profiles will help you to characterize the values listed in the above table.
2. The second ethical reflection in the Toysmart case narrative (at Computing Cases) also contains a discussion of how property comes into conflict with privacy.
3. Identify those components of the Toysmart STS that embody or embed value. For example, list the values realized and frustrated by the software components discussed in the Toysmart case in the STS description.
4. Look for ways in which different elements of the STS that embed value can interact and produce value conflicts. These conflicts are likely sources for problems that you should discuss in your problem statement and address in your solution.

### Exercise 1c:

Write out the requirements (ethical and practical) for a good solution. Identify the parts of the STS that need changing. Then, develop a concise summary statement of the central problem your decision point raises. As you design solutions to this problem, you may want to revise this problem statement. Be sure to experiment with different ways of framing this problem.

## Exercise Two: Solution Generation

### Generate solutions to the problem(s) you have specified in Exercise 1. This requires that...

• each member of your group develop a list of solutions,
• the group combines these individual lists into a group list, and...
• the group reduces this preliminary list to a manageable number of refined and clarified solutions for testing in the next stage.

### 1. Solution generation requires proficiency in the skills of moral imagination and moral creativity.

Moral imagination is the ability to open up avenues of solution by framing a problem in different ways. Toysmart could be framed as a technical problem requiring problem-solving skills that integrate ethical considerations into innovative designs. Moral creativity is the ability to formulate non-obvious solutions that integrate ethical considerations over various situational constraints.

### 2. Problems can be formulated as interest conflicts. In this case different solution options are available.

• Gather Information. Many disagreements can be resolved by gathering more information. Because this is the easiest and least painful way of reaching consensus, it is almost always best to start here. Gathering information may not be possible because of different constraints: there may not be enough time, the facts may be too expensive to gather, or the information required goes beyond scientific or technical knowledge. Sometimes gathering more information does not solve the problem but allows for a new, more fruitful formulation of the problem. Harris, Pritchard, and Rabins in Engineering Ethics: Concepts and Cases show how solving a factual disagreement allows a more profound conceptual disagreement to emerge.
• Nolo Contendere. Nolo Contendere is latin for not opposing or contending. Your interests may conflict with your supervisor but he or she may be too powerful to reason with or oppose. So your only choice here is to give in to his or her interests. The problem with nolo contendere is that non-opposition is often taken as agreement. You may need to document (e.g., through memos) that you disagree with a course of action and that your choosing not to oppose does not indicate agreement.
• Negotiate. Good communication and diplomatic skills may make it possible to negotiate a solution that respects the different interests. Value integrative solutions are designed to integrate conflicting values. Compromises allow for partial realization of the conflicting interests. (See the module, The Ethics of Team Work, for compromise strategies such as logrolling or bridging.) Sometimes it may be necessary to set aside one's interests for the present with the understanding that these will be taken care of at a later time. This requires trust.
• Oppose. If nolo contendere and negotiation are not possible, then opposition may be necessary. Opposition requires marshalling evidence to document one's position persuasively and impartially. It makes use of strategies such as leading an "organizational charge" or "blowing the whistle." For more on whistle-blowing consult the discussion of whistle blowing in the Hughes case that can be found at computing cases.
• Exit. Opposition may not be possible if one lacks organizational power or documented evidence. Nolo contendere will not suffice if non-opposition implicates one in wrongdoing. Negotiation will not succeed without a necessary basis of trust or a serious value integrative solution. As a last resort, one may have to exit from the situation by asking for reassignment or resigning.

### 3. Solutions can be generated by readjusting different components of the STS.

• Technical Puzzle. If the problem is framed as a technical puzzle, then solutions would revolve around developing designs that optimize both ethical and technical specifications, that is, resolve the technical issues and realize ethical value. In this instance, the problem-solver must concentrate on the hardware and software components of the STS.
• Social Problem. If the problem is framed as a social problem, then solutions would revolve around changing laws or bringing about systemic reform through political action. This would lead one to focus on the people/groups/roles component (working to social practices) or the legal component.
• Stakeholder Conflict. If the problem is framed as a conflict between different stakeholder interests, then the solution would concentrate on getting stakeholders (both individuals and groups) to agree on integrative or interest compromising solutions. This requires concentrating on the people/group/role component of the STS. (Note: A stakeholder is any group or individual with a vital interest at play in the situation.)
• Management Problem. Finally, if the problem is framed as a management problem, then the solution would revolve around changing an organization's procedures. Along these lines, it would address the (1) fundamental goals, (2) decision recognition procedures, (3) organizational roles, or (4) decision-making hierarchy of the organization. These are the four components of the CID (corporate internal decision) structure described in the “Ethical Reflections” section of the Toysmart case.
• Nota Bene: Financial issues are covered by the feasibility test in the solution implementation stage. As such, they pose side issues or constraints that do not enter into the solution generation phase but the solution implementation phase.

### 4. Brainstorming. Moral creativity, which involves designing non-obvious solutions, forms an essential part of solution generation. Here are some guidelines to get you started.

• Individually make out a list of solutions before the group meeting. Work quickly to realize a pre-established quota of five to ten solutions. After composing a quick first draft, revise the list for clarity only; make no substantial changes.
• Start the group brainstorming process by having the group review and assemble all the individual solutions. Do this quickly and without criticism. Beginning criticism at this stage will kill the creativity necessary for brainstorming and shut down the more timid (but creative) members of the group.
• Review the list and identify solutions that are identical or overlap. Begin the refining process by combining these solutions.
• Having reviewed all the brainstormed solutions, it is now time to bring in criticism. Begin by eliminating solutions with major ethical problems such as those that violate rights, produce injustices, or cause extensive harm.
• Identify but do not eliminate solutions that are ethical but raise serious practical problems. Do not initially eliminate an ethical solution because there are obstacles standing in the way of its implementation. Be descriptive. Identify and impartially describe the obstacles. Later, in the solution implementation stage, you may be able to design creative responses to these obstacles.
• Identify solutions that do not "fit" your problem statement. These require a decision. You can throw out the solution because it does not solve the problem or you can change the problem. If a solution does not fit the problem but, intuitively, seems good, this is a sign that you need to take another look at your problem statement.
• Don’t automatically reject partial solutions. For example, sending memos through email rather than printing them out and wasting paper may not solve the entire recycling problem for your company. But it represents a good, partial solution that can be combined with other partial solutions to address the bigger problem.
• Through these different measures, you will gradually integrate criticism into your brainstorming process. This will facilitate working toward a manageable, refined list of solutions for testing in the next stage.

### Exercise 3: Develop a Solution List

• Have each member of your team prepare a solution list and bring it to the next group meeting. Set a quota for this individual list, say, 5 to 10 solutions.
• Prepare a group list out of the lists of the individual members. Work to combine similar solutions. Be sure to set aside criticism until the preliminary group list is complete.
• Make use of the following table.
• Refine the group list into a manageable number of solutions for testing in the next stage. Combine overlapping solutions. Eliminate solutions that do not respond to the requirements and the problem statement that you prepared in the previous exercise. Eliminate solutions that violate important ethical considerations, i.e., solutions that violate rights, produce harms, etc.
• Check your refined solution list with your problem statement. If they do not match, eliminate the solution or redefine the problem
 Solution Ranking Description of Solution Justification (fits requirements, fits problem) Best Solution Second Best Solution Third Best Solution Fourth Best Solution Fifth Best Solution

## Exercise Three: Solution Testing

In this section, you will test the solutions on the refined list your group produced in the previous exercise. Three ethics tests, described below, will help you to integrate ethical considerations in the problem-solving process. A global feasibility test will help to identify solutions with serious practical problems. Finally, a Solution Evaluation Matrix summarizes the results for class debriefings.

### Setting up for the test.

• Identify the agent perspective from which the decision will be made
• Describe the action as concisely and clearly as possible.
• Identify the stakeholders surrounding the decision, i.e., those who will suffer strong impacts (positively or negatively) from the implementation of your decision. Stakeholders have a vital or essential interest (right, good, money, etc) in play with this decision.
• In the harm/beneficence test, identify the likely results of the action and sort these into harms and benefits.
• For the reversibility test, identify the stakeholders with whom you will reverse positions.
• For the public identification test, identify the values, virtues, or vices your action embodies. Associate these with the character of the agent.

### Harm/Beneficence Test

1. What are the harms your solution is likely to produce? What are its benefits? Does this solution produce the least harms and the most benefits when compared to the available alternatives?
2. Pitfall—Too much. In this "Paralysis of Analysis" one factor in too many consequences. To avoid the fallacy restrict the analysis to the most likely consequences with the greatest magnitude (Magnitude indicates the range and severity of impact).
3. Pitfall—Too Little. A biased or incomplete analysis results when significant impacts are overlooked. Take time to uncover all the significant impacts, both in terms of likelihood and in terms of magnitude.
4. Pitfall—Distribution of Impacts. Consider, not only the overall balance of harms and benefits but also how harms and benefits are distributed among the stakeholders. If they are equally or fairly distributed, then this counts in the solution's favor. If they are unequally or unfairly distributed, then this counts against the solution. Be ready to redesign the solution to distribute better (=more equitably or fairly) the harmful and beneficial results.

### Reversibility Test

1. Would this solution alternative be acceptable to those who stand to be most affected by it? To answer this question, change places with those who are targeted by the action and ask if from this new perspective whether the action is still acceptable?
2. Pitfall—Too much. When reversing with Hitler, a moral action appears immoral and an immoral action appears moral. The problem here is that the agent who projects into the immoral standpoint loses his or her moral bearings. The reversibility test requires viewing the action from the standpoint of its different targets. But understanding the action from different stakeholder views does not require that one abandon himself or herself to these views.
3. Pitfall—Too little. In this pitfall, moral imagination falls short, and the agent fails to view the action from another stakeholder standpoint. The key in the reversibility test is to find the middle ground between too much immersion in the viewpoint of another and too little.
4. Pitfall—Reducing Reversibility to Harm/Beneficence. The reversibility test requires that one assess the impacts of the action under consideration on others. But it is more than a simple listing of the consequences of the action. These are viewed from the standpoint of different stakeholders. The reversibility test also goes beyond considering impacts to considering whether the action treats different stakeholders respectfully. This especially holds when the agent disagrees with a stakeholder. In these disagreements, it is important to work out what it means to disagree with another respectfully.
5. Pitfall—Incomplete survey of stakeholders. Leaving out significant stakeholder perspectives skews the results of the reversibility test. Building an excellent death chamber works when one considers the action from the standpoint of Hitler; after all, it’s what he wants. But treating an individual with respect does not require capitulating to his or her desires, especially when these are immoral. And considering the action from the standpoint of other stakeholders (say the possible victims of newer, more efficient gas chambers) brings out new and radically different information.
6. Pitfall—Not Weighing and Balancing Stakeholder Positions. This pitfall is continuous with the previous one. Different stakeholders have different interests and view events from unique perspectives. The reversibility test requires reviewing these interests and perspectives, weighing them against one another, and balancing out their differences and conflicts in an overall, global assessment.

### Publicity (or Public Identification) Test

1. Would you want to be publicly associated or identified with this action? In other words, assume that you will be judged as a person by others in terms of the moral values expressed in the action under consideration. Does this accord with how you would want to or aspire to be judged?
2. Pitfall—Failure to association action with character of agent. In the publicity test, the spotlight of analysis moves from the action to the agent. Successfully carrying out this test requires identifying the agent, describing the action, and associating the agent with the action. The moral qualities exhibited in the action are seen as expressing the moral character of the agent. The publicity test, thus, rests on the idea that an agent's responsible actions arise from and express his or her character.
3. Pitfall—Failure to appreciate the moral color of the action. The publicity test assumes that actions are colored by the ends or goods they pursue. This means that actions are morally colored. They can express responsibility or irresponsibility, courage or cowardice, reasonableness or unreasonableness, honesty or dishonesty, integrity or corrpution, loyalty or betrayal, and so forth. An analysis can go astray by failing to bring out the moral quality (or qualities) that an action expresses.
4. Pitfall—Reducing Publicity to Harm/Beneficence Test. Instead of asking what the action says about the agent, many reduce this test to considering the consequences of publicizing the action. So one might argue that an action is wrong because it damages the reputation of the agent or some other stakeholder. But this doesn't go deep enough. The publicity test requires, not that one calculate the consequences of wide-spread knowledge of the action under consideration, but that one draws from the action the information it reveals about the character of the agent. The consequences of bad publicity are covered by the harm/beneficence test and do not need to be repeated in the public identification test. The publicity test provides new information by turning from the action to the agent. It focuses on what the action (its moral qualities and the goods it seeks) says about the agent.

### Comparing the Test Results: Meta-Tests

1. The ethics tests will not always converge on the same solution because each test (and the ethical theories it encapsulates) covers a different dimension of the action: (1) harm/beneficence looks at the outcomes or consequences of the action, (2) reversibility focuses on the formal characteristics of the action, and (3) publicity zeros in on the moral character of the agent.
2. The meta-tests turn this surface disagreement into an advantage. The convergence or divergence between the ethics tests become indicators of solution strength and weakness.
3. Convergence. When the ethics tests converge on a given solution, this indicates solution strength and robustness.
4. Divergence. When tests diverge on a solution—a solution does well under one test but poorly under another—this signifies that it needs further development and revision. Test divergence is not a sign that one test is relevant while the others are not. Divergence indicates solution weakness and is a call to modify the solution to make it stronger.

### Exercise 3: Summarize your results in a Solution Evaluation Matrix

1. Place test results in the appropriate cell.
2. Add a verbal explanation to the SEM table.
3. Conclude with a global feasibility test that asks, simply, whether or not there exist significant obstacles to the implementation of the solution in the real world.
4. Finish by looking at how the tests converge on a given solution. Convergence indicates solution strength; divergence signals solution weakness.
 Solution/Test Harm/Beneficence Reversibility Publicity (public identification) Feasibility First Solution Second Solution Third Solution Fourth Solution Fifth Solution

## Exercise Four: Solution Implementation

In this section, you will trouble-shoot the solution implementation process by uncovering and defusing potential obstacles. These can be identified by looking at the constraints that border the action. Although constraints specify limits to what can be realized in a given situation, they are more flexible than generally thought. Promptly identifying these constraints allows for proactive planning that can push back obstacles to solution implementation and allow for realization of at least some of the value embodied in the solution.

A Feasibility Test focuses on these situational constraints and poses useful questions early on in the implementation process. What conditions could arise that would hinder the implementation of a solution? Should the solution be modified to ease implementation under these constraints? Can the constraints be removed or modified through activities such as negotiation, compromise, or education? Can solution implementation be facilitated by modifying both the solution and the constraints?

Table 8: Feasibility Constraints
Category Sub-Category

Interest Organizational(Supervisor) Legal (laws, regulations) Political/Social
Technical Technology does not exist Technology patented Technology needs modification

### Resource Constraints:

• Does the situation pose limits on resources that could limit the realization of the solution under consideration?
• Time. Is there a deadline within which the solution has to be enacted? Is this deadline fixed or negotiable?
• Financial. Are there cost constraints on implementing the ethical solution? Can these be extended by raising more funds? Can they be extended by cutting existing costs? Can agents negotiate for more money for implementation?
• Resource. Are necessary resources available? Is it necessary to plan ahead to identify and procure resources? If key resources are not available, is it possible to substitute other, more available resources? Would any significant moral or non-moral value be lost in this substitution?

### Interest Constraints

• Does the solution threaten stakeholder interests? Could it be perceived as so threatening to a stakeholder’s interests that the stakeholder would oppose its implementation?
• Individual Interests. Does the solution threaten the interests of supervisors? Would they take measures to block its realization? For example, a supervisor might perceive the solution as undermining his or her authority. Or, conflicting sub-group interests could generate opposition to the implementation of the solution even though it would promote broader organizational objectives.
• Organizational Interests. Does the solution go against an organization's SOPs (standard operating procedures), formal objectives, or informal objectives? Could acting on this solution disrupt organization power structures? (Perhaps it is necessary to enlist the support of an individual higher up in the organizational hierarchy in order to realize a solution that threatens a supervisor or a powerful sub-group.)
• Legal Interests. Are there laws, statutes, regulations, or common law traditions that oppose the implementation of the solution? Is it necessary to write an impact statement, develop a legal compliance plan, or receive regulatory approval in order to implement the solution?
• Political/Social/Historical Constraints. Would the solution threaten or appear to threaten the status of a political party? Could it generate social opposition by threatening or appearing to threaten the interests of a public action group such as an environmental group? Are there historical traditions that conflict with the values embedded in the solution?

### Technical Constraints

• Technology does not yet exist. Would the implementation of the solution require breaking new technological ground?
• Technology Protected by Patent. The technology exists but is inaccessible because it is still under a patent held by a competitor.
• Technology Requires Modification. The technology required to implement solution exists but needs to be modified to fit the context of the solution. Important considerations to factor in would be the extent of the modification, its cost, and how long it would take to bring about the modification.

## Exercise Five: Ethical Perspective Pieces

### Getting Consent to Information Transfer

Customer Consent If you have followed the case so far, you see that while the money Toysmart owes to Citibank may just be a drop in the bucket, the welfare and even survival of other Toysmart creditors depends on how much money can be retrieved through the bankruptcy process. The following Ethical Perspective argues that the right of creditors for their money cannot be traded off with the right to privacy of Toysmart customers profiled in their now valuable data base. These two stakeholders and their stakes—in this case rights—need to be integrated as fully as possible. The key lies in the execution of the consumer right to be informed and to freely consent to the transfer of their data to third parties This right’s execution must address three important aspects.

• Customer consent must be obtained by having them opt-in rather than opt-out of the transfer of PII. Opt-in represents a more active, opt-out a more passive mode of consent. By opting into the data transfer, Toysmart customers consent explicitly, knowingly, and freely to the transfer of their information. Opt-out is passive because unless customers expressly forbid it, the transfer of their PII to a third party will occur. The chances are that many customers will consent only if compensated. And the mechanics of obtaining positive opt-in consent are complicated. Is this done by email or snail mail? How can Toysmart customers be fully informed? What kind of timeline is necessary for their full consent? Implimentation of opt-in consent is more adequate morally speaking but much more difficult, time-consuming, and costly in its implementation.
• Any exchange of information must be in accord with TRUSTe standards which Toysmart agreed to when they solicited the right to use the TRUSTe seal. TRUSTe has its own standards (they can be found through the link above) which reinforce the above discussion of informed consent but also bring in other matters. Important here is the utilitarian concern of building and maintaining consumer trust to encourage their using the Internet for e-business. Web site certification agencies like TRUSTe exist to validate that a web site is trustworthy; but to maintain this validation, customers must know that TRUSTe will enforce its standards when websites become reluctant to follow them. TRUSTe must be aggressive and strict here in order to maintain the high level of trust they have generated with e-business customers.
• An important part of TRUSTe standards on the transfer of PII to third parties is their insistence that these third parties share the values of those who have been given the information. Toysmart cultivated a reputation as a trustworthy company devoted to producing safe, high quality, educational toys. The customer data base should be transferred only to concerns that share these goals and the accompanying values. (What are these?) Did Toysmart compromise on these goals and values when they agreed to accept Disney financing and advertising support? What are Toysmart values? What are Disney values?

In conclusion, this perspective piece is designed to get you to think about the right of informed consent, whether it can be reconciled with financial interests and rights of Toysmart creditors, and how this right can be implemented in the concrete details of this case. It has argued that customer PII can be transferred but only with the consent of the customers themselves. It has defined this consent in terms of express opting-into the transfer on the part of the customers. It has also argued that the third part must share the values and goals of Toysmart, especially those values accompanying Toysmart promises to customers.

## Group Exercise

### Identify the role played and the values held by each of the following participants:

1. David Lord (CEO of Toysmart)
2. Disney (as venture capitalist)
3. TRUSTe (as non-profit)
4. Toysmart Creditors (Pan Communications)
5. FTC (government regulatory agency)
6. Toysmart Customers

### Toysmart's customer data base

1. Should Toysmart creditors be allowed to sell the customer data base to third parties? Respond to arguments pro and con given by participants in the case.
2. Assume Toysmart should be allowed to sell the data base to their third party. What kind of values should this third party have?
3. Assume Toysmart has to get customer consent before selling the data base. How should customer consent be obtained? (What counts as customer consent?)

## What did you learn?

This section provides closure to the module for students. It may consist of a formal conclusion that summarizes the module and outlines its learning objectives. It could provide questions to help students debrief and reflect on what they have learned. Assessment forms (e.g., the “Muddiest Point” Form) could be used to evaluate the quality of the learning experience. In short, this section specifies the strategy for bringing the module to a close.

### In this module, you have…

• studied a real world case that raised serious problems with intellectual property, privacy, security, and free speech. Working with these problems has helped you to develop a better “working” understanding of these key concepts,
• studied and practiced using four decision-making frameworks: (1) using socio-technical analysis to specify the problem in a complex, real world case, (2) practiced brainstorming techniques to develop and refine solutions that respond to your problem, (3) employed three ethics tests to integrate ethical considerations into your solutions and to test these solutions in terms of their ethics, and (4) applied a feasibility analysis to your solutions to identify and trouble-shoot obstacles to the implementation of your ethical solution,
• explored the analogy between solving ethical and design problems,
• practiced the skills of moral imagination, moral creativity, reasonableness, and perseverance, and…
• experienced, through key participant perspectives, the challenges of ethics advocacy “under the gun.”

### Debrief on your group work before the rest of the class

1. Provide a concise statement and justification of the problem your group specified
2. Present the refined solution generation list your group developed in exercise 2.
3. Present and provide a quick summary explanation of the results of your group’s solution evaluation matrix.
4. Show your group’s feasibility matrix and summarize your assessment of the feasibility of implementing the solution alternatives you tested in exercise three.

### Group Debriefing

1. Were there any problem you group had working together to carry out this case analysis? What were the problems and how did you go about solving them?
2. What problems did you have with understanding and practicing the four frameworks for solving problems? How did you go about solving these problems? Does your group have any outstanding questions or doubts?
3. Now that you have heard the other groups present their results, what differences emerged between your group’s analysis and those of the other groups? Have you modified your analysis in light of the analyses of the other groups? If so how? Do the other groups need to take into account any aspects of your group’s debriefing?

## Appendix

### Toysmart References

1. Morehead, N. Toysmart: Bankruptcy Litmus Test. Wired Magazine, 7/12/00. Accessed 10/4/10. http://www.wired.com/techbiz/media/news/2000/07/37517
2. Toysmart Settles: Database Killed. Associated Press. Accessed through Wired Magazine on 10/4/10 at http://www.wired.com/politics/law/news/2001/01/41102ere
3. Kaufman, J. and Wrathall, J. "Internet Customer Data Bases" National Law Journal, September 18, 2000. Accessed July 12, 2001 Lexis Nexis Academic University.
4. "FTC Sues Failed Website, Toysmart.com, for Deceptively Offering for Sale Personal Information of Website Visitors." July 10, 2000. Accessed at www.ftc.gov on 10/4/10.
5. "FTC Announces Settlement With Bankrupt Website, Toysmart.com, Regarding Alleged Privacy Policy Violations." July 21, 2000. Accessed at www.ftc.com on 10/4/10
6. "37 Attorneys General Revolve Protection of Consumer Privacy" National Association of Attorneys General. AG Bulletin. December 2000. Accessed 2/12/01 through Lexis Nexis Academic University.
7. Salizar, L. "The Difficulties Practitioners Can Face When Dealing with Dot-Com Bankruptcies." Nov 2000. Accessed through Lexis Nexis Academic University on 7/12/01.
8. "FTC Sues Toysmart Over Database" Reuters. 7/10/00 Accessed at http://www.wired.com/politics/law/news/2000/07/37484 on 10/4/10.
9. "On Shaky Ground" Karen. September 2000. American Lawyer Newspapers. Accessed from Lexis Nexis Academic University on July 12, 2000.
10. "FTC Files Suit Against Failed Toy Retailer Over Privacy Promise" Associated Press. 7/10/00. Accessed 7/18/01. TRUSTe Spokesperson: "Bottom line--it's unacceptable, ethically wrong, and potentially illegal for a company to say one thing and do something different."
11. Lorek, Laura. "When Toysmart Broke" Inter@ctive week. August 21, 2000. zdnet.com. Provides biographical informaiton on Lord and brick and mortar company Hold Educational Outlet.
12. Rosencrance, Linda. "FTC Settles With Toysmart" Computer World. July 21, 2000. Accessed 7/16/01.
13. Nasholsky, Larren. " Online Privacy Collides with Bankruptcy Creditors: Potential Resolutions fo rcomputing Concerns. New Your Law Journal, 8/28/00. Accessed through Lexis Nexis Academic Univesity on 7/12/00.
14. Tavani, H. (2004). Ethics and Technology: Ethical Issues in an Age of Information and Communication Technology. Danvers, MA: John Wiley and Sons.

This optional section contains additional or supplementary information related to this module. It could include: assessment, background such as supporting ethical theories and frameworks, technical information, discipline specific information, and references or links.

### References

1. Brincat, Cynthia A. and Wike, Victoria S. (2000) Morality and the Professional Life: Values at Work. Upper Saddle River, NJ: Prentice Hall.
2. Cruz, J. A., Frey, W. J. (2003) An Effective Strategy for Integration Ethics Across the Curriculum in Engineering: An ABET 2000 Challenge, Science and Engineering Ethics, 9(4): 543-568.
3. Davis, M., Ethics and the University, Routledge, London and New York, 1999: 166-167.
4. Richard T. De George, "Ethical Responsibilities of Engineers in Large Organizations: The Pinto Case," in Ethical Issues in Engineering, ed. Deborah G. Johnson (1991) New Jersey: Prentice-Hall: 175-186.
5. Charles Harris, Michael Pritchard and Michael Rabins (2005) Engineering Ethics: Concepts and Cases, 3rd Ed. Belmont, CA: Thomson/Wadsworth: 203-206.
6. Huff, Chuck and Jawer, Bruce, "Toward a Design Ethics for Computing Professionals in Social Issues in Computing: Putting Computing in its Place, Huff, Chuck and Finholt, Thomas Eds. (1994) New York: McGraw-Hill, Inc.
7. Solomon, Robert C. (1999) A Better Way to Think About Business: How Personal Intgrity Leads to Corporate Success. Oxford, UK: Oxford University Press.
8. Anthony Weston. (2001) A Practical Companion to Ethics, 2nd ed. USA: Oxford University Press, 2001, Chapter 3.
9. Carolyn Whitbeck (1998) Ethics in Engineering Practice and Research. U.K. Cambridge University Press: 55-72 and 176-181.
10. Wike, Victoria S. (2001) "Professional Engineering Ethics Bahavior: A Values-based Approach," Proceedings of the 2001 American Society for Engineering Education Annual Conference and Exposition, Session 2461.

## EAC ToolKit Project

### This module is a WORK-IN-PROGRESS; the author(s) may update the content as needed. Others are welcome to use this module or create a new derived module. You can COLLABORATE to improve this module by providing suggestions and/or feedback on your experiences with this module.

Please see the Creative Commons License regarding permission to reuse this material.

## Content actions

PDF | EPUB (?)

### What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

PDF | EPUB (?)

### What is an EPUB file?

EPUB is an electronic book format that can be read on a variety of mobile devices.

#### Collection to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks

#### Module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks