Star Trek/Metaphysics

Any ideas or thoughts on this would be appreciated. I've read the textbook and am just overwhelmed with all the information. I'm just not sure how to get started.

Science fiction literature often raises philosophical issues and is a great source for philosophical speculation. This is especially true for the mind/body problem. For example, it is common in science fiction literature to encounter androids. An android is a robot which resembles a human being in appearance and behavior. Examples of androids in science fiction books, television programs or films are numerous (Star Trek, Star Wars, Aleins, Terminator, A.I., I Robot, etc.). In reality many computer scientists are currently working in the area of "artificial intelligence" or machines that can "think for themselves." Many computer scientists believe this is the first step in creating these androids of the future and that in time the distinction between man and machine will be practically erased. These scientists speculate that androids with super-computer brains will have thoughts, beliefs, feelings and desires just like humans. Therefore, some argue, they will also have the same rights, responsibilities, and privileges that all humans have and should be treated as thus. Do you see problems with this view of the future? Do you think machines can ever become persons?

In order to explore this question, let us consider an episode of the popular television series, Star Trek: The Next Generation. I have provided a synopsis.

After reading the synopsis write a substantive response. As you consider your response keep the following questions in mind:

Do you think that artificial intelligence to the level as it is presented in the story will someday be possible? Why or why not?

If AI does become possible, will we have obligations to treat machines "ethically?" Will they have rights? Will turning them off, as Riker does to Data, constitute murder?

What view of the mind/body problem do you think is exhibited by Picard? By Maddox?

Do you think Maddox is right when he claims that Picard is being "irrational and emotional" in his view of Data?

Do you think Picard's argument about slavery is ultimately valid? Why or why not?

Do you agree with the JAG officers final ruling. Why or why not?

© SolutionLibrary Inc. 9836dcf9d7

Solution Preview

... possesses our nature in its entirety.

We could say we are human by natural state because we possess the natural organs, functions, and laws of the human state. If this is taken as a premise then a machine, which is a composite of materials foreign to the natural human state, although they mimic the human state, is still only a shadow of the natural state itself. As such, Data is not entitled to rights as a human, though Data could still be entitled to certain rights proper to his, or it's, being. For example, my pet rabbit is not human, yet it is composed of pretty much the same organs as me. Nonetheless, no one will afford my rabbit the same rights as me, however; that does not mean that my rabbit does not possess a right to live well and free from predators such as my neighbor's dog. As an animal being, its existence is distinct to that of a human. My rabbit does not possess ...