Tuesday, November 29, 2011

Blog 3

For the third and final blog post I am at the material that has been covered during weeks nine and thirteen. Throughout the four weeks one video really engaged me and that was P.W. Singer’s Military Robots and the Future of War. I guess I never really thought about the ethical nature of how robotic warfare really fell into place. I just always see it as a mind blowing technological advancement that keeps armed forces safe; but never stopped to think about the man behind the machine. One quote that really stuck out to me in P.W. Singer’s video was “…robotics… also changed the experience of the warrior and even the very identity of the warrior. Another way of putting this is mankind’s five thousand year old monopoly on the fighting of war is breaking down in our life time” (Singer). Robotics has really changed the entire battle field and the entire tactical strategies for the modern day warrior. We consistently think of our men and women overseas being shot at (which still does happen I’m not suggesting it doesn’t), but a lot of times the people that they are protecting us from are hundreds of miles away while they can man a robot. It just astounds me.

One interesting thing that I learned from week nine was how we can track internet usage patterns and demographic statistics based on race. I find it fascinating that we can break down such a massive database of articles, news, videos, music, etc. and categorize it by what is being viewed by a specific race. The claims by Kretchmer and Karveth on page 308 in Ethics and Technology about internet usage patterns about African American user seems odd to me. I never really think of specific races being into different activities online. I just figure it’s all one big melting pot; everyone uses Facebook, checks their bank account, pays bills, and watches YouTube. I would like to see and or know how they came to a conclusion of these patterns and demographics. I re-read the chapter and couldn’t find anything pertaining to the certain scientific method behind it. I’m guessing tracking usage for a few months or years on a certain demographic in a controlled environment.

My last question to answer will be “How will you take what your learned this week into your everyday life?” I am not going to look at this specific week but the general course over all. First off this was my first ethics class so please doesn’t judge too harshly but, this course has taught me many valuable things and one of them being how to see the world as a whole from a vastly different vantage point. Most of the topics that have been covered have actually pertained quite accurately towards what I want to do in life; and that’s working with technology. People need to remain ethically sound in a rapidly changing technological environment that is full of loop holes, unmonitored actions, and little law restrictions. It is really up to the user and the population as a whole to keep technology ethically appropriate and more importantly safe for users for years to come.

Works Cited

"PW Singer on Military Robots and the Future of War | Video on TED.com." TED: Ideas worth Spreading. Apr. 2009. Web. 29 Nov. 2011. .

Tavani, Herman T. "The Digital Divide and the Transformation of Work." Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing. 3rd ed. Hoboken, NJ: John Wiley & Sons, 2011. 308-09. Print.

Saturday, November 26, 2011

Reflection Blog 3


           In week these last four weeks, we have covered a wide range of topics.  From the digital divide in week 9, to identity and experience online in week 10.  In week 11, we touched on robotics and the new Sixth Sense technology, and we are finishing up with cross/intercultural ethics.  I believe that week 11 interested me the most, with regards to robotics and artificial intelligence, culminating with P.W. Singer's presentation on Military robots and the future of war.   (http://tinyurl.com/coaemf)
            I think that as a society, and world, as artificial intelligence continues to grow in our lives, we will need to develop new rules for ethics and morals.  In contrast to early readings, where we were introduced to the idea that there are no new ethical issues, I believe that with the advent of advanced robotics in combat and everyday life, we will be faced with heretofore unforeseen issues. (Tavani p. 11)  For example, in his presentation, P.W. Singer notes how the use of drones changes the face of war, and how it affects the soldiers fighting it.  To break this down further, it takes away (some, not all) the harsh gravity of some situations, and replaces them with what can almost be looked at as a video game.  Now we all know about the advent of violent video games, and how in some, players are able to rape and kill people.  This has been deemed acceptable by our society, because its "just a game."  But, I think we must consider the implications of those that are now being exposed to this as children and young adults, and think how it might affect them as they fight our wars in the future. 
            Now, I am not trying to take sides on the whole debate about whether video game violence is good or bad, and how it might make young people not understand what is acceptable or not, or how it might warp their young, malleable minds.  What I am considering though, is that someone who has played violent video games their entire life, may not take the same thoughtfulness into an armed conflict as someone who has been on an actual battlefield, and is now fighting it through the computer screen.  If you have personally experienced something, or even if you have never played a game where you killed another person, your reaction may very well differ from someone who has. 
            These considerations are why I feel that we will face new issues in morality and ethics.  Both for the unmanned, but human controlled drone, and the future autonomous killing machine.  New laws may need to be written to address who is responsible for war crimes, and how they are prosecuted.  If a semi-autonomous robot is supposed to have protocols programmed into it, to not kill civilians, and yet it allows the human operator to target civilians and kill them, who is at fault?  The operator, or the programmer?

Thursday, November 10, 2011

blog 2


Over the last four weeks, we have covered the topics of
privacy and security, hackers, freedom of information and speech and the
digital divide. Tavani defines
informational privacy as privacy is defined as control over the flow of one’s
personal information, including the transfer and exchange of that information. In privacy and cyberspace, it made you realize
that no matter how many protection programs you have in place on your computer,
people can still get all the information about you they want. The government
can “sniff” and make sure what you are emailing is not a government threat. If
you look at a website and enter your email, it can be sold to millions of other
people to send you tons of junk mail. The idea of just looking at one item no
longer exists on the internet.

In the
article Anonymous, it tells us how one group can easily attack a group or
political person in just a few you tube videos. The endless streaming and back
and forth banter between the two groups, can possibly destroy one group while
gaining popularity for another.

Hacking is
another controversy and the counter hacking, was interesting. I could never
fight back against a hacker, but to know that other hackers do this almost
gives you a sense of revenge. When people hack into your information and take
it for there own personal gain is frustrating enough. Knowing that others can
fight back for you and return the “favor” and actually have moral grounds to
stand on, seemed a little double-edged.

Professional
code of conduct is something that I use each day at my job. We handle personal
health information each day. Our IT department places strong program security
programs in place to make sure nobody crashes the firewall and anti virus
programs to not only steal the information but could possibility use it to ruin
a person. Professional codes are put in
place to make sure that this does not happen. In the medial field, you have to
be willing to blow the whistle if you see someone using this information in the
wrong way. It would not be considered to me a moral responsibility but rather
professional, because we have to protect those who cannot protect themselves.

These
last four chapters made me realize that nothing I do on the computer is safe, I
can have all the protection I want and it does not even matter.