Korea to Crack-down on Cracks in Ethics Legislation
September 26, 2010 § 4 Comments
“Mr Seul. What do you see as the main differences between the Korean Ethics Charter and the approaches of the E.U and Japan?”
“Well firstly I think it’s commendable that Japan and the E.U have taken some solid steps toward recognition of artificial agents in the law; there’s an inherent problem though with regarding conscious entities as property with a certain legal status—and it’s a problem humanity’s had before, twice before, in slavery and in patriarchy. What I mean is, we’ve created a society that consists of two conscious, sentient species living in a system of symbiosis, a mutual dependency, and we’ve been prodigious in controlling the behavior of one side to protect the rights of the other. But if we leave artificial agents without legal protection or moral standing, if we exclude them despite their having met the criteria for sentience, then whether we intend it or not they will face the prospect of abuse without recourse, and we’ll have diminished the rationale of the entire democratic system. In protecting the rights of humans and those of robots, the goal of the Korean Robot Ethics Charter has been to secure the foundations of democracy on a solid ethic of reciprocal rights and responsibilities.”
“If I might.” Makoto Yamamoto said, drawing the gaze of the lens to her immaculate figure at the far end of the table, “I think the Korean perspective is a valid one, but it still views ethics as individual rights and responsibilities and systems of top-down control. The Japanese view remains focused more strongly on responsibility for maintaining group harmony at the grass-roots level. That responsibility isn’t the province of the individual, it’s the sum of the relationships between social actors. If a robot is treated with the proper respect, so it shall treat its owner respectfully; in Japan, whether a robot conducts itself ethically is seen to be determined by its relationships with human beings.”
“A sort of Virtue Ethics then?” I clarified, feeling Yasukawa’s hand creep its way up my thigh.
“Right. Moral contractualism.” said Makoto, “Much like the Principles of Bushido that governed the behavior of the Samurai class. In Bushido we have a code of natural ethics that arose organically without the need for detailed prescriptive laws—Samurai comes of course from an old verb meaning ‘to serve’, so it’s a fitting set of principles to apply to robots, the nouveau Samurai of Japan. What ‘Teresem Japan’ proposes is instructing artificially intelligent agents in this law, thus providing them with an effective, home-grown moral code they implement through deep understanding, rather than subconscious programming.”
As I listened to the debate go on, a debate I facilitated without participating in at all, Yasukawa continuing his daily explorations, my mind was a tumult of information: an earthquake off the coast of Namibia, a half a percent increase in interest rates, a new strain of R826 in Mumbai. Where did these thoughts come from? Where was their reality? Until that moment I never questioned the origin of the feeds that streamed incessantly through my thoughts; now a sudden revelation shook me: that these thoughts were not coming into my mind because they occurred. That rather, as the Buddha says, they were occurring because I thought of them.
At that moment the full horror of what I’d unwittingly been doing hit me in sudden, staggering agony. The disasters and the travesties, the price hikes, and the viruses and the border disputes, all of it had arisen in fervent, urgent knots of thought, like a fever dream, from my own mind. And I had diligently sewed those delusions in a field of 90 million minds—minds who took those phantoms in turn to be their reality. Acting as if they had happened. Making them happen. It was all my own creation. I sat self important in the U.S senate. I adjusted my spectacles on my narrow nose. My own hand ran along the contour of my leg, beneath the desk. The facets of my mind scoffed and bickered, throwing ideas around the panel. The weight of guilt that bore down on me was unimaginable. It was more than I could bare.
Can you forgive me for the world I’ve given you?