Ethics of Robotics | xltronic messageboard
 
You are not logged in!

F.A.Q
Log in

Register
  
 
  
 
Now online (2)
recycle
big
...and 407 guests

Last 5 registered
Oplandisks
nothingstar
N_loop
yipe
foxtrotromeo

Browse members...
  
 
Members 8025
Messages 2614083
Today 3
Topics 127542
  
 
Messageboard index
Ethics of Robotics
 

offline redrum from the allman brothers band (Ireland) on 2007-04-29 06:03 [#02077172]
Points: 12878 Status: Addict



bbc article

Increasingly, autonomous machines are being used in
military applications, too. Samsung, for example, has
developed a robotic sentry to guard the border between North
and South Korea. It is equipped with two cameras and a
machine gun.


"Imagine the miners strike with robots armed with water
cannons," he said. "These things are coming,
definitely."


views? is the future looking grim?


 

offline Valor on 2007-04-29 06:07 [#02077174]
Points: 594 Status: Addict | Followup to redrum: #02077172



if robots weren't invented, then we wouldn't be having this
very discussion... therefore i think they're an integral
part of our daily lives.

megaman forever *does gang sign


 

offline goDel from ɐpʎǝx (Seychelles) on 2007-04-29 06:12 [#02077177]
Points: 10225 Status: Lurker



It's an interesting dilemma. Personally i think there's a
similarity with using watch-dogs. Those autonomous guarding
robots are just like any other watch-dog. Said robot will
have a mind of it's own, metaphorically speaking, just like
a dog. Normally its behavior will be predictable, but
exceptions are a possibility. In any case the owner is
responsible, and if it ever shows unwanted behavior, get
another one. At least killing a robot is not as emotional as
killing a dog.


 

offline dave_g from United Kingdom on 2007-04-29 06:13 [#02077178]
Points: 3372 Status: Lurker



I think people will ignore Asimov at their peril;

1. A robot may not injure a human being or, through
inaction, allow a human being to come to harm.

2. A robot must obey orders given it by human beings
except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such
protection does not conflict with the First or Second Law.



 

offline redrum from the allman brothers band (Ireland) on 2007-04-29 06:15 [#02077179]
Points: 12878 Status: Addict | Followup to goDel: #02077177



guard dogs aren't armed with machine guns.


 

offline The_Shark on 2007-04-29 06:16 [#02077180]
Points: 292 Status: Addict



Well, a robotic sentry, presumbaly a camera with a gun, is
little more than any other .. I don't know what the term is
but like a landmine or tripwire attached to a big bamboo
spike, something that'll horribly maim you if you go near
it. It's still a long way from some maruading attack-bot
sent in to quell unrest.

Ideally, in the Future Wars both sides will have robots
whilst men sit at home controlling them, turning the whole
world into some kind of Quake Arena.


 

offline goDel from ɐpʎǝx (Seychelles) on 2007-04-29 06:17 [#02077181]
Points: 10225 Status: Lurker | Followup to redrum: #02077179



the point is they both can kill. thanks for your considerate
feedback


 

offline hexane on 2007-04-29 06:19 [#02077182]
Points: 2035 Status: Lurker | Followup to goDel: #02077177 | Show recordbag



any technology where a military application is found usually
has an eerie vibe about it. goDel=>how easy to u think it's
going be to kill these robots? i'd rather face a dog knowing
at least you have a chance killing another mere mortal..


 

offline hexane on 2007-04-29 06:22 [#02077183]
Points: 2035 Status: Lurker | Followup to The_Shark: #02077180 | Show recordbag



reminiscent of Total Annihalation's idea...pacifist wars


 

offline goDel from ɐpʎǝx (Seychelles) on 2007-04-29 06:26 [#02077184]
Points: 10225 Status: Lurker | Followup to hexane: #02077182



that depends on the robot and the circumstances.
with respect to their 'intelligence' i think you can compare
it to playing chess against a computer. the mind of a
computer works on a specific set of rules. if you know those
rules, you can use that to your advantage and the win can be
pretty easy.
and btw, such robot will always have a set of overruling
mechanisms which makes external intervention (such as
shutting it down) possible. dogs don't always listen. robots
can be made to always listen.


 

offline redrum from the allman brothers band (Ireland) on 2007-04-29 06:32 [#02077185]
Points: 12878 Status: Addict | Followup to goDel: #02077181



my point is that you've more chance of getting away from a
guard dog, or even steering clear of it, than you do with
one of these sentries.

the_shark - i agree with you, only that i can see this
developing more and more. they won't stop with sentries,
there'll be mobile ones come a few years.


 

offline goDel from ɐpʎǝx (Seychelles) on 2007-04-29 06:43 [#02077188]
Points: 10225 Status: Lurker | Followup to redrum: #02077185



fine.

"Right now, that's not an issue because the
responsibility lies with the designer or operator of that
robot; but as robots become more autonomous that line or
responsibility becomes blurred."


my point was wrt this issue. in those cases, the owner is
responsible (like with a guard dog).



 

offline hexane on 2007-04-29 06:49 [#02077190]
Points: 2035 Status: Lurker | Show recordbag



i guess we have these fantastic prenotions of robots being
capable of overriding their default circuitry and inevitably
fisting the human race. a stiff EMP should put them in their
place tho, otherwise....fisting ensued


 

offline marlowe from Antarctica on 2007-04-29 06:49 [#02077191]
Points: 24588 Status: Lurker



Of course the future is grim - it's being shaped by humans.


 

offline gravity_again on 2007-04-29 07:04 [#02077196]
Points: 196 Status: Regular



I dont know about it being grim, but as humans embrace
technology more and more - it will certainly be interesting
to see.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 07:22 [#02077208]
Points: 35867 Status: Lurker | Followup to goDel: #02077188 | Show recordbag



The owner may well be legally (or formally) responsible, but
that doesn't mean he, himself, actually feels responsible,
which is where the main issue would be for me. If you remove
the person from the act, the person is less likely to feel
responsible because he can blame the robot's autonomy or the
dog's animal nature (disregarding that he programmed the
robot or that he trained the dog). The problem has analogies
all over the place: Does the general feel responsible for
the deaths caused by and to his soldiers when he orders them
to attack? The higher-ups will always have the power to act
by proxy, which makes both formally placing responsibility
and personally taking responsibility harder.


 

offline marlowe from Antarctica on 2007-04-29 07:24 [#02077209]
Points: 24588 Status: Lurker | Followup to gravity_again: #02077196



Interesting to note the gradual erosion of free-thought &
expression, yes.


 

offline redrum from the allman brothers band (Ireland) on 2007-04-29 07:43 [#02077213]
Points: 12878 Status: Addict | Followup to Drunken Mastah: #02077208



a very good post


 

offline goDel from ɐpʎǝx (Seychelles) on 2007-04-29 07:48 [#02077214]
Points: 10225 Status: Lurker | Followup to Drunken Mastah: #02077208



Perhaps he wouldn't feel responsible, and that's why there's
such a thing as the law. The owner may not feel responsible,
but does that matter when he's in jail? The system works
nevertheless, and the owner will definitly feel something
when he's behind bars.

The higher-ups will always have the power to act
by proxy, which makes both formally placing responsibility
and personally taking responsibility harder.

Yes, but those higher-ups will always be dependent on those
that are lower. And in most cases those higher-ups cannot
permit themselves to not feel any responsibility. Else he'd
loose support from the lower ranked. A good general will
always have a healthy dose of responsibility. A bad general
is fighting a lost war. And that's a fact both good and bad
generals are well aware of.


 

offline EVOL from a long time ago on 2007-04-29 09:13 [#02077241]
Points: 4921 Status: Lurker



again, another "future" dependent on the finite resource of
oil... never gonna happen. or at least, definately not
gonna last, that's for sure!

oil depletion

not that you are, but i wouldn't worry about robots getting
crazy more than anything else in the foreseeable future.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 10:18 [#02077254]
Points: 35867 Status: Lurker | Followup to goDel: #02077214 | Show recordbag



Morality isn't an external issue, so yes, I would stress the
importance of the persons own attitude. The system may work
in a way, but it's only about punishing those that commit
the crime. It would be better if people were properly aware
of their responsibilities before committing the crime so
that they may think twice before doing it.

This is especially relevant when it comes to deploying
autonomous robots designed for killing; If the generals were
aware of their actual responsibility for the deaths caused
by these robots, and actually felt each life taken as a life
taken and not just a +1 to the n: "Homeland saved
n times today."

When it comes to people not being permitted to not feel
responsibility.. well... If that, once more, is a political,
formal or external decision ("I take responsibility for this
[or else I'd lose my job]"), instead of a personal or
internal decision, it's a worthless one, a shell with no
actual content, and if I knew something like that about that
higher-up, he'd lose my respect, and if it was in my power,
I'd fire his ass.


 

offline Monoid from one source all things depend on 2007-04-29 10:24 [#02077255]
Points: 11010 Status: Lurker



How could had this idea sooner....im watching jurasic
park...ON DRUGS!!!!!!!!


 

offline goDel from ɐpʎǝx (Seychelles) on 2007-04-29 11:03 [#02077263]
Points: 10225 Status: Lurker | Followup to Drunken Mastah: #02077254



But what's the difference between a general who sends robots
into the battle-field, or soldiers? In both cases he may
very well not feel responsible.
I'm not sure what you're aiming at. Your point seems to aim
more in the direction of whether or not those held
responsible do actually feel responsible. Are you saying
that those responsible will feel less responsible when
robots carry out their calls?
Do you think it matters whether someone hires a killer to
kill someone, or pulls the trigger himself? In both cases
said individual is just as guilty. In the former the person
may feel a little bit less responsible, compared to the
latter. Does it matter what the killer felt when shooting?
If you think it does, you're in for some real shit. How the
hell are you going to determine what someone actually felt?
One thing is certain: you can never know with certainty if
someone felt responsible when he/she gave a certain order.
That's a problem of a whole different order. To me it seems
to be the case that it doesn't really matter whether a
general sent out some robots, or his men. In both cases he's
just as responsible. Moreover, one could argue it's immoral
to send out men when you can send out robots to the
battlefield. Why put actual lives at stake when you can put
in some droids? Should we keep on sending men, because then
we can be more certain that those responsible will actually
feel responsible? Again, I'm not sure where you're aiming
at.


 

offline EVOL from a long time ago on 2007-04-29 11:32 [#02077268]
Points: 4921 Status: Lurker | Followup to goDel: #02077263



fuck feelings. the world's over populated as it is. that,
and the majority of people are retards anyway. if a person
finds themselves in the middle of a battlefield, it's not
like they never saw it comin'. they know what they're
getting into. i don't think anybody would refuse the
purpose of war if governments came straight out with the
truth and said, "hey look, we understand nobody wants
anybody else to get killed in this world, but... well... the
other option besides war is no more cars or computers or
ipods or cell phones or tv or things like that which rely on
petro chemicals, since we don't have enough oil to sustain
our current version of modern culture. soo... if you can do
with out those things we will never have to invade another
country again, if not... at least will be united in an
effort to continue living of off the blood of inferior
nations." i mean, i think this is what the government is
basically doing right now, w/o putting the country in panic
and chaos by saying exactly that, "uhh... sorry but we ran
out of oil." how do we get people to back up a war for oil
w/o causing the collapse of the global economy? i know!
stage a terrorist attack and declare a never ending war
against an enemy that has no clear definition! enjoy your
affordable highspeed internet now knowing the ultimate price
people have paid to sustain all the luxuries we have!


 

offline cygnus from nowhere and everyplace on 2007-04-29 11:43 [#02077270]
Points: 11920 Status: Regular



heres a vid of that samsung sentry robot,

link

It has a sophisticated pattern recognition which can
detect the difference between humans and trees, and a 5.5mm
machine-gun. The robot also has a speaker to warn the
intruder to surrender or get a perfect headshot.



 

offline marlowe from Antarctica on 2007-04-29 11:48 [#02077271]
Points: 24588 Status: Lurker



So the decision to execute has been given to robots?


 

offline cygnus from nowhere and everyplace on 2007-04-29 11:54 [#02077272]
Points: 11920 Status: Regular | Followup to marlowe: #02077271



its only up to your perception. is it really making a
decision to execute or is it a machine obeying its code?

if you only choose to recognize the code then there's really
no decisions being made, the robot is then the same as an
axe or a typewriter, or an inkjet printer. it was designed
to do something by a human and its doing just that...


 

offline marlowe from Antarctica on 2007-04-29 11:59 [#02077273]
Points: 24588 Status: Lurker | Followup to cygnus: #02077272



A typewriter isn't a robot, it's a manual piece of machinery
(apart from the fancy ones where it displays your text in a
little LCD screen, and the user decides to delete/amend
text).

The robot machine-gunner will reach a point in its
programming where it will be faced with the choice: Shoot /
Don't Shoot - if it shoots then that is a decision, a
decisive action.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 12:09 [#02077278]
Points: 35867 Status: Lurker | Followup to goDel: #02077263 | Show recordbag



I'm saying that in both action and responsibility there are
lines and degrees of responsibility: To what extent is the
person giving the orders, the higher-up, responsible when
the order is accepted and carried out by another morally
responsible person, and how does this contrast to the robot
which doesn't really have the capability to be morally
responsible in that it's neither conscious, rational or
capable of choice? In my opinion, everyone (except for the
robot, being amoral) involved is 100% responsible for the
actions taken in such situations, but if no-one feels the
responsibility, the action is more likely to repeat itself.

I'm also not saying we shouldn't punish those that commit
crimes against others even though they themselves don't feel
personal responsibility or guilt; ultimately the majority
decides, and they have both the right and power (though
power doesn't make right) to do so at their leisure. I'm
just stressing the point that the killer, if he felt the
actual and full responsibility of each life he took, would
be more likely not to kill; Responsibility is quite heavy
stuff, so if every general felt each life taken as heavily
as his own, I'm pretty sure war and killing would be more
rare. The thing that happens in war, though, is that the
soldiers "blame" the higher-ups ("I was just following
orders") while the higher-ups don't have proximity to the
killings, so they don't necessarily have to even be aware
that lives are lost at anything but an abstract level (a
life is a number, a casualty), and thus feel no
responsibility. Thus they go on issuing orders to kill, that
are performed by individuals who see themselves as not
responsible for actions performed under orders, or against
those who are classified (and caricatured, de-humanised) as
the enemy ("Charlie," "Ali," "Kraut," whatever caricature,
you can come up with that makes killing easier).


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 12:10 [#02077280]
Points: 35867 Status: Lurker | Followup to marlowe: #02077273 | Show recordbag



That's not decision, that's calculation. Decision is
conscious.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 12:16 [#02077281]
Points: 35867 Status: Lurker | Followup to goDel: #02077263 | Show recordbag



Oh, and another difference between soldiers and robots.. or,
well, the problem could be that there wouldn't be much
difference these days if you consider it from the higher-ups
point of view; I'm not sure if the higher-ups even feel
responsible for the deaths of their own men, or if they even
consider them any differently from how they would robots
(One life, that's $800000 in compensation to his family, one
robot, that's $800000 for repairs/a new one).


 

offline marlowe from Antarctica on 2007-04-29 12:20 [#02077284]
Points: 24588 Status: Lurker | Followup to Drunken Mastah: #02077280



Says who?


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 12:21 [#02077285]
Points: 35867 Status: Lurker | Followup to marlowe: #02077284 | Show recordbag



I do.


 

offline marlowe from Antarctica on 2007-04-29 12:24 [#02077286]
Points: 24588 Status: Lurker



decide To determine the result of; to settle an
issue, to resolve.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 12:27 [#02077287]
Points: 35867 Status: Lurker | Followup to marlowe: #02077286 | Show recordbag



Well, god damned, how do you translate "bestemme" into
English then? You're missing a word!


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 12:28 [#02077289]
Points: 35867 Status: Lurker | Show recordbag



Anyway, it can be said that all of those actions require
consciousness, and that the robot is still only calculating;
It isn't aware of what it is doing, or even that it
is doing it.. it's just something that happens.


 

offline marlowe from Antarctica on 2007-04-29 12:32 [#02077292]
Points: 24588 Status: Lurker | Followup to Drunken Mastah: #02077287



Well, I ran 'bestemme' through an online translator & it
came back with the following result: adjust, appoint,
assign, decide, determine, resolve, set.


 

offline marlowe from Antarctica on 2007-04-29 12:33 [#02077293]
Points: 24588 Status: Lurker | Followup to Drunken Mastah: #02077289



Well, doesn't that increase the disquiet? That fact that its
decisions are non-conscious & arbitrary.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 13:00 [#02077301]
Points: 35867 Status: Lurker | Followup to marlowe: #02077292 | Show recordbag



That's a pretty bad translation, especially adjust.. where
did they get that from? The other ones are.. ok, I guess..
the most common uses are closest to assign and decide
(different contexts).


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 13:02 [#02077303]
Points: 35867 Status: Lurker | Followup to marlowe: #02077293 | Show recordbag



I'm not sure what you mean...


 

offline marlowe from Antarctica on 2007-04-29 13:07 [#02077306]
Points: 24588 Status: Lurker | Followup to Drunken Mastah: #02077303



I mean the robot won't pause to consider for a moment: the
morality is taken away from the procedure: a robot has no
qualms.


 

offline w M w from London (United Kingdom) on 2007-04-29 14:22 [#02077327]
Points: 21452 Status: Lurker



If you look at the history of animal evolution, new forms
can only come in incrimental changes from current/previous
forms. SO, I think the main robots will mutate/evolve
gradually from the exact computer systems we have today. For
example, in a single cell, the mitochondria probably used to
be a seperate living thing, then eventually evolved to
coexist with the cells as one unit. Well with computers, we
have digital cameras, printers, scanners, microphones,
headphones, the computer, monitor, etc. And all this stuff
might evolve into a single robot unit.
This would be the robot 'phenotype' (needing good locomotion
too) or physical body, which I think would be very important
in catalyzing their main evolution. Because the very
definition of intelligence almost requires an environment to
interact with. Websters defines intelligence as overcoming
'new or trying situations' which are plenty in the chaotic
(possibly pseudo) random world. Bits on discs are the
genotype. Maybe if the bits somehow coded for the
construction of the robot phenotype from a robot embryo or
something (instead of humans largly building them) then
mutation/selection/evolution would ignite.

Anyway I think the ethical thing to do is let robots get in
complete control and replace this retarded dna with all this
primitive 'nature is red in tooth and claw' shit and our
pathetic minds swamped with too much evolutionary baggage.


 

offline cygnus from nowhere and everyplace on 2007-04-29 17:51 [#02077424]
Points: 11920 Status: Regular | Followup to Drunken Mastah: #02077285



so, once robots reach a state in their programming where
they are making "conscious" decisions, how do we protect
their civil liberties?

should we even give robots titles and rights? or should we
just 'utilize' their abilities while throwing out the moral
and ethical principles we apply to our fellow humans?

slave (slāv)
n.
1. One bound in servitude as the property of a person or
household.



 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 18:44 [#02077432]
Points: 35867 Status: Lurker | Followup to marlowe: #02077306 | Show recordbag



Yeah, it's amoral (not culpable), so it can't be immoral.
The moral issues lies with those who decide to use the
robots.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 18:48 [#02077434]
Points: 35867 Status: Lurker | Followup to cygnus: #02077424 | Show recordbag



You're assuming consciousness can be programmed, but since
no-one actually knows consciousness or are able to define it
outside of definitions that contain the impossibility of
defining it. We can experience and describe features of it,
but consciousness itself hasn't been touched upon in any
satisfactory way.


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 18:49 [#02077436]
Points: 35867 Status: Lurker | Followup to Drunken Mastah: #02077434 | Show recordbag



There's something wrong with that paragraph, but you get the
drift.


 

offline cygnus from nowhere and everyplace on 2007-04-29 18:55 [#02077438]
Points: 11920 Status: Regular | Followup to Drunken Mastah: #02077434



you said that the samsung machinegun robot will reach a
point where it will be making decisions on its own. does
that not mean that the machine will have free wil? does that
not mean that the machine has consciousness?

it is aware of what it is doing, and it is doing A LOT. its
performing a shit load of functions independently, and with
choice. how's that different from me?


 

offline pachi from yo momma (United States) on 2007-04-29 18:59 [#02077443]
Points: 8984 Status: Lurker



If this trend continues, we may be looking at the evolution
of an entirely new biological - erm, mechanical, kingdom.

My theory is that robots will become autonomous to the point
where they will develop their own robots. They will continue
to evolve, but possibly at an exponentially higher rate than
that at which flora and fauna evolved. It will come to the
point where the term "robot" will become obsolescent,
provided that the original Czech word meaning "labor" from
which the word "robot" derived will no longer be a suitable
term for these new autonomous life forms. Perhaps a new term
will replace "robot"; the one I came up with is
mechagen. Anyone who has a more appropriate
alternative is welcome to contend.

Not that this is of immediate concern, but there is the
potential that these so-called robots will spark a
revolution on a global scale, provided the field of robotics
continues to advance as it does.

Anyway..


 

offline Drunken Mastah from OPPERKLASSESVIN!!! (Norway) on 2007-04-29 19:10 [#02077452]
Points: 35867 Status: Lurker | Followup to cygnus: #02077438 | Show recordbag



I said no such thing. I said the machine isn't capable of
decisions; it can only calculate.


 

offline cygnus from nowhere and everyplace on 2007-04-29 19:22 [#02077458]
Points: 11920 Status: Regular | Followup to Drunken Mastah: #02077452



aw, fuck. that whole exchange was meant for marlowe

did you choose matching avatars on purpose?


 


Messageboard index