|
|
Anus_Presley
on 2003-05-27 08:44 [#00716573]
Points: 23472 Status: Lurker
|
|
if they made a rrobot that had emotion, and could fearr death, and felt pain, is it still ok to trreat it bad simply because it is not human.
|
|
Anus_Presley
on 2003-05-27 08:44 [#00716574]
Points: 23472 Status: Lurker
|
|
that was a lame efforrt at trrying to sparrk some serrious debate, becasue i am interrested. what makes us human?
|
|
qrter
from the future, and it works (Netherlands, The) on 2003-05-27 08:45 [#00716575]
Points: 47414 Status: Moderator
|
|
kind of depends whether you think it then can be seen as being conscious.
|
|
dariusgriffin
from cool on 2003-05-27 08:46 [#00716579]
Points: 12430 Status: Regular | Followup to qrter: #00716575
|
|
I guess that anything that can fear death could be considered as conscious...
|
|
Anus_Presley
on 2003-05-27 08:47 [#00716580]
Points: 23472 Status: Lurker
|
|
if i punch, it is a chemical rreaction that makes you feel pain, and the same if i make fun of you, the upset you feel is caused by chemical.
if this rrobot is made fun of it, its emotions arre caused by electrrical signals like human neurrons in a sence, so is it just as bad.
|
|
Anus_Presley
on 2003-05-27 08:48 [#00716581]
Points: 23472 Status: Lurker
|
|
if i punch a human*
|
|
Ceri JC
from Jefferson City (United States) on 2003-05-27 08:56 [#00716586]
Points: 23533 Status: Moderator | Show recordbag
|
|
My AI lecturer is really into this. If you're interested read up all about it...
His ideas relating to "can a robot have a soul" are here. It's worth mentioning he's a practicing Roman Catholic himself...
He's one of the leaders in the field.
|
|
Anus_Presley
on 2003-05-27 08:59 [#00716593]
Points: 23472 Status: Lurker | Followup to Ceri JC: #00716586
|
|
o nice one
|
|
korben dallas
from nz on 2003-05-27 09:08 [#00716600]
Points: 4605 Status: Regular
|
|
does it matter?
the fact that people are human doesn't stop us from treating some of them badly does it?
|
|
Anus_Presley
on 2003-05-27 09:10 [#00716603]
Points: 23472 Status: Lurker | Followup to korben dallas: #00716600
|
|
thats not my point
|
|
Morgoth
from Stella-town (Belgium) on 2003-05-27 09:19 [#00716613]
Points: 1264 Status: Regular | Followup to korben dallas: #00716600
|
|
some treat other humas bad, and cause them to feel pain. I try to minimize my negative influence on my fellow human beings. I'm not saying that I always succeed, but most people try to.
We're all emotionally touched by others feeling pain, like seeing images of people in Algeria after the earthquake.
|
|
korben dallas
from nz on 2003-05-27 09:28 [#00716622]
Points: 4605 Status: Regular
|
|
agreed ... 'tis deviating from the "what makes us human".
sticking to topic then:
could go the zen/schopenhauer route, and say that human/non-human distinction is arbitrary, as we are all manifestations of the same underlying ONE.
or Heidegger, human/Dasein is a being for which its Being is always an issue, its essence can not be explicated in a list or axiomatic form.
|
|
korben dallas
from nz on 2003-05-27 09:29 [#00716623]
Points: 4605 Status: Regular
|
|
in light of the ZEN option then .. the robot scnenario is interesting, as we "create" robots in a non-biological way ... unless its considered that we give "birth" to their consciousness or something.
|
|
uzim
on 2003-05-27 09:51 [#00716636]
Points: 17716 Status: Lurker
|
|
of course! i'd even marry a robot, if (s)he would. in another better life.
and then we'd be living happily, wandering around the world into a wooden spaceship without children, and go to the beach and throw magical pink rings on police seagulls to make them sing, unless we'd have kernel panics. and then we'd sleep together and have a threesome with a black shadow ghost, and eat curry ice cream with straspberries.
now robots shouldn't be perfect imo, otherwise humans would get jealous (that's what happens in chobits so far)...
|
|
uzim
on 2003-05-27 09:54 [#00716638]
Points: 17716 Status: Lurker
|
|
maybe we're just biological robots, that wouldn't bother me nor suprise me.
|
|
plaidzebra
from so long, xlt on 2003-05-27 10:21 [#00716650]
Points: 5678 Status: Lurker
|
|
being a biological robot will bother you eventually, believe me.
eventually, a human can realize that while an animal, or human, or robot may recover from the pain inflicted, the choice to inflict pain is a human's own poison.
that is to say, what's important is not the robot registering pain, but our self reflection on the motivation to inflict pain.
|
|
uzim
on 2003-05-27 10:35 [#00716659]
Points: 17716 Status: Lurker
|
|
i don't get your point....
bah, i barely get anything anyway.
|
|
Q4Z2X
on 2003-05-27 10:39 [#00716662]
Points: 5264 Status: Lurker
|
|
i know there is some type of difference, but i am too lazy to explain it.
|
|
elusive
from detroit (United States) on 2003-05-27 11:08 [#00716683]
Points: 18368 Status: Lurker | Show recordbag
|
|
"I guess that anything that can fear death could be considered as conscious... "
no, that would be realizing that one day we will die, not just fearing it.
Yeah, I'm here because of my thumb. We owe everything to our thumbs, bow down to them. I think what makes something human is realizing and taking into account what will happen TOMORROW for his/her actions, today. Being able to realistically calculate cause & effect.
If a robot had AI, you could treat it all you want. There is no right or wrong for treating a robot.
The difference being that you (As a human) have now placed a human characteristic on a nonliving object.
By doing this, you will react and be effected by this object, and let part of your emotion lose control to it.
Building AI into robots will not cause the robot harm, only the human who allows himself/herself to be emotionally attached to the object.
|
|
elusive
from detroit (United States) on 2003-05-27 18:58 [#00717258]
Points: 18368 Status: Lurker | Show recordbag
|
|
no?
|
|
lichtswitch
from playskull.com (United States) on 2003-05-27 19:05 [#00717267]
Points: 165 Status: Regular
|
|
ai action ALIZTHINK 20 2 5 1 40
|
|
lichtswitch
from playskull.com (United States) on 2003-05-27 19:06 [#00717268]
Points: 165 Status: Regular
|
|
(ai) (name) (action) (move) (basic ai type)
|
|
mimi
on 2003-05-27 19:21 [#00717273]
Points: 5721 Status: Regular
|
|
damn - i thought this was going to be about freeform.
|
|
weatheredstoner
from same shit babes. (United States) on 2003-05-27 19:39 [#00717291]
Points: 12585 Status: Lurker
|
|
...but if I kill her and then rape her dead body, does that make me a bad person?
|
|
weatheredstoner
from same shit babes. (United States) on 2003-05-27 19:40 [#00717293]
Points: 12585 Status: Lurker | Followup to weatheredstoner: #00717291
|
|
oops! posted in wrong topic... boy am I embarrassed.
|
|
od_step_cloak
from Pleth (Australia) on 2003-05-27 20:13 [#00717323]
Points: 3803 Status: Regular
|
|
I wouldn't treat a robot badly even if it didn't have a conciousness.
i never bashed my toys, even.
i just never wanted to.
|
|
Messageboard index
|