1C Home   |   Register   |   Today Posts   |   Members   |   UserCP   |   Calendar   |   Search   |   FAQ

Go Back   Official 1C Company forum > 1C Publishing > IL-2 Sturmovik: Cliffs of Dover > Pilot's Lounge

Pilot's Lounge Members meetup

Reply
 
Thread Tools Display Modes
  #1  
Old 01-27-2013, 03:26 PM
raaaid raaaid is offline
Senior Member
 
Join Date: Apr 2008
Posts: 2,329
Default will ai ever get so complex as to deserve rights?

or even if ai gets as complex as an human intelligence it deserves no right on this vein?




Last edited by raaaid; 01-27-2013 at 03:28 PM.
Reply With Quote
  #2  
Old 01-27-2013, 04:43 PM
Das Attorney Das Attorney is offline
Approved Member
 
Join Date: May 2011
Posts: 142
Default

Well if it's ever that intelligent, then it can make it's own mind up whether it deserves 'rights'....
Reply With Quote
  #3  
Old 01-27-2013, 05:38 PM
badfinger badfinger is offline
Approved Member
 
Join Date: Jun 2010
Location: League City, TX
Posts: 319
Default

Quote:
Originally Posted by Das Attorney View Post
Well if it's ever that intelligent, then it can make it's own mind up whether it deserves 'rights'....
IT will probably want to be a lawyer.

Binky9
__________________
Win10 64 bit
1T Hard Drive
ASUS P67 motherboard
Intel i7 3.4ghz Processor
GTX 780 Graphics Card OC
24GB Ram
Track IR5
50" LG HDMI LED 1920x1080 60hrz
MS FFB2 Stick
CH Pedals
Saitek Throttle/Prop/Mixture and Trim wheel
Thrustmaster MFDs

League City, TX
Reply With Quote
  #4  
Old 01-28-2013, 01:39 AM
Skoshi Tiger Skoshi Tiger is offline
Senior Member
 
Join Date: Nov 2007
Location: Western Australia
Posts: 2,196
Default

It would decide our fate in a microsecond

Hmmmm! Not Cool!
Reply With Quote
  #5  
Old 01-28-2013, 02:49 AM
tk471138 tk471138 is offline
Senior Member
 
Join Date: May 2011
Posts: 285
Default

as far as im concerned aliens robots AI has one right when im around and that is the right to get killed by me...i hate robots aliens and AI and will not tolerate them and i suggest you people should think the same way should aliens land immediately kill them, the same with robots or AI before they get a foot hold....robots, aliens, and AI want one thing and that is to dispose of us...



the AI or robots only have what ever rights their creator wants to endow them with...
Reply With Quote
  #6  
Old 01-28-2013, 12:09 PM
KG26_Alpha's Avatar
KG26_Alpha KG26_Alpha is offline
Super Moderator
 
Join Date: Jan 2008
Location: London
Posts: 2,796
Default

Quote:
Originally Posted by tk471138 View Post
as far as im concerned aliens robots AI has one right when im around and that is the right to get killed by me...i hate robots aliens and AI and will not tolerate them and i suggest you people should think the same way should aliens land immediately kill them, the same with robots or AI before they get a foot hold....robots, aliens, and AI want one thing and that is to dispose of us...



the AI or robots only have what ever rights their creator wants to endow them with...




AI rules were laid out in the SF world, the 3 rules went like this from

Isaac Asimov

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

I think a few movies have used those laws from Asimov.

I'm not a SF expert I remember reading Asimov years ago and it rang a bell with me.
__________________



Last edited by KG26_Alpha; 01-28-2013 at 12:24 PM.
Reply With Quote
  #7  
Old 01-28-2013, 01:02 PM
ZaltysZ's Avatar
ZaltysZ ZaltysZ is offline
Approved Member
 
Join Date: Sep 2008
Location: Lithuania
Posts: 426
Default

Quote:
Originally Posted by KG26_Alpha View Post
Isaac Asimov

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
The first law has a fault - a possible deadend, because that law does not cover situation in which dilemma may arise: saving one human causes harm to another, so saving violates the law, but doing nothing violates it too. Undefined behavior anyone? That is the nastiest thing could happen in software.
Reply With Quote
  #8  
Old 01-28-2013, 01:24 PM
swiss swiss is offline
Approved Member
 
Join Date: Mar 2010
Location: Zürich, Swiss Confederation
Posts: 2,263
Default

Quote:
Originally Posted by tk471138 View Post
as far as im concerned aliens robots AI has one right when im around and that is the right to get killed by me...i hate robots aliens and AI and will not tolerate them and i suggest you people should think the same way should aliens land immediately kill them, the same with robots or AI before they get a foot hold....robots, aliens, and AI want one thing and that is to dispose of us...



the AI or robots only have what ever rights their creator wants to endow them with...
lol, what do you reckon?
Reply With Quote
  #9  
Old 01-28-2013, 01:33 PM
KG26_Alpha's Avatar
KG26_Alpha KG26_Alpha is offline
Super Moderator
 
Join Date: Jan 2008
Location: London
Posts: 2,796
Default

Quote:
Originally Posted by ZaltysZ View Post
The first law has a fault - a possible deadend, because that law does not cover situation in which dilemma may arise: saving one human causes harm to another, so saving violates the law, but doing nothing violates it too. Undefined behavior anyone? That is the nastiest thing could happen in software.
Not really because they are not causing the "harm" in the first place.
__________________


Reply With Quote
  #10  
Old 01-28-2013, 01:42 PM
SlipBall's Avatar
SlipBall SlipBall is offline
Approved Member
 
Join Date: Oct 2007
Location: down Island, NY
Posts: 2,718
Default

Quote:
Originally Posted by KG26_Alpha View Post




AI rules were laid out in the SF world, the 3 rules went like this from

Isaac Asimov

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

I think a few movies have used those laws from Asimov.

I'm not a SF expert I remember reading Asimov years ago and it rang a bell with me.

Why does Sarah Connor live in fear
__________________



GigaByteBoard...64bit...FX 4300 3.8, G. Skill sniper 1866 32GB, EVGA GTX 660 ti 3gb, Raptor 64mb cache, Planar 120Hz 2ms, CH controls, Tir5
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 05:18 PM.

Based on a design by: Miner Skinz.com

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.
Copyright © 2007 1C Company. All rights reserved.