Jump to content

Alien and Robot rights?


archimage_a

Recommended Posts

  • Replies 70
  • Created
  • Last Reply

Top Posters In This Topic

I don't "know" how to swallow food for example, it's an innate ability.

 

 

 

You really don't consider yourself over analytical? :mrgreen:

 

 

 

Heh, it's difficult to talk semantics without being anal.

La lune ne garde aucune rancune.

Link to comment
Share on other sites

And I believe that Salmon 'know' they have to get back to the top of the river. They don't feel inclined, they just know they have to, and no article/page or whatever I have read tells me that a Salmon is aware of the face/theory that it the eggs have more chance of survial up there.

 

By the same token, if you do not know how do swallow food(I don't either I would like to point out) then you are not Sapient, by your own defination.

 

 

 

I am trying to argue this but we are both arguing over such a small area its difficult to stay within it.

 

Sapience would indicate that some things are passed on from generation to generation, in line with Lamark's theory of evolution, which is largely discredited.

 

Where as Sentience would indicate that things have to be relearned each generation, but some things are chemically forced(impulses that for the heart to beat for instance), in line with Darwin's theory of evolution, which is currently accepted in most of the world.

Well I knew you wouldn't agree. I know how you hate facing facts.

Link to comment
Share on other sites

How does sapience indicate the passing any kind of genetic knowledge? All that's passed on is sapience itself. That is, the ability to 'hear', question, rationalise and think about your own thoughts, or metacognition, which as I said earlier is considered one of the definitions of sapience. It should also be noted that I never said sapience and sentience were mutually exclusive, since I should rather hope that they're both things we could each claim to be.

La lune ne garde aucune rancune.

Link to comment
Share on other sites

The knowladge has to come from somewhere, and if it isn't learned then it must be passed on.

 

 

 

I think they are just two interpretations of the same thing, but lets move on with the term Sentience because it doesn't have any issues around it's use, save quibbling.

Well I knew you wouldn't agree. I know how you hate facing facts.

Link to comment
Share on other sites

I think if we actually did come across aliens and for whatever reason did not destroy them, they wouldn't exactly need us to give them "rights". It's not like they'd be colonizing our planet and living among us.

whalenuke.png

Command the Murderous Chalices! Drink ye harpooners! drink and swear, ye men that man the deathful whaleboat's bow- Death to Moby Dick!

BLOOD FOR THE BLOOD GOD! SKULLS FOR THE SKULL THRONE!

angel2w.gif

Link to comment
Share on other sites

Well there is always the option of visits...and black marketeers, and all that sort of stuff. Unless we really made an effort to stop them, and even then it would happen eventually, if we let them survive(and visa versa) then we would mix. Then the issue would become very important because with mixing comes crime...If not from then then certainly from us, since so many of us consider them to not have rights ;) .

 

 

 

As to silver wits: If mind control became avalible would humans lose their sentience? Since we can just reprogram them. In that case none of us are sentient, we can all be 'reprogramed' it is simply alot cruder and takes alot longer at the moment...

Well I knew you wouldn't agree. I know how you hate facing facts.

Link to comment
Share on other sites

As to silver wits: If mind control became avalible would humans lose their sentience? Since we can just reprogram them. In that case none of us are sentient, we can all be 'reprogramed' it is simply alot cruder and takes alot longer at the moment...

 

It's not mind control. Robots pretty much are just machines, and machines can be reporogrammed.

whalenuke.png

Command the Murderous Chalices! Drink ye harpooners! drink and swear, ye men that man the deathful whaleboat's bow- Death to Moby Dick!

BLOOD FOR THE BLOOD GOD! SKULLS FOR THE SKULL THRONE!

angel2w.gif

Link to comment
Share on other sites

(I am shamelessly going to rip this arguement from star trek)

 

 

 

"We are machines as well, just of a different type. Robots are created by (in this case) humans, do we deny that? No. Children are created from the building blocks of their parent's DNA. Are they property?"

 

 

 

I find it hard to believe that it is not considered mind control to forcibly change somethings mind about something. We practice this weak form of mind control on a daily basis, telling people tiny bits of information, mostly irrelevantly, sometimes importantly....but every second we are changing the people around us simply by being there. The number of what if's is incredible, what if:

 

 

 

A random person in the street had side stepped a man carrying a box. Because they side stepped him he was slightly faster, meaning he got back to his house slightly earlier than he should. As a result he fumbles with his keys that have moved slightly further down into his pocket, and as a result of that his dog is closer to the door than it would have been, so he trips over it, injuring himself. Because he has injured himself he can't meet his wife at the theater that evening. After waiting for twenty minutes she is furouis with him and heads to the exit. She collides with a person who has just stood up. The gun they have in their hand discharges, she screams. Someone tackles him to the ground and he is arrested. As a result of this Lincon remains president of the United States of America.

 

The future would be irrevocably altered. Supposing during his extended presidency he shakes the hand of someone who on a different day would have been killed by a falling chimeny stack. Because he isn't dead he gets married has children and his great great great grand-daughter becomes the first woman in space, on an ill fated UA (United Americas, since Canada became part of the USA, and since space travel was never nationalised as NASA) shuttle craft. However because she was there she spotted the hairline fracture in the radiation sheild that saved everyone on board.

 

And so on and so on.

 

 

 

This demonstrates two things. First that mind control(The wife thinks that her husband is drinking again after he promised her he would go to the theate) happens every day in an almost undectable way. The second thing is that the smallest action can have the most extreme consquences. A large action almost always changes the world, and making it immoral and illegal to alter Robot's memories would be a major action.

 

 

 

For instance. If you had(Another Star Trek Argument) thousands of robots, without rights to consider, without mantainance to consider(Cheaper to buy something than to repair it, with most things anyway)...Imagine the possiblities. Why send a Human into a dangerous area when a robot can be sent in to do the job for you. Of course you don't just want a robot like the Mars Lander, something that needs constant directions, you want something that will follow your orders but adapt to the situation. They would be intelligent, they would just be very precisely programmed not to want freedom, not to consider themselves important, not to challange. In a sense they would be 'educated' to be subserviant to humanity, almost like a dog.

 

 

 

Now it depends on whether or not you think that human life is equal to other life. Is it right to send something that can feel pain and suffering(you want something to be able to feel if it is going to be doing something delicate, of course you could just give it a low threshold for pain, which would make sense but then you would have to reboot it or it wouldn't realise it had walked onto a [bleep]e or whatever...I am sure that there are ways around that but why bother...it is only a machine, programming it with all of these things is going to cost so much more.) to do something that will certainly(not maybe, certainly) result in its death?

 

 

 

Is it right to do this when the only other option is that a human will have to go? It would give the human a chance to make a moral statement wouldn't it, selflessly giving up their life for the machine. But it is a safeguard isn't it. You are making completely certain that the humans survive, if they want to. You are giving them the choice of whether or not they want to die. It would make all humans equal before death's judgement, they can choose.(Stopping now because I am repeating myself)

 

 

 

So ending with a story from the good old USSR. During the second world war the Russian's needed an anti-tank weapon that was better than sending cowardly, and expensive, humans to fight. So they trained dogs to look for food under tanks. Now this may sound stupid, but they then strapped mines to the dog's back, before they sent them into battle. Now this sounds like a great, a terribly cruel, but great tactic doesn't it. You can produce dogs fairly quickly and fairly cheaply, the mine is probably more expensive than raising a dog. Anyway so they trained up these dogs, strapped the mines to their back and released them during a battle. Now being trained on Russian Tanks the dogs understandably headed towards the Russian Tanks now.

 

Not meant to sway you this way or that, just to give you an example of a real situation of using a sub-human outside of slavery.

Well I knew you wouldn't agree. I know how you hate facing facts.

Link to comment
Share on other sites

Your argument is flawed for the simple fact that the example you gave using the random person sidestepping the man with a box is exactly what mind control isn't.

 

 

 

Cause and effect has nothing to do with directly influencing an individual's thought processes.

 

 

 

Robots and trained dogs make for a fundamentally flawed comparison. Firstly, that the dog is, by nature, created and suited for a task other than what it is being employed in at that present time. Evolution has built it to be a predator or scavenger, and only by some trick of adaptation has it become domesticated.

 

 

 

Not so with the robot! From the very beginning, it has been created for the express purpose of performing tasks in place of humans. They are tools, nothing more. You do not give rights to a hammer or a saw, even automated ones, on the off chance that they might be inconvenienced by their task.

 

 

 

The robot is not being 'educated' to be subservient to humanity - it is subservient to humanity right from the outset. It does not need to be driven to tasks and actions that it would not normally do, as in the case of the dogs. You do not make a hammer and then teach it how to pound nails into wood. You do not make a wheel and then teach it how to roll.

 

 

 

We do not need to program sentience into robots. We already have systems which adapt and operate efficiently under changing circumstances with only limited human direction (e.g. master systems and AI). We do not need to program these to not seek freedom or not be self-conscious, because they do not know what these things are. We create them so that the eventuality does not even arise in the first place.

 

 

 

How, then, can you equate robots to humans? No matter how indoctrinated into particular modes of thinking, the human ultimately has a capacity for independent thought and action, and always retains the potential to contradict his established patterns of thought and behaviour. The robot is specifically made to not even have this capability.

Link to comment
Share on other sites

(I am shamelessly going to rip this arguement from star trek)

 

 

 

"We are machines as well, just of a different type. Robots are created by (in this case) humans, do we deny that? No. Children are created from the building blocks of their parent's DNA. Are they property?"

 

 

 

I find it hard to believe that it is not considered mind control to forcibly change somethings mind about something. We practice this weak form of mind control on a daily basis, telling people tiny bits of information, mostly irrelevantly, sometimes importantly....but every second we are changing the people around us simply by being there. The number of what if's is incredible, what if:

 

 

 

A random person in the street had side stepped a man carrying a box. Because they side stepped him he was slightly faster, meaning he got back to his house slightly earlier than he should. As a result he fumbles with his keys that have moved slightly further down into his pocket, and as a result of that his dog is closer to the door than it would have been, so he trips over it, injuring himself. Because he has injured himself he can't meet his wife at the theater that evening. After waiting for twenty minutes she is furouis with him and heads to the exit. She collides with a person who has just stood up. The gun they have in their hand discharges, she screams. Someone tackles him to the ground and he is arrested. As a result of this Lincon remains president of the United States of America.

 

The future would be irrevocably altered. Supposing during his extended presidency he shakes the hand of someone who on a different day would have been killed by a falling chimeny stack. Because he isn't dead he gets married has children and his great great great grand-daughter becomes the first woman in space, on an ill fated UA (United Americas, since Canada became part of the USA, and since space travel was never nationalised as NASA) shuttle craft. However because she was there she spotted the hairline fracture in the radiation sheild that saved everyone on board.

 

And so on and so on.

 

 

 

This demonstrates two things. First that mind control(The wife thinks that her husband is drinking again after he promised her he would go to the theate) happens every day in an almost undectable way. The second thing is that the smallest action can have the most extreme consquences. A large action almost always changes the world, and making it immoral and illegal to alter Robot's memories would be a major action.

 

First off, that's not mind control. That's a random sequence of events leading to big change in time.

 

 

 

Second off, that has nothing to do with reprogramming machines. We already reprogram machines if they're not working correctly. It's little different, and it's a process we already know how to do without conditioning them as the Russians did with dogs.

 

 

 

Third off, we wouldn't know if it changes the world or not, or if it did for better or for worse. I don't think laws can be made on huge "what-if?" statements.

 

 

 

And fourth off, if you're creating sentient robots in the first place, you're not doing it right.

whalenuke.png

Command the Murderous Chalices! Drink ye harpooners! drink and swear, ye men that man the deathful whaleboat's bow- Death to Moby Dick!

BLOOD FOR THE BLOOD GOD! SKULLS FOR THE SKULL THRONE!

angel2w.gif

Link to comment
Share on other sites

Sigh....I hate not being able to talk in person, I digress and end up making a fool of myself.

 

 

 

Cause and effect has nothing to do with directly influencing an individual's thought processes.

 

Ok I am a rationalist(Someone who believes that things have rational causes and follow their own logic. Where as an Irrationalist will say that things have a set plan and individuals cannot influance that plan in any major way...things will happen anyway.)

 

I believe that when we educate people to think in a certain way that is what we get, people who have been programmed to believe something. Saying that democracy, or anarchy, or communism or spice girls are good is not something that can be developed independantly of education. If robots are programmed to be subserviant then that is what will be, you are taking away the choice, in exactly the same way as educating a child that they are inferior in every aspect, that they are only here to serve takes away the choice.

 

The only difference here is that you are dealing with a machine.

 

 

 

Ok. You also make a flawed comparison. We are not talking about a hammer. A lump of metal or rock and a stick. Something that has been used for thousands of years and does not, to our knowladge, feel or think.

 

An automatic hammer is the same thing. You have a lump of metal or rock on a stick. You then have an arm equivelent and a whole range of mechanical material. You have things that are working by smashing themselves into other things. Cogs, Hammers, Spears everything humanity has developed requires something to smash itself into something else to yeild a result.

 

 

 

A robot that can have intelligence(We are assuming that you personally do not make the robot and that the person who did gave it intelligence) is therefore not something that cannot think or feel. It is not something that is limited by humanity's limited manipulatory abilities to smash itself against something else to yeild a result. It is NOT built with the expressed purpose of performing tasks. It is built with the expressed purpose of being alive. Once it is alive it needs to be given rights or subdugated. That is what this is about. Not what we personally feel a robot should be designed to do. Yes it plays an important part, a critical part in fact in the decision making process but you have to realise we are not talking about

 

"Oh well in fifty years I will have a working prototype and before I start I thought I should ask if I should put emotions in"

 

We are talking about a robot that can feel, that can think and can function independantly from humans indefinately has been created. The issue of it's rights has been asked and you need to have an arguement or accept other people's decisions.

 

 

 

 

 

It would not be "subservient to humanity right from the outset". The choice has to be made of whether to wipe it's memory and use it as a tool, in which case you ARE educating it into subservience, or if it should be given protection from people who seek to do that.

 

 

 

Sorry. I find this line humourous

 

"You do not make a hammer and then teach it how to pound nails into wood."

 

I agree with both your statements either side of it but this one just made me laugh. The thought that by making a hammer it would suddenly start pounding nails into wood. Something it would of course not naturally do. Sorry again but I need to lighten the tone after the last part.

 

You do not have to teach a hammer how to nail wood. You do have to 'supervise it', 'guide it' and ultimately 'control it' into doing what you want. You can't simply tell it to hammer the nails into the wood you have to 'force it' to do that. You do not have to teach a wheel how to roll. No that is very true, a wheel is perfectly capable of rolling on its own. Once in position, which could very easily happen naturally, it could roll for eternity. Of course the actual number of naturally occuring wheels is quite small. I do think there are, but there might be so... Of course Boulders, rocks, pebbles and the like do roll and they do occur in nature. We were probably inspired to make the wheel by watching boulders. Now I know you are going to rip this arguement to shreads but I am going to say it anyway. We built wheels on the inspiration of boulders and as such we treat them in the same way. If we built robots then would would have the inspiration from ourselves, so should we not treat them as we would. Now that arguement is incredibly weak, since you have decided that they will not really be like us, they will just be an extention of what we already know, no new inspiration. But supposing it was, does that change your opinion on this arguements validity at all?

 

 

 

Ok. I agree. We do not need to program sentience into machines. I think it is a rather silly idea to be frank. Creating artifical life on the other hand is not a silly idea. Steven Hawking once said

 

"I think computer viruses should be considered life forms. It says alot about us as a people, that the only life we can create is designed to destroy."

 

Yup ultra-naive but creating robots that can think on their own is a major step forward for us as a species. Creating life and learning to interact with something different, something that is possibly dangerous, something that isn't controllable in the same way a computer virus is. It is one of those things that we are going to have to come to terms with at some point because there is going to be something out there that is going to be stronger than us, if there isn't then we are the luckest species in existance because it means we can cleanse the universe and never face the chance of extinction by alien. But if there does come a race that is stronger and does not instantly destroy us, then we are going to have to learn to interact with them. Another potentially weak argument, we should create robots because it will give us a chance to learn how to interact with other races, especially since we programmed the robot. But its not just that, it is one of the major events I was going on about. It changes our view of the world. In the same way Oliver Cromwell and George Washington changed the way we think about politics, it affects every other part of society.

 

 

 

Lastly Zonorhc, I would like to point out that you are again 100% correct. It is a theory know as Rationalism(Well probably thats what I have heard, read and spoke it as), in which everything has a rational cause and a rational effect. You put 2 in and you get 5 out because there is a process between the two. Where as Irrationalism, where things will happen regardless you can put anything in and you will always get 2. Humans of the first type, or who believe in the first type, are capable of counter-though, capable of thinking outside the box and changing society. The humans who are the second type, or, again, believe they are part of the second type will go with the flow, and essentially that would be how I assume you would design robots, so that they thought within parametres.

 

 

 

So if again I must ask for your responce, if they created to be part of the first group would that make a difference.

 

 

 

 

 

Ok then, on to Lenin.

 

 

 

Mind control verses influance. It really does depend on your interpretation. Random sequences of events like being mugged by a black person can and often do lead to prejudices being formed, which is a form of mind alteration, and therefore comes under the broadest meaning of mind control.

 

 

 

Actually most of the time we just hit machines that go wrong. But I do see your logic, and the fatal flaw. 'If they're not working corrently' now a criminal is not working how society would like them to behave. So we condition(mind control) them to act on a more acceptable level. Now a robot, that is not 'working incorrectly', not acting as an unproductive member of society, not relying on the charity of others, simply working as best they can in whatever field it is THEY have chosen. Why should that robot be reprogrammed into being docile. Isn't that(and I know this arguement will get shreaded) the same as a man beating his wife(or visa versa) into submission so that they are the most important part of the house. We are going to reprogram robots, aliens and whatever else into docile second class citizens so that we can maintain our claim to the top of the social ladder.

 

 

 

Sorry again I have to laugh. The thought that you do not know how to train a dog. But I take your point. We know how to speak binary, so reprogramming a machine would be easy work. A robot would essentially have just a very big program. Again something we have in common with them. Essentially we have a very large, very powerful program, running on a very powerful machine, all happening inside our heads. If it was discovered tommrow, first I doubt we would ever know if it was, that the human mind could be very easily altered would that make you reconsider?

 

 

 

I don't think I implied that anything was better or for worse with Lincon alive, I simply stated that things were different. It doesn't make me laugh but it is rather humourous. 'I don't think laws can be made on huge "What-if?" statements.' To think that America and Russia went though the cold war on the sound principle that other was preparing for war, it wasn't a what if statement. Lets be realistic, laws are always made on what ifs.

 

What if a man has a bike stolen, should the thief go to prison? What if that thief was actually a police officer who was chasing a villian? What if that villian had just shot someone? What if that someone was a terrorist and the villian was actually a CIA agent?

 

Laws are passed with loopholes because that is the way it is. In Britain the 'foolish' Conservatives want to remove all the laws which have been placed without considering the what if's. The laws themselves are sound it is simply their enforcement, or rather the fact that if they are not enforced so strictly the money hungry world will cripple the government with lawsuits. But I am digressing. The point is that laws have to made considering all the factors, not just the immediate ones. America majorly messed that up when they banned slavery(Please it is bad, yes I know, please let me explain). It needed to be tackled properly, not in such a high and heavy handed manner. The short term was not solved and the long term was not solved, it simply looked like people were doing something.

 

 

 

Thank you for you opinion. Please see above

Well I knew you wouldn't agree. I know how you hate facing facts.

Link to comment
Share on other sites

While I agree that yes, being mugged by a black man can cause certain prejudices, that is something very different from bumping into a man on the street and causing Lincoln to live. One is an actual encounter that has some lasting effect on the person, and the other is just a chance meeting out of the blue.

 

 

 

If it were disocvered that the human mind could be eaily manipulated, would I change my mind? ABout what, that we should be able to reprogram humans? Well, that's kind of the point of prison (though it doesn't usually work out), so sure. But with robots, we're the Creators. It's not really the same as beating your wife because one os caused by brute force and can be grudgingly taken, but the robot is perfectly fine with being below humans; I'm sure we've all seen/read I, Robot, the robots were perfectly fine as they were. I disagree with the concept of a class system, whatever species or robot it happens to be, so I'm with you there, but I still think that robots shouldn't be made sentient in the first place.

whalenuke.png

Command the Murderous Chalices! Drink ye harpooners! drink and swear, ye men that man the deathful whaleboat's bow- Death to Moby Dick!

BLOOD FOR THE BLOOD GOD! SKULLS FOR THE SKULL THRONE!

angel2w.gif

Link to comment
Share on other sites

I think man can be quite arrogant when we see ourselves as 'ruining' nature or mother nature, by making species extinct, ozone holes etc. through evolution we are as much a part of the natural process as any other plant or animal, this means that our actions are also part of the ongoing natural process (whether it is inevitable or not).

 

 

 

by implying that somebody is messing with or spoiling nature we are also implying that we are superior and have the ability to ruin it, in actual fact we are just a small segment of it and all of our actions are part of it.

 

 

 

therefore robots are as much a part of this process as we are, and we created them as much as nature created us, we have no rules forced upon us by nature and we turned out alright with our sentience, the same thing would occur for 'life' that we create. see where i am going? if we ever did create a truly artificial intelligence with the ability to think freely by itself it would develop integrity and know the difference between right and wrong

NICKELEY102.png
Link to comment
Share on other sites

[hide=]

Sigh....I hate not being able to talk in person, I digress and end up making a fool of myself.

 

 

 

Cause and effect has nothing to do with directly influencing an individual's thought processes.

 

Ok I am a rationalist(Someone who believes that things have rational causes and follow their own logic. Where as an Irrationalist will say that things have a set plan and individuals cannot influance that plan in any major way...things will happen anyway.)

 

I believe that when we educate people to think in a certain way that is what we get, people who have been programmed to believe something. Saying that democracy, or anarchy, or communism or spice girls are good is not something that can be developed independantly of education. If robots are programmed to be subserviant then that is what will be, you are taking away the choice, in exactly the same way as educating a child that they are inferior in every aspect, that they are only here to serve takes away the choice.

 

The only difference here is that you are dealing with a machine.

 

 

 

Ok. You also make a flawed comparison. We are not talking about a hammer. A lump of metal or rock and a stick. Something that has been used for thousands of years and does not, to our knowladge, feel or think.

 

An automatic hammer is the same thing. You have a lump of metal or rock on a stick. You then have an arm equivelent and a whole range of mechanical material. You have things that are working by smashing themselves into other things. Cogs, Hammers, Spears everything humanity has developed requires something to smash itself into something else to yeild a result.

 

 

 

A robot that can have intelligence(We are assuming that you personally do not make the robot and that the person who did gave it intelligence) is therefore not something that cannot think or feel. It is not something that is limited by humanity's limited manipulatory abilities to smash itself against something else to yeild a result. It is NOT built with the expressed purpose of performing tasks. It is built with the expressed purpose of being alive. Once it is alive it needs to be given rights or subdugated. That is what this is about. Not what we personally feel a robot should be designed to do. Yes it plays an important part, a critical part in fact in the decision making process but you have to realise we are not talking about

 

"Oh well in fifty years I will have a working prototype and before I start I thought I should ask if I should put emotions in"

 

We are talking about a robot that can feel, that can think and can function independantly from humans indefinately has been created. The issue of it's rights has been asked and you need to have an arguement or accept other people's decisions.

 

 

 

 

 

It would not be "subservient to humanity right from the outset". The choice has to be made of whether to wipe it's memory and use it as a tool, in which case you ARE educating it into subservience, or if it should be given protection from people who seek to do that.

 

 

 

Sorry. I find this line humourous

 

"You do not make a hammer and then teach it how to pound nails into wood."

 

I agree with both your statements either side of it but this one just made me laugh. The thought that by making a hammer it would suddenly start pounding nails into wood. Something it would of course not naturally do. Sorry again but I need to lighten the tone after the last part.

 

You do not have to teach a hammer how to nail wood. You do have to 'supervise it', 'guide it' and ultimately 'control it' into doing what you want. You can't simply tell it to hammer the nails into the wood you have to 'force it' to do that. You do not have to teach a wheel how to roll. No that is very true, a wheel is perfectly capable of rolling on its own. Once in position, which could very easily happen naturally, it could roll for eternity. Of course the actual number of naturally occuring wheels is quite small. I do think there are, but there might be so... Of course Boulders, rocks, pebbles and the like do roll and they do occur in nature. We were probably inspired to make the wheel by watching boulders. Now I know you are going to rip this arguement to shreads but I am going to say it anyway. We built wheels on the inspiration of boulders and as such we treat them in the same way. If we built robots then would would have the inspiration from ourselves, so should we not treat them as we would. Now that arguement is incredibly weak, since you have decided that they will not really be like us, they will just be an extention of what we already know, no new inspiration. But supposing it was, does that change your opinion on this arguements validity at all?

 

 

 

Ok. I agree. We do not need to program sentience into machines. I think it is a rather silly idea to be frank. Creating artifical life on the other hand is not a silly idea. Steven Hawking once said

 

"I think computer viruses should be considered life forms. It says alot about us as a people, that the only life we can create is designed to destroy."

 

Yup ultra-naive but creating robots that can think on their own is a major step forward for us as a species. Creating life and learning to interact with something different, something that is possibly dangerous, something that isn't controllable in the same way a computer virus is. It is one of those things that we are going to have to come to terms with at some point because there is going to be something out there that is going to be stronger than us, if there isn't then we are the luckest species in existance because it means we can cleanse the universe and never face the chance of extinction by alien. But if there does come a race that is stronger and does not instantly destroy us, then we are going to have to learn to interact with them. Another potentially weak argument, we should create robots because it will give us a chance to learn how to interact with other races, especially since we programmed the robot. But its not just that, it is one of the major events I was going on about. It changes our view of the world. In the same way Oliver Cromwell and George Washington changed the way we think about politics, it affects every other part of society.

 

 

 

Lastly Zonorhc, I would like to point out that you are again 100% correct. It is a theory know as Rationalism(Well probably thats what I have heard, read and spoke it as), in which everything has a rational cause and a rational effect. You put 2 in and you get 5 out because there is a process between the two. Where as Irrationalism, where things will happen regardless you can put anything in and you will always get 2. Humans of the first type, or who believe in the first type, are capable of counter-though, capable of thinking outside the box and changing society. The humans who are the second type, or, again, believe they are part of the second type will go with the flow, and essentially that would be how I assume you would design robots, so that they thought within parametres.

 

 

 

So if again I must ask for your responce, if they created to be part of the first group would that make a difference.

 

 

 

 

 

Ok then, on to Lenin.

 

 

 

Mind control verses influance. It really does depend on your interpretation. Random sequences of events like being mugged by a black person can and often do lead to prejudices being formed, which is a form of mind alteration, and therefore comes under the broadest meaning of mind control.

 

 

 

Actually most of the time we just hit machines that go wrong. But I do see your logic, and the fatal flaw. 'If they're not working corrently' now a criminal is not working how society would like them to behave. So we condition(mind control) them to act on a more acceptable level. Now a robot, that is not 'working incorrectly', not acting as an unproductive member of society, not relying on the charity of others, simply working as best they can in whatever field it is THEY have chosen. Why should that robot be reprogrammed into being docile. Isn't that(and I know this arguement will get shreaded) the same as a man beating his wife(or visa versa) into submission so that they are the most important part of the house. We are going to reprogram robots, aliens and whatever else into docile second class citizens so that we can maintain our claim to the top of the social ladder.

 

 

 

Sorry again I have to laugh. The thought that you do not know how to train a dog. But I take your point. We know how to speak binary, so reprogramming a machine would be easy work. A robot would essentially have just a very big program. Again something we have in common with them. Essentially we have a very large, very powerful program, running on a very powerful machine, all happening inside our heads. If it was discovered tommrow, first I doubt we would ever know if it was, that the human mind could be very easily altered would that make you reconsider?

 

 

 

I don't think I implied that anything was better or for worse with Lincon alive, I simply stated that things were different. It doesn't make me laugh but it is rather humourous. 'I don't think laws can be made on huge "What-if?" statements.' To think that America and Russia went though the cold war on the sound principle that other was preparing for war, it wasn't a what if statement. Lets be realistic, laws are always made on what ifs.

 

What if a man has a bike stolen, should the thief go to prison? What if that thief was actually a police officer who was chasing a villian? What if that villian had just shot someone? What if that someone was a terrorist and the villian was actually a CIA agent?

 

Laws are passed with loopholes because that is the way it is. In Britain the 'foolish' Conservatives want to remove all the laws which have been placed without considering the what if's. The laws themselves are sound it is simply their enforcement, or rather the fact that if they are not enforced so strictly the money hungry world will cripple the government with lawsuits. But I am digressing. The point is that laws have to made considering all the factors, not just the immediate ones. America majorly messed that up when they banned slavery(Please it is bad, yes I know, please let me explain). It needed to be tackled properly, not in such a high and heavy handed manner. The short term was not solved and the long term was not solved, it simply looked like people were doing something.

 

 

 

Thank you for you opinion. Please see above

[/hide]

 

with the robot.. it would be able to use the internet... so at the last possible moment it could upload it's data onto the internet. this would basically mean that the robot's body is useless, and that it was just using the body to perform a function, and when that function ceased to exist.. it got rid of the body...

 

also... robots don't need oxygen.. all they would theoritcally need is a solar panel to get electricity.

 

unless you're gonna build a robot that needs O2

 

 

 

EDIT

 

 

 

Also.. don;t you find it odd that we can't create life by piecing together humans and zapping it with electricity like frankenstein?

 

so.. unlike robots.. humans are unable to load their data on the internet in order to be revived with a new body and all their memories intact. :shock:

Link to comment
Share on other sites

Hmmmm interesting. I suppose it could get attached to its body, you know like we are attached to our bodies...Burying them for example, we wouldn't make such a big deal about it if it was just a shell. Personal choice I suppose...some robots might want to just live their life and thats all.

 

 

 

You would need more than a solar panel. While electricity would power most things there are some things that it cannot do...repair the body for instance...or lubricate the moving parts, or cool the system down.

Well I knew you wouldn't agree. I know how you hate facing facts.

Link to comment
Share on other sites

But why would they be able to access the internet? They only could if they were built with the parts to do so.

whalenuke.png

Command the Murderous Chalices! Drink ye harpooners! drink and swear, ye men that man the deathful whaleboat's bow- Death to Moby Dick!

BLOOD FOR THE BLOOD GOD! SKULLS FOR THE SKULL THRONE!

angel2w.gif

Link to comment
Share on other sites

Of course the program might not be compatible with the internet...we don't know what sort of complex streams will be created...we might only be able to transfer the memories, not the essence...like a joke that is only funny at the time. If a robot lost that then they would lose their sentience...or at least revert back to an earlier form of sentience...back to the beginning of their path, only with new knowladge this time.

 

 

 

But Aliens could do it, in some undiscovered...bio net? Or just into a higher concouisness, or use psi abilities to hide in someone else's mind...

 

 

 

I would think they would be built with access to some sort of internet...or else you would have to connect them(supposing they were enslaved) to a USB cable each time you wanted them to learn how to do something...why bother teaching them when you can just zap it into them...besides if they can't cognitively progress then they cannot learn, they can only obey what they know, and what they know would have to be downloaded.

 

 

 

If they were free (not enslaved) then it would be choice...

 

 

 

But it would not make them theoretically unenslavable...you would use a virus or something that destroyed the protocals for connecting...or you would enslave the internet and block transfers...or something.

Well I knew you wouldn't agree. I know how you hate facing facts.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.