How petty can a God be?

How petty can a God be?

Spirituality

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.

Krackpot Kibitzer

Right behind you...

Joined
27 Apr 02
Moves
16879
25 Jul 06

Originally posted by Halitose
Your analogy is disingenuous; the robot will have to rape and murder other robots to have any valid application.
On the contrary, my robot is an equal opportunity pervert.

Krackpot Kibitzer

Right behind you...

Joined
27 Apr 02
Moves
16879
25 Jul 06

Originally posted by howardgee
Yes, but the whole point is that the inventor gave the robot free will, KNOWING that this free will would result in the death of people.
The inventor did not have to give the robot free will. It was a bad decision to give the robot free will.
Do you agree?
Suppose I don't know that the robot will rape and kill, but I know that there is a risk that he will, given his free will. Am I still responsible?

Krackpot Kibitzer

Right behind you...

Joined
27 Apr 02
Moves
16879
25 Jul 06

Originally posted by FreakyKBH
[b]I want God to take it way, so I can be inevitably good.
Is that your will?[/b]
I am already an angel, in case you hadn't noticed, sinner.

Walk your Faith

USA

Joined
24 May 04
Moves
158039
25 Jul 06

Originally posted by howardgee
Yes, but the whole point is that the inventor gave the robot free will, KNOWING that this free will would result in the death of people.
The inventor did not have to give the robot free will. It was a bad decision to give the robot free will.
Do you agree?
So again the knowledge of right and wrong within the robot does
not matter, but the knowledge of the creator alone? This seems
to be the common thread of this assult on the creator, the
created must blame someone else for its own crimes.

I do not know if what the creator did can be called a bad decision
it depends on many factors. I do believe that actions with the will
the robot has can be called bad strait up without a doubt when it
does harm for the pleasure if it.
Kelly

Krackpot Kibitzer

Right behind you...

Joined
27 Apr 02
Moves
16879
25 Jul 06

Originally posted by KellyJay
Well, which is it, does the robot have free will or not? If the robot has
free will we go after the robot, if the robot doesn't have free will we
go after the creator of the robot. The designer either gave free will or
didn't, that which can and does make choices is responsible for the
choices made. If you claim that you can make that which can only
ma ...[text shortened]... s you are not then making a free will within your
robot, you cannot have it both ways.
Kelly
That's a point of view, clearly expressed.

So let me put it another way.

Scenario 1:

Suppose I (freely!) create 100 robots without free will. Suppose I also equip them with virtuous natural instincts that, in the absence of free will, deterministically impel them towards good.

Now, how many of these 100 can we expect, on balance, to do good, and how many, on balance, to do evil? In this case, I can expect all to be good, because I programmed them all that way.

So, if the good 100 went out into the world, and did good deeds, wouldn't I be responsible for that?

Scanario 2:

Suppose I create another 100 robots, again without free will. Suppose I also equip them with conflicting natural instincts, which, in the absence of free will, randomly impel them with equal probability to do good and to do evil.

Now, how many of these 100 can we expect, on balanced, to do good, and how many, on balance, to do evil? As the matter is randomly determined, your expect roughly a 50-50 split.

So, if the bad 50 robots went out into the world, and did bad deeds, wouldn't I be responsible for that? After all, I could have the robots deterministically good, as I had in Scenario 1.

Scenario 3:

Suppose I create yet another 100 robots, this time with free will. Suppose I also equip them with conflicting natural instincts, which, in the absence of free will, randomly impel them with equal probability to do good and to do evil. However, now equipped with free will, they are free to pursue their good instincts and to inhibit their bad instinct, if they so choose.

Now, how many of these 100 can we expect, on balanced, to do good, and how many, on balance, to do evil? I submit the answer is still 50/50. Their natural instincts are balanced; there is no reason to believe they will choose one way or the other. What effectively happens is that one level of complete unpredictability gets superimposed on another. Hence, the 50/50 figure remains unchanged.

So, if the bad 50 robots went out into the world, and did bad deeds, wouldn't I be responsible for that? After all, I could have the robots deterministically good, as I had in Scenario 1.

If you answers to Scenario 2 and Scenario 3 differ, please explain why.

Walk your Faith

USA

Joined
24 May 04
Moves
158039
25 Jul 06

Originally posted by Pawnokeyhole
That's a point of view, clearly expressed.

So let me put it another way.

Scenario 1:

Suppose I (freely!) create 100 robots without free will. Suppose I also equip them with virtuous natural instincts that, in the absence of free will, deterministically impel them towards good.

Now, how many of these 100 can we expect, on balance, to do good ...[text shortened]... ad in Scenario 1.

If you answers to Scenario 2 and Scenario 3 differ, please explain why.
1. You would be responsible, the robots didn't have any choice.
2. You would be responsible, the robots didn't have any choice.
3. You are no longer concerning yourself with free will, but free
moral agency, different topic altogether. Free will doesn't have
any morals involved, the core value system within, the desire to
do good or bad will cause will to act one way or another.

Your number 3. is a different topic of discussion altogether.
Kelly

Krackpot Kibitzer

Right behind you...

Joined
27 Apr 02
Moves
16879
25 Jul 06

Originally posted by KellyJay
1. You would be responsible, the robots didn't have any choice.
2. You would be responsible, the robots didn't have any choice.
3. You are no longer concerning yourself with free will, but free
moral agency, different topic altogether. Free will doesn't have
any morals involved, the core value system within, the desire to
do good or bad will cause will ...[text shortened]... t one way or another.

Your number 3. is a different topic of discussion altogether.
Kelly
I disagree. Free will is about moral agency, because it is about agency generally, of which moral agency is a subset.

Walk your Faith

USA

Joined
24 May 04
Moves
158039
26 Jul 06

Originally posted by Pawnokeyhole
I disagree. Free will is about moral agency, because it is about agency generally, of which moral agency is a subset.
Free will is about making choices, moral agency is about the guiding
core values with which one uses to make those choices. They are
not the same thing you can make choices that have nothing to do
with morals, but morals will guide all choices to some degree.
Kelly

Krackpot Kibitzer

Right behind you...

Joined
27 Apr 02
Moves
16879
26 Jul 06

Originally posted by KellyJay
Free will is about making choices, moral agency is about the guiding
core values with which one uses to make those choices. They are
not the same thing you can make choices that have nothing to do
with morals, but morals will guide all choices to some degree.
Kelly
But free will is a precondition for moral agency. Hence, it is relevant to moral agency.

Walk your Faith

USA

Joined
24 May 04
Moves
158039
26 Jul 06

Originally posted by Pawnokeyhole
But free will is a precondition for moral agency. Hence, it is relevant to moral agency.
Why?
Kelly

h

Cosmos

Joined
21 Jan 04
Moves
11184
27 Jul 06

Originally posted by KellyJay
Why?
Kelly
Well, if there is an element of coercion, then the perpetrator has diminished responsibility for his actions.

If I hold a gun to your head and force you to rob a bank, then you should not be sentenced for robbing a bank.

Do you understand?

Walk your Faith

USA

Joined
24 May 04
Moves
158039
27 Jul 06

Originally posted by howardgee
Well, if there is an element of coercion, then the perpetrator has diminished responsibility for his actions.

If I hold a gun to your head and force you to rob a bank, then you should not be sentenced for robbing a bank.

Do you understand?
That applies to the discussion we were having how? If you have
the will or you are being forced does not address free moral
agency or a moral core. Free will is not the same as a moral code
or free moral agency. I don't see the example you use as even
addressing the issues we were talking about, please explain.
Kelly

h

Cosmos

Joined
21 Jan 04
Moves
11184
27 Jul 06

Originally posted by KellyJay
That applies to the discussion we were having how? If you have
the will or you are being forced does not address free moral
agency or a moral core. Free will is not the same as a moral code
or free moral agency. I don't see the example you use as even
addressing the issues we were talking about, please explain.
Kelly
(sigh) What do you understand by the term "free will"?

Cape Town

Joined
14 Apr 05
Moves
52945
27 Jul 06

I am incapable of believing in God. The design of my mind (through no doing of my own) is such that it finds the concept untenable.
Does this mean that from a Christians point of view I am not wrong to be an athiest as it was not a free will decision? Can I be punished for it? Is my maker responsible?

Krackpot Kibitzer

Right behind you...

Joined
27 Apr 02
Moves
16879
27 Jul 06

Originally posted by KellyJay
Why?
Kelly
I am not sure what your "why?" refers to. Is it to free will being a precondition for moral agency, or to free will being relevant to moral agency?

I accept that an angel irresistibly inclined towards good both does good and is good. If on judges goodness in terms of dispositions and effects, then one could justly call such an angel and her actions morally good.

Nonetheless, the angel would not be responsible for her actions, being deterministically inclined towards the good (at least, she would not on an incompatibilist interpretation of the meaning of free-will and determinism). Some people, including me, choose to reserve the term moral for free actions that intentionally good and/or that have good effects. I don't object to people using the term otherwise, as long as they do so knowingly.

But, anyway, this point is neither here nor there.

The point of the hypothetical scenarios I devised is that, whether or not the robots have free will, if half of them commit evil acts, then their creator must share at least some of the responsibility for creating them, because the creator could have made robots that would not behave in this way. Whether some of the robots do evil willingly or randomly doesn't affect that fact that ultimately some evil gets done, by hook or by crook, that could have been avoided if an anlternative design had been followed.