View Single Post
Old 05-14-2007, 06:37 PM   #19
Spider AL
A well-spoken villain...
Spider AL's Avatar
Join Date: Jan 2002
Location: Help, help, I'm stapled to my workstation.
Posts: 2,162
I'm back.

Originally posted by tk102:

Oh probably not... but the syntax is concise and with the goal of eliminating all ambiguity in the terms. When you say m(x) and I say m(x) I know we're talking about the exactly the same thing for example.
Removing ambiguity is a laudable goal, but since your notation merely directly substitutes symbols for longhand words and concepts, it's not going to be any more nor any less ambiguous than the original terms were. We could certainly quibble over the meaning of "m(x)" to precisely the same extent as we could quibble over a longhand term like "the morality of a given action".

Concision is also a laudable goal, but I'm afraid that since every public usage of a "morality equation" will have to be accompanied with a longhand key-code, It'll be slightly less concise than just using the longhand.

Don't get me wrong, I still think that it's an interesting exercise that might yield new insights from manipulation of the variables... but I rather think that's all it's useful for.

Originally posted by tk102:

I understand what you mean by loss. I tried to cheat it into the D(x) factor assuming death as being the maximum distress for an individual:

I believe that was the only negative impact you cited that didn't fall under D(x). "Violations of established rights" I believe would qualify as psychological distress. I'm open to suggestions though of how better to define m(x) in terms of physical distress, psychological distress, death. Maybe each of those get their own variable instead of being lumped under D(x).
Absolutely they need their own values. I mean, conflating these values totally oversimplifies the equation. If the values are joined at the hip, there's no way to properly delineate say... a painless murder and on the other end of the scale, torture without physical injury.

Originally posted by tk102:

I'm glad you took issue with Ky. It bothered me too. Seemed quite anthrocentric, but at the same time I am surprised you wouldn't kill a mosquito. If I saw one biting my son, I wouldn't hesitate even without the risk of disease.
Why would you kill it? Merely because of parental instinct? If so, I'm sure you'll agree that such protective instincts- while human and understandable- are not means by which we can determine a moral course of action.

Originally posted by tk102:

But getting back, you are suggesting that in order to be remain objective, we simplify the equation:

D(x) := ? dxyKy : (y ? P)


D(x) := ? dxy : (y ? P)

That is quite a conservative view, considering the suffering of a mosquito is equivalent is on par to the suffering of a human. It's not unheard of though -- Jainism follows this belief precisely. Plus it is does eliminate the rather troublesome Ky value.
Which once again highlights this point: In order to be optimally moral, it follows that we must follow the most conservative view. By this, I mean the optimally moral individual would follow a course of action that eliminated even the RISK of behaving immorally.

Thus, while I am not a follower of Jainism, I would have a hard time arguing against the assertion that they're being optimally moral in their treatment of other creatures.

Originally posted by tk102:

But why does Ky trouble us? As you suggested with your alien scenario, what appears to an advanced alien as moral may appear to us as cruel and immoral. If we assume the universe has a fixed number of species within it, it is possible there does exist a scale of complexity that could be objectively defined. By inference then, the seemingly subjective Ky would in fact be an objective value. So in that case, perhaps it is true that Khuman << Kalien. Oh yeah, that's why it troubles us.
Mmm, bit of leap there TK, I rather think the reason the alien scenario shows that anthrocentric views are immoral is that we know that we suffer and die, and that suffering and death are negative experiences for us. So regardless of whether the hypothetical aliens are more complex than us in both body and mind, that doesn't make their version of suffering more "valuable" than ours. It doesn't make our lives less "important". After all, empathy is about putting yourself in the shoes of others, not weighing them against your self-image.

So I think the question of whether there's an objectively definable scale of complexity on which all higher animal life can fit... is an utter irrelevance to the question of morality and moral treatment of other creatures.

Originally posted by tk102:

I still haven't figured out how to represent a categorical imperative into this set of equations. Anyone have any suggestions?
The categorical imperative is essentially a plea for moral universality. Universality is already implied (assumed, actually) in the equation. I don't think you need to fit the CI in anywhere to properly represent the values you're trying to describe. It would take a separate (and prohibitively complex) equation to deal with the question "why should a person be moral?"

[FW] Spider AL
Hewwo, meesa Jar-Jar Binks. Yeah. Excusing me, but me needs to go bust meesa head in with dissa claw-hammer, because yousa have stripped away meesa will to living.
Spider AL is offline   you may: quote & reply,