Harm theory
Lets say (as has been proposed) there are two broad classes of harms:
(1) qualities of experience - e.g. pain; and
(2) qualities of a life - e.g. global preference satisfaction for beings with a personal identity.
Death is a harm of the second type (if we subtract out any pain that comes with it) and so death does not harm beings without personal identity* or global preferences.
Makes sense? maybe - lets see what sort of hypothetical's we can make from it.
Lets say that we have one person who has intense and complex global preferences. Lets say he is a person who compulsively wants every object to be in a certain place and has derived a very complex strategy to put it there. Nothing this person does is 'in the moment'. Now the average person is nowhere near this intense with their global preferences - so maybe he is 50 times more that way inclined. Is it reasonable to say his life is worth that of 50 normal people?
What if you had a planet of people who live in the moment and have no particular global preferences - is it OK to kill them all?
what if a group of people don't follow their global preferences - and so have low value to their 'death' by this scale, lets say there is some 'prima face' reason for them to be killed (eg someone elses preference) Should we threaten people with death in order to drive them to obey their global preferences?
I think it sounds much less plausible a theory when we are not the beings with the maximum amount of global preferences.
* not sure personal identity is very well defined (in the relevant sense here) but we can overlook that.
(1) qualities of experience - e.g. pain; and
(2) qualities of a life - e.g. global preference satisfaction for beings with a personal identity.
Death is a harm of the second type (if we subtract out any pain that comes with it) and so death does not harm beings without personal identity* or global preferences.
Makes sense? maybe - lets see what sort of hypothetical's we can make from it.
Lets say that we have one person who has intense and complex global preferences. Lets say he is a person who compulsively wants every object to be in a certain place and has derived a very complex strategy to put it there. Nothing this person does is 'in the moment'. Now the average person is nowhere near this intense with their global preferences - so maybe he is 50 times more that way inclined. Is it reasonable to say his life is worth that of 50 normal people?
What if you had a planet of people who live in the moment and have no particular global preferences - is it OK to kill them all?
what if a group of people don't follow their global preferences - and so have low value to their 'death' by this scale, lets say there is some 'prima face' reason for them to be killed (eg someone elses preference) Should we threaten people with death in order to drive them to obey their global preferences?
I think it sounds much less plausible a theory when we are not the beings with the maximum amount of global preferences.
* not sure personal identity is very well defined (in the relevant sense here) but we can overlook that.
0 Comments:
Post a Comment
<< Home