Utilitarianism and similar moral theories often tell us to evaluate an action

Utilitarianism and similar moral theories often tell us to evaluate an action

Utilitarianism and similar moral theories often tell us to evaluate an action based on its expected consequences. Usually, this is assumed to be equivalent to the mathematical expectation of some function or other. Isn't this quite a specific probabilistic assumption to be making about the consequences of an action? What would utilitarians do if they had to make a choice over actions where the consequences depended on a random variable with no measure?

Read another response by Thomas Pogge
Read another response about Ethics
Print