Serenity: Joss Whedon\'s Diatribe Against Consequentialism

While I'm on the topic of Kant, this seems like as good a time as any to give my long overdue review of Serenity.

Enough has been said on this blog in praise of Firefly, and I don't really have anything interesting to say further about that. It's a good movie. Go see it. There, finished.

There's a line in the self-referential Stephen King movie, Secret Window, that nicely describes what's wrong with Serenity.

You know, the only thing that matters is the ending. It's the most important part of the story, the ending. And this one is very good. This one's perfect.

Well, that's overstating it. The ending to a story doesn't have to be perfect. And Secret Window's ending certainly wasn't. But the ending does have to be decent. Believable. That's the only thing that matters. The ending is fragile. You can screw up anywhere else in the story, but if you screw up the ending, that's what everyone remembers, and it leaves the audience with a sense of lacking, a bad taste in our collective mouths - a bad taste especially pronounced if the rest of the story was really good. Like a frustrating case of blue balls, the worst thing you can do as a writer is screw up the ending of an otherwise decent movie and leave the audience wanting...something, anything, different than what you've given them.

And so it is with Serenity. The television series was not known for cheerful cheesiness or happy endings or sudden changes of heart. And the movie itself was willing to kill off two major characters. So why did Whedon think it would be a good idea to have The Operative realize the error of his ways at the last possible instant, just in time to help save the day?

A brief synopsis: the Alliance government sends an operative to capture and kill River Tam, who can read minds, because she may have learned governments secrets while in the presence of key state officials. What secrets? The secret that, a decade earlier, Alliance scientists experimentally released into the atmosphere a chemical intended to eliminate human aggression. The experiment failed, and instead of simply eliminating human aggression, the chemical made the entire populace of this particular planet completely docile, eradicating the will to live entirely, ala a Misesian version of Buddhism. Worse still, the chemical had the opposite reaction on a smaller segment of the population, creating the beastial monsters known as Reavers.

The unnamed Operative is faithful to the alliance cause of making the world a better place, and is willing to do anything - including murder innocents - to achieve this goal. He accepts the fact that the world he works to create is a world that he, with his many sins, could never be a part.

The Operative is the most interesting character in the movie. That is, until he learns of the failed government experiment and suddenly has a change of heart. Previously in the movie, he was willing to decimate entire villages and their innocent inhabitants in an effort to capture River, but now, all of a sudden - what? Did the numbers suddenly become too great? Was there a cut-off point where 30 murders was acceptable but 30,000 was not? I guess he decided the omelet he was making required too many broken eggs.

Perhaps he realized that his efforts to capture River were only for the purposes of a coverup, and therefore felt that he was not directly advancing his goal of making the world a better place. But surely, if the government is to ever try another social engineering experiment on such a large scale - and it should, for the utilitarian calculus has not changed in the slightest - it must have the trust and support of the public, which will be difficult to achieve if word gets out of the previous failure. So again, why the change of heart?

Libertarians were expecting a libertarian themed movie and Whedon delivered. Even the most hardened of statist hearts must crumble at lines like these:

People don't like to be meddled with. We tell them what to do, what to think, don't run, don't walk. We're in their homes and in their heads and we haven't the right. We're meddlesome.

But it's important to recognize that the major libertarian theme here is deontological, not consequentialist. Our lives are our own. Leave us alone. Don't meddle. A consequentialist libertarian should have difficulty objecting to what actually took place with the failed social experiment. The failure was not systemic in nature, a result of bad economics, misdirected incentives, the impossibility of planning. No, the problem was merely scientific. With better testing and research, perhaps the next batch of Soma will do the trick. Can we fairly rule out the possibility of heaven on earth because of a single engineering mistake?

Share this

I'm not sure the problem was

I'm not sure the problem was entirely scientific, or rather, certainly not entirely an engineering problem. The problem was that they had no idea what the "pax" would do on a widescale, and apparently hadn't tested it enough on individuals to know that it would have the will-sapping effects (or the extreme violent antisocial reaction) indicating a depraved indifference to life among the engineers. Its not a function of "getting the dose/formula right" but a whole host of other issues.

For example, It may well be that there is some sort of insuperable knowledge problem that would prevent such a chemical solution to societal aggression to ever work. This is wholly unrelated to an engineering problem or the particular scientific one. Additionally, the meta issue of the demonstrated depraved indifference to individual or group life displayed by the social engineers is itself a threat to societal peace. Its not at all convincing that objecting to the meta problem is not consequentialist or anti-consequential since the opposition is not "just because/ A is A" but that the consequences of the meta rule are bad and thus that meta rule should not be applied/followed.

The Operative the most

The Operative the most interesting character in the movie. That is, until he learns of the failed government experiment and suddenly has a change of heart.

It's been a while, and I may have forgotten some things, but I don't think he had a change of heart. I think he knew about the experiment all along, but just realized that once the recording had been broadcast, there was nothing left to cover up. He was still very much on the side of the Alliance at the end of the movie.

But are the consequences of

But are the consequences of the meta rule bad? The meta rule seems to be this: it is good to make the world a better place through the use of science/technology.

Aggression is a bad thing. Making people less aggressive, either through cultural/environmental influences or genetic influences is a good thing. Thus, were we to invent a chemical solution to aggression, it would be wise to release this upon the population at large.

The problem in this specific instance seems to have been poor product testing. But that is a risk with any product. Those who promote the precautionary principle would have us place an unreasonably high burden of proof on the inventors of genetically modified foods, new pharmaceutical drugs, and other similar sorts of products. Until these substances can be proven safe, they should be avoided. But they can never be proven safe. There is always a risk that some late-acting devastating effect will occur and cause cancer or something. If we allow small possible risks to counter any conceivable progressive action, we have the ultimate form of conservative stagnation. I don't see a huge difference between these sorts of real life issues (flouride in the drinking water causes a big stink amount some weird people) and the substance presented in the movie.

Perhaps they should have done more testing. But at what point do we conclude that a product has been thoroughly tested and is considered safe?

One objection would be that

One objection would be that the people having the drug forced upon them had no choice. Some of them may of wanted to be medicated, most would have been suspicious and waited to see the effect upon others.

As such, the consequence of forcing a drug upon a whole population is going to be contrary to the goals of the individuals involved. Even if the drug worked perfectly, it would have been tantamount to enslavement.

Micha, The meta rule is the

Micha,

The meta rule is the "kto, kgo" aspect of what took place on Miranda. The formulation "it is good to make the world a better place through the use of science/technology" completely sidesteps this problem as it is essentially irrelevant to what went on. Your reformulation is also fairly vague about the how aside from the who/whom distinction and whether or not one should care about individual autonomy when adding up costs/benefits to any proposed active societal change.

I disagree with this as well:

Aggression is a bad thing. Making people less aggressive, either through cultural/environmental influences or genetic influences is a good thing. Thus, were we to invent a chemical solution to aggression, it would be wise to release this upon the population at large.

1. "Aggression is a bad thing" is critically dependent on the functional definition of aggression. Mindless aggression is of course maladaptive and antisocial, but aggression against active & continuing threats can be beneficial (as well as lower the long term violence in the system).

2. It continues to ignore considerations of personal autonomy. Societies are at the very least collections of individuals, and I would say further that society is a byproduct of individual needs, aspirations, and instantiation, not vice-versa. Thus a theoretical system that is not based on and grounded in that fact is incoherent- societies must by definition have and be composed of individuals, but individuals need not have society. Thus posing a question that ignores the individual and individual desires in the course of saying "this is good for society" is incomplete at best, incoherent/irrelevant at worst.

3. Regardless of 1&2, the statement assumes that aggression is an independent/isolatable variable that can be ratcheted down without side effect or further consequence. However, if aggression is a spandrel (i.e. a necessary consequence of a confluence of other factors that cannot be eliminated absent the elimination of the confluence) then targeting "aggression" per se is folly; any such modification scheme would thus have to take as a package deal the elimination of the confluence, which may be undesireable (such as the revealed case of Miranda; eliminating aggression in most people eliminated the desire to live, while in the small percentage it had the perverse effect. Nothing in the text suggests this effect was a simple testing/dosage problem but rather a response inherent to the targeted system).

Ghertner: Aggression is a

Ghertner: Aggression is a bad thing. Making people less aggressive, either through cultural/environmental influences or genetic influences is a good thing. Thus, were we to invent a chemical solution to aggression, it would be wise to release this upon the population at large.

That's an interesting plan you've got there for curtailing aggression, but it has one minor flaw. Specifically, forcing drugs on innocent people against their will is a form of aggression.

Oops.

Ghertner: Perhaps they should have done more testing. But at what point do we conclude that a product has been thoroughly tested and is considered safe?

When someone freely decides that it's worth the risk and chooses to take it.

One of the chief benefits of being a libertarian is that it makes a lot of things easier. Who needs a central plan for safety testing when people are free to make their own individual decisions about risks and rewards? Or, well, their own minds?

As usual, Rad Geek finds the

As usual, Rad Geek finds the nub of the argument whilst I run all around it. :)

Hey folks, I said "a

Hey folks, I said "a consequentialist libertarian should have difficulty objecting" to these turn of events, not that such a libertarian couldn't succeed. As you've demonstrated, one certainly can. :razz:

But let me play devil's advocate for a bit longer. I'll even go a bit further and try to justify this act not just on consequentialist grounds, but on natural-rights/non-aggression grounds as well.

One objection would be that the people having the drug forced upon them had no choice.

This is true. However, assuming the drug had worked as advertized and turned the entire population into peace-loving non-aggressors, on what grounds could people object that they had had their rights violated? The only legitimate use of force is in self-defense, right? But in a universe of non-aggressors, there would be no such thing as self-defense, because there would be no such thing as offense. So the drug would be depriving people of a right they could not legitimately excercise in the first place! And a right that cannot be legitimately exercised is no right at all. Cute, eh? Sort of like preemptive self-defense, but better.

And, of course, it's even easier to defend on consequentialist grounds. After all, a consequentialist need not concede that aggression is always wrong; after all, if aggression leads to better consequences, it would be preferred. So there's no logical inconsistency in aggressively forcing a drug on people in order to completely eliminate aggression in the world - that's just a good consequence worth the costs. Remember the line in the movie where the Operative says that he (my paraphrase) "accepts the fact that the world he works to create is a world that he, with his many sins, could never be a part."

Incidentally, the unintended consequences of eliminating aspects of base human nature are more interestingly demonstrated in a tale of Jewish lore I learned many years ago.

The Talmud (Yoma 69b) relates the story of how, after the return from exile in Babylonia, Ezra and the other leaders of the day, fearful of another national disaster, prayed that G-d should erase the “yetzer hara” the evil inclination, from the hearts of Israel. After three days of fasting, their prayers were answered; the “Evil Inclination” charged out of the Holy of Holies like the fiercest of lions. They tried unsuccessfully to seize the beast which cried so loudly that it could be heard a thousand miles away. Finally the prophet Zechariah advised them to put it in a lead pot, but not destroy it lest the entire world be destroyed along with it. For three days they held it captive, but for three days all chickens stopped laying eggs, for sexual desire had disappeared from the world. Not a single egg could be found for the sick in Eretz Yisrael. In the end they were forced to lift the lid of the pot and set it free, but not before blinding it and thereby robbing it of some of its power.

This is certainly a vivid and fascinating tale, one that can serve as a proof-text for many philosophical and psychological insights. The point I wish to emphasize here, however, is that even what we regard as our most evil or forbidden impulses have, according to Jewish tradition, an important place in our world. If a more explicit reference is required the Midrash (Genesis Rabbah 9:7) tells us that without the so-called “evil inclination” a man would never marry, beget children, build homes or engage in commerce.”

Adam Smith would be proud.

I do agree that the sudden

I do agree that the sudden "road to Damascus" moment felt forced from the Operative, who showed no moral qualms about anything in the service of bringing about the perfect world.

He knew they had secrets and that people wouldn't understand the Alliance's rightness and he knew that he had to do awful terrible things in the course of birthing the Better World, so I agree that it doesn't make much sense for him to be disillusioned by a misstep along the way. Just this sort of thing was what he trumpeted to Mal beforehand. Birth Pains, etc, breaking eggs to make omelets, etc.

It certainly seems like the wholesale nature of the death tripped his "grain of sand/heap of sand" barrier, but I agree that it doesn't seem easily understandable given the text.

And from personal

And from personal experience, people don't make life-altering ideological changes-of-mind-and-heart in a split second. They may have an eye opening experience, but it usually takes a few sessions of deep pondering and soul searching before a change like that.

Then again, it's a movie, and lengthy conversion sequences don't sell tickets. But a flaw like that is uncharacteristic for Whedon considering the tight storytelling of the tv show.

And from personal

And from personal experience, people don’t make life-altering ideological changes-of-mind-and-heart in a split second.

Nor do they in my experience. But they do in powerful and evocative works of fiction all the time.

Ghertner: This is true.

Ghertner: This is true. However, assuming the drug had worked as advertized and turned the entire population into peace-loving non-aggressors, on what grounds could people object that they had had their rights violated?

If you're talking about the effects of Pax in the movie, it wasn't advertized as something that would just stop people from committing rights-violations against each other. It was supposed to have pretty radical effects on people's personalities and dispositions. (And in fact it did; just not the effects that the central planners expected.) But people have a right to have any personalities and dispositions that they want, and coercively controlling the minds of a whole population through drugs involves a massive and systematic regime of aggression against lots and lots of innocent people.

If you're talking about some other hypothetical drug that somehow stopped people from ever violating anyone else's rights, and had no other effects at all, then you might have some case for claiming that it wouldn't be aggressive, in and of itself, to make people take it. Fine, but on the other hand, most of the people you force it on wouldn't ever violate anyone's rights in any serious way, so there is a question of proportionality. If the amount of illegitimate force being defended against through forcibly administering the drug is at or near 0, then the amount of force that you could legitimately use in forcing the person to take it is also at or near 0. Meaning that you effectively have no right to force most people to take it anyway.

Ghertner: This is true.

Ghertner: This is true. However, assuming the drug had worked as advertized and turned the entire population into peace-loving non-aggressors, on what grounds could people object that they had had their rights violated?colonies of moralists will be needing those missiles after all.

The failure was not systemic

The failure was not systemic in nature, a result of bad economics, misdirected incentives, the impossibility of planning. No, the problem was merely scientific. With better testing and research, perhaps the next batch of Soma will do the trick. Can we fairly rule out the possibility of heaven on earth because of a single engineering mistake?

No, I disagree. The failure was in fact intrinsic to the success: there was no separating the two through further testing and research, because they were one and the same. Remember this is not real life, this is in the context of the movie, so even if you think Whedon is wrong on this point, that aggressive impulse is separable, that's not how it's treated in the movie. From the movie: "well, it works". (That's the woman in the holograph talking about the Pax.)

In fact the argument in the movie is precisely the same as the argument in the Jewish lore:

The point I wish to emphasize here, however, is that even what we regard as our most evil or forbidden impulses have, according to Jewish tradition, an important place in our world. If a more explicit reference is required the Midrash (Genesis Rabbah 9:7) tells us that without the so-called “evil inclination” a man would never marry, beget children, build homes or engage in commerce.”

That is precisely what Joss Whedon was arguing in the movie.

The Operative is the most interesting character in the movie. That is, until he learns of the failed government experiment and suddenly has a change of heart. Previously in the movie, he was willing to decimate entire villages and their innocent inhabitants in an effort to capture River, but now, all of a sudden - what?

All of a sudden, he realized that his dream was impossible, unachievable. Or rather, it was "achievable", but only at the expense of wiping out humanity entirely. Again, the woman in the holograph did not merely say that the Pax had certain bad side effects. She said it worked. And then she described the consequences of it working.

I didn't see the movie so

I didn't see the movie so pardon me if I get anything movie related wrong.
Point one: I see no reason why the Operative couldn't live in the "better world", he need only take the drug.

My second point(s) is directed at Micha. I too was taken aback by the statement

Aggression is a bad thing. Making people less aggressive, either through cultural/environmental influences or genetic influences is a good thing. Thus, were we to invent a chemical solution to aggression, it would be wise to release this upon the population at large.

I think this is completely false and Doss and Rad Geek did a good job pointing this out. If you meant "libertarian aggression" then as Rad Geek pointed out this would be an instance of such agression. If you meant just agressive behavior in general then there are many way to trespass against people that are not agressive but require an agressive response. How exactly are you going to get the gumption up to throw your freeloading brother out of the house if you lack all agressive tendencies. What about those tenants that decided to just passively stop paying the rent.

In a later post you clarified a little and postulated a drug that would make it such that the person would perform no libertarian agression and then to counter Rad Geeks assertion you made the claim that you were not really agressing in administering the drug, since it would not change any individuals behavior in any way they could complain about.

I have a few objections to this. You can take my objections as either denotological or consequentialist as pleases you. In my ethics the denotological nature of morality unfolds in a meta-consequentialist way. It's both and neither.

My objections:

1) There is no way to be sure of any drugs effect on me without first testing on me. Each human is a unique combination of genes and experiences. Although it may be harmless to 99.999999% of the population I may be the one person who it kills. This may be a result of my genetic make up or merely the circumstance of my unique medical history. Thus using the drug puts me at risk. Putting me a risk without my knowledge and consent is an agressive act.

2) If the drug actually works on me then you are actually in essence removing my right to [or consequentialist: "ability of"] self defense. Again due to fallibility the drug may not work on every person and you would be setting everybody else up as sitting ducks for those it didn't work on. By the rules of your own scenario you must have used the drug on yourself, thereby rendering yourself incapable of protecting me should you be wrong. Not only are you making me unwhole but you are in the very process of your plan rendering yourself incapable of making me whole again, either directly or by proxy.

3) Forcing such a drug on the population at large would set up strong selective pressures for agressive behavior. Even supposing that the drug worked on every genotype present in the current population there is no such assurance for future generations. It is always possible that a mutation may arise that renders an individual immune to the drug. Such an individual would be much more selectively fit than his peers and would soon out reproduce the competition. Such a scenario could be quite ugly.

4) I like my agressiveness. I use to like playing agressive games and sports for instance. What makes your valuations superior to mine?

5) What about other societies? You going to drug them too. If not big problems. What order are you going to do this in?

6) There are issues of secrecy. Suppose you only get half the population done before word gets out? How can you guarantee you can get the holdouts. Many would hold out just on the notion that no one else would and they could reap the rewards. Suppose the neighboring society gets win of your actions half way through the program, as you are drugging their citizens. They are not going to be pleased and have every reason to consider it a weapon of aggression. After all you could just stop when every foriegner has been drugged then take over the world without a struggle.

In summary all these objections are as to the consequeces, and all are related to both risk and fallibility. I consider putting me at unreasonable risk against my wishes to be a form of trespass, or sagression if you like to push those concepts a little broadly. I think I would be more comfortable with libertarian ideology if it would just add a few more restrictions to the list instead of forcing everything under the "agression" rubric.

OK, one thing, some of them

OK, one thing, some of them appear to do it for money. Some of it Syrian money. That is plausible, no? A poor country, surely there will be many people who will murder for money.

My evidence? Well, just one example, here:

http://www.memritv.org/Transcript.asp?P1=585

"You are no mujahideen, you are a gang. For $200 from Syria you murder and throw them in cemeteries. Every time we go visit our sons' graves at the cemetery, we find ten more people who were slaughtered, beheaded."

Some of them are from out of Saddam's forces. Evidence: Jawa report, quoting NYTimes, transcribing confession:

http://mypetjawa.mu.nu/archives/066566.php

"The coalition forces arrested me last April as one of Saddam's special forces," he said, sporting a scraggly beard, his eyes wide and a crease furrowing his brow. He was shown from the neck up, a plastic sheet forming a backdrop behind him.

"I met a man named Sheik Mahdi in jail," Mr. Mahmoud said. "When I was released, we met again. He was organizing four groups. They hung out at a pool hall."

More on the Syria connection:

http://www.newsmax.com/archives/articles/2005/2/24/104124.shtml

"We received all the instructions from Syrian intelligence," said the man, who appeared in the propaganda video along with 10 Iraqis who said they had also been recruited by Syrian intelligence officers.

More on the money:

Weapons, explosives and equipment were all provided by Syrian intelligence, the man claimed, adding that group members received $1,500 a month.

These are just fragments. If you want to understand the insurgents, you can do more research on your own.