« Music Blather Friday (on Thursday) | Main | Tsunami Relief »

December 24, 2004


No LEM, No LNC, No Buddha So much for Hofstadter.

Over at Pandagon in a "I hate ID" boring thread, a commenter discusses string theorey, where wouldn't it be really neat to put all the particles together like Christmas lights. So, so elegant, ya know.

I am also reading Holbo's dissertation on Wittgenstein, today Russell saying late Witty was rotten because Witty had no basic propositions, and Witty saying Russell was evil because Russell wanted basic propositions.

Lawyers thinking they make law are like physicists thinking they make the world. A system to be useful must be simple enough to be manageable, complicated enough to be comprehensive, and ambiguous enough to be human(ethical).

The answer to your problem is to put the dude in front of a judge and/or jury. Make sure you leave enough ambiguity to make that possible.


I'm with you on the DJ question, both in ignorance and inclination. This is going to pester me all day, but such a thing is just crazy enough to have happened, and so I'll have to look it up. I'll report back . . .

Bob, of course a judge gets to hear the DJ issue as soon as the man is charged, unless he's got one of those underpaid underperforming public defenders we hear so much about.

On a cheerier note, last night I got an email from an ACLU lawyer I'm working with on a case. (Against the CIA). Closing text: "Merry Christmas."

I wish the same to all within the "sound" of my "voice."

The only thing unique about the problem is that presumably you can only murder someone once, so logically how can there ever be two separate offenses.

The law does not rely much on careful logical exposition of its rules, and relies far more on policy underlying those rules when faced with conundrums. This is usually a good thing since precise rules frequently suffer from logical fallacies in unique circumstances. It can also create trouble since it provides an out for being logical about a problem and sticking to the rules (even if you are not happy with the result) -- hence the line that hard cases make bad law.

The policy underlying double jeopardy is to prevent unfair retrials, which is an obvious overreaching of state power. Here, the unfairness was the initial wrongful conviction for murder, and there is no policy supporting the conclusion that an earlier unfair conviction permits one to then actually commit the crime "again" without consequence.

I do not know of any actual cases on this issue, but I suspect that courts would have no difficulty in permitting the second trial for the actual murder.

Again, the only thing unique is how can there be a "second murder" for purposes of a "separate offense." But the problem would not get analyzed from that point of view. And the simple answer is that there was never a first murder -- just a wrongful conviction.

Well, I am no lawyer, nor a philosopher. But it seems to me so much here would depend on the facts of the case, and assessment of the characters of the principals. Justice might be guilty, with sentence of time already served. Maybe he was an abusive husband; maybe she emptied the bank accounts before cutting off her finger. But my instincts say a jury could determine justice in a few hours, if the facts could get to them, or instructions didn't limit their options (I hate guidelines, and am no fan of tort reform).

Sebastian, who often argues against an activist SCOTUS, could perhaps extend that into a legal philosophy. We maybe should not be trying to make law so complete and precise that juries become unnecessary.

Thought experiments are a lot of fun, but this particular one raises a few real-world questions:

Why did the police assume the wife was killed in the first place, rather than that she had simply run away? Had she told friends/relatives ahead of time that she feared for her life? Was there a history of abuse?

Someone planning to fraudulently disappear, esp. someone who isn't already used to criminal behavior, tends to leave traces. They hide assets. They have plastic surgery - which leaves either a credit card trail or a sudden disappearance of a lot of cash. They get PO boxes, to divert mail. They get fake ID - again, either a credit card trail (believe me, people are dumb enough to do that) or disappearing cash.

The finger found in the fridge would have been analyzed, and analysis would have shown it was cut off while the person was alive (it would also show whether or not she was drugged). Was the husband supposed to have dismembered her before she was dead? If he did so in the house, why weren't there blood traces elsewhere? And where's the rest of the body? If he took her away somewhere and then cut her up, how did one finger wind up in the fridge?

In detective-fiction, when someone vanishes, police "always" first suspect the spouse. This may be true in RL as well. But unless the police are very lazy, very corrupt, or have it in for the guy, they don't just stop at the most obvious suspect and ignore everything else. They do look at "motive, method, opportunity."

So I'd say this thought experiment fails in its premise: the initial conviction. Assuming the police were competent, and that the guy could afford competent counsel, it is by no means a given that the guy would've been convicted in the first place.

As I said, I suspect it is an urban legend. But I simplified because I assumed most people had heard it before. The version I heard involved years of previous abuse, a freaked-out phone call to mom before the woman disappeared, and other such circumstantial evidence. Furthermore, I suspect the legend is old enough to have avoided some of the more modern CSI techniques in the initial conviction. I don't think it is ridiculous that someone could have been convicted 20 years ago with lots of circumstantial evidence and the only piece of physical evidence being a hacked off finger.

Before reading about it here, I had never heard of this legend. However, the facts of the case produced interesting commentary from your respondents. The law is often so deliberately ambigious that a guilty, not guilty, hung jury, etc. can be produced at the trial level. The ambguity often shows up in jury instructions. That's why we have appellate courts and supreme courts. But, as we know from the many convicts that have been released in recent years for crimes they did not commit, the system does not always work as it should. Unfortunately, there will always be wrongful convictions as long as we have politically ambitious prosecutors who can't admit mistakes, and incompetent defense lawyers incapable of catching the mistakes at the trial level.

Interesting stuff about ambiguity. I had a friend in college whose senior thesis took a long detour into the value of legal ambiguity. Given that she was a smarty-pants, I assume that there's been some cool stuff written about it by Important Theorists. Little help?

You and others interested might want to read Lawrence Solan's book entitled the _Language of Judges_. I haven't read the book, but the presentations and articles that I've seen of his are very interesting. Here's an article about how linguistic perceptions of causation and enablement compare with the legal framework.

I think (this time) that one of the problems in analyzing law is that it is highly self referential, but we don't have a good method of analyzing it without resorting to further self-referential states.

That is, in part, because there really isn't any good method of analyzing self-referential sentences. The keystone of Godel's First Incompleteness Theorem is the Diagonalization Lemma, which (translated into lay terms) says that, for any predicate B you can "reasonably" define, there is a sentence G that asserts B holds of itself. Once you allow a logic/language strong enough to get the Diagonalization Lemma, you're potentially running into all kinds of logical trouble.

Ultimately, the problem is essentially a variant of Russell's Paradox, or the Cretan Paradox, or whatever you want to call it: "This sentence is false". Once you have a language that can express something like this -- and, I should note, the fact that you can't express this particular notion in standard mathematical logic is the (short) proof of Tarski's theorem on the Undefinability of Truth -- you're pretty much screwed. You either have to prevent certain kinds of statements from being grammatical (Quine's type theory, or some kind of stratification theory, being the most common), severely restrict the power of your language (in formal mathematics this takes the form of not being able to do both addition and multiplication; in philosophy, I'd guess you'd have to remove the notion of a predicate for "truth"), or live with the fact that you can't express certain things in the object language that you can in the metalanguage (e.g. "M1 makes a sentence true iff M2 makes a sentence true"). Them's the breaks.

And a damn good thing too, or I'd be unemployeder than I already am.

The comments to this entry are closed.

Blog powered by Typepad