Like many personal blogs of its era, this blog is moribund, a casualty of what we might call "the Facebook effect." However, as of late 2015, two things are clear: (1) The Indie Web is a thing, and (2) the re-decentralization of the web is a thing. So who knows? ~~2016~~ **2017** (!) could be the year this blog rises from its own ashes. Stay tuned!

Sunday, 14 March 2004

From a piece by Freeman Dyson in the latest *New York Review of Books* comes this:

Littlewood was a famous mathematician who was teaching at Cambridge University when I was a student. Being a professional mathematician, he defined miracles precisely before stating his law about them. He defined a miracle as an event that has special significance when it occurs, but occurs with a probability of one in a million. This definition agrees with our common-sense understanding of the word “miracle.”

Littlewood’s Law of Miracles states that in the course of any normal person’s life, miracles happen at a rate of roughly one per month. The proof of the law is simple. During the time that we are awake and actively engaged in living our lives, roughly for eight hours each day, we see and hear things happening at a rate of about one per second. So the total number of events that happen to us is about thirty thousand per day, or about a million per month. With few exceptions, these events are not miracles because they are insignificant. The chance of a miracle is about one per million events. Therefore we should expect about one miracle to happen, on the average, every month.

I like those odds - Homer J style

That explanation makes way too many wrong inferences, of which the biggest one is that _every_ "thing" that happens to us _every_ second is significant. Littlewood's definition says that a miracle has "special" significance when it happens. Which means instead of counting any and every event, we should only count a significant event towards the million count. Boo.

**Your thoughts?**

© 2016 Matthew Newton, published under a Creative Commons License.