Posts Tagged ‘math’

“I tried to start a business, but it failed. It must be Obama’s fault.”

People make this sort of argument all the time. They use their own experience (i.e. one data point) to make some generalization about POTUS or Congress or the governor or whatever. But such arguments are, most of the time, completely bullshit.

I’ll go out on a limb: if your business failed, it’s probably your fault.

The fact of the matter is, it’s hard to succeed in business. After about six years or so, over 50% of new businesses have failed. [See here to see where I got my hard numbers, most of which come from the U.S. Bureau of Labor Statistics.] How do you know, if your business failed, that it was x, y, or z’s fault and not your own? Would your business have really survived, even under a perfect utopia of your imagining?

First of all, the consistency of the “current administration” (and by that I mean POTUS and Congress and the governor and your local city council and so on) doesn’t really matter jack squat when it comes to business survival rates. Check out this graph:


There’s no noticeable difference between Clinton, Bush, and Obama here. 20% to 25% of all new businesses fail in their first year, no matter who’s in charge. (This is a little hard to see from the graph directly, as the x-axis is shifted in a strange way.  I used the hard data here.)  After a decade, roughly two-thirds have failed. You can blame POTUS if you like, or alternatively, blame whoever is controlling Congress, but either way you’re slipping into the No True Scotsman fallacy: “Sure, 25% of businesses fail in their first year, but I am different… I would have succeeded had it not been for x, y, or z. I am not bad at business; I would have succeeded if only…” You get the idea. (By the way, and anecdotally, if I were playing a blame game, I’d be more likely to blame a governor, because in my experience state policies tend to effect businesses more than federal policies.)

Now, I’ll admit that certain laws (passed by Congress) or policies (enacted by POTUS) can hurt businesses in specific instances. If you start the Acme Widget corporation, and there’s a huge Widget Tax enacted, your business might fail, and you wouldn’t be remiss in blaming Congress or POTUS. But I’m more interested in generalizations: given that your business failed, can we estimate (if at all?) whether or not it was your fault, or someone else’s fault?

Here’s where Bayes’ Theorem comes to the rescue. First, let’s make some definitions. Let

P(good) = Probability that a random person is good at business;

P(suck) = Probability that a random person sucks at business;

P(fail|good) = Probability that your business fails in year 1, given that you’re good at business;

P(fail|suck) = Probability that your business fails in year 1, given that you suck at business;

P(suck|fail) = Probability that you suck at business, given that your business failed in year 1.

Bayes’ Theorem can help us calculate that final quantity, given assumptions about the previous four. Let’s start with a pretty arbitrary guess:

P(good) = 20%

P(suck) = 80%

What’s my justification for saying that 4/5 people suck at business? Well, the graph above seems to be approaching 20% asymptotically. Only 20% of businesses survive “for the long haul”. And 20% “feels” right: most people aren’t good at business, but there’re still millions of people who are.

Now, let’s say 1000 people start a business. Based on the assumptions above, 800 of them suck at business, whereas 200 of them are good at it. We’ve already mentioned that (say) 75% of the businesses will survive their first year, and 25% will fail: 750 vs. 250. Let’s further assume that

P(fail|suck) = 30%.

I pulled that percent out of a hat, but it seems reasonable to assume if 25% of all businesses fail in their first year, then more than 25% of businesses run by sucky managers will fail. But not too much more: 75% of businesses still succeed in their first year, no matter who is running the show. With P(fail|suck) fixed, we’re forced to concede that

P(fail|good) = 5%.

Bad luck, that. Some 5% of businesses run by good managers will still fail in their first year. These are the people who can rightly blame the administration. (Where did the 5% come from? Well, 30% * 800 + 5% * 200 = 250 failed businesses.)

Now: on to Bayes’ Theorem! (For a discussion of how to use this handy tool, see here). We find that

P(suck|fail) = [P(fail|suck) P(suck)]/[ P(fail|suck) P(suck) + P(fail|good) P(good)]

P(suck|fail) = [(0.3) (0.8)]/[ (0.3) (0.8) + (0.05) (0.2)] = 0.96

Translated into English, this means that if your business fails in its first year, we can conclude that it was probably your fault. There’s about a 96% chance that you suck at business.

This is a fascinating result…and you’ll get similar results no matter what assumptions you make, as long as they are reasonable and consistent. For one thing, you’ll always get that P(suck|fail) > P(suck). If I take a person off the street, they have about an 80% chance of sucking at business. But if I take a failed business owner, someone whose business failed in its first year, then I have some extra data, and their chance of sucking at business has gone up to 96%. (There’s a 4% chance that they are good at business but got unlucky.)  A person whose business fails has no right to complain about their failure. It was probably their fault.

There are two lessons here. One, stop whining. Two, remember Bayes’ Theorem!


I suck at business, too.

Read Full Post »

Anyone who studies physics and/or mathematics has often encountered the following conundrum:

How do you distinguish 18th-century French mathematicians with surnames beginning with an “L”? (I call these E.C.F.M.W.S.B.W.A.L.’s)

For example, you might recall that an E.C.F.M.W.S.B.W.A.L. invented the calculus of variations, some time around 1760.  Was it Legendre?  Lagrange? Laplace?  Or maybe you remember that an E.C.F.M.W.S.B.W.A.L. was the father of probability theory, and worked on the Buffon needle problem.  Was that Laplace?  Legendre? Lagrange?

So as a public service, I’ve sorted this out for you.  I henceforth talk about these three great mathematicians, and hope to distinguish them in your mind.

Lagrange: perhaps the best mathematician of the 1700’s.

Lagrange is the oldest of the E.C.F.M.W.S.B.W.A.L.’s, born in 1736.  Some call him the greatest mathematician of the century, although I might give that title to Euler.  In any case, he’s responsible for a host of discoveries: he pretty much invented an entire branch of mathematics, the calculus of variations; he used this tool to reformulate classical mechanics (think L = T – V) making it suitable for non-Cartesian coordinates, such as polar; he invented Lagrange multipliers, an elegant way to deal with constraints in differential equations; and he introduced the f(x),f'(x),f”(x)…notation for derivatives.

His greatest work was Mécanique analytique; all of the above achievements are found in this book.  Hamilton described the work as a “a scientific poem,” for its elegance is astounding.



Lagrange was rigorous and abstract: he bragged that the Mécanique analytique did not have a single diagram.  To Lagrange, math was an art; the aesthetics of a theory took precedence over utility.

Laplace: the “applied” mathematician

Laplace was seven years younger than Lagrange, born in 1749.  He also is associated with classical mechanics, but unlike Lagrange, he did not reformulate the field per se.  Rather, he took Newtonian mechanics to its “apex” with his work Mécanique céleste.  This work is brilliant, but it’s also clunky and difficult.  It analyzes the orbits of all known bodies in the solar system, and concludes that there is no need of God to keep the whole mess going.  In fact, Napoleon supposedly asked why Laplace didn’t mention God in the Mécanique céleste.  He reportedly said “I have no need for that hypothesis.”



Laplace didn’t place as much emphasis on “beauty” in mathematics.  To him, math was just a tool.  Not surprisingly, he contributed to the “applied” field of probability theory; in fact, he’s arguably the founder of probability theory as we know it today.

Legendre: the elliptic integral guy

Although highly regarded in his day, Legendre (b. 1752) is really a tier below the first two guys.  Basically, he worked out how to do some elliptic integrals, and he introduced the Legendre transformation, which is used in many branches of physics.  For example, you can go back and forth between the Hamilton and Lagrange approaches of classical mechanics by means of Legendre transformation.  Also, such transformations are ubiquitous in thermodynamics (think U → H → A → G).

Legendre is also know for the portrait debacle.  Only a single known image of Legendre exists, and that image is not flattering:



Every other supposed portrait of Legendre is actually the picture of some obscure politician, because of a mistake which has propagated forward for 200 years.

In summary:

Lagrange: the beauty of math; reformulated mechanics in the Mécanique analytique

Laplace: math as a tool; Newtonian mechanics reaches its zenith in Mécanique céleste; probability theory

Legendre: the creepy looking elliptic integral guy

Note: I have not mentioned Lavoisier (b. 1743) because he was a chemist.  But if you really need him:

Lavoisier: a chemist who was guillotined in the French Revolution.

[Note added Dec. 4, 2014]  I could have included L’Hopital (French, died 1704) but all he did was write a textbook.  Laguerre was French, but he was born in 1834;  Lebesgue was French, but he was born in 1875.

Read Full Post »

As promised, the solutions…

1.   681472 [Um, Didn’t we answer this one earlier?]

2.   3927.27272… seconds This represents the amount of time it takes the minute hand of a clock to lap the hour hand.  For example, the hands coincide at midnight; they next coincide 3927.27272 seconds later, or at about 1:05:27 AM.

3.   23.14069… This is just e^π.

4.   2.1656 x 10^185 This is how many cubic planck lengths fit in the observable universe…basically, if our universe were a 3D computer, this is how many pixels you’d need.

5.   1.03 light year/year^2 This is the acceleration due to gravity g, in non-standard units.  It has the following interpretation: if you ignored relativity and accelerated at a rate of 1 g (reasonable for a starship), after a year you’d have reached the speed of light.

6.   133956 This is the number of possible combinations of two birthdays, since 133956 = 366^2.  If everyone on Earth had a significant other, there would be over 26,000 couples with the exact same two birthdays as you and your other.

7.   About 19.5 million people The number of people on Earth who share your birthday.

8.   0.739085… This is called the “Dottie number”…an irrational number that solves the equation cos x = x.

9.   1.72048 m^2 The area of a pentagon with sides of 1 m.

10.   0.004295 % This is what percent of Earth’s history homo sapiens has been around.

Read Full Post »

Many Worlds Puzzle #3

Today there are really 10 puzzles. Can you figure out the significance of each number below? I’ve answered the first to get you started.

1.   681472

This number has a prime factorization of 2^9 x 11^3, which indicates that it equals 88^3. There are 88 keys on a piano…so one obvious interpretation is that the number 681472 is the number of possible three-note permutations that could start any piece on a piano (not counting rests, and ignoring duration). I wonder how many of the permutations have actually ever been played over the years?

2.   3927.27272… seconds

3.   23.14069…

4.   2.1656 x 10^185

5.   1.03 light year/year^2

6.   133956

7.   About 19.5 million people

8.   0.739085…

9.   1.72048 m^2

10.   0.004295 %

Because many of these problems are challenging, I will post hints in a week or so.

Read Full Post »

Up until Oct. 7, 2013, my modest blog averaged about 18 hits per day.  Then this happened:


A post of mine, the 9 kinds of physics seminar, had gone viral.  I was shocked, to the say the least.

I spent some time investigating what happened.  The original post went out on a Thursday, Oct. 3.  Nothing much happened, other than a few likes from the usual suspects (thank you, John Zande!)  I did share the post with Facebook friends, which include not a few physicists.  (Note: I don’t normally share my blog posts to Facebook.)  Then on Monday, Oct. 7, the roof caved in.

It started in India.  Monday morning, I had over 800 hits from India.  My initial thought was that I was bugged somehow.  But soon, hits started pouring in from around the world, especially the USA.

And then it kept going.

On Tuesday, Oct. 8, the Physics Today Facebook page shared the post, where (as of today) 451 more people have shared it, and 188,000 people have liked it.  (Interesting question: my blog has only had 130,000 views.  Are there really that many people who like Facebook posts without even clicking on the link?)

The viral spike peaked on Wed., Oct. 9.  I had discovered by then that my post had been re-blogged and re-tweeted numerous times, by other physicists around the world.  If you Google “The 9 kinds of physics seminar” you can see some of the tweets for yourself.

Why did the post go viral?  Who knows.  I’m not a sociologist.  I think it was a good post, but that’s not the whole story.  More importantly, the post was funny, and it resonated with a certain segment of the population.  If I knew how to make another post go viral, I’d do so, and soon be a millionaire.

What’s fascinating to me, though, as a math nerd, is to examine how the virality played out mathematically.  Here’s how it looked for October:

Chart 1

I don’t know anything, really, about viral cyberspace, but this graph totally matches my intuition.  Note that for the last few days, the hits have been around 400/day, still much greater than the pre-viral era.

After the spike, is the decay exponential?  I’m not a statistician (maybe Nate Silver could help me out?) but I do know how to use Excel.  Hence:

Chart 2

The decay constant is 0.495, corresponding to a half-life of 1.4 days.  So after the peak, the number of hits/day was reduced by 1/2 every 1.4 days.

This trend didn’t continue, however.  Let’s extend the graph to include most of October:

Chart 3

Over this longer time span, the decay constant of 0.281 corresponds to a half-life of 2.5 days.  The half-life is increasing with time.  You can see this by noticing that the first week’s data points fall below the exponential fit line.  It’s as if you have a radioactive material with a half-life that increases; the radioactive decay rate goes down with time, but the rate at which the number of decays decreases is slowing down.  (Calculus teachers: cue discussion about first vs. second derivatives.)

Maybe this graph will help:

Chart 4

The long-term decay rate seems to be 0.1937, corresponding to a half-life of 3.6 days.  At this rate, you would expect the blog hits to approach pre-viral levels by mid-November.  I doubt that will happen, since the whole experience generated quite a few new blog followers; but in any case, the graph should level off quite soon.  What the new plateau level will be, I don’t know.

Where’s Nate Silver when you need him?


If you enjoyed this post, you may also enjoy my book Why Is There Anything? which is available for the Kindle on Amazon.com.


I am also currently collaborating on a multi-volume novel of speculative hard science fiction and futuristic deep-space horror called Sargasso Nova.  Publication of the first installment will be January 2015; further details will be released on Facebook, Twitter, or via email: SargassoNova (at) gmail.com.

Read Full Post »

In my continuing effort to present cutting-edge research, I present here my findings on the 9 kinds of physics undergrad.

First, let’s look at a scatter plot of Ability vs. Effort for a little more than 100 students.  (This data was taken over a span of five years at a major university which will remain unnamed.  Even though it’s Wake Forest University.)

HR diagram

Student ability is normalized so that 1 is equivalent to the 100th percentile, 0 is the 50th percentile, and –1 is the 0th percentile.  [This matches the work of I. Emlion and A. Prilfül, 2007]  Ability scores below –0.5 are not shown (such students more properly belong on the Business Major H-R diagram).

On the x-axis is student effort, given as a spectral effort class [this follows B. Ess, 2010]:

O-class: Obscene

B-class: Beyond awful

A-class: Awful

F-class: Faulty

G-class: Good

K-class: Killer

M-class: Maximal

As you can see, most students fall onto the Main Sequence.

HR typical

The Typical student (effort class G, 50th percentile) has a good amount of effort, and is about average in ability.  They will graduate with a physics degree and eventually end up in sales or marketing with a tech firm somewhere in California.

HR giant

The Giant student (effort class K, 75th percentile) has a killer amount of effort and is above average in ability.  Expect them to switch to engineering for graduate school.

HR smug

The Smug Know-it-all student (effort class O, 100th percentile) is of genius-level intellect but puts forth an obscenely small amount of effort.  They will either win the Nobel prize or end up homeless in Corpus Christi.

HR grad

The Headed to grad school student (effort class B, 75th percentile) is beyond awful when it comes to work, and spends most of his/her time playing MMORPG’s.  However, they score well on GRE’s and typically go to physics graduate schools, where to survive they will travel to the right (off the main sequence).

HR industry

The Headed to industry student (effort class F, 55th percentile) is slightly above average but has a faulty work ethic.  This will change once they start putting in 60-hour weeks at that job in Durham, NC.

HR phobe

The Hard working math-phobe student (effort class M, 30th percentile) is earnest in their desire to do well in physics.  However, their math skills are sub-par.  For example, they say “derivatize” instead of “take the derivative”.  Destination: a local school board near you.

HR super

The Supergiant student (effort class K, 100th percentile) is only rumored to exist.  I think she now teaches at MIT.

HR frat

The Frat boy student (effort class O, 50th percentile) is about average, but skips almost every class.  Their half-life as a physics student is less than one semester.  They will eventually make three times the salary that you do.

HR dwarf

The White dwarf student (effort class B, 30th percentile) is below average in ability and beyond awful when it comes to putting forth even a modicum of effort.  Why they don’t switch to being another major is anyone’s guess.


If you enjoyed this post, you may also enjoy my book Why Is There Anything? which is available for the Kindle on Amazon.com.  The book is weighty and philosophical, but my sense of humor is still there!



I am also currently collaborating on a multi-volume novel of speculative hard science fiction and futuristic deep-space horror called Sargasso Nova.  My partner in this project is Craig Varian – an incredibly talented visual artist (panthan.com) and musician whose dark ambient / experimental musical project 400 Lonely Things released Tonight of the Living Dead to modest critical acclaim a few years back.  Publication of the first installment will be January 2015; further details will be released on our Facebook page, Twitter feed, or via email: SargassoNova (at) gmail.com.

Read Full Post »


Run for your lives!

A few days ago I heard a story on NPR about wildfires in Yosemite.  It turns out that something like 360 square miles of forest have burned.  Being a math geek, I immediately took the (approximate) square root of 360 in my head:

360 ≈ 19 x 19

I did this without really even thinking about it; I did it in order to be able to visualize the size of the Yosemite blaze.  I now had a picture in my head of a square, 19 miles by 19 miles.  A burning square.  That’s how big the conflagration was.  And the mental math was important because I have no intuition at all about square units.

[Disclaimer for my readers not in the USA: I use the S.I. units (m/kg/s) in my physics research, but in American culture units like miles, inches, gallons, etc. are still endemic.  Sorry about that.]

Quick: how many square feet is a baseball diamond?  If you’re like me, absolutely nothing comes to mind.

I do know that a baseball diamond is 90 ft. x 90 ft. square.  So that’s the answer: 8100 sq. ft.  (752.5 m2)  The problem is that, somehow, psychologically, 90 ft. x 90 ft. seems much smaller than 8100 sq. ft., even though they are the same.

The county I live in, Jackson County, NC, is 494 sq. mi. (1,279 km2).  Somehow, this seems big to me.  But in order to better visualize this area, take a square root: the county is like a 22 mile x 22 mile square (36 km x 36 km).  In those terms, the county seems puny (although it is still bigger than Andorra).  The area of Jackson county is less than 1% the total area of the state of North Carolina.

What about the Yosemite fire?  360/494 = 73%.  So that fire is about three-fourths the size of the puny county that I live in.  A big fire, sure, but not apocalyptic.

The problem that all of this illustrates is one of scaling.  Most of my students know that 1 m = 100 cm.  However, very few know (initially) that 1 m2 ≠ 100 cm2.  Instead, 1 m2 = 10,000 cm2.  That’s because a square meter is a 100 cm x 100 cm square.

This fact leads people’s intuitions wildly astray.  Suppose I double the length and width of an American football field.  The area goes up by a factor of 4.  What was approximately 1 acre has become 4 acres.  Suppose I switch from a 10-inch pizza, which feeds 2, to a 20-inch pizza.  That pizza feeds 8.

It gets even stranger if you imagine the switch from length to volume.  Michelangelo’s David is almost 17 ft. tall.  Assume David was 5’8’’ (68 inches).  Then the statue represents a scaling factor of x3 in terms of length (3 x 68 = 204 in. = 17 ft.)  Imagine a real-life David, 17 ft. tall.  How much would he weigh?  If the life-size David is 160 pounds, the 17 ft. David would be 160 x 33 = 160 x 27 = 4,320 pounds.  To most people, this seems very strange.


He weighs 4320 pounds. If he weren’t made of stone, that is.

But back to my original idea: I had mentioned that I had no intuition about square units.  I don’t think many people do.  What intuition I do have is based on experience, and comparing unknowns to knowns.  500 sq. miles is about the size of the county I live in.  An acre is about a football field.  1000 sq. ft. is about the area of a small house.  500 sq. in. is about the area of a modest flat screen TV.  100 fm2 (a barn) is about the cross-sectional area of a Uranium nucleus.  A hectare is about 2.5 football fields stuck together.  And so on.  I’m sure you have your own internal mnemonics to help you gauge area, or volume.

If not, just remember: you can also do the square root in your head.  So if that guy on NPR says there’s a fire that’s 100,000 sq. miles in area, you can visualize

100,000 ≈ 316 x 316

and since this is very similar to the size of Colorado (380 miles x 280 miles) you can start kissing your love ones and planning for the apocalypse.

Read Full Post »

Older Posts »