Last time, I showed a way of using a graph to model a particular kind of puzzle via
a search graph. Games and puzzles provide a lot of examples of how we can use graphs
to model problems. Another example of this is the most basic state-space search that we do in computer science problems.
Archive for: September, 2007
Last time, I showed a way of using a graph to model a particular kind of puzzle via
Much to my professional shame, PZ recently pointed out David Plaisted, a Computer Science professor at
the University of North Carolina, who has an anti-evolution screed on his university
website. Worse, it's typical creationist drivel, which anyone with half a brain should know is utter
rubbish. But worst of all, this computer scientist's article is chock full of bad math. And it's
deliberately bad math: this guy works in automated theorem proving and rewrite systems - there's no way
that he doesn't know what utter drivel his article is.
Via Atrios, I found this article at the American Prospect, which demonstrates an example of a very
common and very serious math error that's constantly made in the media: unit errors. If you want to
compare two numbers, you need to make sure that they're actually numbers that can be compared. You can't meaningfully compare height in inches to height in centimeters; you can't compare income in dollars
to income in Euro's - to do a meaningful comparison, you need to convert to a common unit.
The specific error pointed out the by Prospect was in the New York Times. The Times published an article discussing politics and economics in Germany. They make some silly arguments about how the Germans don't like goverment economic reforms because they're all a bunch of lazy socialists who like to have the government take care of them. In support of that, they compare the
unemployment rate of Germany to the unemployment rate in the US. And that's where they make their
error: the unemployment rates that they compare don't measure the same thing. It's a slightly
subtle kind of units problem - but it is a units problem.
The German government reports official unemployment numbers using a unit which is "percentage of
employable people with full-time employment". By contrast, the US government reports official unemployment numbers in units of "percentage of people eligible to work with any employment". In the German figures, if you work a part time job 20 hours per week, you're considered unemployed. In the US, if you work a part time job 20 hours per week, you're considered employed. A huge portion of the "unemployed" Germans would be considered employed in the US; or put the other way, a large number of people who are considered employed in the US because they're working part time jobs would be considered unemployed
in Germany because they don't have full-time jobs.
This difference in units is not an unknown or obscure fact. The Organization of Economic
Co-Operation and Development - which is as close as exists to an official authority on these kinds of
statistics - clearly documents the difference between how different governments report unemployment. They
further provide a standard measure of unemployment, which is almost the same as the US measure. (The
difference is that the US doesn't consider you unemployed if you're employable but not currently looking
for work; the OECD numbers consider anyone eligible for work but not working as unemployed.)
The Times cites the German rate as 9%, compared to the US rate of 5%. But that's comparing apples to oranges. The accurate comparison, of the OECD is 6% in Germany and 5% in the US. Not nearly such a
big difference as the original article makes out.
This kind of error is very common, particularly in reporting about economics. But it's very bad math.
In one of Jeff Shallit's recent posts on the Panda's Thumb, he mentioned that Tom Bethel, aside from being a creationist, was also a relativity denier. In general, relativity denial is a veritable mine of bad math. So I went looking - and found Bethel's anti-relativity site. As I expected, we've got extremely silly bad math. In fact, it's the worst kind of bad math - it's a lack of math masquerading as being math. It's also, sadly, full of pathetic errors.
As I've mentioned before, the real use of graphs is as models. Many real problems can be
described using graphs as models - that is, to translate the problem into a graph, solve some problem on the graph, and then translate the result back from the graph to the original problem. This kind of
solution is extremely common, and can come up in some unexpected places.
For example, there's a classic chess puzzle called the Knight's tour.
In the Knight's tour, you have a chessboard, completely empty except for a knight on one square. You can
move the knight the way you normally can in chess, and you want to find a sequence of moves in which is
visits every square on the chessboard exactly once. There are variations of the puzzle for non-standard
chessboards - boards larger or smaller than normal, toroidal boards (where you can wrap around the left edge to the right, or the top to the botton), etc. So - given a particular board, how can you
(1) figure out if it's possible to do a knights tour, and (2) find the sequence of moves in a tour
if one exists?
If you remember, a while back, I wrote about a British computer scientist named James Anderson, who
claimed to have solved the "problem" of "0/0" by creating a new number that he called nullity. The
creation of nullity was actually part of a larger project of his - he claims to have designed a computing
machine called the Perspex machine which is strictly more powerful that the Turing machine. If
this was true, it would mean that the Church-Turing thesis is false, overturning a huge part of the theory
of computer science.
Of course, just overturning the theory of computer science isn't grandiose enough. He also claims that this solves the mind body problem, explains how free will works, and provides a potential
grand unified theory of physics. From Anderson's own introduction on his website:
The Book of Paragon is a web site that offers one solution to the centuries old philosophical conundrum
of how minds relate to bodies. This site shows that the perspective simplex, or perspex, is a simple
physical thing that is both a mind and a body.
The perspex can be understood in many ways. Mathematically, the perspex is a particular kind of matrix; concretely, it is simultaneously a physical shape, a physical motion, an artificial neuron, and an instruction for a machine that is more powerful than the Turing machine. In other words, a perspex is an instruction for a perspex machine that is more powerful than any theoretically possible digital computer.
The perspex machine operates in a 4D space of perspexes called perspex space. This space is related to the 4D spacetime we live in. It is claimed that the perspex machine can describe any aspect of the universe we live in, and can be built from any part of our universe. In other words, the universe can be understood as a perspex machine. And, on the materialistic assumption, our bodies and minds are physical things so they, too, can be understood as perspex machines.
This site contains mathematical formulas for the perspex machine and for properties such as feeling, consciousness, and free will. These things are described in scientific papers, books, and software that you can download and run. The site also contains news items that explain the perspex machine in a non-technical way, and it has links to old research on the perspex machine.
He also claims that the Perspex machine can prove the existence of free will, God, and original sin.
One thing you've got to give to Anderson - the guy's got ambition.
- Naftule's Dream, "Something is There": What do you get when you mix up a traditional Klezmer band with Ornette Coleman, plus just a bit of thrash? Naftule's Dream.
- Genesis, "Counting out Time": a catchy little tune from Peter Gabriel's opus with
Genesis, "The Lamb Lies Down on Broadway": It's astonishing just how undated this sounds.
- Mogwai, "Glasgow Mega-Snake": one of my favorite tracks by one of my favorite post-rock bands. This one should be listened to loud to really get the full effect.
- Pink Floyd, "Another Brick in the Wall (Part 3)": Wow, now here's one I haven't listened
to in a very long time. It's as good as I remembered.
- Porcupine Tree, "Prepare Yourself": Brilliant stuff from Porcupine Tree.
- New Grange, "Going to Boston": This is what happens when Darol Anger gets goofy: a sort
of old-timey rap song based on a traditional New England fiddle tune. This is great fun to see
- Hugh Blumenfield, "Longhaired Radical Socialist Jew": the greatest gospel song of all time. "Jesus was a homeless lad with an unwed mother and an absent dad; And I don't really think he'd
have gotten that far, if Newt, Pat, and Jesse had followed that star. So let's all sing out praised to that Longhaired Radical Socialist Jew".
- Rush, "Bravest Face": It's really good to hear Rush back in full form. This is a great song, and so obviously Rush - the Peart drums, the Lifeson guitar playing, and above it all Geddy Lee's voice and amazing bass playing.
- Spock's Beard, "All That's Left": SB is a great neo-progressive band. They went through
a bit of a rough patch a few years ago, when the former leader of the band found Jesus and quit the
band. This album is the first where they really feel like they're comfortable with the new lineup. It's definitely got a different sound from a lot of their older stuff, but it's still recognizable as the same band. This is a really nice track - it's almost a ballad, but with a really solid structure, some odd rythyms, and a lot of transitions.
- Jonathan Coulton, "Todd the T1000": A very silly song from a very geeky pop-singer.
When you mention fractals, one of the things that immediately comes to mind for most people
is fractal landscapes. We've all seen amazing images of mountain ranges, planets, lakes, and things
of that sort that were generated by fractals.
Seeing a fractal image of a mountain, like the one in this image (which I found
here via a google image search for "fractal mountain"), I expected to find that
it was based on an extremely complicated fractal. But the amazing thing about fractals is how
complexity emerges from simplicity. The basic process for generating a fractal mountain - and many other elements of fractal landscapes - is astonishingly simple.
Suppose you've got a huge graph - millions of nodes. And you know that it's not connected - so the graph actually consists of some number of pieces (called the connected components of the graph). And there are constantly new vertices and edges being added to the graph, but nothing is ever removed. Some questions you might want to ask about this graph at a particular point in time are:
- How many components are there in the graph?
- Which component is vertex X in?
- Are vertices X and Y in the same component?
- How many components are there?
All of these questions are variants of a classic computer science problem, called
union-find, which also comes up in an astonishing number of different contexts. The reason for the name is that in the representation of the solution, there
are two basic operations: union, and find. Basically, the division of the graph into
components is also a partition of the vertices of the graph into disjoint sets: union find
is a problem which focuses on a particular kind of disjoint set problem, where you can modify
the sets over time.
There are a lot of very cool problems in computer science that can be solved by using
an appropriate data structure; and the data structures are often easiest to describe in terms
of graphs. And of those data structures, one thing that often comes up is amortized algorithmic complexity. Amortized complexity is something which has been occupying my thoughts lately,
because it's come up in several real problems, so I'm in the mood to write about it, and it'll be
The idea of amortized complexity is that for some structures, the worst case complexity
cost of a series of operations is different from the worst-case complexity of a single operation. In amortized complexity, you consider cases where some operation is inexpensive most of the time - but to keep it inexpensive most of the time, you need to periodically do something expensive.