Archive

Posts Tagged ‘history’

ANZAC Protests

April 26, 2017 Leave a comment

About 2 years ago I commented on an incident where an Australian sports commentator was fired for making some tweets critical of Australia’s record in past wars. The tweets were made on ANZAC Day which is a day observed in Australia and here in New Zealand to commemorate the sacrifice of our military personnel in past wars, especially the Gallipoli campaign in World War I.

The general conclusion I reached then was that the tweets were (probably deliberately) provocative and somewhat insulting, and in most cases not particularly accurate, but also did raise some valid issues related to that country’s participation in war.

On ANZAC Day this year in New Zealand we have had a controversy which was also related to criticisms of our past war record. This time, we had a group protesting the lack of a serious investigation into allegations of the possible involvement of New Zealand troops with war crimes in Afghanistan.

They held a sign protesting civilian war deaths (which read “Lest We Remember: No NZ support for war”) and attempted to place a wreath on a war memorial to remember the civilians allegedly killed by a botched raid lead by New Zealand military personnel in Afghanistan in 2010.

At that point they were verbally attacked, especially by a minor political official of New Zealand’s populist party, New Zealand First, and his particularly loud and obnoxious 12 year old son. Up until then any protest had been minimal and the solemnity of the occasion had hardly been disturbed.

They approached the protesters and shouted that they should not be there. The boy then said “Do it tomorrow, do it the day before, do it any day – but today it is wrong, wrong, wrong” and “You are so inappropriate, I just cannot believe this.”

Note that word “inappropriate”, which I have commented on before. This word is used (and I must admit to being guilty of this occasionally myself) as a way to say that you don’t like something but want to make it seem like your dislike is based on something more universal or objective.

So instead of saying “I don’t like that” a person will say “that is inappropriate”, because whether something is inappropriate or not is, in most cases, a matter of opinion. And it certainly is in this case.

In fact a poll run by NZ news organisation Newshub showed 67% of respondents supported protests on ANZAC day as being OK. I do need to emphasise this wasn’t a scientific poll and (at the time I voted) only had 2500 votes, but it did seem to correlate with the majority of comments I saw on the subject.

The protesers were peaceful and reasonable and the only time the subdued mood of the occasion was broken was when they were shouted at. Even then, they replied in a quiet and reasonable way.

I should also say that if someone is going to criticise another group, especially in a context like this, then they should expect to get criticised in return, and I do think it is good that people make their strongly held opinions known, but it is really a matter of how these things are done and the vigorous, loud, and seemingly tactless attack on the protesters was unacceptable (see how easy it is to use that word?)

Many people think New Zealand’s official national day, Waitangi Day, has been spoiled by protest (and I have blogged about his in the past) and it might be that another important day for this country is heading that way too.

I don’t think that is necessarily bad, but the protests have to be reasonable and they shouldn’t be over-done. That is bad for two reasons: first, too much protest spoils the event for others; and second, too much protest loses any meaning and just becomes background noise.

One of the claims made about our country’s past reasons for going to war was to protect our freedoms, such as the ability to speak out against injustice and to protest. It is sort of ironic now if those freedoms are being denied. And it is also ironic if a protest about a protest is more disruptive than the original protest!

If people would just settle down a bit, recognise that there are alternative views on every topic, including the way that our military personnel have acted, and just talk about these things reasonably instead of shouting mindlessly, then everyone would benefit. Will that ever happen? Probably not.

The Meaning of Easter

March 29, 2016 Leave a comment

For God so loved the world that he gave his one and only Son, that whoever believes in him shall not perish but have eternal life (John 3:16, New International Version).

This is possibly the most well known verse in the Bible and many would suggest the central message of Christianity. It is also particularly relevant during Easter, of course, since that is when the sacrifice described by John supposedly happened.

Clearly how much importance you attach to these words will depend on your perspective on the Christian religion. Those who take it seriously will probably find the idea both inspirational and highly relevant. Others might find it inconsequential or even bizarre.

Let’s have a quick look at what this is all about…

Supposedly the only way God could solve the problems the world was experiencing (mainly to do with sin, but more on that later) at the time was to send his son, who was in some way both a man and a god, to a primitive and isolated part of the world so that he could spread the message of how to make things better.

And in addition to that God’s son had to be sacrificed in order for this new way of thinking (which in fact wasn’t new and had been discussed by philosophers for years) to become possible. The people who were being saved had to submit to God’s will by accepting his Son as their source of guidance, and if they didn’t they were likely to be punished horribly.

To complicate matters God’s son didn’t write down any of his ideas and no one else around him bothered to either, so there are various versions of what his real intent was.

Then, just to make things worse still, most interpretations of the new message were quite different than the existing one, and about 600 years after all this happened a lot of people think he changed his mind again and sent another messenger down (who really was going to be the last one, this time).

Finally, to add insult to injury, the followers of all God’s various messengers have caused unmatched death, misery, and destruction over the entire planet in a presumably misguided effort to follow their god’s wishes.

At this point you might wonder about whether God really knew what he was doing. This really doesn’t sound like the way a competent omnipotent being woud operate. And it doesn’t sound like things overall were much better after the sacrifice described in John 3:16 than they were before. In fact, you would really have to wonder, what was the point of it all?

But according to many commentators the world is gradually getting better. It is more peaceful, people live longer, they are happier and more free, and they are more healthy. Is this because of the sacrifice? Well, no, probably not. In fact, many of the worst aspects of the modern world are as a result of people still misinterpreting (presumably) God’s messengers (mainly the more recent one this time) and the real improvements have only happened through ignoring religion and following rationality, and especially the scientific method.

So despite the reverence with which this verse – and the message it imparts – is held by believers it is really more a condemnation of God’s incompetence to many. Maybe a better message would be this: For God was so incompetent that we would all be better off to ignore his inept bungling and just get on with improving the world ourselves!

A Reason for the Season

December 27, 2015 Leave a comment

Well, Christmas is over for another year so I guess it’s about time I spoiled the holiday spirit with one of my curmudgeonly blog posts. We are often asked by the more traditional groups in society to remember the “reason for the season” but what is this and does a reason even exist?

Well no, I don’t think so. I think several reasons exist – one of which is the one the traditionalists are thinking of – but there’s no longer just one reason (and maybe there never was).

So let’s get it out of the way now: the most usually cited reason for the season is the celebration of the birth of Jesus Christ, the symbolic founder of the Christian Church. As you might have guessed, I have a few comments to make about this particular reason…

First, no one really knows whether Jesus even existed. In fact I believe there are very good reasons to say he didn’t; however I realise that the majority of historians disagree with me on this one. The big problem is that it’s not a simple case of him existing or not existing. The idea that Jesus existed in the way described in the Bible is ridiculous and most historians agree that didn’t happen, but there are some reasons to think the myths might be based on a real person or maybe several people. So if the Jesus myth described in the Bible is very loosely based on real events does that mean he existed or not? It’s somewhere in between.

Second, the birth story is hopelessly confused and contradictory. Prophecy indicated Jesus should be born in Bethlehem but the story already indicated Nazareth so a non-existent census had to be evoked to try to reconcile this. There’s also the non-existent star mentioned in only one gospel, the contradictory virgin myth, the fact that no one knows the day, month or even year of the birth, etc, etc. So choosing December 25 seems to be totally arbitrary (or is it? see below).

Third, Christmas, along with all the other known traditions, dogma, and myths associated with Christianity, only appeared decades or centuries after the alleged events occurred (or, in most cases, didn’t occur) and the special days all seem to be borrowed from earlier traditions. Christmas is clearly a mid-winter celebration, for example, and Easter originally came from a spring or fertility ritual.

But if the birth of Jesus isn’t the reason then what is? In most countries the number of people reporting that they think of Christmas in the traditional, religious sense is shrinking. Christmas for many is about a break from work, time with family, an excuse to buy stuff, or just a summer (southern hemisphere) holiday.

So there is not just one reason, there are many: traditional, modern, religious, family related, consumerist, etc. Many Christians arrogantly assume theirs is the only reason but that isn’t true – it isn’t even the first. If we want to celebrate the original reason let’s go back to pagan rituals like Saturnalia, in fact the descriptions of those sound pretty cool (lots of drinking and sex).

Christians are welcome to their reason, no matter how silly it is, and I’ll stick to mine (enjoying summer, relaxation, drinking, etc) if they don’t mind. At least mine is based on reality.

The Enigma

November 4, 2015 Leave a comment

I seem to have had a theme of blogging about people recently. First it was Grace Hopper, then Richard Feynman, and today I’m discussing Alan Turing, the famous computer pioneer, mathematician, and World War II code breaker.

I am currently listening to an audio book biography of his life called “Alan Turing: the Enigma” (a reference to his enigmatic personality and the German code he broke: Enigma) and the thing which I have found most interesting is the way he advocated for, and in some cases invented, many of the ideas and technologies we have today. He did this work after the code breaking he is probably most well known for.

So I’ll list a few of his innovations here. I should say that he can’t claim sole credit for some of these because he often worked by himself and “reinvented” ideas (often in a better form) which other early theoreticians and inventors had already found. For example, some ideas go back to Charles Babbage who didn’t have the theory or the technology to exploit them at the time.

Anyway, here’s the list…

Instructions and data should both be stored in main memory.

Many people see these two as being quite separate and on many machines the instructions would be read linearly from paper tape or cards and data would be stored in fast, random access memory. By putting the code in memory too it could be accessed much more quickly, plus there were two other benefits: any instruction could be accessed at any time so conditional jumps and loops could be done, and instructions could be modified by other instructions (see below for details).

It’s just taken for granted today that code is loaded into working memory (RAM). That’s (mainly) what’s happening when you launch a program (the program’s instructions are being copied from the disk to main memory) or boot your system (the operating system is being loaded into memory) but in the early days (the 1940s and 1950s) this wasn’t obvious.

Programs stored in main memory allow conditional jumps and loops.

Conditional statements and loops allow a lot of efficiency and flexibility. A conditional statement allows the computer to run a set of instructions if a certain condition occurs. For example it could test if a bank balance is less than zero and show a warning if it is. Loops allow a chunk of code to be executed multiple times. For example, survey results for 100 participants could be analysed one at a time by skipping back to the start of the analysis code 100 times.

Any modern programmer would find it bizarre not to have access to conditional statements and loops, but some early machines didn’t have these abilities.

Code in memory allows self modifying code.

If the programming code is in main memory it can be read and written freely, just like any other data. This allows instructions to be modified by other instructions. Turing used this for incrementing memory locations and other simple stuff but potentially it can be used for far more complex tasks.

I can remember when I did computer science being told that self modifying code was a bad thing because it made code hard to understand and debugging difficult, but it has its place and I use it a lot in modern interpreted languages.

Simple, fast processing units and more memory is the best strategy.

Some early American computer designs tried to provide a lot of complex operations built into the main processing unit. This made them more complicated and required more valves (this was before transistors or integrated circuits, of course) for the main processor and less for memory. Turing advocated simpler instruction sets which would allow for more memory and more efficient execution, and the programmer could write the complex code using simpler instructions.

This sounds very much like the modern concept of RISC (reduced instruction set computing) processors which provide a limited range of very fast, efficient instructions and use the extra space on the CPU for cache memory. The more complex instructions are generated by combining simpler ones by the compiler.

Microcode and pipelines.

Turing’s computer, the ACE, ran at 1 MHz (one million cycles per second) which was the fastest of any machine at the time. But interpreting each instruction (figuring out what it meant, like where the data should come from) took several cycles and actually carrying out the function took several more. To make things go faster he interpreted the next instruction while the current one was being executed.

Modern processors have a “pipeline” where several stages of processing can be performed simultaneously. Today we also have deep, multiple pipelines (multiple steps of several streams of code being processed at once) and code prediction (figuring out what the next instruction will be) but the basic idea is the same.

Subroutines and libraries.

Most CPUs could only do very basic things. For example they could add whole numbers (like 42) but not multiply at all or work with real numbers (those with decimals, like 3.33). But many programs needed these operations, so instead of reinventing them over and over for each new program Turing created libraries of subroutines.

A library is a collection of useful chunks of code to do particular things, like work with real numbers. Modern processors have this function built in but more complex tasks, like reading a packet of data from the internet, still require long sequences of code.

Today computers typically have hundreds of libraries of thousands of subroutines (a rather old term for a chunk of code which can perform a task then return to what the computer was doing before it was run) and in many ways that is mostly what a modern operating system is: a collection of useful libraries.

Computers could be accessed remotely.

Turing thought that since there were so few computers around it made sense to allow people to access a computer they needed by remote control. He thought this could be done with special devices attached to the phone system.

Simple modems and other serial interfaces allowed this, and now we have the internet. Even though computers are no longer rare (I have 14 conventional computers at home plus many other devices, like iPads and iPhones, which are effectively computers) it is still useful to be able to access other computers easily.

Computers for entertainment.

Turing thought that “ladies would take their computers to the park and say ‘my little computer said something so funny this morning'” (or something similar to this).

I couldn’t help but think of this when my daughter showed me an amusing cat video on her iPhone today. Yes, ladies carry their computers everywhere and are constantly entertained by the funny little things they say.

No one’s perfect.

So what did he get wrong? Well a few things, actually. For example, he wasn’t a great enthusiast for making the computer easy to use. It was necessary to enter input and read output in base 32 expressed using a series of obscure symbols, plus he used binary but with the least significant bit first.

Perhaps the most important change in the last thirty years has been making computers easier to use. Turing can’t claim much credit in this trend. Still, I see where he was coming from: if it’s hard to build a computer and hard to write code, it should be hard to use them too!

Amazing Grace

October 6, 2015 Leave a comment

There is no doubt that in the past (and to a lesser extent in the present) women have been treated unfairly in many situations, such as when they want to become scientists. There are some obvious cases where a Nobel Prize should have been awarded to a woman but that didn’t happen or it was awarded to a man who made a lesser contribution. At one time it was virtually impossible for a woman to get an advanced education. And there are cases where they couldn’t contribute to science or were only allowed to with disadvantageous conditions, such as no pay!

On the other hand I am a bit offended by some of the attempts at redressing this imbalance. Many people produce lists of female scientists who were ignored or who have been forgotten but fail to acknowledge that a similar number of men who made a similar level of contribution have also been forgotten. Unfortunately, except in areas where the person worked, it is all too common to forget about pioneering scientists of either gender.

So there is a bit of political correctness involved in this phenomenon and I don’t like political correctness. However, I’ll put that aside and discuss one of my favourite women scientists, from my area of work (computing), Grace Hopper.

Rear Admiral Grace Murray Hopper (how cool is that) lived from 9 December 1906 to 1 January 1992 and not only made some important contributions to the early development of computer software but also sounded like she was a really interesting character.

She was one of the first programmers of the Harvard Mark I computer, and she developed the first compiler for a computer programming language. Compilers are fiendishly complex programs which convert a program written in a “high level” language to the code a computer can execute.

The instructions computers execute are very simple and do very specific things, such as adding two numbers together. But to add two numbers the computer first has to retrieve them from memory, add them, check for overflow and other conditions, then put the result back into another part of memory. So a simple operation might involve a sequence of obscure instructions such as “MOV AL, NUM1” and “ADD AL, BL”. Remember that these are human readable words for individual machine code instructions.

Humans tend to like to use more sensible instructions like “total = price + tax” which might translate to 10 or 20 machine code instructions like those above.

So a compiler is simply a program which takes the human readable code (which itself can be obscure to non-programmers) and turns it into (even more obscure) instructions which the computer can execute. It sounds simple but it’s not. The compiler has to take potentially complex strings of instructions, check that they make sense, and turn them into machine instructions (possibly hundreds just for one line of high level code) and do it perfectly. Every time.

The high level language COBOL (COmmon Business Oriented Language) was developed from an earlier language called FLOW-MATIC created by Hopper. Back in the day I programmed in COBOL – amongst a lot of other languages – and I hated it because it was too inflexible and awkward. But at least it was a lot easier than programming in assembly language (a slightly simplified version of machine code) which I also did in the past.

So in my opinion that was Hopper’s greatest contribution but there are other details and anecdotes about her I would like to share here.

In 1969 she won the first “man of the year” award from the Data Processing Management Association. Yes, I believe it was called “man” of the year. Sort of ironic, I think.

Attribution of the famous quotation “It’s easier to ask forgiveness than it is to get permission” is often given to her. This is one of my favourite quotes and a principle I often live by too! Like many quotes it’s not certain if she really used it first but it did reveal a certain rebellious part of her personality.

She also allegedly said she would com back to haunt anyone who said “we have always done in that way” in reference to why something was done a certain way. Sure, sometimes there’s a good reason why something has been done a particular way in the past but I think there’s also room to ask why and explore alternatives. That’s how she achieved what she did.

Finally there is the “bug” anecdote. Even non-specialists know that a problem with a computer, especially in software, is often known as a bug, but why? In 1947, while working on the Mark II computer at Harvard University, an associate discovered a moth stuck in a relay which stopped the computer running (yes, mechanical relays were used back then). Hopper remarked that they were “debugging” the system.

Yes, moths aren’t bugs in the technical sense, although they are insects which some people refer to as bugs. Also the term cannot be definitively attributed to Hopper, but she did at least make it popular. We don’t need to worry about that kind of bug (an insect) much any more but we sure still have plenty of the computer type!

So yes, I think “Amazing” Grace Hopper (as she became known) was pretty cool, and I hesitate to say this, but the fact that she was a woman made her even cooler!

Easter Suckers

April 6, 2015 Leave a comment

I have a little cartoon depicting a person who might be meant to resemble the traditional appearance of Jesus (which is far from his actual, most likely appearance, assuming he even existed) saying “there’s a sucker born again every minute.” Of course, this is an allusion to the classic phrase attributed to American showman, P. T. Barnum.

I should make one comment before I continue with the main point of this blog post: according to Wikipedia the phrase was “most likely spoken by David Hannum, in criticism of both P. T. Barnum … and his customers. The phrase is often credited to Barnum himself. It means: many people are gullible, and we can expect this to continue.” I always assumed the phrase was from Barnum himself referring to his customers so I have learnt something new already.

But I should get back to the main subject here. The phrase “born again” is often used by Christian nutters to refer to some revolution in their life after conversion to whatever (especially Evangelical) sect of Christianity they have currently got involved with. So the cartoon is suggesting that anyone who believes this is a sucker. it’s hard to disagree.

We have just completed Easter, perhaps the most important event in the Christian calendar, so this blog post is a comment on that. We all know that Easter is just another pagan celebration (like Christmas) hijacked by Christianity and that most of the symbolism of the event (eggs, bunnies, etc) has nothing to do with Christianity but I’ll let that pass this time and move on.

The purported reason for Easter is the alleged crucifixion and resurrection of Jesus Christ so let’s have a look at the authenticity of this event which is so central to Christian mythology and substance. Well to summarise, it’s bullshit. Thank you, that is the end of this post.

But seriously, I need to provide some detail…

What evidence do we have of any of the events of Jesus’ life, including the great supernatural ones, like his resurrection? Well, basically none, if you really want to know. Here’s some of the reasons I can make this claim…

1. Absolutely no one who would have witnessed the events bothered to record them. And I agree that records weren’t as good at the time and could have been lost but is this a credible excuse? I don’t think so. Note that the gospels were written by unknown people many years after the events they describe, and people like Paul never met Jesus (not to mention the fact that many of the writings attributed to him have now been shown to be from other unidentified authors).

2. The stories are conflicting and significant details in one are entirely missing from others. For example, only the Gospel of Matthew mentions the Star of Bethlehem. Not only is it not mentioned in any other Biblical story but it isn’t mentioned anywhere else either. Why invent a story like that? And since it almost certainly was invented what else might also be fiction?

There’s another example related to Easter too. Three of the four canonical gospels mention a darkness just after the crucifixion (“From noon on, darkness came over the whole land [or earth] until three in the afternoon”). But John doesn’t bother to mention it and neither does anyone else. Maybe they didn’t notice? And there is a similar problem with the rather silly story about the dead rising. This obviously didn’t happen because no one else mentions it. It’s pure fiction.

3. The stories we have today are just a small selection chosen by a committee hundreds of years after the events supposedly happened. Other gospels have completely different stories from the four in the canonical gospels most people know about. What makes these four so special? Well they suited the purposes of the early church, I guess.

4. The events which would reasonably be expected to be recorded (whether they had supernatural significance or not) weren’t. It is fair to expect that one crucifixion might not have been recorded by the Romans (even though they were good record keepers) or the records might have been lost. But the big events which everyone must have been aware of – the star, the darkness, the dead rising – would surely have been written about so many times that records would have survived. But we have nothing.

Clearly the whole Jesus story is largely fiction. I probably wouldn’t go so far as to say that there was no person that the stories are based on, but there is no resemblance of any of the supernatural mythology (including the resurrection story) to reality. It is so unsupported that you really do have to be a sucker to take it seriously. But many people do, I guess because there’s a sucker born again every minute!

Influential People

November 14, 2014 Leave a comment

A recent episode of the excellent podcast Skeptics’ Guide to the Universe presented the following Skeptical Quote of the Week: “Biographical history, as taught in our public schools, is still largely a history of boneheads: ridiculous kings and queens, paranoid political leaders, compulsive voyagers, ignorant generals, the flotsam and jetsam of historical currents. The men who radically altered history, the great creative scientists and mathematicians, are seldom mentioned if at all.”

The quote is by Martin Gardner (1914-2010), a mathematician and writer who had a prominent place in the skeptical community. It’s quite a strong statement and I think it makes a fair point, but just how accurate and justified is it?

First I should look at whether history classes (and by extension our society in general) really do concentrate on political and military leaders. I found an interesting list of important figures at Time magazine where they used a computational process to analyse credible sources, including scanned historial books, to establish a list of the most influential people in history. This isn’t really what the original quote was about but I think it is still worth commenting on.

Here’s how Time described their process: “we evaluated each person by aggregating millions of traces of opinions into a computational data-centric analysis. We ranked historical figures just as Google ranks web pages, by integrating a diverse set of measurements about their reputation into a single consensus value.” and “By analyzing traces left in millions of scanned books, we can measure just how fast this decay occurs, and correct for it.”

Anyway, here’s the list…

1 Jesus
2 Napoleon
3 Muhammad
4 William Shakespeare
5 Abraham Lincoln
6 George Washington
7 Adolf Hitler
8 Aristotle
9 Alexander the Great
10 Thomas Jefferson
11 Henry VIII of England
12 Charles Darwin
13 Elizabeth I of England
14 Karl Marx
15 Julius Caesar
16 Queen Victoria
17 Martin Luther
18 Joseph Stalin
19 Albert Einstein
20 Christopher Columbus
21 Isaac Newton
22 Charlemagne
23 Theodore Roosevelt
24 Wolfgang Amadeus Mozart
25 Plato
26 Louis XIV of France
27 Ludwig van Beethoven
28 Ulysses S. Grant
29 Leonardo da Vinci
30 Augustus
31 Carl Linnaeus
32 Ronald Reagan
33 Charles Dickens
34 Paul the Apostle
35 Benjamin Franklin
36 George W. Bush
37 Winston Churchill
38 Genghis Khan
39 Charles I of England
40 Thomas Edison
41 James I of England
42 Friedrich Nietzsche
43 Franklin D. Roosevelt
44 Sigmund Freud
45 Alexander Hamilton
46 Mohandas Karamchand Gandhi
47 Woodrow Wilson
48 Johann Sebastian Bach
49 Galileo Galilei
50 Oliver Cromwell
51 James Madison
52 Gautama Buddha
53 Mark Twain
54 Edgar Allan Poe
55 Joseph Smith, Jr.
56 Adam Smith
57 David, King of Israel
58 George III of the United Kingdom
59 Immanuel Kant
60 James Cook
61 John Adams
62 Richard Wagner
63 Pyotr Ilyich Tchaikovsky
64 Voltaire
65 Saint Peter
66 Andrew Jackson
67 Constantine the Great
68 Socrates
69 Elvis Presley
70 William the Conqueror
71 John F. Kennedy
72 Augustine of Hippo
73 Vincent van Gogh
74 Nicolaus Copernicus
75 Vladimir Lenin
76 Robert E. Lee
77 Oscar Wilde
78 Charles II of England
79 Cicero
80 Jean-Jacques Rousseau
81 Francis Bacon
82 Richard Nixon
83 Louis XVI of France
84 Charles V, Holy Roman Emperor
85 King Arthur
86 Michelangelo
87 Philip II of Spain
88 Johann Wolfgang von Goethe
89 Ali, founder of Sufism
90 Thomas Aquinas
91 Pope John Paul II
92 René Descartes
93 Nikola Tesla
94 Harry S. Truman
95 Joan of Arc
96 Dante Alighieri
97 Otto von Bismarck
98 Grover Cleveland
99 John Calvin
100 John Locke

It’s easy to point out how bizarre some of this list is and plenty of people did that in the comments. Of course it wouldn’t make any difference who was in the list and in what order because someone would find it strange. But I will make a few points about the list and quote a few of the better judgements made by commenters…

While there is good reason to believe Jesus didn’t even exist I think the (probably fictitious) character should be near the top because there is no doubt that the religion his followers founded has been incredibly influential: in both good and bad ways. The same applies to Muhammad.

Similar points were made by commenters, such as this one: “There’s zero [I disagree with the word zero here] evidence that Jesus ever actually existed, therefore he should not be included in this list. If you’re going to include Jesus then you may as well also include Superman and Batman.”

A similar point might be made about King David (position 57) and King Arthur (who is almost certainly a fictitious figure) at position 85. And what about Socrates (at position 68)? There’s some question regarding whether he really existed as well. Still, you could make an argument to say that idealised characters can be even more influential than real people.

The list is clearly western (and especially American) focussed, including George Bush at 36? Really? The inclusion of so many other American presidents in general is totally ridiculous. To the world as a whole most of these people were completely irrelevant.

Here’s a comment about Bush I liked: “If you guys believe that G W Bush belongs in such a list of greats then you should be fair and give a shot to Homer Simpson and Sponge Bob!”

As well as being very western-centric the list is also very male-centric. You might say that until recently women have had little chance to make big contributions but surely we could at least have had Marie Curie who is usually listed amongst top scientists.

And then the women rulers who are there (Elizabeth I of England at 13, Queen Victoria at 16) are absurdly listed ahead of the greatest scientist ever, Isaac Newton (at 21). Are they for real? Political leaders might have had a lot of influence at the time they were in power but in the long term scientific progress is far more important.

In fact I find the lack of scientists, mathematicians, and engineers bizarre. Does anyone really believe that Richard Nixon (at 82) was more important than people like Turing, Euler, or Maxwell (who aren’t even on the list)? Surely not! And what about King Henry VIII at 11? A fat buffoon who started a series of pointless wars, murdered his wives and political opponents, and created a religion for his own benefit is important but the originator of quantum theory isn’t?

Maybe the most ironic thing of all is that none of the inventors of the computer which made the creation and distribution of the list possible are actually on it.

A commenter said: “A pretty weak list all around. Not enough scientists/inventors (Pasteur, James Watt, Faraday, Maxwell, Clausius, Lavoisier, Kepler, Hutton, Heisenberg, for starters). Way too many presidents and heads of state who didn’t do anything unique.”

I hope the algorithm used to create the list was faulty, because if these really are the most important people to our society then there really is no hope!

Maybe Martin Gardner really did have a point, after all.