Wednesday, April 22, 2015

The earliest reference to a telescope: England 1551?

“And hereof came it that Fryer Bakon was accompted so greate a negromancier, whiche never used that arte (by any coniecture that I can fynde) but was in geometrie and other mathematicall sciences so expert, that he coulde dooe by theim suche thynges as were wonderfall in the syght of most people.

“Great talke there is of a glasse that he made in Oxforde, in which men myght see thynges that were doon in other places, and that was iudged to be done by power of euyll [evil] spirites. But I knowe the reason of it bee good and naturall, and to be wrought by geometrie (sythe [since] perspective is a parte of it) and to stande as well with reason as to see your face in common glasse.”

The quotation above comes from the preface to a textbook on geometry called A Pathway to Knowledge published in London in 1551. It was written by a doctor and mathematician called Robert Record (1512 – 58). His arithmetic textbook, Ground of the Arts, first published in 1543, was popular with students by virtue of being in English. It went through over 40 editions right the way to the end of the seventeenth century.  Record is probably best known for his invention of the equals sign.  However, despite his relative success as an author, he died in a debtors’ prison.

Of course, the quotation is most interesting because it describes a device that sounds much like a telescope, sixty years or so before the telescope was supposed to have been invented.  “Fryer Bakon” is Roger Bacon OFM, the Franciscan scholar of the thirteenth century famous for his Opus Maior and Opus Minor who lectured at the Universities of Oxford and Paris.  He may have been (but probably wasn’t) imprisoned for a time for his adhesion to the ultra-ascetic wing of the Franciscans.  His reputation for necromancy was a not uncommon trope in sixteenth-century England where he was a famous historical figure.  However, the accusation of black magic is almost always found in the context of a denial that he was, in fact, a magician. 

Record’s preface is a good example of this kind of defence of Roger Bacon.  The specific charge is that Bacon had a device that allowed him to see what was going on in other places.  Record says the device was not magical but used Bacon’s knowledge of perspectiva, what we would call geometrical optics.  Bacon was indeed familiar with this subject and wrote a treatise on it.  In this treatise, he mentions magnifying glasses and, as it happens, spectacles were invented in Italy shortly thereafter.
To be clear, there is no evidence that Bacon had any device to see different places, magical or otherwise.  What interests me is what Record thought Bacon had invented.  The device mentioned in the quotation does not sound like a magnifying glass or spectacles.  In any case, if that was what Record had in mind, he would just have said so (probably calling a magnifying glass a “perspective glass”, which confusingly was also an early term for the telescope). 

I think there are three possible interpretations of Record’s words:

  • Record has no idea what he thinks Bacon’s device was.  He just wants to reassure his readers that it would have been built on mathematical and not necromantic principles.  This is possible.  Record is making a point that geometry is jolly useful.  But the passage reads as if he knew what the device was supposed to be and how it worked.

  • Record thinks the device was a periscope.  These had been invented a hundred years before Record wrote by Johan Gutenberg, who lost money on the venture.  Gutenberg later had more success as a pioneer printer.  The trouble is, periscopes don’t really show you what is going on in another place.  But they do allow you to see around corners, so this interpretation is a possibility.

  • Record has in mind a device like a telescope that really does let the user see things where he isn’t.  Hans Lippershey famously patented the first telescope in 1608, but several others claimed to have invented it.  This certainly best fits the context of the passage but would require that Record knew what a telescope was before it was supposedly invented.

So could Record really know about a telescope as early as 1551?  Astronomer Colin Ronan has claimed that it was invented by a Kentish mathematician and astronomer called Leonard Digges (d 1559).  The claim is actually made by Digges's son Thomas (d. 1595) on his father’s behalf.  Digges Jr produced an edition of his father’s book on practical geometry called Pantometrica which was published in 1571.  In the introduction, he notes:

“… my father by his continual painful practices, assisted with demonstrations Mathematical, was able, and sundry times hath by proportional Glasses duly situate in convenient angles, not only discovered things far off, read letters, numbered pieces of money with the very coin and superscription thereof, cast by some of his friends of purpose upon downs in open fields, but also seven miles off declared what hath been done at that instant in private places.”

This does sound a lot like the device mentioned more briefly by Record in the preface to The Pathway to Knowledge.  It would be great to be able to link Record and Digges directly.  Unfortunately, Record was based in Cambridge in the 1540s, Digges in Kent.  But Record was reasonably well-known after 1543 thanks to his arithmetic book Ground of the Arts.  Among the small community of English mathematicians, Digges and Record, both avid Protestants, could have met.


Overall, I think there is a good chance that Record is referring to a telescopic device in 1551 and if so, this is most likely to be the same one that Digges had invented.  At least, Record seems aware that a telescope exists even if he has not seen one.  If this is the case, it is evidence that Digges really did create telescope before 1551 and makes Record’s preface the earliest reference to it.

Discuss this post at the Quodlibeta Forum

Thursday, January 29, 2015

Islam and science have problems with their relationship

In August 2013, Richard Dawkins elicited one of his periodic bouts of controversy by tweeting that Trinity College, Cambridge had produced many more Nobel Laureates than the entire Muslim world.  While no one could deny that his tweet was objectively correct, any serious point he might have been making was drowned out by the condemnation of his quasi-racist language.  This is a shame because, unfortunately, science in much of the Muslim world really is in crisis.  

Nidhal Guessoum (who is associated with this website), a professor of Physics and Astronomy at the American University of Sharjah in the United Arab Emirates, likes to ask his students and colleagues about their scientific beliefs.  Despite living just down the road from the multinational entrepôt of Dubai, he has found that only about ten per cent of Muslims at the university accept that human beings evolved from other animals.  We should bear in mind that Guessoum is asking only undergraduates and members of the university faculty.  The population at large is likely to be even more dismissive of Darwin.  In comparison, Gallup polls of Americans have consistently found that half of those surveyed accept humans are descended from apes. We tend to think of the United States as a haven for creationists, but it has nothing on the UAE.  If the problem of science among Muslims were confined to rejecting Darwin, we would at least be confronting an opponent familiar from debates with Christian creationists.  But, as Professor Guessoum explains in Islam’s Quantum Question, science in the Islamic world has further problems.  He’s written the book to help explain Muslim attitudes towards science and discuss ways that the situation can be improved.  Unfortunately, besides creationism, there are several other serious threats to a harmonious relationship between Islam and modern science.  
The first threat is the claim that science is an imperialist cultural artefact with no objective claim to truth.  This leaves people in Islamic countries free to reject science as a colonial imposition.  The situation is made worse by left-wing professors in the West.  Muslim intellectuals in the United States and United Kingdom, who have drunk deep of the beguiling draught of postmodernism, have attempted to build an Islamic science better suited to their co-religionists.  Thinkers like Ziauddin Sardar and Seyyed Hossein Nasr are not household names, but their rejection of western science as incompatible with Islam has become conventional wisdom in many Muslim countries. Of course, their attempts to create an alternative natural philosophy have been an abject failure and they have been reduced to bickering among themselves.  In Islam’s Quantum Question, Guessoum is always impeccably polite, but it is clear he despairs of views like these.  His ridicule of Sardar, Nasr and their fellow travellers is all the more devastating for being so gently expressed.  Guessoum knows perfectly well that science is universal.  Its truths are the same everywhere. The idea of a specifically Islamic science makes as little sense as a Christian or atheist one.  This is why Abdus Salam, who won the Nobel Prize for Physics in 1979 for his work on the weak nuclear force, is one of Guessoum’s heroes.  Salam saw himself as an ambassador for science to the developing world.  It goes without saying that he found no conflict between his own Muslim devotion and his epochal work in nuclear physics.
A second threat is the low status of science in Muslim-majority countries.  For example, at the beginning of his book, Guessoum mentions two competitions for the pupils at his son’s school in the UAE.  One was for a science project that he was asked to judge.  It is fair to say he was less than impressed by the quality of many of the entries, but attendance was so sparse that there were few people to notice.  Three days later, the school held its annual Koranic memorisation competition.  Hundreds of parents, the local media and various guests of honour crammed into the school hall to witness prizes totalling $20,000 being handed out to the pupils.  With educational priorities like these, it is hardly surprising that the Muslim world has to import scientists and engineers from the West or send its own sons and daughters to be trained there.
The third threat is even more insidious.  The school of I’jaz teaches that the findings of modern science have been miraculously present in the Koran all along.  The verse “the Originator of the heavens and the earth! When he decrees a thing, he says only: ‘Be!’ And it is.” (Q:2:117) is taken as a reference to the Big Bang.  But proponents of I’jaz make even more surprising claims.  They can derive the speed of light, being 300,000,000 metres per second, from the verse “He directeth the ordinance from the heaven unto the earth; then it ascendeth unto Him in a Day, whereof the measure is a thousand years of that ye reckon.” (Q32:5).  Guessoum devotes one of the appendices of his book to refuting this “calculation” of the speed of light.  But there are many other examples taken extremely seriously and he is clearly angered that I’jaz is so influential.  Unfortunately, despite having much in common with the Bible Code craze of a few years back, I’jaz is fast becoming mainstream in Muslim countries.  Guessoum found that 80% of the Muslim faculty and students at his university in the UAE believed that the Koran contains explicit statements now known to be scientific facts.  
Guessoum himself suffers from none of these misapprehensions.  He is a believing Muslim but his scientific views are similar to those of most of his Western colleagues: he wholehearted accepts Darwin’s theory (rejecting intelligent design) and sees science as universal rather than local.  He rejects I’jaz but does see the merit, like many Christians, of the fine-tuning argument and theistic evolution.

Guessoum’s experience shows that reconciling Islam and Science is a problem that has already been solved.  Islam’s awesome thirteen hundred years of scholarship has already furnished the answers in this debate.  All that is needed is to retool the arguments developed centuries ago to make them fit for the modern era.  The original debate between Islamic and foreign sciences took place in the ninth to twelfth centuries when ancient Greek natural philosophy and mathematics were first translated into Arabic.  Admittedly, back then, in a high-scoring game, the mystics eventually prevailed with a late winner from Al Ghazzali (d. 1111).  Warning Muslims against the work of Euclid and Ptolemy, Al-Ghazzali said they are “the preliminary to the sciences of the ancients, which contain wrong and harmful creeds.”  He was probably talking specifically about astrology but as his influence has waxed, the sciences in the Islamic World have waned.  

That is not to say that the traditional picture of Al Ghazzali snuffing out Golden Age science is accurate.  The astronomical work of Nasir al-Tusi (d. 1274) and Idn al-Shatir (d. 1375) alone refutes that theory.  The mathematical models of both these scholars were used, unacknowledged, by Nicolas Copernicus (d. 1543) in his Revolutions of the Heavenly Spheres.  Nidhal Guessoum’s own favourite Islamic thinker is Idn Rushd, known as Averroes in the West.  He took on the challenge of Al-Ghazzali and has been a pariah among conservative Muslims ever since.  Only in the West is Averroes hailed as one of the most important thinkers in history.  

So obviously, science in the Islamic world did not break the mould in the way that it did in the West.  But, as George Saliba notes in his Islamic Science and the Making of the European Renaissance, the question of why modern science didn’t arise in the Muslim world is the wrong one to ask.  It didn’t arise in all sorts of advanced civilisations including China or India; ancient Greece and Rome; or Sassanid Persia and its great antagonist Byzantium.  Instead, we should be wondering why a recognisably modern science had arisen in the West by the end of the nineteenth century.  That this didn’t happen elsewhere isn’t because of the deficiencies of other societies.  It’s just that there was a unique conjunction of historical contingencies in one place and time.  Exactly what those contingencies were remains a matter of much debate.

What, then, is the solution to Islam’s quantum question?

There are some clues in Guessoum’s book.  One element is the need to ensure that any discussion of science is grounded in the Koran.  The esteem in which this book is held among Muslims is well known.  Since it is full of injunctions to observe and understand nature, there is strong support for science to be found within its pages.  It also supports a philosophy of the unity and predictability of nature which adheres well to the axioms of modern science.  

Obviously, Koranic literalism can be unhelpful.  Luckily, there is a history of interpretation that allows the Koran to be read in a figurative rather than literal way where necessary.  A passage that looks like a straightforward statement of fact is likely to also have a range of metaphorical and religious interpretations.  Guessoum warns of some pitfalls in this approach.  For instance, the Arabic word commonly translated as science today, ‘ilm, has the wider meaning of “knowledge” in classical Arabic.  Nonetheless, the essential lesson is that revering the Koran as the word of God does not also mean having to treat it as a scientific textbook.

To a great extent, the relationship between science and Christianity is of academic interest only.  Readers of this blog might find the subject fascinating but it only rarely impinges on public life.  When it does, the issue in question is almost always creationism which most scholars in the field regard as one of the subject’s least interesting manifestations.  The situation among Muslims is different.  For them, the question of how to reconcile science to Islam is of epochal importance.  The best-case scenario could well see them in a better place than the West – a science that recognises its ethical boundaries and rejects the naïve utilitarianism of so many western scientists.  But for the present, the story is much less encouraging.  Nidhal Guessoum is in no doubt that the relationship between science and Islam is highly problematic and that this is holding back the development of Muslim societies.  Sadly, there is little that western Christians can do about this.  

Given the importance of its subject-matter, it is unfortunate that Islam’s Quantum Question is such a poorly organised and written book.  Even the title is a misnomer – Guessoum tells us early on he’s got hardly anything to say about quantum mechanics.  The book was originally in French and, as far as I can tell, Professor Guessoum translated it into English himself.  The result is difficult to read and even harder to follow.  For most readers, the amount of new material is more than can be easily swallowed.  Muslim thinkers come thick and fast, sometimes referred to by their surnames and sometimes by their given names.  Keeping track of who is who and what they all think becomes a serious challenge.  It’s not even clear for whom the book is for.  There is lots of material which looks like it is aimed at an audience of western non-Muslims.  But Guessoum also spends a great deal of space elucidating the basic philosophy of science and presenting evidence that evolution is true.  

Deep within his book there is an essential text fighting to get out.  There is no doubting the significance or the urgency of the issues it raises.  Thus, despite its faults as a piece of writing, it is something that everyone interested in the interface between science and religion should read.

This article is a much expanded version of a review originally published in Science and Christian Belief 26(2) 2014.

Discuss this post at the Quodlibeta Forum

Friday, January 09, 2015

We know less about the ancient world than we think we do

On 15 June 763BC, a near total eclipse of the sun was visible over a swathe of the Near East.  As luck would have it, the event was noted in the official list of Assyrian high officials.  This record provides the earliest absolute and uncontroversial date in ancient history.  Using lists of kings and the chronicles of events, historians have counted the years back from this date to construct the chronology of ancient history.  
Radiocarbon analysis (which measures the decay of carbon 14, an unstable isotope) and the predicable styles of pottery found in digs both provide corroborating evidence.  Dating the layers of archaeological remains from the artefacts found within them is called stratigraphy and can yield quite precise results.  The vast amount of pot shards that has been unearthed allows archaeologists to use statistical methods to screen out random noise and anomalous samples that have found their way into the wrong strata.  Of course, pottery and radiocarbon methods need to be calibrated to produce absolute dates.  This has been done using samples of wood whose age can be determined by matching patterns of tree rings, a technique called dendrochronology.  We can count back sequences of tree rings from the present day, all the way to 2000BC.  By carbon dating the oldest samples of wood, we can tie the tree ring record to the results from carbon 14 decay.
By 1990, all these clues had yielded a multi-dimensional jigsaw which fitted together to almost everyone’s satisfaction.  There were a few heretics like Peter James, who suggested in his book Centuries of Darkness that the conventional chronology included two hundred additional years around 1000BC.  Thus remains that were conventionally dated to 1050BC actually occurred in 850BC.  Although James’s book is an excellent read, it fails to convince.  
Nonetheless, it has now turned out that the conventional chronology was not as secure as everybody else thought.  While James was convinced ancient history was two centuries too long, new evidence has begun to pile up in the opposite direction: it now looks like the conventional chronology is up to 150 years too short.  To put it another way, a cataclysm that everyone thought occurred in 1500BC actually happened before 1620BC.  The event in question was the massive eruption of the island of Thera in the Aegean Sea.  
Conventional chronology dated the end of Minoan age in Crete to 1450BC.  Archaeologists assumed that the Thera eruption (on the modern island of Santorini) and its resulting tsunami had destroyed the Minoan fleet leaving them vulnerable to raiders from the mainland.  Certainly, the havoc wrought by the volcano can clearly be seen across the Eastern Mediterranean.  When Thera exploded, it blasted 60 cubic kilometres of rock into the atmosphere which settled over Asia Minor.  The resulting layer of ash and pumice is used to date the sites where it is observed.  And the eruption had other effects.  Sulphur dioxide released by the volcano spread across the northern hemisphere and fell to earth as acid rain, or more significantly as acid snow.  At the poles, not all of that snow has yet melted and, from the 1990s, it provides a new strand of evidence to date the eruption.  
Ice cores, drilled from the icecap of central Greenland, record the depth of each annual snowfall.  The ice holds within it information on the constitution of the atmosphere going back tens of thousands of years.  Like tree rings, each layer can be counted so as to give an absolute rather than relative date.  Big volcanic eruptions show up as spikes in the sulphur-content of the annual fall of snow: Krakatau in 1886; Tombura in 1815; Vesuvius in AD79.  Despite the presence of literate civilisations in Egypt, the Levant and Babylon, no written record of the Thera eruption exists, but the ice cores should overcome that deficiency and provide an absolute date for the cataclysm.  
Actually, the fact that the Thera event went unrecorded is less surprising than it seems.  Mankind has been remarkably unobservant of enormous volcanic eruptions.  An event in 1257AD, less than 800 years ago, is indelibly imprinted into both the Greenland and Antarctic ice cores.  It was greater in size even than Tombora and thus the largest eruption in the last ten thousand years.  But remarkably, no one knows where it happened.  Only in 2012 has Mt Rinjani in Indonesia emerged as a likely candidate.  Another big eruption, as recent as 1809, remains unidentified.
By 2000, the Greenland ice cores had revealed that Thera could not have happened when everyone thought it had.  The most likely anomaly in the ice dated from 1640BC, but this turned out to be from a volcano in Alaska.  At the same time, carbon dating an olive tree buried in the Aegean eruption yielded a date of around 1620BC.  Sulphur traces in the ice have been found that correspond to this date, although they are not as strong as might be expected.  Now, the dendrochronologists have piled in.  The Thera eruption would have caused unusually cold weather which stunted plant growth across the globe.  Evidence from bristlecone pines in the western United States, oak trees in Ireland and Swedish pines all point to a cold snap in 1627BC.  This is consistent with what we’d expect from a big volcano blowing its top in the Mediterranean.  Evidence from the Antarctic ice cores should be in shortly, but for a northern hemisphere volcano, this is unlikely to be conclusive.
The lack of a definitive date for the Thera disaster is frustrating, but we can now be reasonably sure it occurred 120 years earlier than thought.  The implications of this for ancient history are immense.  The chronology of the New Kingdom of Egypt was thought to be rock solid.  Finding that they need to find room for a dozen more decades has been too disconcerting for Egyptologists to tackle so far.  There is a good chance that the extra years belong in a period after the well-documented New Kingdom called the Third Intermediate Period.
For historians of Babylonia, the crisis has been less existential.  Absolute dates for the second half of the second millennium are based on ancient observations of the planet Venus.  We know from modern calculations that a particular configuration of Venus recorded during the eighth year of the reign of a certain King Ammisaduqa must have occurred in 1702BC, 1646BC, 1582BC or 1550BC.  Other events in Babylonian history, such as the reign of King Hammurabi (famous for his law code) and the sack of Babylon by the Hittites are arranged around whichever absolute date is most convenient.  That some of these possible Venusian dates differ by 120 years, about the same length of time that the Thera eruption has been moved back, is highly suggestive to say the least.

So, where does all this leave biblical chronology?  That remains very unclear.  But the redating of Thera shows that we know a lot less about when things happened in the ancient world than we thought we did.
Discuss this post at the Quodlibeta Forum

Friday, January 02, 2015

The British Medical Association thinks it is bad for doctors to work weekends. Why do we still treat the medical profession as special?

In this age of public cynicism, few professions remain in high public esteem. No one ever liked journalists, politicians or estate agents. But in recent years, bankers and lawyers have become much less-trusted: for good reason you might say. The teaching unions continue with their long campaign to undermine the regard which much of the public still have for their members. But doctors have bucked this trend. Even recent scandals in the National Health Service (conveniently blamed on managers) haven’t really dented the way the public see physicians, or how physicians see themselves.

That isn’t too surprising. Doctors can do amazing things to heal us. They save lives habitually. In some circles, it is sacrilege even to criticise them. But, remarkably, doctors enjoyed a healthy professional reputation even back in the days when they couldn’t really help us at all. The miracle of modern medicine is much more recent than we realise. It was only from the mid-nineteenth century that doctors were more likely to cure than kill. Our expectation that we won’t die of infectious disease dates from after the Second World War.

It’s impossible to overstate just how useless pre-modern medicine was. If you fell ill, there was nothing, and I mean absolutely nothing, that a doctor could do to cure you. Granted, he had plenty of treatments and his learning was considerable. But bleeding, purgatives and the like would do you more harm than good. In essence, doctors were charging fat fees to hasten patients towards the grave.
Actually, I was slightly exaggerating when I said doctors could do nothing. There were some drugs available, like opium, to lessen pain. But you didn’t need a doctor to access these drugs and, although opium could reduce discomfort, you wouldn’t be cured. It was palliative only. Luckily for them, doctors did have another trick up their sleeves, although they did not know it. It’s called the placebo effect.

It’s well known that when you give a patient a sugar pill, something with no active ingredients, it can have marked beneficial effects. The mere fact that the patient thinks that they are being treated with an effective medicine makes them better able to heal themselves. And this effect is even more marked if the doctor himself thinks he is doing some good. That’s why new drugs are tested using the double-blind method. Patients are divided into two groups. One group is given the drug under test and the other is given a placebo. It’s called double-blind testing because the researchers giving the drug don’t know which is which any more than the subjects do. Only a second lot of researchers, who never actually come into contact with the patients, know who has received the real drug and who has received the fake.

So, a doctor in the eighteenth century, with his training and aura of competence, could help his patients cure themselves merely because all parties thought that he could. This might even offset the damage that the doctor was doing by administering dangerous drugs or ordering bleeding. Clearly, the doctors who could best help their patients were the ones who didn’t do anything besides having a reassuring bedside manner and giving out harmless placebos. That’s generally what village healers and cunning folk did. Their magical cures were less likely to hurt you than the treatments of professional doctors. Most effective of all was praying at a saint’s shrine. If you believed in it, prayer would do as much good as a visit to the doctor, and it was unlikely to do you any harm at all. Physicians made their living by cloaking themselves in learning, jargon and professional qualifications. But it was all an illusion. No matter how many long years they studied Galen and Avicenna, they couldn’t help their patients one jot.

Incidentally, that’s how homeopathy got going. It was founded by Samuel Hahnemann in 1796 while doctors were still more likely to be licensed killers than saviours. Now, I hope I won’t offend anyone when I say that homeopathic medicines do precisely nothing. They rely entirely on the power of suggestion – in other words the placebo effect. But when homeopathy was founded, doing nothing could be a huge improvement on conventional treatments. So, it appeared to work better. This meant that homeopaths gained a respected place in British medicine that they have never really relinquished. Homeopathy is still available on the National Health Service.

All this raises a slightly disconcerting question. If doctors could maintain a professional reputation back when they couldn’t help their patients, is some of the reverence in which we hold them today really just a function of good public relations? That’s not to say that today’s medical professionals don’t deserve a large measure of respect. But placing them on a pedestal doesn’t do us or them any good at all. So when the British Medical Association say that doctors are too important to work at weekends, we should treat the suggestion with the scorn it deserves.

Discuss this post at the Quodlibeta Forum

Thursday, December 11, 2014

The Perils of Earthquake Prediction and Political Hubris


Six seismologists were finally acquitted last month after they made rash comments before Aquila earthquake of 2009.  They were victims of unreasonable expectations and not scientific ignorance. 
At 3am on 6 April 2009, an earthquake measuring 6.2 on the Richter scale devastated the medieval city of L’Aquila in the Apennine Mountains of central Italy.  Over three hundred people died and the city’s cultural treasures were left in a parlous state.  But it is events that unfolded shortly before the quake that have continued to attract worldwide attention. 

Six days before the disaster, a government committee of six seismologists and a public official tried to dampen down fears that an earthquake was imminent.  In particular, the one member of the committee who was not a scientist, Bernardo De Bernardinis, stated that there was “no danger”.  In 2012, a local court convicted the committee members of involuntary manslaughter.  When they were first charged, numerous professors, decorated with the weightiest of credentials, wrote letters attacking the prosecutors.  Putting these men on trial was an affront to the dignity of science, they cried.  When the seven were found guilty, the cacophony of outrage doubled in volume.  The Aquila seven joined Galileo as paradigms of scientific martyrdom. 
The wheels of Italian justice turn extremely slowly and only now have appeals against the decision of the local court been handed down.  The six scientists have had their convictions quashed, but that of De Bernardinis, who said there was no danger, was upheld.  Further appeals are still possible.

So what was really going on?  The world’s media misreported the 2012 trial with an even greater level on ineptitude than usual.  No prosecutor had alleged that failing to predict the earthquake was a criminal offence.  This was because predicting an earthquake is impossible.  The record of failure is long and inglorious.  We’ve only recently found out why earthquakes happen at all.  Aristotle thought they were a result of vapours escaping from the soil.  In the eighteenth century, some theorists blamed lightning strikes.  The development of plate tectonics in the early-twentieth century means that we now understand what causes the ground to shake. But mainstream geologists long derided this theory and it only achieved widespread acceptance in the 1950s.
Prediction remains a pipedream.  Studies of animals have found that, while they can act strangely before a major quake, plenty of more innocent occasions set off the same behaviour.  A retired engineer claimed that an earthquake was looming at L’Aquila because he was picking up higher readings of a radioactive gas.  But again, this also happens when no earthquake is due.  Foreshocks, such as those felt at L’Aquila, occur before about half of large quakes.  In contrast, large quakes only follow foreshocks about one occasion in fifty.  Thus, major seismic events do give some warning signs.  It’s just that those warnings don’t usually presage a serious earthquake.  In the jargon, “false positives” are far more common than true predictors.  Just imagine if scientists demanded the evacuation of Los Angeles, promising the big one was around the corner, and then nothing happened.  That is the most likely outcome given the current state of knowledge.

So, the seven Italians were not convicted of failing to predict the disaster in L’Aquila.  Rather they were accused of going about their duties negligently.  And negligence that causes death is often characterised as manslaughter.  Given De Benardinis assured the public they were completely safe, the failure of his appeal seems fair.  But the seismologists were in the impossible position of not knowing what the risk was, just that it probably wasn’t very great.

One question the case raises is the extent to which scientists should be held accountable.  The implication of many of the L’Aquila seven’s defenders is that scientists should be given carte blanche to say what they like.  Anything else would obstruct free enquiry.  But that can’t be right.  A scientist who carried out their work without due care or made off-the-cuff pronouncements would surely be culpable.  Given that we cannot predict earthquakes, a confident statement that no earthquake was due would be as bad as saying that it was imminent.  As in many other fields, an honest mistake is a defence, but negligence is not.  Scientists enjoy the status of latter-day sages.  Many imagine that their methods provide the only road to truth, not only in physics and biology, but in the social sciences as well.  So perhaps the message from L’Aquila is that in making those claims, scientists unintentionally erect expectations that they cannot possibly meet.
Still, there is another way of looking at the case of L’Aquila.  The government set up a committee to advise on earthquakes and people of the city felt betrayed when it failed to protect them.  No matter that the state could no more control the ground than Canute could the tide.  Like so many westerners, the citizens of L’Aquila thought that their government was an indomitable Leviathan.  The media fuels this mood with its constant refrain that “something must be done” even when there is patently nothing that can be.  So grandiose has the rhetoric of the state become, that people imagine the reality should match the words.  Even in a country like Italy where the government is so self-evidently incompetent, it is still expected to be in control of events that are intrinsically beyond control.  Extending the scope of a government’s tasks to scientifically impossible tasks such as earthquake prediction is only a small step further from expecting it to achieve the economically impossible by “kick-starting the economy” or the mathematically illogical task of preserving generous entitlements without raising taxes or cutting other spending. 

We have come to expect too much from our politicians and our scientists.  What it needed is for both professions to become more humble. Otherwise, they can expect to be severely punished if they can’t live up to their rhetoric.
Discuss this post at the Quodlibeta Forum

Saturday, July 26, 2014

Text wanted

I'm looking for an introduction to philosophy text that meets the following criteria:

1. It's arranged topically, not historically.
2. It nevertheless deals with these topics by going over how various thinkers throughout history have addressed them -- although not exhaustively of course.
3. It's actually introductory, for people who haven't taken philosophy before.
4. It's inexpensive.

I had used Does the Center Hold? but it's chapter on metaphysics was exclusively on philosophy of mind (not really dealing with space and time, causality, etc.), and the chapter on philosophy of religion was just terrible. Anyone have any ideas?

Discuss this post at the Quodlibeta Forum

Sunday, June 22, 2014

The Trace of God by Joe Hinman

“Are mystical experiences real?” asks Joe Hinman in his new book, The Trace of God: A Rational Warrant for Belief. It should not be too much of a spoiler to reveal that he concludes that they are. Joe has distilled the research on mysticism since William James to determine the commonalities of these experiences and to ask how much they can tell us about aspects of reality not readily accessible to everyday experience.

There are plenty of problems with studying mystical or “peak” experiences. For a start, they are highly subjective and extremely difficult to describe. Francis Spufford gives it a go in his excellent book Unapologetic, but reading about someone else’s experience is always second best. I am a decidedly non-mystical person and so I find the subject alien, if quite fascinating. Joe introduces us to Ralph Hood Jr’s “M scale” which attempts to provide a measure of mystical experience so that they can be compared and validated. In fact, it turns out that there has been a great deal more work in this area than your might imagine. This has revealed uniformities that mean we can certainly group these experiences into a single category.

But are they real? The standard rationalist response is to dismiss mysticism was something that goes on inside our heads, often aided by illicit chemicals. It has no external cause and so it is of interest to neurologists and hippies only. It certainly can’t tell us anything about God. As Joe explains, the problem with this dismissal is that all our experiences are ultimately subjective. We never enjoy unmediated access to reality, but only the most radical solipsist would claim this means that reality doesn’t exist. We know when we are awake and quite often know when we are dreaming. Mysticism isn’t like that – it feels more real than everyday life.

Still, the strength of science, says the rationalist, is that it overcomes subjectivism by insisting on repeatability. Joe marshals Thomas Kuhn and other sociologists of science to argue that scientists are just as prone to herd mentality as the rest of us. I’m not sure this goes far enough to mean a mystical experience can claim parity of subjectivity with a laboratory experiment. But Joe doesn’t want to take things that far. He just argues that the mere fact that mystical life is subjective does not rule it out of court as a valid experience from which we can extrapolate knowledge. His basic argument is that we are justified in accepting religious truths on the ground of our own experiences (what is called the “religious a priori”). Thus, mysticism can provide us for a rational warrant for religious belief.

The bulk of The Trace of God is taken up by detailed rebuttals of sceptical arguments against mysticism: that it is just emotions and feelings, caused by drugs or brain chemistry. Joe blunts these arguments, but he would be the first to admit that he has not proved that mystic experiences are not purely internal. However, by showing that rationalists cannot invalidate mysticism, he leaves the road open to his own argument: these experiences are evidence of God in the way that a footprint is evidence of a wild animal. Mysticism provides a trace that gives us a rational warrant to postulate the existence of the being that gave rise to the spoor. We’re not dealing in proof here. In that respect, Joe’s project is similar to Alvin Plantinga’s work on warrant. But whereas Plantinga floats his justifications on rarefied philosophical air, Joe builds on the solid ground of widely experienced phenomena. No one, as far as I am aware, believes in God because of philosophy. Plenty of people base their religious faith on mystical experience.

This leaves us with two difficult questions: is belief in God warranted by someone else’s mystical experience? And where does religious doctrine fit into experiences that can be wildly inconsistent? Joe doesn’t really deal with the first of these. He concludes that the evidence of mysticism provides him with warrant for knowledge about God that he has anyway. It doesn’t seem to provide much evidence for the non-believer unless that non-believer is willing to invest in a religious interpretation of these experiences. As far as apologetics goes, this is a “come on in, the water’s lovely” argument. 

And what about doctrine? Joe, like me, is an orthodox Christian of liberal persuasion. One senses that, for him, the universal aspects of mysticism are an advantage not a problem. A Christianity that damned the rest of humanity (or worse a Christian sect that damned most Christians into the bargain) is not one that Joe or I would be comfortable with. If mystical experiences provide warrant for believing in God, they also provide evidence of God’s interest in all of humanity. Joe distinguishes between knowing God “face to face” and the knowledge of doctrine. He finds evidence for the distinction in the writings of Paul: the man who had the most famous mystical experience in history. 

Overall, as a first book that breaks new ground in the philosophy of religion, this book represents a considerable achievement. Joe’s publishers, Grand Viaduct, also deserve credit for helping him overcome the disadvantage of dyslexia to communicate his ideas in a format such that they might achieve the recognition they deserve.

Discuss this post at the Quodlibeta Forum

Wednesday, March 19, 2014

Debate

Ed Feser had an online debate with Keith Parsons. Considering that they had had some minor hostility issues with each other previously, it's remarkable how respectful and cordial they are, although it started with some less-than-polite forays. Here are the elements, and the comments are worth reading too:

Before the debate:
Keith Parsons: Can the arguments of the "new atheists" be made stronger?
Ed Feser: Four questions for Keith Parsons

The debate:
Parsons: Answering Prof. Feser
Feser: An exchange with Keith Parsons, part 1
Parsons: Reply to Prof. Feser's second question
Feser: An exchange with Keith Parsons, part 2
Parsons: Reply to Prof. Feser's third question
Feser: An exchange with Keith Parsons, part 3
Parsons: Reply to Prof. Feser's fourth question
Feser: An exchange with Keith Parsons, part 4

After the debate:
Feser: Can you explain something by appealing to "brute fact"?
Parsons: Response to Prof. Feser's response, part 1 (I'm not sure why this was just posted a couple days ago as it seems to form a part of the actual debate, but so be it)

Discuss this post at the Quodlibeta Forum

Wednesday, March 05, 2014

Arctic views

With Russia's sort-of invasion of Ukraine (some say it is definitely an invasion, some say it isn't), pundits on the political right are pointing out how, during the 2012 American presidential campaign, Mitt Romney was mocked by President Obama, John Kerry, and the media for suggesting that Russia was America's number one geopolitical foe. Some have looked further back to the 2008 campaign when Sarah Palin was not taken seriously when she suggested that Obama's stance towards Russia's invasion of Georgia would only encourage Putin to invade Ukraine.

This latter case is bringing up another issue involving Sarah Palin and Russia that I've never understood. During the 2008 campaign, an interviewer asked her for her thoughts on Russia, given its proximity to Alaska, the state Palin was the governor of at the time. She responded that Russia and Alaska are neighbors, and that in fact you can see Russian territory from Alaskan territory, specifically an island in the Bering Strait.

When I first heard this, I nodded my head. I thought it was common knowledge. There are two islands about two and a half miles apart in the Bering Strait: the Little Diomede Island is Alaskan and the Big Diomede island is Russian. The Alaskan island has a town facing the Russian island, and the Russian island has a military base on it. The international date line goes right between the two islands, and the space between them (actually the whole Bering Strait) was known as the Ice Curtain during the Cold War. Monty Python's Michael Palin began one of his travelogues on Little Diomede Island, and tried to finish it there as well, but couldn't quite make it. I remember in the 1980s the comic strip Bloom County had a sequence about how some ignorant hicks heard that the USSR had moved within two and a half miles of American territory and were panicking about it. Lynne Cox, an American swimmer, swam between the two islands to "ease international tensions." Etc. Again, I thought this had permeated American culture and that everyone knew it -- not necessarily the names of the islands (which I didn't know), but just that there was an American island and a Russian island a couple miles apart in the Bering Strait.

In fact, the Diomede Islands aren't the only place that Alaska and Russia are within sight of each other: "To the Russian mainland from St. Lawrence Island, a bleak ice-bound expanse the size of Long Island out in the middle of the Bering Sea, the distance is 37 miles. From high ground there or from the Air Force facility at Tin City atop Cape Prince of Wales, the westernmost edge of mainland North America, on a clear day you can see Siberia with the naked eye." St. Lawrence Island has two towns on it. And Tin City, as noted above,  is part of the North American continent, not an island. It is the mainland, and you can see Siberia from it: "The station chief at Tin City confirms that, for roughly half the year, you can see Siberian mountain ranges from the highest part of the facility."

Yet when Palin said you can see Russian territory from an Alaskan island, everyone went crazy about how stupid she was. On Saturday Night Live, Tina Fey, portraying Palin, said "I can see Russia from my house," which, incredibly, has entered the public consciousness as something Palin supposedly said. I guess if people were ignorant of the Diomede Islands -- and if they didn't realize that the proximity of mainland Russia to St. Lawrence Island and the westernmost part of the North American continent allowed an observer to see one from the other (which I was ignorant of and surprised by) -- I could understand them being skeptical of Palin's actual statement. But even if you think she's unintelligent and says foolish things in general, once you found out about these islands, why in the world wouldn't you respond by saying, "Oops, my bad, Palin was right." I mean, it's no big deal. You didn't know about a couple of islands in the Bering Sea. It's not a personality flaw. Yet Palin's statement is still held up as an example of stupidity on her part. I don't get it. Maybe it's because she supposedly used this to tout her international cred. But, again, she was asked about Russia's proximity to Alaska, and she merely confirmed it by accurately stating you could see Russian territory from Alaskan territory.

I'm not defending Palin's politics at all here. My confusion about this has nothing to do with her politics or her overall intelligence or how well-informed she is. I just don't understand why an innocuous and correct statement she made in response to an interviewer specifically asking her about this subject would cause so many people to have such a strong reaction that she must be wrong. If you think she's unintelligent, fine. If you think she's wrong about politics, great. This isn't politics, it's geography. What's the source of this reaction? It's this absurd polarization, this staking out of claims, this willful blindness that makes me avoid politics as much as possible.

Update: Here's some pictures that I obviously cannot vouch for. First, to show their proximity, are some pictures of the Diomede Islands:




Second, a picture of the Russian mainland from St. Lawrence Island, with the Alaskan town of Gambell in the foreground:


Third, a picture taken from Tin City (presumably at the part that's over 2000 feet above sea level), which, to reiterate, is the westernmost point of the North American continent. The Diomede Islands are about three-fourths up from the bottom of the picture -- Little Diomede is right in front of Big Diomede, so it's hard to distinguish them. On the upper right part of the picture is Cape Dezhnev, the easternmost point of the Asian continent, and on the upper left part of the picture is more Russia.


Another update: Here's the beginning of Michael Palin's travelogue Full Circle, which begins on Little Diomede Island with Big Diomede in the background:



Discuss this post at the Quodlibeta Forum

Wednesday, February 26, 2014

Some recent reads

Brainstorms: Philosophical Essays on Mind and Psychology by Daniel Dennett. I just added Dennett to my "Favorite Books" list on my profile page. Not because I agree with him, but because he has staked out the issues that I tend to move towards as well. The difference is in our assessment of the issues, but we agree on the meta issue of which issues should be addressed. I'm currently reading The Intentional Stance (I'm stuck in the far-too-long chapter "Beyond Belief" which defends his almost-but-not-quite eliminativism), and will move on to his new book afterwards, probably.

Physicalism, or Something Near Enough by Jaegwon Kim. You should read everything Kim writes. He is one of the most important voices in philosophy of mind. I'd like to make a website devoted to Kim's writings, analogous to the websites Andrew Bailey has made for Alvin Plantinga, Peter van Inwagen, and others. This book, which I disagree with for the most part, is his attempt to solve the mind-body problem and mental causation, which he also addressed in his excellent Mind in a Physical World. He seems to think he is successful -- and Kim is emphatically not one of those overconfident philosophers who solves deep problems with superficial analyses -- but he says he is still left with qualia. That's why it's near enough to physicalism: he's solved the most important and difficult part, and the remainder is a difficult but nowhere near as significant issue.

Reason, Metaphysics, and Mind edited by Kelly James Clark and Michael Rea. These are the proceedings from Alvin Plantinga's retirement conference. The essays aren't on Plantinga's philosophy, but rather (and I approve of this wholeheartedly) on issues that Plantinga wrote on extensively. As with any collection there are some good and some not as good, but it's definitely worth it.

The Concept of Canonical Intertextuality and the Book of Daniel by Jordan Scheetz. Jordan is one of my best friends, so I'm completely biased towards this book. We went to the same school, and we both took a class on Aramaic. This is relevant because his book is a focused commentary on Daniel, one of two Old Testament books (the other being Ezra) with a significant portion written in Aramaic rather than Hebrew. This book is a short commentary, but Jordan has mastered the languages so completely, it's incredible. Some general editor compiling a new series of Bible commentaries better contact him, because if he wrote a comprehensive commentary on Daniel, it would be amazing.

Simply Jesus by N.T. Wright. Very good. I need to start buying his Christian Origins and the Question of God series, seeing as how he's just published a fourth volume on Paul.

Jesus and the Logic of History by Paul Barnett. I've had this book for 15 years and never read it. I finally took it off the shelf a couple months ago. It's outstanding. One very interesting point he makes is how scholars start studying Jesus with the gospels, and never move on. He suggests that we start with the New Testament epistles because the information they contain about Jesus is a) tangential to the points they're making to their audience, and b) was already accepted by the original audience -- that is, the statements were meant to remind the readers about something, or put it in a specific context they may not have thought of before. Barnett suggests this makes these statements immune to many of the methodologies used to evacuate the gospels of historical validity.

The Martian by Andy Weir. My wife brought me back a pre-publication copy of this novel from a conference she attended in Chicago last month (it was officially published this month). It's about one of the first manned explorations of Mars, and one astronaut being accidentally abandoned there and struggling to survive, the hope of rescue, etc. I read a lot of science-fiction, so I'm referencing this book in lieu of a long list. My reasons for singling out this one are that a) it's a particular subject that I love: near-future exploration of the solar system; b) it threads the needle of being a very pleasant read while being nice hard science-fiction: the guy really knows the science and the technology (or at least is able to convince a layman like myself that he does). This is made all the more impressive by the fact that c) it's the author's first work. For my fellow science-fiction fans, I recommend it.

Discuss this post at the Quodlibeta Forum