Friday, 28 November 2014

You get what you pay for in publishing

The publishing world is very different now to the way it was 20 years ago. Some of us still work with traditional publishers. We appreciate the professional services they offer, from editing and typesetting to getting our books into bookshops. Others choose to do it themselves with mixed results - the best self-published authors do superbly well others sell to their aunty and that's about it. But there's an interesting in-between scenario.

What if you want your book professionally produced, but it's rejected by traditional publishers? It might seem there's an ideal alternative in companies that do the work the traditional publisher does, but  will accept pretty well any manuscript as long you are prepared to defray costs. This kind of operation has been going a long time, and is traditionally described as 'vanity publishing'. In principle there's nothing wrong with it - but a recent experience I've had with just such a publisher (I won't name them as there are plenty of their ilk) shows the dangers for the author.

We'll leave aside the price that authors face - these are typically too high for the services offered, but I'm sure there are some competitively priced vanity publishers out there. It's more about what the author expects to get for their money.

What started this was receiving a press release from a vanity publisher. I had no illusions about what they were, and treated it as I would a self-published book. 99 times out of 100 I ignore these. But this real world fantasy with a historical twist sounded quite intriguing, so I thought I'd give it a go. The book arrived, and you could see that the author had got some serious work done for his or her money:
  • It was a professional produced paperback without that rather cheap look of print on demand.
  • It had clearly been proofread - there were no obvious typos.
  • The publicist had managed to get it in front of a reviewer.
All in all, not bad. Of course it's unlikely it would ever be in a bookshop, but with Amazon etc. this isn't strictly necessary. But then I started to read the book. I tried. I really did. But it was practically unreadable. Poor use of English, badly structured... it was well produced, but a very bad book. I pointed this out to the publicist, to be told that some authors are trickier to deal with than others.

Okay, I should have left it there. But I was then offered something more down my usual line - a popular maths book, written by someone who had a background that made it possible that it could be a decent text, and on a subject I find really interesting. So I thought I'd give them another go.

Again, the presentation was excellent. This was a chunky hardback produced to excellent quality - it could easily have been from one of the big publishers. And again it had clearly been proof read. There was even a lot of worthwhile content. But it was a book that was crying out for a proper edit. The text wandered here and there, had stylistic issues and, most worryingly for a popular science book, repeatedly made reference to concepts and people it had yet to introduce, so unless you already knew the subject it was baffling.

This was by no means a total disaster, so I wrote a review on concluding: 'Overall, then, the idea behind the book is excellent and there is sometimes some rather poetic, readable material, but there is a total lack of understanding narrative flow - the writing jumps around without consideration for what the reader already knows - and the whole is in need of a serious edit. The book is handsomely produced, but from a publisher that only seems to do copy editing without any true editorial input, and it shows. I can't really recommend the book unless you like a challenge, but that's a pity because there is good material in it.'

I thought the publicist would think any publicity is good publicity, but instead got a hurt email saying 
Thank you, but the bad aspects of your review outweigh the good so I would have preferred it if you had warned me that you found the book poor as I would have asked you not to put it up on your site.

I hope the author will appreciate your review.
Taking that final remark as a veiled threat, I took the review down. I won't be taking books from them again. But this sorry debacle left me with two thoughts. First, what did they expect, putting out a book that clearly hadn't been edited? A glowing review? And secondly, and more importantly, what do the authors expect? Did they think their books had been edited, just because they had been proof read? Were they like those X-factor contestants who blithely believe in their own talent, despite all the evidence to the contrary, or did they think that by going to an apparently respectable publisher that their handsome looking books would have been turned their book into something more than they originally wrote?

I have been writing books for a long time, but I still get editor's notes and make changes to the first draft. I'm just doing such an edit right now, and, for instance, the editor suggested a change to the end of the book that I think has improved it immensely. Yet these people have missed out on this essential part of working with a publisher - something needed far more with a new writer.

You do indeed get no more than what you pay for with a vanity publisher (and in some cases considerably less) - but would-be authors who are thinking of using their services should check exactly what is on offer, and whether it will indeed make their book saleable.

Thursday, 27 November 2014

I don't know much about robots, but I know what I like

Is it art?
I've always had mixed feelings about the Turing test. This is (a variant on) the mechanism proposed by Alan Turing (you know, the one who looks like Benedict Cumberbatch) to decide if computers could be considered to be intelligent. As I've pointed out previously, the way the test is administered is far too lax. And part of the problem is the requirement of a judge to decide if the entity he or she is communicating with is a person. This is inevitably a subjective decision, and highly dependent on the quality of the dialogue the judge uses.

Now, though, we've got a whole new level of silliness, with a Georgia Institute of Technology professor suggesting that in testing for machine intelligence we should also 'ask a machine to create a convincing poem, story or painting.' What remarkable twaddle. Take the 'art' aspect. We can't agree on which humans can create a convincing painting, so how could we possibly use this as a test? By the standards of modern art, any random collection of paint marks on a canvas could be considered a 'convincing painting' - it purely depends on what those judging persuade themselves is valid and/or meaningful and important. There is no standard against which to measure what the computer produces.

Let's be clear - I am not saying this because I think that art that doesn't require skill and craft is worthless (although I do think this). Merely saying that there is no metric that could be possibly be used. What, for instance, if the computer produced the image shown here. If this had been done by, say, Mark Rothko, it would be classed as a convincing painting. As it happens I did it pretty randomly on an iPad in 2 minutes - so it's not classed as a convincing painting. The metric is not the nature of the artwork itself, but who produced it. Modern art is essentially a celebrity phenomenon. And that means the process is bound to fail.

Wednesday, 26 November 2014

Do You Still Think You're Clever? review

John Farndon, the author of Do You Still Think You're Clever?: Even More Oxford and Cambridge Questions! is, very sensibly, a believer in 'If it ain't broke, don't fix it.' In this follow up to Do You Think You're Clever? he takes exactly the same approach of collecting a series of the more bizarre questions asked in Oxbridge interviews and providing his own suggested answers.

As Farndon says, you may not always agree with his answer - but that's part of the fun, because when you're dealing with questions like 'What makes a strong woman?' in a theology interview, it's really up to you how you answer - and what the interviewer is looking for (if he or she is any good) is not so much someone who comes up with a pat answer, but someone who can demonstrate how to think through a question, and this is something that Farndon excels at.

Thankfully, the reader doesn't need to know too much about the subject. In fact I found questions like 'Was Shakespeare a rebel?' much more interesting than more science-based ones like 'Why does a tennis ball spin?' I have both taken a Cambridge entrance interview and interviewed for a company that used a fiendishly evil question in their interviews (or at least did until it got too well known) - in the latter case, it was always the interesting answers that came at the problem laterally that were considered to indicate better candidates rather than the straightforward attempts at a solution.

(As an aside, the company interviews had a senior and junior interviewer. The first time I took part, other than being interviewed myself, the senior interviewer said to the first candidate 'If you need to know any statistical formulae don't worry, just ask Brian.' (B*st*rd.) I was taken totally off guard. I'm not good at remembering formulae, and it's a red herring - the question doesn't require it. But of course the first person said 'What's the formula for standard deviation?' and my mind went totally blank and had to ask for help. Next interview I had a cheat sheet.)

In the end, the reader's thoughts are as interesting as Farndon's answers. I found, having put the book down part way through, that I was thinking about how I would answer the next question up - in fact probably the best way to read it is one question at a time, then put it down while you think about the next one. This makes it a great loo book - but also a great gift book (I'm sure it's no coincidence it's going on sale this time of year) and it will certainly be one I'll be giving to a few people.

I feel I ought to say something negative about any book I review - all I can really find to say here is that I hate the cover. Please don't judge the book by it.

You can find out more or buy it at and

Tuesday, 25 November 2014

Proving the irrational

When writing about science we often have to fight against irrational ideas that seem to grow on people's minds like fungi. Yet early mathematicians had the opposite problem of requiring the irrational. This was the irrational in the literal sense - a number that is not made up of a ratio. According to myth, when one of Pythagoras' merry band discovered that the length of a diagonal of a square with sides of length one (the square root of 2) was not a rational number (a fraction that's the ratio of two whole numbers), he was drowned for spreading such a malicious concept.

What's interesting, as I describe in my book A Brief History of Infinity, is that there is a remarkably simple proof that 2 is irrational. It requires little more than an understanding of odd and even and goes something like this:

Let's assume 2 can be represented by a rational fraction - we'll call it top/bottom.

To keep things simple, we are assuming that top/bottom provides the simplest fraction you can get - there's nothing to cancel out, so it's like 1/2 rather than 2/4. So:

top/bottom =2

Square roots are a bit fiddly, so let's multiply each side of the equals sign by itself. This gives us

top2/bottom2 = 2

In traditional mathematical fashion, we can get rid of the division by multiplying both sides of the equation by bottom2, giving us:

top2 = 2 x bottom2

Next, the Greeks relied on their knowledge of odd and even numbers. They knew three things about odd and even numbers.
  1. A number that can be divided by 2 is even.
  2. If you multiply an odd number by an odd number, you get another odd number.
  3. If you multiply any number (odd or even) by an even number, you get an even number.
As the right-hand side of the equals sign is 2 x bottom2, it must be even - it's the outcome of multiplying by an even number, 2. So top2 also must be even. And that means top has to be even (because were it odd we would be multiplying two odd numbers together and would get an odd result).
Now here comes the twist. If top is even, then it can be divided by 2. So top2 can be divided by 4. And we know that top2 is the same as 2 x bottom2. If 2 x bottom2 can be divided by 4, then bottom2 can be divided by 2. So bottom2 (and hence bottom) is even. (Read that again if necessary - it makes sense.)

So both top and bottom are even. But if both are even, then top/bottom isn't the simplest fraction we could have, since we can divide both top and bottom by 2. Yet we started by saying that top/bottom was the simplest fraction we could have. We've reached an impossible contradictory situation - which means our original assumption that it was possible to represent 2 by a ratio was false.

Added: Thanks to Thony Christie for pointing out that it's thought the Pythagoreans first discovered that 5 was irrational - but because 2 is based on the diagonal of a unit square, I think it makes the simplest example.

Monday, 24 November 2014

The joy of being tech support

For a while when I worked at British Airways I was in charge of the department that did all the support for PC users - and I was also one of BA's first PC programmers - so I think it's fair to say that I know more about computers than most people of my generation. This can be handy. But the downside is that the family regard me as official PC support guy.

This came home with a bang when one of my daughters reported one of the weirdest errors I've come across. Every time she tried to save something in Word the above error box came up. She couldn't save a single file. Even with the default Document1 filename. Yet other programs - Powerpoint for instance - were fine. Word is something she uses heavily on her course, so it needed sorting, but what could possibly be happening?

At the time the laptop was at university and I was at home, so several local attempts were made to sort it out without success. This weekend I finally got my hands on it and spent a couple of hours tidying up various bits and pieces, plus fully de-installing and reinstalling Office. End result? No change.

I was under a bit of pressure, as I had a train to catch. But three minutes before I was due to leave I had a really silly idea. And 2 minutes and 50 seconds before I was due to leave, I had fixed the problem. What it comes down to is a subtle divergence between Word, with its Windows background, and the Mac's OS X operating system, which is basically a tarted up version of Unix. Windows comes from a DOS heritage where filenames were very limited. Who remembers names that had to be no more than 8 characters in length? And there were lots of forbidden characters in filenames. Windows has loosen up since then, but there are still a number of limitations on what can appear, and this proved to be the secret to fixing the problem.

It might seem this doesn't make any sense - after all I was trying out totally legitimate filenames. But the whole path that specifies where the file is located also had to meet with Word's approval. And at some point, the hard disc of the computer had been accidentally renamed ]q - which the Mac had no problems with. But this meant that file's path, which includes the name of the hard disc, had a ']' in it, which Word didn't think was possible.

So there are three problems here the developers should have spotted and prevented. First, by default the Mac puts an icon for the hard disc on the desktop, which makes it far too easy to accidentally rename it. (Easily removed, but it's probably a mistake to have it there in the first place.) Secondly Word, like Powerpoint, should have coped with all possible Mac file naming possibilities. And thirdly the Word error message should have been a lot more explicit, rather than leaving you guessing just what it was complaining about.

Sigh. Computers, eh?

Friday, 21 November 2014

An old one but a good one

Thanks to Peet Morris for reminding me of this little puzzle for the weekend.

Multiple choice:
If you choose an answer to this question at random, what is the chance you will be correct?:

A) 25%
B) 50%
C) 60%
D) 25%

I'm not going to suggest a 'right' answer (though there are at least two) - I leave it up to you.

Thursday, 20 November 2014

The most obscure physics laureate?

We all love a good Nobel Prize, but every now and then there is a flare up over the winners. Sometimes it is because of the arbitrary restriction to three winners who must be alive at the time of the award. Sometimes, as when Jocelyn Bell appeared to be pushed aside for her boss Anthony Hewish (much to the irritation of Fred Hoyle), it is an apparent unfairness. But most often, I suspect, in the case of the physics Prize it is due to the Prize committee's inability to decide just what physics is.

There have been a number of examples of awards that were really for inventions or technology. Admittedly these inventions were usually based on physics - but it would be tenuous to call them a fundamental breakthrough in physics itself, as the inventors were making use of an existing physical concept. So, for instance, the award for the laser (or more accurately the maser, as neither Gordon Gould nor Theodore Maiman were included, arguably the key names for the laser) should arguably have gone to Einstein, who came up with the theory in the first place.

But one thing the dalliance with inventions gives us is the inclusion of the man who must, surely, be the most obscure physics Nobel laureate ever: Gustaf Dalén. Without peeking below, I challenge anyone from working physicists to those with a casual interest in science to say what Dalén achieved to win the 1912 prize.

Here's his picture to consider while you work it out:

Gustaf Dalén: public domain image from
Nobel Prize website

You must admit, he looks cool. Possibly the hero of a steampunk romance.

Okay have you guessed? Have one more attempt before the reveal.

Gustaf Dalén won his prize for his 'invention of automatic regulators for use in conjunction with gas accumulators for illuminating lighthouses and buoys.' 

Not only was there no real physics here, the control of gas-lit lighthouses is not exactly going to have a long-term impact on life, the universe and... well, anything really.

Nice one, Gustaf.

Wednesday, 19 November 2014

A zap from the sun

Image by Fir0002/Flagstaffotos from Wikipedia
I've always loved the science of lightning, hence, for instance the piece I wrote for the Observer. At the time I mentioned a theory linking cosmic rays to lightning strikes as, surprisingly, we really don't know a lot about why lightning occurs. Now there's some brand new (published today) research that suggests the Sun may be playing a part in the generation of lightning strikes by temporarily ‘bending’ the Earth’s magnetic field and allowing the shower of energetic particles that makes up cosmic rays to enter the upper atmosphere.

According to the IOP, 'researchers at the University of Reading who have found that over a five year period the UK experienced around 50% more lightning strikes when the Earth’s magnetic field was skewed by the Sun’s own magnetic field. The Earth’s magnetic field usually functions as an in-built force-field to shield against a bombardment of particles from space, known as galactic cosmic rays, which have previously been found to prompt a chain-reaction of events in thunderclouds that trigger lightning bolts.'

Lead author of the research Dr Matt Owens said: 'We’ve discovered that the Sun’s powerful magnetic field is having a big influence on UK lightning rates. The Sun’s magnetic field is like a bar magnet, so as the Sun rotates its magnetic field alternately points toward and away from the Earth, pulling the Earth’s own magnetic field one way and then another.'

In their study, the researchers used satellite and Met Office data to show that between 2001 and 2006, the UK experienced a 50% increase in thunderstorms when the heliospheric magnetic field pointed towards the Sun and away from Earth. This change of direction can skew the Earth’s own magnetic field and the researchers believe that this could expose some regions of the upper atmosphere to more cosmic rays.

'From our results, we propose that galactic cosmic rays are channelled to different locations around the globe, which can trigger lightning in already charged-up thunderclouds. The changes to our magnetic field could also make thunderstorms more likely by acting like an extra battery in the atmospheric electric circuit, helping to further "charge up" clouds,' Dr Owens continued. The results build on a previous study which found an unexpected link between energetic particles from the Sun and lightning rates on Earth.

'Scientists have been reliably predicting the solar magnetic field polarity since the 1970s by watching the surface of the Sun. We just never knew it had any implications on the weather on Earth. We now plan to combine regular weather forecasts, which predict when and where thunderclouds will form, with solar magnetic field predictions. This means a reliable lightning forecast could now be a genuine possibility.'

This paper can be downloaded from the IOP here.

Tuesday, 18 November 2014

Science writing one hit wonders

I'm in the process of transferring the Popular Science book review site ( to a new home after getting fed up with Wordpress.

The old site (about the fourth incarnation since 2004), was hosted on my own website using Wordpress, but it was a nightmare to keep up to date. They kept updating Wordpress and its plugins with nauseating regularity, and I could never get the automatic updates to work, so had to update it by hand each time. For a while it has been close to the maximum memory my ISP allocates to a virtual server, and the latest version crashed through this so that it was impossible to update the site ever again.

One advantage of moving it to a new site is that I've taken the opportunity to add a couple of features missing from the Wordpress version, notably an alphabetical set of index pages by author. And what's quite surprising is how many one hit wonders there are. If you take a look, for instance, at the S authors, one of the more popular surname initial letters, out of 51 authors, only 6 have more than one book listed. (I am still updating the site, so there may be more by the time you read this.)

One interpretation of this is that popular science writers are primarily amateurs (at writing). Another is that most aren't very good. Or are very slow writers. Or didn't earn as much as they expected. (Or hated our review and wouldn't send another book in case that one was slated too.) All of the above, I suspect, and other reasons too. But interesting nonetheless.

Incidentally, the site is up for the UK blog awards. If you've got a few moments to spare, it would be great if you could pop along and vote for it! It should only take a few seconds.

Monday, 17 November 2014

Politicians need science advisors - and not to be swayed by single interest groups

Image from BBC website
I am totally disgusted by the EU. Not in a generic UKIP fashion, but by their cancellation of the position of EU Chief Scientific Advisor, a post held by Professor Anne Glover, otherwise based at the University of Aberdeen.

There are two problems with this. The first is that politicians are in dire need of science advice. We (and the EU as a whole) have very few politicians and civil servants with a science background. It is essential that they have advisors who can explain the scientific realities of a world where science and technology is central to our everyday lives. To abolish the post is madness.

Secondly, the reason that Professor Glover seems to have got her marching orders is a result of a campaign by green groups, and specifically Greenpeace, which objected to her support for genetically modified crops. Just like they do for nuclear power, such groups have a knee-jerk reaction to GM that has no thought, no appreciation of the science, they just don't like the words.

The green blanket opposition to GM just doesn't make any sense, because it's something we've been doing for thousands of years (if you doubt this, take a look at maize and cauliflowers, both so drastically genetically modified that they can't reproduce without human intervention) - and because we can now do it in a much more controlled and beneficial fashion.

The GM debate is admittedly not simple or black and white, but it has certainly been subject to the misuse of information from both green organisations, which oppose it on principle without thinking about it in detail, and from tabloid newspapers. For example, genetically modified variant of rice that was designed to counter vitamin A deficiency was dismissed by Greenpeace because the environmental organisation said that to obtain the required amount of vitamin A would require ‘seven kilograms a day of cooked Golden Rice’. The actual amount is 200 grams.

So shame on you Greenpeace (who have tried to weasel out by saying that 'Scrapping the CSA post was about the integrity of science advice, the clarity and independence and it's about getting the science right' - since when has Greenpeace cared about getting science right?) for engineering this highly negative move.

We need more MPs and MEPs with a science background - but even if we had them today, party politics and, yes, the malign influence of pressure groups both from industry and the greens, means that we also need good science advisors. Professor Glover will be sorely missed.

This has been a Green Heretic production

Thursday, 13 November 2014

Where were the world's first computer animations produced?

Part of one floor of the Atlas installation
(courtesy Rutherford Appleton Laboratory and
the Science and Technology Facilities Council (STFC))
We are all so used to CGI that it's not even a surprise these days when the effects on Dr Who are passable. But 50 years ago, things were very different. Usually the only computer animation you could expect was watching the punched tape or cards fly through the reader. But where was the first seed planted for the future wonders of CGI that would make practically any modern science fiction or action film possible? Was it MIT? Hollywood? No, it was Oxfordshire. In the wonderful Rutherford Appleton Laboratory.

Let me hand you over to Marion at the Science & Technology Facilities Council (based in sunny Swindon):

UK computing is today celebrating fifty years since the launch of what was at that time the largest supercomputer in the world, the Atlas 1. When built it was the size of a large detached house.  Now that same computing capacity would fit in your pocket inside your mobile phone.

In 1964, the Rutherford Appleton Laboratory (RAL) in Oxfordshire opened the UK’s first purpose-built computer laboratory to house one of the world’s first supercomputers.   Not only did this facility go on to produce the world’s first computer animated films during the mid-seventies it also contributed the 3D wire-frame model  shown on the navigation monitors in the landing sequence of the Ridley Scott film ‘Alien’ – making it the Industrial Light and Magic or Weta computer animation facility of its day.

The Ferranti Atlas 1 computer was the largest of three world leading computers built in the UK.  It cost around £3M – equivalent to about £80M in today’s currency – and was so enormous the Atlas Computer Laboratory, as it was known then, was built to fit the computer.

This week, on 13 – 14 November, the Science and Technology Facilities Council (STFC) is opening RAL’s doors to celebrate those 50 years of supercomputing, with a series of talks, tours and exhibits to highlight the importance of this computer facility to society today.

 In the 1960s and 70s, universities and other research establishments  that needed to use computing facilities had to put their program and data onto punch cards and post them to the Atlas Computing Laboratory, where their program would  be run for them.

Dr Andrew Taylor, Executive Director, STFC National Laboratories, said, “Since those early days, computing at RAL has gone from strength to strength, and the Atlas Centre is now home to Tier One – where data from the Large Hadron Collider is stored in the UK, as well as a range of other facilities such those which process data from weather satellites. Fifty years on, the technology is so far advanced that a mobile phone is more powerful and far cheaper than the Atlas computer.”

The Atlas processor used more than 5,600 circuit boards, which would have covered an area about the size of a tennis court – around 90,000 times bigger than a modern computer chip.  One of its discs could hold just two photographs, whereas today’s equivalent, the USB stick, can store thousands of images.

The original Atlas Computer Laboratory established a national computing operation to support scientific research.  Since 1964 that UK computing operation has been a part of many technical and scientific innovations. It has contributed to the governance of the World Wide Web; it has managed the data which led to the discovery of the Higgs boson, and it continues to support major scientific experiments at facilities in the UK and internationally.

The world's first computer animations were produced at the laboratory. These included an animated model of stress-loading across an M6 motorway bridge that was being built at the time.  It was the first entirely computer-produced engineering film to be made in the UK and won the Great Britain entry in the 1976 international Technical Films Competition in Moscow. Most famously, the laboratory's facilities were used to produce the 3D wire-frame model  shown on the navigation monitors in the landing sequence of the Ridley Scott film ‘Alien’, which won the 1979 Academy Award for best visual effects.

People touring the Atlas Centre exhibits during these 50th Anniversary celebrations will discover the rich history of computing innovations at RAL, from the very beginning of supercomputers to the endless possibilities of today.

Dr Taylor added, “We are particularly excited that, in its 50th anniversary year, we are able to display the console from the original Atlas computer, together with memorabilia of the time.”

Though the Atlas computing operation has gone from strength to strength the Ferranti Atlas 1 itself closed in March 1973 and was replaced by an ICT1906A. In the eight years of operation it had run for 44,500 hours with a 97% up time. 836,000 jobs were run, 300 million cards read, 4000 million characters from paper tape read, 800 million lines of line-printer output generated and 17 million cards punched.

You can read more about Atlas at its website. Here are those groundbreaking wireframe graphics in a clip from Alien, but it French to make it more noir:

Wednesday, 12 November 2014

When scientists show their claws

The unfortunate Thomas Young
With their media of image of being cool, emotionless brainboxes, it might be surprising to learn that scientists can be just as catty as anyone else, and though science is a collaborative business where it's par for the course to tear apart other people's theories and then go out for a drink with them, it's still the case that personal dislikes sometimes triumph over rational argument.

One of the most famous scientific quotes in history, from Isaac Newton is often thought to be a masked insult. Newton, writing to his hated arch rival Robert Hooke, approximately quoted a line from Robert Burton when he wrote 'If I have seen further it is by standing on the shoulders of Giants.' The reason many think this was a piece of nastiness was not just because Newton was making it clear that he didn't owe much to Hooke, but also because Hooke was anything but a giant physically.

The scientific claws come out in all kinds of subtle ways. I'm currently reading a new book on quantum biology by Jim Al-Khalili and Johnjoe McFadden. In one chapter, Al-Khalili (I assume it's him, as this is a reference to quantum physics) makes the effort to point out four times that quantum entanglement does not produce 'paranormal effects' (his inverted commas) like telepathy. He refers to those who come up with such theories as charlatans and uses what, since the Simon Singh/BCA affair must now be considered the 'B' word when he says: 'despite the bogus claims of telepathy.'

You might think this is just general commentary rather than backbiting. However, if you know the quantum entanglement field, it's hard not to be aware that Nobel Prize winning physicist Brian Josephson has publicly suggested that there might be an explanation for telepathy in quantum entanglement. Which does put these remarks in a whole new light.

However, my favourite insult is probably one I've just revisited in preparing a new edition of my first popular science book Light Years. When Thomas Young first came up with his evidence that Newton was wrong and that light was, as Descartes, Huygens and others had suggested, a wave, he got considerable opposition from the British establishment. I want to leave you with the commentary in the Edinburgh Review from Henry Brougham, at the time a young lawyer and writer, and later Lord Chancellor. (Thanks, by the way, to John Gribbin for pointing out that this is probably a double insult, as the reference to the 'ladies of the Royal Institution' may well be a dig at the way the head of that then upstart institution, the Brian Cox of his day, Humphrey Davy, had a reputation for making the ladies swoon.)

We may now dismiss for the present, the feeble lucubrations of this author, in which we have searched without success for some traces of learning, acuteness or ingenuity that might compensate his evident deficiency in the powers of solid thinking, calm and patient investigation, and successful development of the laws of nature by steady and modest observation of her operations. Has the Royal Society so degraded its publications into bulletins of fashionable theories for the ladies of the Royal Institution? Let the Professor continue to amuse his audience with an endless variety of such harmless trifles, but in the name of Science, let them not find admittance into the venerable repository which contains the names of Newton, Boyle, Cavendish...
Image from Wikipedia

Tuesday, 11 November 2014

Taking the tablet

Effortlessly editing a script in Word for iPad
I do technically have a laptop, but I hardly ever use it. Ever since I've had an iPad, the tablet has been my sturdy companion when working away from home. Why would I want to carry a heavy, delicate beast like a laptop when I've everything I need in a compact package with a battery life that means I can work on it all day?

I can touch type on the onscreen keyboard - okay, a little slower than a real one, but not much. It is the perfect working companion for a train journey or an overnight in a hotel. But there was a tiny fly in the ointment. And that was the lack of Office.

Not having Office was, frankly, a pain. I made use of a perfectly respectable alternative, that pretty much read and wrote Office files, but like all such second-bests it wasn't quite the real deal. The Word equivalent lost some of the formatting, while the Powerpoint handling didn't show animations, which practically every Powerpoint I use has.

So it was great when Office for iPad eventually came out - except for another issue. To do anything other than read a document, you needed an Office 365 subscription. Now I do intend to cut over to this - but not until they bring out the new Office for Mac, which isn't expected until second half of 2015.

However, Microsoft has finally seen the light. The new release of Office for iPad (which also works on iPhones) is almost full-functioned. There are a few small things missing you need Office 365 to get, but nothing I regularly use. With joy, I could throw away my old compatiblish app and it's Office all the way. So now I genuinely can say that I can't imagine ever using my laptop again.

To make the replacement seem complete, although I rarely actually want to print from my iPad, it seemed reasonable to get printing up and running as I very occasionally need to do this at home. I don't have an airPrint printer, and treasure my 12-year-old laser printer, which is a solid a battleship, so I had to look for alternatives. I'm currently in the trial period for Printopia (HT to Mark Hogarth), which seems to do the job excellently.

I think I'll keep taking the tablet...

Friday, 7 November 2014

Code breaking for treasure

I'm not a great one for putting out press releases as blog posts, but this is one I can't resist sharing. So let me hand you over to the University of Manchester's press office to discover the opportunity to crack a cryptic code and win some movie goodies in honour of Alan Turing (and, yes, the new film with Mr Cumberbatch about him). Luckily, the treasure hunt won't  turn out like the infamous Masquerade book, which gave cryptic clues to the burial place of a solid gold hare, and resulted in all manner of places being dug up to the irritation of their owners and embarrassment of the publishers. Here the treasure isn't really buried (and, sadly, isn't real silver ingots as Turing is supposed to have hidden).

So, over to you Manchester U:

A new fiendishly-challenging online brain-teaser, featuring cryptic clues, has been launched by mathematicians at The University of Manchester.  The online cryptography competition has been designed to coincide with the launch of the film ‘The Imitation Game’, which tells the real-life story of mathematician Alan Turing, who is credited with cracking the German Enigma Code.

The cryptic conundrum is based around a true story of how in 1940, Alan Turing converted his savings into silver ingots and buried them in Bletchley Park. In real life the silver has never been found, but for the purposes of the competition, a location has been chosen and three coded clues are there to be deciphered.  The answers to the clues can then be used to find the location of the silver.  Participants submit their solution and winners, who will be drawn at random from correct solutions, receive film-related merchandise kindly donated by StudioCanal, distributors of the film.

Turning, who was a pioneer of computing, artificial intelligence and mathematical biology had close links with the University of Manchester.  In 1948 he was appointed Reader in the Mathematics Department and soon afterwards he became Deputy Director of the Computing Laboratory at the University, working on software for one of the earliest true computers - the Manchester Ferranti Mark 1.  Actor Benedict Cumberbatch, who plays Turing in the film, also has ties to the city and The University, where he was a former drama student.

Dr Andrew Hazel from the School of Mathematics said, “Having seen our annual online Alan Turing Cryptography Competition, StudioCanal contacted us to propose a one-off competition related to the release of The Imitation Game. We were delighted to take the opportunity to share our enthusiasm for mathematics and cryptography, and to highlight the close ties between the University, Alan Turing and Benedict Cumberbatch.”

The Imitation Game Cryptography Competition will close at midday on 28th November 2014.  It is free to enter and open to any resident of the United Kingdom but only one prize will be awarded per household.  Full details available online.

Thursday, 6 November 2014

Wishful thinking on the demise of supermarkets

Should I give up Asda, 5 minutes walk away, and drive a 10 mile round trip
to get to a butchers, greengrocers etc?
I have a bit of a history with 'natural food' journalist Joanna Blythman. Don't get me wrong, I've never met her, and we've never argued, but I have often mentioned a quote from her in her days at the Soil Association when she came out with a statement that managed to be both an understatement and an unnecessary scare. Writing in the Guardian, she remarked:
You can switch to organic... Or you could just accept that every third mouthful of food you eat contains poison. Are you up for that?
The understatement is because practically every mouthful you eat contains poison, whether you buy organic or not. Food contain poisons both natural and artificial. Usually far more are natural - typically around a factor of 1,000. And the unnecessary scare is because the fact is that the levels of pesticide residues on non-organic food are sufficiently low that they provide far less risk than that from the food itself - and that risk is (for uncontaminated food) is minimal with almost everything except that ubiquitous poison alcohol. The key to understanding poisons is that it's the dose that matters. However, I should move on, as this isn't the topic of this post.

In her recent article in the Observer, Ms Blythman celebrated the demise of the supermarkets. I don't disagree with her assessment of some of the issues faced by the big supermarkets, but I think she is indulging in pure wishful thinking if she thinks that in a few years they will have disappeared and we will all be good Stepford Wives/Husbands, spending the entire day trolling from butchers to greengrocers to half a dozen other shops in order to have the tea on the table when our partners come home.

Yes, it's true that we are tending to shop more frequently in small quantities, rather than a single big weekly shop, but most of us don't live in a fashionable London suburb, or a quaint market town, that still has its neat row of butchers, bakers and candlestick makers all ready for us to pop in with our hessian baskets akimbo. For good green reasons I do most of my food shopping on foot - and that means shopping at an Asda superstore or a Tesco convenience store. The Asda (pictured above) is closer and has much more range - and is very friendly and has some good pricing - so that's where I go.

About once a week we do a bigger shop, though no longer the traditional 'shopping for the week' and for that we go to Waitrose. It's a new building, well-designed with great facilities in which to sip your free latte (or whatever). It is actually the most enjoyable supermarket I've ever used. Unlike the butchers, I don't have to stand in a queue - in fact not even to check out, as Waitrose have the natty check-yourself-as-you-go system.

Of course it's a dangerous trap to assume the rest of the country is like you. (Could Ms B be doing this?) Lots of people have a day job that makes it less easy to pop to the shop than mine does - and many of them will pick up food shopping in whatever they pass on the way to the bus or the station. They don't want to spend an hour browsing round six different shops, they want to quickly pick something up and get home.

So by all means enjoy the decline of the big supermarkets. They were, indeed responsible for the kind of misleading selling that Ms Btythman mentions. But it would be wrong to assume that this means that they are going to disappear - they will change and survive - or that most people will go back to toddling round a whole range of food shops on a daily basis. It's not going to happen. It will remain a self-indulgent luxury for those who have the money to live in the right places and the time to do it.

Wednesday, 5 November 2014

Space travel is inherently risky

In the wake of the Virgin Galactic tragedy, it is worth thinking a little about the realities of risk and space travel - especially as we are so bad at handling probability, which is what risk is all about.

But I'm afraid I'll have to ask you to click here to pop over to my Huffington Post blog, where you will find the piece.

Tuesday, 4 November 2014

Nothing to lament about here

I'm not a great reader of historical fiction with the exception of titles where it overlaps with crime. Perhaps the greatest proponent of that crossover is C. J. Sansom, and his latest novel, Lamentation, featuring the hunchbacked lawyer Matthew Shardlake, operating in the complex times of Tudor England, is to my mind his best. Oddly, this is despite - or, rather, because - this isn't much of a crime novel. Instead what we have here is a full blown Tudor political thriller, with all the twists and turns, machinations and backstabbing (in this case sometimes literally) that you would expect in the modern equivalent.

The crime that Shardlake investigates appears simple. The disappearance of a compromising manuscript written by Henry VIII's last queen (on whom Shardlake has a long-term crush), Catherine Parr. But the setting, mixing the dangerous teetering between traditionalist, near-Catholic beliefs and 'reformer' protestant beliefs with the political manoeuvring that became ever more strong as Henry's death became obviously close is fascinating, engrossing and gripping.

I admit I'm probably the ideal reader of this book. I read a lot of crime, I like political thrillers, my favourite music is from this period and I find both the religious and political battles absorbing. All three of Henry's children, who haven't featured much in the previous Shardlake books, appear towards the end, and with the prescience of foreknowledge we can feel sorry for Edward, a shudder at what Mary will be responsible for, and hope for Elizabeth - directed by Sansom's expert light touches.

It's a long book at 642 pages, and this would usually put me off, but for once it is justified as there really is never a feeling that the author is padding things out. In reality, the book ends at page 619 with the rest taken up with historical notes - don't skip these as they fill in some details that you will probably find extremely enlightening. I certainly did.

So there we have it: gripping, historically impressive, yet never overloading the detail to the extent you feel that you are being educated, a page turner and yet thoughtful too. Wonderful.

Lamentation is available from and

Monday, 3 November 2014

Does cocoa reduce memory loss?

I was listening to Steve Wright's show on Radio 2 the other day (I'm sorry, it was someone else's car) when an item caught my ear. They reported that a paper in Nature Neuroscience (yes, that's the kind of highbrow stuff you get on Radio 2) said that older people could reduce memory loss by drinking cocoa. Now my next book, due out in January, is all about the claims made for science (good and bad) in areas like health, diet, exercise, the brain and so forth, so it seemed worth looking into, and so I got hold of a copy of the original paper.

I'll be honest, it wasn't one of the better ones I've seen. Most scientific papers are hard work to read, but this was a bit more fuzzy about some things than I would expect to be made explicit. As is often the case, while the paper was interesting, and highlighted something worthy of further investigation, what it demonstrated was more complex than the media report suggested, and at this stage it didn't offer substantive proof of benefits.

In the trial, a group of healthy people aged between 50 and 69 were split into four groups. Two groups spent three months on a diet that was high in cocoa, two on a low cocoa diet. At least, that's how the paper describes it in headline terms - dig in further and it seems the 'high cocoa' group took a daily supplement of 99mg of cocoa flavanols. To get this much naturally you would have to eat 25 individual chocolate bars (not recommended!) - I don't know how much that is in cups of cocoa, but I suspect it's a lot. Each group was also divided into half that were sedentary and half that took regular exercise.

The scientists then looked at two things - how a particular part of the brain responded in an fMRI scanner, and how well the test subjects did at two memory tests. What they found was that those on a high cocoa diet did better at one of the memory tests - the equivalent, it was claimed, of being almost 30 years younger.

This is interesting, but the results presented aren't enough to suggest we should all get out and start consuming lots of cocoa flavanols. The test groups were small with only 8 to 11 people in each. This doesn't mean that the results are meaningless, but it does suggest further tests are required. It has also been pointed out that the claim that result is statistically significant is doubtful. The value isn't what most scientists would consider significant - the results could be obtained in error with about 50 per cent probability, which isn't good enough to be considered useful.

What was claimed to be observed is that the cocoa increased blood flow to the dentate gyrus region of the hippocampus in the brain, which is thought to have a role in memory handling. In the trial, the high cocoa group did better at a memory test where they had to remember whether a shape they were shown was one of 40 they had just seen in a sequence. But they didn't do any better in a test where they had to recall words from a list, 60 minutes after three attempts to learn it.

Another oddity of the trial is that no improvement was found in those who performed exercise, even though in a previous trial by the same experimenters with a different subject group had found a benefit from exercise. This doesn't rule out the findings, but does emphasise the need to repeat the trial, several times and with bigger groups. Oh, and the authors declared no personal interest, but it wasn't strongly flagged up that the study was funded by the Mars chocolate company.

There seems to be some evidence here that this cocoa-sourced substance might help with the short-term recognition of shapes, which is something we get worse at as we get older. This can't be a bad thing if true. But it isn't a miracle cure for the way that ageing effect our memories, and taken on its own, this trial is not enough even to demonstrate that.

Without doubt it raises the question: should the media be reporting this kind of trial in the way they do, or should they wait until there is enough evidence to make a clearer statement? I'm not saying they should conceal the trial: the more reporting of science, the better. But it could have had more provisos attached. We shouldn't be too harsh on Steve Wright's show, though. They, realistically don't have time to read such a paper in full, and the way the findings were summarised left something to be desired too.

You can see the full paper at Nature Neuroscience, though you would need a subscription to read more than a summary. It is Enhancing dentate gyrus function with dietary flavanols improves cognition in older adults - Adam M Brickman, Usman A Khan, Frank A Provenzano, Lok-Kin Yeung, Wendy Suzuki, Hagen Schroeter, Melanie Wall, Richard P Sloan & Scott A Small - Nature Neuroscience (2014) doi:10.1038/nn.3850