Friday, 27 February 2015

Is being an author the most desirable job in Britain?

An article in the Independent newspaper was boldly headlined with the news that 'the three most desirable jobs in Britain are author, librarian and academic.' The article begins 'Forget dreams of a glittering career in Premier League football...' Now as an author myself, I feel really thrilled that I've got the country's dream job. But as an author who likes to look at the numbers behind the headlines, I'm a little doubtful about the validity of this story.

The article was based on a YouGov survey of an impressively large 14,294 British adults, and the chart to the left was the combined outcome.

Well, author is certainly up top. But when you look through that list, there are (at least) two strange additions, which I was very surprised about. No footballers and no pop stars. Isn't that strange?

So it's important to know exactly what YouGov asked - and being a responsible polling organization they give us all the information we need here.

There are a number of fiddly pollster manipulations in the numbers. The sample is weighted, most notably giving less weight to under-forties, though it's not a particularly heavy change. And rather than show people all the options, the were only shown a random sample of 8 jobs, so each particular job was only offered to between 3,643 and 3,797 people - still quite chunky samples.

But there were two particularly interesting points. The data confirm that they neither footballer nor pop star were offered - and as far as I can see, one thing YouGov don't say is how they chose the list. (I have asked YouGov to explain their selection criteria, but they are yet to reply.)

And here's the other thing. The respondents weren't asked 'what would you most like to do?' They were asked 'Generally speaking, please say whether you would or would not like to do each of the following for a living,' for each of the eight options.

So what the survey actually says is this:
We asked people, of the jobs offered (which don't include some of the most common aspirations particularly of young people) which would you like to do and which would you not like to do. 'Author' is the job that the most people said they would like to do.'
Note that this doesn't mean that anyone most wanted to be an author from the selections offered. It could have been everyone's fifth choice, say. Just that more people included it in their 'would like to do' list than any other item from the list.

I don't want to knock the poll too vigorously - but there's quite a leap from what it actually measured to how the results have been portrayed in the media.

Thursday, 26 February 2015

Structural alterations

The British physicist/astronomer, Arthur Eddington was a great science populariser who came up with a lovely comment when writing about quantum mechanics in the late 1920s.

He wrote that rather than cover the theory as it stood, he really ought to 'nail up over the door of the new quantum theory a notice "Structural alterations in progress - No admittance except on business". And particularly to warn the doorkeeper to keep out prying philosophers.'

I don't think I've seen such a brilliant summary of the way quantum physics went through a transformation from its early implementation, and brought what many physicists would continue to consider far too much agonising over interpretation and philosophy into the field. I think it's fair to say that the notice has come down, but whether those philosophers should have been allowed in is a different matter.

Tuesday, 24 February 2015

Apparently authors can't advertise on Facebook

Like many authors I have a toe in social media - not just this blog (and the associated Google+), but Twitter and Facebook (and LinkedIn) too. I do have some useful social interaction on Facebook, but my Facebook page is dedicated to business - in my case, letting people know about science, writing and my books.

Fair enough, and Facebook positively encourages this, providing opportunities to advertise both your page and specific posts to interested parties. I've never bothered with this - I do a bit of Google advertising in the vain hope that it will push up visibility in the search listings, but Facebook advertising seems like money down the drain. However, the other day I had a post I thought would be benefit from a wider audience so I thought I'd invest the price of a cup of coffee in a couple of days promotion.

Off it duly went to the Facebook censors... only to be rejected fairly smartly because it 'breached guidelines'. Apparently, the image in my 'advert' had too great a percentage with words in it. Now, bear in mind I hadn't designed an ad - all I did was to try to promote a post that pointed to my blogs, and Facebook had automatically picked up the image from the header of the blog. So the 'advert' looked like this:


Now, bear in mind I didn't choose what that image was - Facebook did. And when you think about it, any advertising showing a book's blog header, or a book's cover is liable to have a lot of writing on it. That's what books do.

Get your act together, Facebook! (If you want to see the post about chocolate it was referring to, you can see it here.)



Saturday, 21 February 2015

You say embargo, I say lumbago

One of the fun things (well, it's sometimes fun) about my job is that I get sent interesting books to review, which I sometimes do for magazines and newspapers, but most of my reviews either appear here on my blog (if it's not a science book) or on my www.popularscience.co.uk website.

When a book arrives from a publisher, it is accompanied by an information sheet/ press release. The bright-eyed and bushy tailed view of these is that they provide useful information for the editor or review writer. (The cynical view is that they provide nice words about the book that some lazy hacks will just reproduce, in classic press release journalism. But I'm not cynical.)

I must confess, I rarely give these more than a quick glance before reading the book. Yes, I do read every book I review, almost always cover to cover, with the exception of books where I decide that my review would be so nasty that I really shouldn't do it - and usually the publisher agrees this is a good move. I don't want the publicists' words to influence my thinking about books - I want to come to it with the same information that a casual purchaser would have.

There are really only two significant things I check - the email address of the publicist, almost inevitably down the bottom, so I can let them know the review has gone live, and the publication date, because I don't want to review a book months before publication, and sometimes I get sent them ridiculously early. (When a book is very early, it is usually a bound proof, rather than a real book, which I don't like. The only possible excuse for this is if the publisher wants me to write a nice couple of lines to go on the back of the book, otherwise they are the devil's spawn.)

When I do glance at the publication date, just occasionally I will see something like this:
From a real book information sheet
(Publisher's name hidden to conceal the guilty)
The book comes out on 5 March... but I'm not allowed to write about it until the 2 March. This is an example of the dreaded press embargo. Sometimes these have an obvious point. When, for instance, the shortlist for a book prize is going out to the press, you don't want it published before the date the list is announced. But it's a bit more complicated when it comes to book reviews - and it's not clear what's the best approach.

The idea of an embargo is that readers get all the publicity at about the time the book launches, so it's fresh in people's minds, and I can see that's a good thing. But on the other hand, perhaps it's good to build up the awareness a bit earlier? Perhaps this date doesn't fit well with my publishing schedule?

It was the particular case illustrated above that gave me a bit of a pain in the backside - I wrote the review on Saturday intending to go live with it this weekend... and now I've got to sit on it for a couple of weeks. (The review. Not my backside. Well, not the whole two weeks.)

So... I can see why they do it. It sort of makes sense to get a flurry of activity and awareness around the time the book goes on sale (though if that's really what's wanted, why not embargo it until publication date?). But I honestly don't know if it works in a marketing sense. I'd be interested to hear if anyone has done research on this. And when most books are listed on Amazon months for pre-order months before they are available, I do wonder if the embargo is a concept from a different era.

Friday, 20 February 2015

Two weird quantum concepts

Quantum physics is famous for its strangeness. As the great Richard Feynman once said about the part of quantum theory that deals with the interactions of light and matter particles, quantum electrodynamics:
I’m going to describe to you how Nature is – and if you don’t like it, that’s going to get in the way of your understanding it… The theory of quantum electrodynamics describes Nature as absurd from the point of view of common sense. And it agrees fully with experiment. So I hope you can accept Nature as she is – absurd.
It's interesting to compare two of the strangest concepts to be associated with quantum physics - Dirac's negative energy sea and the 'many worlds' interpretation. Each strains our acceptance, but both have had their ardent supporters.

Dirac's 'sea' emerges from his equation which describes the behaviour of the electron as a quantum particle that is subject to relativistic effects. The English physicist Paul Dirac discovered that his equation, which fits experimental observation beautifully, could not hold without one really weird implication. We are used to electrons occupying different quantised energy levels. This is bread and butter quantum theory. But all those levels are positive. Dirac's equation required there also to be a matching set of negative energy levels.

This caused confusion, doubt and in some cases rage. Such levels had never been observed. And if they were there, you would expect electrons to plunge down into them, emitting radiation as they went. Nothing would be stable. As a mind-boggling patch, Dirac suggested that while these levels existed, they were already full of electrons. So every electron we observe would be supported by an infinite tower of electrons, all combining to fill space with his 'Dirac sea'.

As you might expect, a good number of physicists were not impressed by this concept. But Dirac stuck with it and examined the implications. Sometimes you would expect that an electron in the sea would absorb energy and jump to a higher, positive level - leaving behind a hole in the negative energy sea. Dirac reasoned that such an absence of a negatively charged, negative energy electron would be the same as the presence of a positively charged, positive energy anti-electron. If his sea existed, there should be some anti-electrons out there, which would be able to combine with a conventional electron - as the electron filled the hole - giving off a zap of energy as photons.

It took quite a while, but in the early cloud chambers that were used to study cosmic rays it was discovered that a particle sometimes formed that seemed identical to an electron, except for having a positive charge - the positron, or anti-electron.

Weird though it was, Dirac's concept was able to predict a detectable outcome and moved forward our understanding of physics. As it happens, with time it proved possible to formulate quantum field theory in such a way that the positron was a true particle and the need for the sea was removed, although it remains as an alternative way of thinking about electrons that has proved useful in solid state electronics.

The 'many worlds' hypothesis originated in the late 1950s from the American physicist Hugh Everett. Its aim is to avoid the difficulty we have of the difference between the probabilistic quantum world and the 'real' things we see around us, which seem not to have the same flighty behaviour. Everett didn't like the then dominant 'Copenhagen interpretation' (variants of which are still relatively common) which said that a quantum particle would cease behaving in a weird quantum fashion and 'collapse' to having a particular value when it was 'observed'. This concept gave a lot of physicists problems, especially when it was assumed that this 'observation' had to be by a conscious being, rather than simply an interaction with other particles.

Like the Dirac sea, 'many worlds' patches up a problem with a drastic-sounding solution. In 'many worlds', the system being observed and the observer are considered as a whole. After an event that the Copenhagen interpretation would regard as a collapse, 'many worlds' effectively has a universe that combines both possible states, each with its own version of the observer. So, in effect, the process means that the universe doubles in complexity each time such a quantum event occurs, becoming a massively complex tree of possibilities.

Some physicists like the lack of a need for anything like the odd 'collapse' and the distinction between  small scale and large - others find the whole thing baroque in its complexity. What would help is if 'many worlds' could come up with its equivalent of antimatter - a prediction of something that emerges from it but not from other interpretations that can be measured and detected. As yet this is to happen. Whether or not you accept 'many worlds', it is certainly a remarkable example of the kind of thinking needed to get your head around quantum physics.

Thursday, 19 February 2015

Top Gear forgets the number one rule of hybrids

On Top Gear last weekend, Jeremy Clarkson drove the rather lovely looking BMW i8 hybrid, and decided he'd rather drive it than the sporty BMW M3, as the i8 has great performance and the manufacturer claims you can get over 100 miles to the gallon - truly a win-win for greenness and petrolheads simultaneously.

However, in his excitement at driving the thing, Clarkson forgot the number one rule of hybrids, established, in part, by Top Gear. This is that hybrids are only more fuel efficient than ordinary cars in urban driving. They use more fuel than an equivalent petrol car (let alone a diesel) on motorways and country driving. Both Top Gear and rather more reliable testers have shown in the past that a BMW 3 series (sorry it's so BMW weighted - I have no affection for the things) uses less fuel than a Toyota Prius when driving outside towns. But here's the green rub (my Grandma used to have some of that) - short drives in town are exactly the conditions when a pure electric is superb. So even if all your driving is 10 mile urban tripettes, a hybrid isn't the greenest option.

It turns out that the i8, on Clarkson's 400 mile round trip, averaged around 30 mpg - these days even something pretty sporty can manage that, while I'd expect a good midrange vehicle to manage something in the 50-70 range. Now, add in the fact that building a hybrid is vastly less green than building an ordinary car, typically doubling its environmental impact, and we see once again that hybrids have no place in the green driver's vocabulary. If most of your driving is short range urban, go electric (you can always hire for the long trips). If not, go for a low consumption standard car until electric technology has improved enough to give them a practical range for longer journeys. But hybrids aren't the answer.

This has been a green heretic production.

Image from Wikipedia - click for attribution details

Wednesday, 18 February 2015

Light in treacle

I was in Waterstones, Piccadilly in London yesterday and rather pleased to see that I was allocated my own mini-section (see photo), but also that they had the new version of my book Light Years on one of the tables used to grab people's attention. Light was one of the first topics I wrote about, and it has always fascinated me.

A key characteristic of light is its dramatic speed - the universal speed limit when in a vacuum and not cheating by warping space or similar - but something I cover is the experiments Lene Hau did a few years ago, bringing light to a walking pace. I just wanted to share an extract here.

Nearly 80 years after the theory [of Bose-Einstein condensates] was developed, a Danish scientist has used a Bose–Einstein condensate to drag the speed of light back to a crawl. Her name is Lene Vestergaard Hau. In 1998, Hau’s team set up an experiment where two lasers were blasted through the centre of a vessel containing sodium atoms that had been cooled to form a Bose–Einstein condensate. 
Normally the condensate would be totally opaque, but the first laser creates a sort of ladder through the condensate that the second light beam can claw its way along – at vastly reduced speeds. Initially light was measured travelling at around 17 metres per second – 20 million times slower than normal. Within a year, Hau and her team, working at Edwin Land’s Rowland Institute for Science at Harvard University, had pushed the speed down to below a metre per second – and more was to follow, as we will discover later... 
Lene Hau’s team have not stood still since they originally slowed light to a crawl, despite accidental sabotage by a German TV team. The strange possibilities of quantum light experiments quite often attract media attention, but a modern lab is visually boring. One set of black boxes looks much like any other. The TV team decided that they could make Hau’s experiments look more impressive by bringing in a smoke machine to make the interlacing patterns of lasers visible. Unfortunately they didn’t ask permission to do this. The result was a total collapse of the experiment, which had to be shut down for days until the air could be cleared. Now a plastic curtain surrounds the table that houses the experiment to keep out interfering onlookers. 
As we saw in the first chapter, Hau’s first experiments used one laser to form a sort of ladder through the otherwise opaque Bose–Einstein condensate that allowed a second laser to claw its way through. But if that first laser, called the coupling laser, is gradually decreased in power, the team found that the second beam was swallowed up in the material. The result is a strange mix of matter and light, called a dark state. The trapped light only comes out again when the coupling laser is restarted.
There's far more about the history of our understanding of light from ancient times through to the latest quantum theory in Light Years.





Tuesday, 17 February 2015

Time for open book exams?

Reading Steve Caplan's interesting piece on cheating I was reminded of two very different types of exam I've done in my youth. (Thankfully I haven't done an exam in over 30 years and have no intention to start now.)

The first are the traditional horror exams where you might be tested on your expertise, but you only got a chance to use it if you could remember a whole pile of facts. And I still occasionally get nightmares where I am in exams and can't remember this or that formula. 

The other type was pretty much the last exam I ever took, on my OR course at Lancaster. Called a 'jumbo' it was a 6ish hour exam with a single question. (Though admittedly that question was a good few pages long). You could take in whatever books you wanted - and go out and get more if you wanted. Not only was it far more interesting to do than a traditional exam, I believe it told you far more about the candidate than any ordinary test. 

I really can't see any reason exams should test memory. Surely they should be about understanding and what you can do with the equations (or history dates or whatever)? I think this also fits very well with the RSA's alternative school curriculum, which is all about giving students the tools to research and work, rather than remembering lots of facts. 

How about it, educationalistas? Can we move to a better way?

Monday, 16 February 2015

What did Descartes do for science?

Actually Lancaster University one Rag Week (infra-red shot)
According to Monty Python's Philosophers' Song, sung by the Bruces at the the University of Wallamaloo* (see below for the real thing):
Rene Descartes was a drunken fart: 'I drink, therefore I am'
However, Descartes tends to be held up as a scientist just as much as a philosopher. In Steven Weinberg's book To Explain the World which I've just reviewed, the author points out that while we owe a lot to Descartes' maths for providing the mapping between geometry and algebra, his thinking on the philosophy of science was more than a little shaky.

Specifically, Weinberg shows how Descartes, in his best bit of pure science, explaining how rainbows are seen at the angle they are, totally ignores his own method for 'Rightly Conducting One's Reason and of Seeking Truth in the Sciences.' According to this, Descartes says we should be highly doubtful about information that is derived from authority or our senses, but should instead rely on the power of reason alone. This was a kind of hybrid of Ancient Greek thinking and modern - he dropped the importance of authority and rejected teleology (where things are assumed to be the way they are because they are fulfilling a purpose), but he still wanted minimum observation and experiment, merely providing a spot of data to be worked with the power of sheer thought.

In practice, when working on the rainbow, he totally ignored his method and did something much closer to modern science. He guessed a mechanism and used that to work out the angles of incidence and refraction that would occur in a raindrop, finding they come close to what's observed. Then he does an experiment with an artificial 'raindrop' in the form of a globe filled with water and showed that the observed angles matched his predictions.

So, interestingly, at least once Descartes 'did' science in an effective manner, yet his philosophy of how science ought to be conducted was fatally flawed.

And philosophers wonder why scientists are sometimes a bit sniffy about their subject.

* I know this isn't the correct spelling, but it matches the picture

Friday, 13 February 2015

Can a fact be a stereotype?

Despite its theoretical veneer of objectivity, science - and even more so, writing about science - is subject to the cultural mores of the day. I discovered this recently when I had to modify a piece of text because what I wrote was seen as perpetuating a stereotype. I'll come back to the specifics, but this does raise a rather more important question than the issue at stake, which is whether it is acceptable to perpetuate a stereotype if it's true?

I suppose the classic example from the history of science is the way that people with different ethnic backgrounds scored in relatively predictable ways in an IQ test. Here the stereotype, which definitely isn't true, was that people of a particular ethnic background were more intelligent than others. However, this wasn't what the test actually showed. What it showed was that people of a particular ethnic background were better at doing IQ tests than others. This definitely was true, but some still considered that unacceptable, considering it to be the application of a racial stereotype. It wasn't. All it was saying is that the test was designed with a certain cultural background in mind and someone with that background would do better. The danger here is the knee-jerk response that if a statement fits a group that is often stereotyped, then that statement must be false, offensive, lazy and disgusting.

I ought really to take one step back and ask what a stereotype is. According to my dictionary it is 'A preconceived and oversimplified idea of the characteristics which typify a person, situation, etc.; an attitude based on such a preconception.' So, by implication a stereotype can't be totally true, because it is oversimplified. So it would be a stereotype to say that all people with naturally red hair (to choose a minority I am a) a member of and b) who those who defend minorities don't care about) burn easily in the sun. I would suggest, however, that it would not be a stereotype to say that most people with naturally red hair burn easily in the sun, because there are good genetic reasons why this is likely to be the case, and because there is reasonable experiential evidence for this to be true.

In the case of my correction, I had said that chocolate seems to have a particular appeal for female consumers. Now, I would say that similarly, it would be a stereotype to say 'all women love chocolate', but that to say 'seems to have a particular appeal' is not a stereotype, because it describes the appearance - and I would challenge anyone to give me evidence that this is not the case. As a simple example, in last week's episode of Broadchurch, Olivia Colman's character said to her son 'I love you more than chocolate.' There's an obvious implication there.

So the stereotype would be 'all women love chocolate' or 'women love chocolate more than men do.' But I think it is entirely factual to suggest that women express the appeal of chocolate more than men do, just as, for instance, more men express the appeal of football than do women. The whole point is that this is not a generalisation. There are men who don't like football - I'm one. For that matter I have a male friend who goes on about his liking for chocolate. But we are exceptions. Expressing the particular appeal of football is primarily a male activity, just as making comments like Coleman's character does about chocolate is primarily a female activity, is not a stereotype, it is a fact.

So then we have to ask, is it okay to suppress facts because we don't like them? Well, it's certainly not a scientific thing to do. It can be a social thing to do - it's called a white lie, and I can only think this is why some people get het up about this kind of thing - but I'm really not sure that white lies have a place in science.

Thursday, 12 February 2015

That name sounds funny - I'll change it

For hundreds of years it has been the norm to give names a tweak if they sounded odd in the language being used - particular names of people and places. So for a long time, when Latin was the go-to language of Europe, it was the norm to Latinise people's names. We now find a lot of these fiddly and they have been discarded, but some still remain - Jesus and Copernicus, to name but two.

Medieval scholars also struggled with Arabic names, which became essentials when Europe was regaining its interest in science, largely spurred on by the writings of Arabic scientists and their translations of Greek books. So, for instance, Abū l-Walīd Muḥammad Ibn ʾAḥmad Ibn Rušd, or Ibn Rušd for short (whose name inspired Salman Rushdie's surname) somehow became Averroës, while Abu Mūsā Jābir ibn Hayyān became Gerber.

However, the most lasting and interestingly nuanced is our current approach to place names. Traditionally we gave a name to a place, and that was its English name, even if its 'real' name subsequently changed. But now we try harder to keep up... except where we don't. So we have gone along with the change from Ceylon to Sri Lanka, and we've even tried to keep up with as arbitrary a change as an update of transliteration from some languages that don't use the same characters as us. So, for instance, Peking has become Beijing and Mao Tse Tung turns into Mao Zedong. Yet somehow  what probably should be transliterated Moskva stubbornly remains Moscow.

In Europe, we are impressively confused over what to do. Most French places we seem to cope with, but we can't bring ourselves to say Paris the way we should. In Germany, Köln is still usually Cologne, and in Italy we shun Roma (it's not difficult, guys), Firenze and Venezia for Rome, Florence and Venice.

However, we are most confused of all when it comes to Wales. Here, we simply can't make our minds up, so end up with dual names for many places, causing strife, confusion and I'm sure road accidents as people have to cope with twice the amount of words, working out which to apply. Personally I love the Welsh names, and I'd rather we simply dropped the anglicised versions. Who wouldn't prefer Y Trallwng to Welshpool? Once you learn the basics of Welsh pronunciation (and I'm the first to admit, mine is rudimentary as it's mostly picked up from non-Welsh speakers) there's far more fun to be had with Caerdydd than Cardiff. What I wish, though, is that people would use one or the other. It really irritates me, for instance, when people use the Welsh pronunciation of, say, Aberystwyth (as happened when they were on University Challenge), but don't then go on to use Welsh place names for towns that have dual designation.

So, frankly, names are a mess. We've gradually moved away from Latinisation and Anglicisation, but only to a point. I know we don't have a body that sorts out English, it just sort of evolves. But I wish that evolution could get a move on and head in one direction or the other, rather than taking its current drunkard's walk.

(Next week, we drag astronomers, kicking and screaming into using SI units.)

Wednesday, 11 February 2015

A dark day for Huddersfield

As someone brought up in one of the pennine towns, I know Huddersfield reasonably well and have always thought of it as, frankly, a bit of dump. So I was pleasantly surprised a couple of years ago when I accompanied one of my daughters on a visit as a potential student at the University of Huddersfield. Its compact campus is a really well designed, pleasant environment. And, I mean, it has Patrick Stewart as Chancellor!

However I am decidedly concerned about a press release I received from them. It tells us 'Following a pilot study in Huddersfield, researchers feel that Reiki, as a complementary therapy, should be available to cancer sufferers on the NHS.' Hmm.

Here's what I say about reiki (I can't see why it deserves a capital letter) in Science for Life:
Like acupuncture, reiki claims to use energies unknown and unde- tectable to science in its cures, but where acupuncture depends on the inner human energy of ch’i, reiki, which was devised in Japan in the early years of the 20th century, makes use of something more like ‘the Force’ in Star Wars – an external universal energy which is supposed to be channelled by the healer’s hands into the body of the person being treated.
     The only positive trials of reiki seem to be those where no controls were imposed – the treatment was not compared with a placebo, and so a natural sense of wellbeing resulted from a belief in its effectiveness. Although reiki can do no harm, there is always the danger if it is used instead of a functional treatment in the case of serious illness that the individual’s health will get worse as a result of not being properly treated.
     If you want a placebo-based treatment, over-the-counter home- opathy is a cheaper way to go.
 Now to be fair to Huddersfield, they aren't claiming reiki can cure cancer (this would be illegal), but they are saying that it can make sufferers feel better. Dr Serena McCluskey, who is a Senior Research Fellow in the University’s Centre for Applied Psychological and Health Research said 'Acupuncture and other techniques that were regarded as quite unorthodox are prescribed on the NHS, so we just thought that more research on Reiki was needed. We are not suggesting that we can establish scientific effectiveness, but we are adding to the body of evidence for the quality of life benefits it has for women with cancer.'

Hmm again. The research was done by D McCluskey and an-ex colleague Dr Maxine Stead, who is a 'Reiki master' and is 'now the owner of a holistic health spa in Huddersfield'. Triple hmm. No possibilities of vested interests here, then.

What did the research consist of? Over the course of a year, the researchers conducted detailed interviews with ten women who had received reiki therapy. They 'discovered benefits such as a release of emotional strain, “a clearing of the mind from cancer” and feelings of inner peace and relaxation.'

So it's a tiny trial using subjective interviews and they discovered, surprise, surprise, that when these patients received a lot of attention they felt better in themselves. But this is a classic placebo reaction. There was no attempt here to control the trial. No blinding. No attempt to compare with and without reiki. No suggestion that the same effects could be done without paying the fees of 'Reiki masters' by buying some ten-a-penny sugar pills. No consideration of the morality of placebo-as-treatment. (It may be justified, but it at least needs considering, and if it is to be done, it should be low cost and avoid encouraging misunderstanding of science.)

Frankly, this isn't science at all, it's effectively advertising for an alternative therapy. And the University of Huddersfield has seriously declined in my estimation.

Tuesday, 10 February 2015

In search of the quadrilemma

I've just read for review Amir Aczel's book Finding Zero. A lot of the book is concerned with his challenging attempt to track down a Cambodian inscribed stone that bears what is thought to be the oldest zero so far discovered. But along the way, he speculates on the differences between Western and Eastern approaches to thought that could have led to the invention of the mathematical zero.

Specifically he points out that traditional Western logic is very much binary - something is either true or it isn't. There are two options. But the Eastern equivalent, he suggests, which sometimes goes by the name of the quadrilemma, has four options: true, false, both and neither.

Now, on shallow observation, the 'both' and 'neither' options might seem like wishy-washy useless philosophical musings. And in some cases they are. But in fact they do sometimes make sense and are, in fact, also present in Western thinking - we just don't emphasise them as much as they might be emphasised in Eastern cultures.

So, for instance, Aristotle, when discussing infinity, described it as 'potential'. And to illustrate what his meant he used the example of the Olympic Games. If a little green man came down in a flying saucer (that bit is my addition to the illustration) and asked you 'Do the Olympic Games exist?' then I think you would say 'Yes.' But if he then asked 'Can you show me these Olympic Games of which you speak?' your answer would be 'No.' Aristotle's concept of potential is, I would suggest, pretty much identical to the third possibility in the quadrilemma - it is something which is both true and false.

How about something that is neither true nor false? Now here I would say I diverge from the Eastern approach, because while the third and fourth possibilities are distinct - so there is another, different case, which I'll illustrate in a moment - I can't say for certain which way round they are to be applied. But my final and distinct possibility is something that is imaginary (not in the mathematical sense, but literally) or fictional. Does a fictional character or an imaginary notion exist? Well, no. But on other hand, its existence isn't really false either, because we talk about them, think about them - and they make things happen. Yet this isn't the same as a potential, because a potential definitely can be, but isn't.

Who said philosophy wasn't fun?


Monday, 9 February 2015

The Museum of the Future - Review

I was a little bit wary of this collection of short stories, as it has the look of being self-published even though it isn't (specifically the paragraphs have gaps between them, like this blog, rather than the indented start you always see in a 'real' book) - but I needn't have worried because this isn't reflected in the content.

My suspicion is that these are Marmite stories - you'll love them if you like period writing. The first, for instance, is (intentionally) in the style of the wonderful M. R. James' Victorian ghost/horror stories, and several others adopt a Victorian style. I can't say every story worked for me - but that's true of pretty much every short story collection I've read, even those by masters of the art like Ray Bradbury, Gene Wolfe and Neil Gaiman. The ones that did work, were genuinely engaging and intriguing.

Andrew May gives us a mix of science fiction, fantasy and mild horror. I think this is a collection that works best for those who are well versed in these genres, as you will get the references and the cleverness of stories like The Call of Cool-o, which is an H. P. Lovecraft style plot written in the style of Philip K. Dick, or The Museum of the Future, which is essentially the same story (each set in 2012), told as the story would have been if written in 1912, 1932, 1952, 1972 and 1992, employing the styles and recurring themes common in each period. Quite often 'Fortean' themes are explored, something of a speciality of May's, using a story of real life weirdness* as a setting for the fiction.

If I had to pick out a favourite it would be the relatively long story (many are only a few pages) A Case for Crane, which sees the main character watching a 1970s US crime drama, which he gets pulled into at various levels, sometimes inhabiting the mind of the character, sometimes the actor playing the character and sometimes the author, giving a strange and mind-twisting meta-view of the story. This sounds messy, but actually works really well. And I'm a sucker for period tales set in Oxford or Cambridge, of which there are several, though I should point out that all the best Cambridge colleges have a Senior Combination Room, not a Senior Common Room as mentioned in The Rendelsham Magi, with its unusual twist on the star of Bethlehem.

May's writing is assured and mostly enjoyable. His only weakness, I'd say, is that his female characters (when there are any) are straight out of the 1940s - either mousy and slightly plump or dominating Amazons. This is a very male-oriented collection of stories.

It's a mark of the effectiveness of the storytelling that there were several where I wanted to to find out more, to have the story taken on further. What's more, I took the book with me to read on the train, intending to put it aside for the important physics book I should have been reading last night, but instead I read on. If you like genuinely inventive and interesting short fiction, often with a period feel, it's well worth a try. You can find it on Amazon.co.uk and Amazon.com (I'd go for the Kindle edition if you use ebooks, as it's significantly cheaper, and I think short stories are amongst the best things to read in this format).

* Well, as real as you might consider the likes of a haunting or the Loch Ness Monster to be. (Not that  Nessie is featured, but it's that kind of thing.)

Thursday, 5 February 2015

Remembering Adventure

I recently posted this piece of text on Facebook and asked who remembered it.
YOU ARE STANDING AT THE END OF A ROAD BEFORE A SMALL BRICK BUILDING. AROUND YOU IS A FOREST. A SMALL STREAM FLOWS OUT OF THE BUILDING AND DOWN A GULLEY.
Given I have pretty geeky friends on FB, I expected most would spot it immediately, but many didn't. Which gives me the opportunity to pop in a little extract from my upcoming book, Ten Billion Tomorrows, which looks at how science and science fiction have influenced each other (and how science fiction really isn't about predicting the future, yet manages to shape it). I had great fun writing it. If you think it'll float your boat, it's out in December and you can already pre-order it on Amazon.co.uk and Amazon.com.

So here we go with the extract:

By coincidence, 1976 was also the year when a true computer-based virtual world came to life. It was then that American computer engineer Will Crowther, who was working on ARPANET at the time, had an idea that would capture of the hearts and minds of computer enthusiasts – me included. Crowther was a fan of the fantasy pen and paper role-play game Dungeons and Dragons, which came out in 1974, and was also having family issues at the time. A caver, he had mapped out a real cave system on an early computer, and used the idea (if not the actual map) as an inspiration when he wanted to have something to play with his daughters after his marriage break-up. So he put together a simple game that used the computer’s ability to respond to a series of text commands to build a virtual world, which would later be improved on by graduate student Don Woods, who strengthened the fantasy elements in the game.

When playing, the early gamers would be told their position in a series of linked caves and could ask to move in different directions. They might discover swords or treasure – or, for that matter, deadly monsters – all of which were summed up with a few, tightly conceived words. Any pictures were in the imagination of the player. Crowther called his game Adventure, set in Colossal Cave. This wasn’t the first game to make use of computer power. The stand-alone tennis game Pong, running on TV sets with a simple computerized controller to produce the signal, came onto the market in 1972. But Crowther’s Adventure was the first computerized adventure game (giving the name to the genre).

In 1976 I moved from the bustling world of Cambridge to the isolated campus of Lancaster University in the north of England to take my masters degree. Until then, my use of computers had been limited to running simple programs written in languages like Fortran on a stack of punched cards. But Lancaster had a secret computing weapon in George 3, a computer operating system running on an already antiquated 1900 series mainframe from the now long-defunct ICL company, an operating system that transformed the way that the user interacted with the computer. Instead of punching a set of cards, feeding them through a reader and waiting for the output to churn out on a line printer, users communicated with George III using teletypes, electronic typewriters that could both take input from a keyboard and respond by typing controlled directly from the computer. Suddenly it was possible to have a conversation with a computer in real time.

I made a small amount of use of George 3 for my coursework. But by far the most frequent command I would type after logging in to the system was ADVENT. The command to run the game Adventure, which had been ported from its original implementation on a DEC PDP-10. I would play long into the night under the stark fluorescent lights of the computer lab, immersed in a world that could only be accessed via the imagination and the clattering print-head of the teletype:

YOU ARE STANDING AT THE END OF A ROAD BEFORE A SMALL BRICK BUILDING. AROUND YOU IS A FOREST. A SMALL STREAM FLOWS OUT OF THE BUILDING AND DOWN A GULLEY.

When in response I typed GO IN, the system would respond:

YOU ARE INSIDE A BUILDING. A WELL HOUSE FOR A LARGE SPRING.
THERE ARE SOME KEYS ON THE GROUND HERE.
THERE IS A SHINY BRASS LAMP NEARBY.
THERE IS FOOD HERE.
THERE IS A BOTTLE OF WATER HERE.

The adventure had begun. I genuinely can still feel the hairs standing up on my arms when I read those nostalgic words again. There had never been anything quite like it. It was a whole world that you interacted with through text – these text adventure games have become known as interactive fiction, and the name fits well. Perhaps most extraordinarily back then, it was a computer program that responded to ordinary words, rather than the terse instructions of a command language. The game would stay with me in 1977 when I moved to work at British Airways, where we had a PDP-10 and were able to play the original in all its glory. 

Wednesday, 4 February 2015

The Critic of Wolf Hall

I type this warily, with 'Tread softly because you tread on my dreams,' in mind. And I ought to say straight up that I am enjoying the BBC's adaptation of Wolf Hall. But. I can only assume that the fervent praise for it I see on social media is from people who have read and loved the books, and who are delighted to see what I gather is generally a very good adaptation on the screen.

As someone who hasn't read anything by Hilary Mantel (in fact I've hardly read anything by any Booker Prize winner, because with a few exceptions like William Golding, I really don't get anything from reading literary fiction except a sense of worthiness), I do think that the glowing praise needs to be balanced by a little negative criticism.

Before I do, I'll get some praise in. It's very well acted, the locations are excellent, and as someone who is fascinated by Tudorbethan times (mostly because it's my favourite period for music), there's a distinct thrill of thinking, when we first meet, say, Dr Ridley, 'I know what's going to happen to you!' It's a bit like playing god. But.

So here we go with a few bullets to the heart:

  • It's a bit dark. I don't mean ominous, I mean without enough lighting. Sometimes this works wonderfully. It's hard not to think when, for instance, you see Cardinal Wolsey glowing by candlelight against a murky backdrop, 'I now understand why paintings of the period look the way they do.' But I still have two problems. One is that I'm currently re-watching the X-Files, and I've always felt they spent far too much time wandering around in the dark by torchlight. Similarly, it seems a trifle overdone in Wolf Hall - just substitute candles for torches. The other problem is that eyes don't work the same way that TV cameras do. I think with the number of candles in some scenes, because the human eye is so good at working in low light (think how well you can see by moonlight), there wouldn't be so many dark voids - you would comfortably be able to see the whole room.
  • An awful lot of the scenes have the same format. Character spends a long time walking into a room. Character exchanges a few lines with another character. Character spends a long time walking out the room. I think the series could lose about an hour of walking and benefit from it. I get it that this isn't 24, and they want to be leisurely about it, but sometimes the pace verges on somnolence.
  • If you live in Wiltshire, it is hard not to spend quite a lot of time thinking, 'That's not Greenwich, that's Bowood House... if you go through that door you get to the gift shop' or whatever. This is, of course, an unfair complaint, as they had to film somewhere, but it's hard not to get distracted by it. Oh, and I did think they could have spruced up some of the stonework, which looked as if it were over 400 years old, rather than newish.
  • Finally, I hope they'll get a bit more variety in the period music. We're only two episodes in and we've heard the mournful sounding tune Ah, Robin three times now. Admittedly it's quite appropriate, as when performed as it should be, as a part song, it's essentially about two friends discussing their mistresses, and it's a piece I'm very fond of, but even so this was a very rich period musically.
So there we have it. I like it, but I can't get as excited as everyone else seems to be. I shall now retire to my bomb shelter and await the assault.

Tuesday, 3 February 2015

Are people from London and the South East physics dullards?

All together now: 'Maybe it's because he's not
a Londoner, that he's a physics great...'
While walking the dog yesterday I got to thinking about Isaac Newton (the way you do) and from him, of the other great physicists in British history. And it started me thinking that London and the South East is rather under-represented.

As a little experiment, I've listed all British Nobel Prize in Physics winners, plus the obvious individuals who would have won a Nobel if it had been around in their day.

I came up with:

  • Isaac Newton (NE)
  • Michael Faraday (born in London, but his family had just moved from NW)
  • James Clerk Maxwell (Scot)
  • 1904 Lord Rayleigh (SE)
  • 1906 J J Thomson (NW)
  • 1915 WH and WL Bragg (NW)
  • 1927 Charles Wilson (Scot)
  • 1928 Owen Richardson (NW)
  • 1933 Paul Dirac (SW)
  • 1934 James Chadwick (NW)
  • 1937 George Thomson (East Anglia)
  • 1947 Edward Appleton (NE)
  • 1948 Patrick Blackett (London)
  • 1950 Cecil Powell  (SE)
  • 1952 John Cockroft (NW)
  • 1973 Brian Josephson (Wales)
  • 1974 Martin Ryle (SE)
  • 1974 Anthony Hewish (SW)
  • 1977 Nevill Mott (NE)
  • 2003 Anthony Leggett (London)
  • 2013 Peter Higgs (NE)
So, London manages 2, and the SE manages 3. That's not a bad score, but still seems a little meagre compared with 7 from the North West.

Of course the numbers are small, and it's hard to read a lot into such statistics (though it's worth a pause for thought that we didn't get a single Nobel Laureate in Physics between 1977 and 2003). Even so, it would be interesting to compare the ratio of, say cabinet minsters from London and the South East to other parts of the country since 1901 (the year of the first Physics Nobel).

My suspicion is that such a comparison might suggest that where physics greats are chosen on merit, cabinet ministers are chosen for a different reason entirely.

P.S. I couldn't be bothered to go through cabinet ministers, but I did prime ministers and it's quite interesting that a) Scotland is over-represented, b) NW is under-represented and c) the domination of London and the SE is relatively recent:

  • Arthur Balfour (Scot)
  • Henry Cambell-Bannerman (Scot)
  • Herbert Asquith (NE)
  • David Lloyd George (Wales)
  • Andrew Bonar Law (Colonies)
  • Stanley Baldwin (Midlands)
  • Ramsey McDonald (Scot)
  • Neville Chamberlain (Midlands)
  • Winston Churchill (SE*)
  • Clement Attlee (SE)
  • Anthony Eden (NE)
  • Harold Macmillan (London)
  • Alec Douglas-Home (London)
  • Harold Wilson (NE)
  • Edward Heath (SE)
  • James Calaghan (SE-ish**)
  • Margaret Thatcher (NE)
  • John Major (SE)
  • Tony Blair (Scot)
  • Gordon Brown (Scot)
  • David Cameron (London)

  • The two starred items are because we don't have a South Midlands:
    * Oxfordshire is spiritually SE
    ** Portsmouth is not spiritually SE, but Hampshire is

    Monday, 2 February 2015

    I have been studied (sort of)!

    I was fascinated to discover that my old book Armageddon Science has become the subject of a masters thesis. To be more precise, the experience of of translating two chapters of it into Chinese has been documented. All I know about the exercise is that it is the work of one M X Xi and was finished by May 2013. I haven't seen the actual thesis, but here is the abstract for your delectation:
    This paper is a report based on the author’s experience of translating two chapters of Brian Clegg’s popular science book Armageddon Science, under the guidance of her supervisor. The report consists of six parts. The first part gives a brief introduction to the task. The second part describes the translation process and translation requirements. Translation process generally includes three stages:preparation, translation and proofreading. And the translation requirements fall into two parts:format requirements and quality requirements. The third part focuses on the source text analysis, in which features of popular science are discussed and illustrated in details from three aspects, lexis, syntax and style. The fourth part is theoretical resources. A brief introduction is given to Nida’s theory of translation process, that is, analyzing, transferring, and restructuring. Then in the fifth part the focus is moved to the translation strategies under the guidance of Nida’s theory of translation process at the lexical, syntactic and textual levels:at the lexical level, the report discusses the translation of polysemy and cultural-loaded words; at the syntactic level, it talks about the translation strategies of passive voice, long and complex sentences, post-positioning of attributives and nominalization; at the textual level, it analyzes the logicality of the translation. In the last part the author summarizes the whole translation process and brings this report to an end.By describing the translation process and analyzing the source text, the author hopes that the report as well as the translation of Armageddon Science can give some enlightenment to those who translate such kind of texts.
    So there you are. It's not everyone who gets Nida's theory applied to their work. If M X Xi ever sees this, please drop me an email at brian@brianclegg.net - I would love to hear a little more about why this book was chosen and how the experience went.