Tuesday, 30 September 2014

Who is in the running for the Nobels?

For the outside world, exactly who wins Nobel Prizes in the science is fairly academic (geddit?) - and even for those with a professional interest it may sometimes seem that the reason for the awards can be sliced pretty thin these days. The early prizes do seem often to have been for more 'big' work than the more subtle modern ones. But having said that, we also always get some goodies.

I didn't realize it until they sent me a press release, but Thomson Reuters do an annual prediction of the likely runners and riders - useful in case you fancy a flutter. So here are this years' favourites according to TR. On the physics side, I rather fancy the Quantum Spin Hall effect, but that's just me...

P.S. I don't know why Economics is treated as a science either.

James E. Darnell, Jr.Vincent Astor Professor Emeritus, Laboratory of Molecular Cell Biology, Rockefeller University
New York, NY USA


Robert G. RoederArnold and Mabel Beckman Professor, Laboratory of Biochemistry and Molecular Biology, Rockefeller University
New York, NY USA


Robert TjianProfessor of Biochemistry, Biophysics, and Structural Biology, Department of Molecular and Cell Biology, University of California Berkeley, and President, Howard Hughes Medical Institute
Berkeley, CA, and Chevy Chase, MD USA

For fundamental discoveries concerning eukaryotic transcription and gene regulation
David Julius
Morris Herzstein Chair in Molecular Biology and Medicine,
Professor and Chair of Physiology, University of California San Francisco
San Francisco, CA USA

For elucidating molecular mechanisms of pain sensation
Charles LeeProfessor and Scientific Director of the Jackson Laboratory for Genomic Medicine
Farmington, CT USA
Stephen W. Scherer
Senior Scientist and Director, The Centre for Applied Genomics, The Hospital for Sick Children, Professor and Director, McLaughlin Centre, University of Toronto
Michael H. WiglerProfessor and Head, Mammalian Cell Genetics Section, Cold Spring Harbor Laboratory
Cold Spring Harbor, NY USA
For their discovery of large-scale copy number variation and its association with specific diseases

Charles L. Kane
Class of 1965 Endowed Term Chair Professor of Physics, University of Pennsylvania
Philadelphia, PA USA
Laurens W. Molenkamp
Professor of Physics and Chair of Experimental Physics, University of Würzburg
Würzburg, GERMANY
Shoucheng ZhangJ.G. Jackson and C.J. Wood Professor of Physics, Stanford University
Stanford, CA USA

For theoretical and experimental research on the quantum spin Hall effect and topological insulators
James F. ScottDirector of Research, Department of Physics, University of Cambridge
Cambridge, UK

Ramamoorthy Ramesh
Professor, Physics and MSE, and Associate Lab Director for Energy Technologies, University of California Berkeley
Berkeley, CA USA


Yoshinori Tokura*
Director, RIKEN Center for Emergent Matter Science, and
Professor, Department of Applied Physics, The University of Tokyo
Saitama and Tokyo, JAPAN

For their pioneering research on ferroelectric memory devices (Scott) and new multiferroic materials (Ramesh and Tokura). *Tokura was previously named a Citation Laureate in 2002.
Peidong Yang
S. K. and Angela Chan Distinguished Chair in Energy, Department of Chemistry,  Materials Science and Engineering, University of California Berkeley, Kavli Energy Nanoscience Institute, and Materials Science Division, Lawrence Berkeley National Laboratory
Berkeley, CA USA

For his contributions to nanowire photonics including the creation of first nanowire nanolaser

Charles T. Kresge
Chief Technology Officer, Saudi Aramco, Dhahran

Ryong Ryoo
Director, Center for Nanomaterials and Chemical Reactions, Institute for Basic Science and Distinguished Professor, Department of Chemistry, Korea Advanced Institute of Science and Technology (KAIST)

Galen D. Stucky
E. Khashoggi Industries, LLC Professor in Letters and Science, University of California Santa Barbara
Santa Barbara, CA USA

For design of functional mesoporous materials
Graeme MoadChief Research Scientist, CSIRO
Clayton, Victoria, AUSTRALIA

Ezio RizzardoCSIRO Fellow, CSIRO
Clayton, Victoria, AUSTRALIA


San H. ThangChief Research Scientist, CSIRO
Clayton, Victoria, AUSTRALIA

For development of the reversible addition-fragmentation chain transfer (RAFT) polymerization process
Ching W. Tang
Professor of Chemical Engineering and Bank of East Asia Professor, Institute for Advanced Study, University of Rochester, and Chair Professor in the Departments of Electrical and Computer Engineering, Chemistry, and Physics, Hong Kong University of Science and Technology
Rochester, NY USA and Hong Kong, CHINA
Steven Van SlykeChief Technology Officer, Kateeva
Menlo Park, CA USA
For their invention of the organic light emitting diode

Philippe M. Aghion
Robert C. Waggoner Professor of Economics, Harvard University
Cambridge, MA USA

Peter W. HowittLyn Crost Professor Emeritus of Social Sciences and Professor Emeritus of Economics, Brown University
Providence, RI USA

For contributions to Schumpeterian growth theory
William J. BaumolProfessor of Economics and Harold Price Professor of Entrepreneurship, New York University
New York, NY USA

Israel M. KirznerEmeritus Professor of Economics, New York University
New York, NY USA

For their advancement of the study of entrepreneurism
Mark S. GranovetterJoan Butler Ford Professor and Chair of Sociology, and Joan Butler Ford Professor in the School of Humanities and Sciences, Stanford University
Stanford, CA USA

For his pioneering research in economic sociology

 "Nobel Prize". Via Wikipedia

Monday, 29 September 2014

Google walks

Start of the journey - BBC Wiltshire reception
Yesterday I made my regular appearance on BBC Wiltshire, but I was without a car, so experienced the joys of a bus in, and decided to walk back, a distance of just over four miles. What made it different, and really rather fun, was I did it with a walking sat nav.

It's not the first time I've used GPS on a phone for guidance while walking - in fact I've done it when finding my way across cities on foot for years - but what I've always done before is kept my phone in my hands, glancing at the map to see where I should go. This time, I plugged in a pair of earbuds, stuck the phone in my pocket and let the software do the talking. And it worked brilliantly.

Ms Google starts me off
One essential before getting started on this was to use Google Maps. More often than not I use Apple's mapping app - after its initial teething problems it works fine for most uses, including my strolls around cities. But for the kind of journey I was about to do it has a fatal flaw. It doesn't know about footpaths (certainly not footpaths in Swindon). Google does - and it makes a real difference to the walk.

So off I set with the slightly whiny, but assertive American woman telling me what to do in my ears. It was strangely intimate. When a car sat nav tells you what to do, it is clearly coming from that piece of kit on the dashboard, but when a voice in your ears tells you to turn left onto Regent Street, there really is quite a strong urge to respond and make it a conversation.

As always when I take these mid-range walks across Swindon it's a delight that's rather similar to the experience of travelling on a narrowboat on one of the UK's canals. There's that same mix of passing close by everything from industrial architecture to open fields at a pace where you can really look around and observe things, seeing the world from an angle you don't usually get to experience.

Ms Google did the job perfectly, though I did find it a little unnerving, only following voice commands. Three times I gave in and got the phone out to check the map (especially when she appeared to be directing me to cross the road and walk up a set of steps, as indeed she was), but each time what I thought she meant was correct. She even got a little cheeky.

Occasionally she would make apparently reassuring comments that were clearly intended to wind me up, as they would only be of use to a scout. I would be powering up a steep sloping bend and she would suddenly say 'Head north west.' Now, bearing in mind the phone was in my pocket, unless I had a compass in hand, or had one of those pairs of shoes (Wayfinders) with a compass in the heel I always wanted as a kid, but wasn't allowed because my mum (rightly) said the compass would be rubbish, this information was totally without value. Still, it broke the awkward silences.

All in all, a real success. I think Ms Google will be my companion on many more trips to come...

Friday, 26 September 2014

What's the best science-related quote?

I enjoy wheeling out the odd science-related quote, and would be interested to collect more. To be really great, I think a quote like this needs to be pithy, funny... and make you think. Do you have a favourite? (Please ensure they are from a reliable source.)

It's hard to go wrong with Rutherford's famous:
All science is either physics or stamp collecting.
Which is particularly useful to wind up biologists. I am also rather fond of Konrad Lorenz's advice, which rather a lot of scientists could do to consider:
It is a good morning exercise for a research scientist to discard a pet hypothesis every day before breakfast.
But the one I'll leave you with this morning, which I shall dedicate to cosmologists and string theorists, is:
There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.

Thursday, 25 September 2014

Science facts and black holes

Chandra image of the black hole (or not)
at the centre of spiral galaxy M81
As any regular readers will know, I have a habit of banging on about the nature of science - that it isn't about establishing the 'truth' about reality, but rather about developing models that produce as close as possible results to what is observed, and that these models are inevitably provisional and could always be thrown out as new data becomes available.

This not saying 'anything goes' or 'all theories have equal value.' We will typically have a best theory of the moment, and the only sensible thing is to use that until something is established to have better credibility. But it does mean we shouldn't treat our models as certainties.

Sometimes when the model suffers a defeat it is patched up - as in the introduction of inflation to the big bang model. This isn't always a good thing as it can lead to epicycles - effectively taking a bad model and making it more and more complex and obscure to match observation. Other times the old model is genuinely thrown away.

Different areas of science have to be more or less loose with the models they accept. Cosmology, for instance, suffers hugely from the fact you can't do experiments in the lab and there is no opportunity for repetition. Inevitably, then, cosmological models are particularly at risk of revision or rejection as new data emerges. This is why I have always been very uncomfortable with saying that the universe began with* the big bang 13.7 billion years ago, something you will generally hear stated as fact by pretty well anyone doing science presentation on TV. (Naming no names.) I understand why they do this - there is a huge temptation to over-simply under media pressure. I've done it myself. But what they really should say at least once is 'when I say this happened, please take this as having an unsaid proviso "this is our best current theory, but it may well change in the future."'

A recent paper suggests that one of the keystones of modern cosmology, black holes, don't exist. There have been mutterings about black holes in the past, but this a mathematical proof that they can't form. The paper hasn't been peer reviewed yet, so there's a big proviso to this, but it's entirely possible it's true. If so, in some ways it's a relief. We would still have near black holes, doing all the things currently ascribed to black holes by astrophysicists and cosmologists. We would still have spaghettification. But we wouldn't have all the uncomfortable weirdness and breakdown of theory provided by the event horizon and the singularity.

However, my point here isn't so much the implications of the proof, if true. Rather it's that here again is something that we all knew was speculative, but have spoken about far too often and too long as if we were dealing with fact. It's time scientists and science presenters were rather more, erm, scientific about the way they presented what we know - and don't know.

* Technically just before (this is inserted to keep John Gribbin happy)

Image Credit: X-ray: NASA/CXC/Wisconsin/D.Pooley & CfA/A.Zezas; Optical: NASA/ESA/CfA/A.Zezas; UV: NASA/JPL-Caltech/CfA/J.Huchra et al.; IR: NASA/JPL-Caltech/CfA

Wednesday, 24 September 2014

What to do with a fish kettle

I am always interested in books about autism, in part because like most people with a scientific background,  I share some traits with those on the spectrum. I've previously reviewed, for instance Simon Baron-Cohen's book The Essential Difference, and most fascinatingly, if rather hard work, Richard Maguire's I Dream in Autism. So it was with real interest I agreed to take a look at a copy of Michael Barton's A Different Kettle of Fish, in which he, a physics student with high functioning autism, describes what it's like to take a trip into London.

My first opinion was that it is a very slim book at just 80 pages of large, well-spaced print, which for £10 seems a little skimpy. Nonetheless I would recommend it to get some insights into a different way of looking at the world. Our biggest difficulty in sharing the world with people on the autistic spectrum is understanding why and how they see and feel things differently. It is partly about the way we so often say things we don't really mean, and partly in the sensory overload they can feel with the need to process everything that most of us ignore or don't notice in the first place.

One thing that did irritate me slightly was the constant mention of strange sounding euphemisms and idioms. Time and again, Barton tells us he doesn't get what is being said, because it sounds weird. But we all think this the first time we hear such an expression, then we assign a meaning to it, just like any other vocabulary. 'Sausage' sounds weird if you don't know what it means. It would really have helped if Barton could have explained why someone with autism can't learn what a euphemism codes for and assign a meaning to it. Unpacking the experience and explaining would have meant so much more than coming up with more and more examples with no context. Later in the book he does admit to learning them most of the time, but notes he has to learn them first, where most people seem to pick them up naturally - I'm not sure this is true. I think we all think 'What???" the first time we hear about a red herring, say. What is more interesting and informative is his failure to understand indirect requests like 'Can you pass the salt?' (Response: 'Yes.')

I was also slightly suspicious that the author was trying to find 'funny' meanings in announcements to the extent that he at least once created one. I can absolutely understand being amused by an announcement saying 'This is an announcement,' or a sign saying 'Dogs must be carried' - plenty of people who aren't on the autistic spectrum laugh at this kind of thing too. You regularly see them on Facebook and the like. But when Barton tells us a tube announcement says 'Please let other people off the train first,' with its implication that no one will make the first move, it smacks of constructed humour - because the actual announcement is 'Let customers off the train first, please.' (You can even hear it by clicking the play button at the top of the post.) The exact wording, by not having that 'other', and the fact that the announcement is on the platform speakers, not the train speakers, makes it much clearer that this is addressed to people who aren't on the train. The announcement makes perfect sense.

Overall, the book is still quite a good way to get the message across to some who aren't aware of people on the autistic spectrum. The bumf says the book is aimed at everyone from children aged 8 plus to adults, but I think it would be best confined to the 8-12 range (in which case, the pricing should be seriously reduced).

You can find out more about the book, or purchase it, at Amazon.co.uk and Amazon.com.

Tuesday, 23 September 2014

The New Tyson Fight

Neil deGrasse Tyson
One of the interesting aftermaths of the Scottish Referendum debate was that I have seen a number of people saying 'A lesson to learn is don't trust the traditional media, get your information from social media.' I know where they were coming from, but there are two dangers here - one is that (even more than watching, say, Fox News) you won't get information you will get propaganda, and the other is that even when you aren't being told what you want to hear by your friends and political allies, a lot of internet sources are unreliable. The Tyson story I want to tell you illustrates this doubly.

The Tyson in question is not Mike, but science populariser and astronomer, Neil deGrasse Tyson. I was surprised the other day to hear that Tyson was being pilloried for making up quotes to support an argument. The argument in question is that a lot of people (including many in the media and our elected representatives) are extremely ignorant about science and so (I presume) aren't well equipped to make decisions about science teaching and science funding. This is an argument I strongly support - but clearly not by making up data.

There seem to be three quotes that have caused the furore. Tyson claims that:
  • George Bush made a speech after 9/11 distinguishing 'we from they' by saying 'Our God is the God who named the stars.' Yet lots of named stars have Arabic names - Bush made a silly argument, claimed Tyson.
  • A congressman uttered the sentence 'I've changed my views 360 degrees on that issue.' Which showed basic ignorance of what 360 degrees is.
  • A newspaper headline in New York City: 'Half the schools in the district are below average.' - Tyson claims 'we have to re-think the foundations of mathematics if this were false.'
There is video evidence of Tyson's use of these quotes, so that much is pretty definitively true. But the furore is over whether any of these allegations are true, or Tyson just made up the quotes to suit his message.

Tyson's detractors claim the following:
  • The Bush quote was not made after 9/11, but after the Columbia disaster and he actually said 'The same Creator who names the stars also knowns the names of the seven souls we mourn today.' This had nothing to do with Christians versus Muslims and was simple consoling rhetoric.
  • There is no evidence of the 360 degree comment being made as quoted, though Representative Maxine Waters did say 'You have done a 360 degree turn' to someone.
  • The information on schools below average in the headline, which no one can find, could well be false, so Tyson misunderstood statistics by claiming that you would have re-think the foundations of mathematics if it were false.
Others have weighed in claiming that this is a anti-intellectual right wing campaign against Tyson, some even suggesting that it is because he is black.

What really applies? As far as the basic facts go, the detractors mostly have it right. The Bush quote was misapplied and misquoted. The 360 degree 'quote' was not word for word. And in principle the headline could be reasonable and can't be found in a New York City newspaper headline (online, at least). This last is the least obvious, but the reason the headline could be reasonable is that it is not true that exactly half a population will be below average, as Tyson seemed to imply. Take for instance a room full of ordinary people and Bill Gates. Look at the average net worth of the people in that room. Chances are everyone except Bill is below the average. The number that is in the middle of the grouping is not the average, it's the median.

So what have we established? Tyson's use of the Bush quote was a real, and unpleasant error in the way he misused it. The 360 degree quote was mis-worded, probably typed from memory - he should really have checked, but frankly it's close enough. And no one can find the 'below average' quote, but if it were true, we needed more information to criticize it, as it isn't stupid in its own right. It's still quite likely the newspaper was misreading the information (apart from anything else, the media often call a median an average because they think the readers don't understand 'median' - see my article on this happening over 'average house prices'.) So Tyson could have been making a worthwhile point, but it would have needed a lot more unpacking than he actually did to be sure.

To be honest, I don't like Tyson's approach to public speaking - it tends to the pompous and bombastic (perhaps this is just a US/UK style thing), belittling those who don't agree, which I don't think is a great way to make an argument, even if they are wrong. He made a clear mistake on the Bush quote and messed up with the the 'below average' business. So he ought to clean his act up, and admit this. But frankly these errors have nothing to do with a serious and important message. So by all means consider Tyson reprimanded - but don't confuse the message and the messenger. What he was saying about the media and the political class being dangerously ignorant of science is still true.

However, those who defend Tyson saying this is a fuss over nothing are also wrong. He is not a gutter press journalist, he's a scientist. And he knows perfectly well that two of the biggest failures for a scientist are to make up data, and to rely on anecdotal evidence. And he has clearly done at least one of these here. It was a serious error of judgement, hence the need to apologise.

The reason I said at the beginning that this is a double error of trusting unverified online sources is that firstly people have been coming out pro and anti Tyson based on reports that take one or the other extreme view on what happened, and secondly because I suspect the reason Tyson got the quotes wrong in the first place was that he too relied on a dubious internet source. We all slip this way occasionally. I certainly have. But it's a good reason for taking a step back from that 'get your information from social media' suggestion.

"Neil deGrasse Tyson August 3, 2014 (cropped)" by Mingle Media TV - https://www.flickr.com/photos/minglemediatv/14849113273/. Licensed under Creative Commons Attribution-Share Alike 2.0 via Wikimedia Commons

Monday, 22 September 2014

What to name your new university town

Once again, the UK has done brilliantly in the worldwide university league tables, with four universities out of the top six. But I was interested in another phenomenon, which I haven't seen reported in the press.

Let's imagine you are an up and coming country, building a new university, and you want to rename the town it is in to give the university instant prestige. What should you call it? Look again at that top six and something fascinating pops out. Look at where the universities are located:
  1. Cambridge (Mass)
  2. Cambridge (UK)
  3. London (2=)
  4. Cambridge (Mass)
  5. Oxford
  6. London (5=)
Spot anything? So I look forward to a lot of new towns called Cambridge springing up around the world.

Friday, 19 September 2014

Sticky fun

I get sent a lot of press releases, most of which go in the electronic bin within 2 seconds. (In some ways I really miss the old days when I used to get paper press releases through the mail, some of which were really creative. Though there was a lot of fuss, as I recall, about one computing company that sent out a release with lots of tiny pieces of paper that flew all over the floor when you opened the envelope. But I digress.) Occasionally, however, I get one worthy of note.

This one was about cover stickers for Apple MacBooks (other laptops exist, but apparently they aren't worthy of stickers). Now, I ought to come out straight away in 'Ba, humbug!' mode, because I think stickers look absolutely terrible on laptops. It's fine if you're six, but if you are 16 or older, it's time you grew out of it. It just looks a mess. I don't mind a tasteful shell, but no stickers, okay?

However, if you insist on tarting up your beautiful and expensive hardware with paper or plastic tat, a 'virtual design studio' called DesignCrowd has run a contest to design 'useful stickers' and apparently Apple has officially endorsed them being used on MacBooks. So here are some of the best 'decals' (as our US cousins like to think of them), according to DesignCrowd and/or their PR agency:

Hmm... a multiplication table. Really useful if you lose the calculator app.

Ah yes, reminders of the Photoshop shortcuts. Mind you, even better if they had been printed back to front so you could read them in a mirror while you work because, guys, you can't see the back of the computer when you use it!

Periodically useful. Geddit? One teensy problem - about half the elements are missing.

Somehow, this appeals to me most. Perhaps it's the simplicity. And the thought of flinging a £1000 laptop around to use it as a ruler.

No, totally loses me, this one. Just weird.

Interesting, certainly... but surely there could be something more imaginative? How about, for instance, a sticker that makes it look like you are seeing through the lid. Or... no, stop me. I don't want any stickers, and that's the end of it.

Thursday, 18 September 2014

Come on England, have some pride!

As the Scots go to the polls, I'd like to direct attention back home for a moment. The fact is, there have been some pretty unedifying scenes down here. We've seen political leaders and celebrities begging Scotland not to leave the union. I really don't understand why these people are so worked up. Perhaps they should put some effort into having pride in being English.

After all, in the grand scheme of things, England has a lot to be proud of - whether it's in cities and countryside, culture and heritage, literary fields, science or whatever. Take universities. It's interesting that the Guardian reported on Tuesday that 'Four British institutions ranked in top six of world's universities.' This is true - but it's also true that four English institutions ranked in top six of world's universities - because those four were Cambridge, Imperial College London, Oxford and University College London.

The fact is that England has around 90% of the population of the UK. For most of the world, England is the UK. Even if the entire union split up, England has the same population now as the entire UK had in 1961. By losing Scotland we're talking less than 10% change in population. Hardly makes us a tiny nation.

Of course there would be losses if Scotland went independent, but there would be plenty of gains too - including the chance for English people (and Welsh and Northern Irish too - I just happen to be English) to feel like they have more of a proper identity. This is not about nationalism. One thing I've learned from the interviews of Scots during the run up to the vote is how many said something to the effect of 'I'm not a nationalist - but I am proud to be a Scot.' For too long we've been scared that the only people who are proud to be English are right wing thugs. But it shouldn't be like that.

It's too late to change what has happened already - but politicians and celebrities, shame on you. Let's let Scotland make the best decision for itself, and start to think about ourselves with as much pride as they do.

Wednesday, 17 September 2014

What's in a cereal?

The other morning I was staring at the back of a cereal packet on the breakfast table, as you do, and read the contents list. Nothing extraordinary, until I started to look at the numbers involved and discovered the Nestlé seems to have something in common with the X-Factor. They believe that it's possible to give 110%.

In fact there are two significant oddities in that ingredients list. One is the matter of nuts. Because it says that the product (Honey Nut Shredded Wheat, if you must know) contains 10.5% nuts when in fact its only 0.3% - that's quite an error bar. This is because neither peanuts nor coconut are actually nuts. But we'll let them off, because there is probably some sort of convention that allows them to come under this heading. (It can't just because they have 'nut' in their name, as 'Honey Nut Shredded Wheat' has 'nut' in its name. So if that were the rule, the contents should read '100% nuts'.)

But the more interesting oddity is the maths. You might wonder what the problem is. With 84.1% wheat, 10.5% nuts and 2.8% honey, that still allows 2.6% for the other bits and pieces. But ingredients lists don't work like that. They have to be specified in order of weight - so there is more sugar than there is nuts, the list just doesn't mention how much sugar. With a minimum of 10.6% sugar, that makes a minimum contents of 108%.

We can get some idea of the quantity of sugar from the nutritional information. We are told that 100g of the product contains 15.9g of sugar - but we can't just take this number as the missing figure, as it will also include the sugar in the honey and molasses. So reasonably we can guess that the 'sugar' percentage is in the 10.6-12% range.

So what is going on? Thankfully, Nestlé has been helpful on the subject and told me this:
The basic maths does not add up and unfortunately this situation is replicated across many foods as they try to comply with QUID (Quantitative Ingredient Declaration) legislation. The complication comes from the requirement to list the amount of ingredients as they are added to the formula at each step. It is called the ‘mixing bowl’ rules. 
In a simple process, this works well and the ingredients add up to 100%. In a process with many steps, and where moisture is lost in intermediate drying and toasting stages, the maths becomes more complex and illogical, and 100% is hard to achieve.   Each product must be viewed in isolation, and its manufacturing method affects the final result as well as the ingredients used. 
We have to comply with 'The Food Labelling Regulations 1996' and its amendments.  There are two amendments which detail how we should declare the quantities of ingredients used, and the key requirement is in the second of these Amending Regulations, which states; 'Where the food has lost moisture as a result of treatment, the indication of quantity of the ingredient or category of ingredients used shall be expressed as a percentage which shall be determined by reference to the finished product”. 
 So there you have it. The percentages can't really be taken as sensible detailed information, just a broad brush guide. This doesn't of course, explain why peanuts and coconuts are nuts (no doubt another regulation), or why there is no percentage against sugar - but it does help us understand what is going on to allow Nestlé (and other food manufacturers) to give 110%.

Tuesday, 16 September 2014

Central heating and the change in watching position for Dr Who

In a Facebook discussion of the most recent episode of Dr Who (yes, that's the kind of exciting social life I have), Matt Brown expressed (mock?) surprise that people didn't push the sofas against walls in the old days - and suddenly one of the greatest mysteries of the universe clicked into place. It's all about hiding behind the sofa. (If you aren't from the UK, you may need assistance from the Wikipedia page on the subject.)

When I was little, I did, genuinely, watch Dr Who from behind the couch (we weren't posh enough to call it a sofa), so that it was possible to hide when it got really scary. And I was not alone. Most of the young nation used to do this. Yet it is a practice that has pretty much entirely died out. Why?

I had assumed it was because the yoof of today is far more cynical and exposed to horrors that make Dr Who look wimpish in the extreme. But there was no doubt that this Saturday's episode, Listen, was suitable behind-the-sofa material, especially the bit with the bedspread right behind them (you have to have been there). If you haven't seen the episode and have access to BBC iPlayer, I recommend it. And then Mr Brown made that simple remark.

Because the fact is, these days, many people do push their sofas against the walls, while back then they tended not to. There could be various reasons for this - fewer squarish living rooms now, and we have much bigger TVs, for instance. But my suspicion is that it could be central heating related. Like much of the UK, we didn't have central heating when Dr Who first aired. In our case not until 1966. Before then, on a winter evening, you didn't want your sofa miles away from the fire. So the seats tended to be more advanced into the room than they now would be.

Of course, this could be rubbish. But it's a theory. And even better, it's a Dr Who related nostalgic theory. What more could you ask?

Monday, 15 September 2014

The Room - review

Sorry, games again! But this is the last of the series.

After my recent dip into the nostalgia of game playing while reading the book on the makers of Doom, I just had to have a go at a game. There was a temptation to revisit the past and fire up a copy of the Seventh Guest or Doom itself (both available on Mac, though sadly my old favourite, the X-Wing series isn't so I would have make to do with Wing Commander III). And I may still do so, though as I pointed out in the piece on Netflix and games, I'm not sure I could make the time for serious playing time any more.

However, while perusing 'best of' lists to see what's recommended on the Mac at the moment, I noticed some 'best on iPad' games and was tempted to spend the enormous sum of 69p on a game called The Room - and I am so glad I did.

If you ever played something like Seventh Guest, this is a bit like the puzzles without all the wandering around. The Room limits you to a single table - but on that table is the most gorgeous, complex puzzle box you ever saw. And if you complete it and open the box - another, even more wonderful box emerges. One, for instance, turns into a gorgeous planetarium and orrery.

It's a bit murky, but this is a part of the level 2 (or is it 3?) puzzle box. The device on the front is a complex clock that you need to get going. Every flap, button, knob and locked door will eventually contribute something. 
For me, this is the ideal game for the Netflix generation. You can do it a bit at a time (although it is extremely more-ish, and the temptation is to just do one more clue). And there's no frustrating dying and going back to the start. You can do whatever you like in whatever order it presents itself and it will either not work or take you on a step.

It's hard to describe the puzzles without giving too much away, but they range from simple physical discoveries along the lines of 'if I turn that bit it will open a door in which I will find something', through the need to build a gear chain to get some machinery running to spotting an inscription on the back of a photograph that tells you in an obscure fashion how to position something you will discover later (and only be able to see through a special viewing glass). It is brilliant! And did I mention it was cheap? Even better, it's a couple of years old, so The Room 2 is waiting for when it's completed.

There is a hint system, but most of the time you can make progress without it. I'm so glad I read that book...

Friday, 12 September 2014

The Toffler scorecard part 2 - weathering heavy seas

A little while ago I took a step into Alvin Toffler's bestselling 1970 book Future Shock to see how its vision of the future has held up. Here's the second instalment.

Perhaps the biggest danger was always where science is involved, and in a chapter titled 'the scientific trajectory' we start off with a pair of unlikely projections.

The first concerns the oceans. As has often been observed, there are huge opportunities in the sea, particularly as we use up more and more land-based resources - and there is far more space than on the land - so it was common back then, and Toffler falls for it hook, line and sinker, to assume that we would see far more sea-based industry, and even underwater cities.

Toffler quotes Dr F. N. Spiess, heard of the Marine Physical Laboratory of the Scripps Institute as saying 'Within fifty years man will move onto and into the sea - occupying it and exploiting it as an integral part of his use of the planet for recreation, minerals, food, waste disposal, military and transportation operations, and, as populations grow, for actual living space.'

That 50 years is close - but very few of these predictions are. Yes, we make more use of underwater resources like oil and gas. But living on and in the sea is generally a very expensive and restrictive way of going about things, and there is no sign of it becoming commonplace. Toffler expected 'aqua-culture' to be as frequently used a term as agriculture by now. Maybe not.

I'm not quite sure why, but Toffler links his second dubious prediction to the first when he says 'The conquest of the oceans links up directly with the advance towards accurate weather prediction and, ultimately, climate control.' He quotes Dr Walter Orr Roberts, past president of the American Association for the Advancement of Science as saying 'We foresee bringing the entire globe under continuous weather observation by the mid-1970s - and at reasonable cost. And we envision, from this, vastly improved forecasting of storms, freezes, droughts, smog episodes - with attendant opportunities to avert disaster.' What they didn't realize was that the seeds of the failure of this prediction were already sown.

While it's true that weather forecasting has got a lot better since 1970, so has the understanding that we are never going to be able to predict weather more than a few days into the future. Through the 1970s and 80s an increased understanding of the nature of chaotic systems would make it obvious that it doesn't matter how good Dr Roberts' worldwide weather observation is, the weather system is just too complex and too susceptible to small changes in initial conditions producing huge changes down the line. I suppose I shouldn't be too hard on Toffler as we still regularly see presented as 'fact' forecasts outside the 10 day window, where a guess based on typical weather for the time of year is more accurate that a forecast. But the confidence in the predictions on weather forecasting and climate control vastly misunderstood both the nature and scale of the problem.

Sorry Alvin - this one's a 100% fail.

Thursday, 11 September 2014

Netflix killed the video (game) star

Thanks to reading Masters of Doom, I've been in a contemplative, and probably rather nostalgic mood about games over the last few days. I've stocked up on a couple of games as a direct result, but my suspicion is that I won't be playing them much. Certainly not as much as I once would have done. Why? There's a simple, one word answer. Netflix.

Here's the thing. There are broadly two types of gamer. The teen gamer who builds his/her life around game playing and the adult gamer who plays games when they've nothing better to do. I've primarily been the latter. Apart from anything else, computer games didn't exist when I was a teen. The first time I ever played one was running Adventure on the George III ICL system at Lancaster, but by then I was already 21.

Although at my gaming peak I could spend a a good few hours at a time playing (X-Wing and its offspring were particularly time-eating), as an adult, life has always had other attractions and games tended to be a way to fill in time when I had an evening to myself - a 'boy's night in', as it were. This was, in part, because the chances of their being anything captivating on the TV that night was pretty small. But these days, if I've an evening to myself, I can just delve into Netflix and consume great dollops of the binge-watch du jour. (For me, this happens to be Battlestar Galactica at the moment.)

Of course all those teens (literal teens or twenty-something plusses who are still channelling their inner teen) will still be obsessively playing. There is still a massive market for the big games, especially among those who appreciate the online multiplayer benefits. But for the less obsessive gamer, I really think that the ready availability of quality binge watches makes for strong competition. My suspicion is that it will make for more use of 'dip in, dip out' games like the excellent iPad game The Room (of which a review follows soon). But we shall see.

Wednesday, 10 September 2014

Boldly going

It's a nice coincidence that I recently wrote about Battlestar Galactica, because the whole business of being out there in space is the topic of my latest book which I'm pleased to say is now available. In Final Frontier we discover the massive challenges that face explorers, both human and robotic, to uncover the current and future technologies that could take us out into the galaxy and take a voyage of discovery where no one has gone before...but one day someone will. In 2003, General Wesley Clark set the US nation a challenge to produce the technology that would enable new pioneers to explore the galaxy.

That challenge is tough - the greatest humanity has ever faced. But taking on the final frontier does not have to be a fantasy. In a time of recession, escapism is always popular - and what greater escape from the everyday can there be than the chance of leaving Earth's bounds and exploring the universe? With a rich popular culture heritage in science fiction movies, books and TV shows, this is a subject that I just couldn't resist and, like geeks everywhere, find fascinating.

One of the joys of writing a book like this is you find out a lot more about a topic that has always intrigued you. It's not that I've always wanted to be an astronaut - I'm far too fond of home comforts and minimising personal risk for that - but as a real-life story you can get your heart behind, it's hard to resist. I'm old enough to have been allowed to stay up all night by my parents to watch the Apollo 11 moon landing - and it's one of the most powerful memories of my childhood. And at the same time, I've boldly gone in fiction with Dr Who, Star Trek, Star Wars and so many more works of fiction, particularly in book form.

So the emotional connection was there already. But two things have really stood out for me in pulling together Final Frontier. One is the need to go beyond the traditional nationalism at the heart of early space exploration. Future manned exploration of space would benefit hugely from being an international venture, and, as recent developments have demonstrated, a mix of private and public funding.

The second is to detach space travel from science. I have always heartily agreed with those who say that having manned space vessels is a terrible way to do science. It is vastly more expensive than using unmanned probes and unnecessarily puts human life at risk. It would help enormously if we totally separated the two reasons for venturing into space. Science needs great unmanned probes. But humanity needs people out there. I'd suggest that rather than fighting over a relatively small science budget, manned space travel should be lumped in with the defence budget, as it would transfer cash from the dark side to the positive side of the human spirit, and arguably it has the same goal of expanding the cause of human survival, though in a much less nationalistic fashion.

We shouldn't send people out into space to do science (although they are welcome to do some while there). Instead, such an adventure (in the literal sense) should be to fulfil the human spirit that makes us more than just animals that live to breed and die. And that's kind of important.

You can find out more about Final Frontier at its web page, or buy a copy at Amazon.co.uk and Amazon.com.

Here's what the inestimable John Gribbin said about it:
An enjoyable romp across space and time, from Cyrano de Bergerac to future space-warp driven interstellar craft, via Verne, Wells and the possibility of colonising the solar system. 

Tuesday, 9 September 2014

On the road to Doom - review

I was delighted when someone pointed out the book Masters of Doom. It's not a new title, dating back to 2003, but it covers a period that anyone of a certain age with an interest in computer games will regard with interest.

Describing the rise and fall of the two creators of id software, John Carmack and John Romero, it is a classic silicon valley business/bio - with some particularly extreme characters. I knew nothing of these people at the time, but reading the book brought on waves of nostalgia as they were responsible for three of the key milestones in gaming history. I was still programming PCs when Wolfenstein 3D came out and I remember being amazed by the effects and responsiveness they coaxed out of the early PC's terrible graphics. By the time Doom and Quake came along, I was reviewing games for a living. Though my personal tastes ran more to the X-Wing series and Seventh Guest, I was stunned by the capabilities of the id games. They were the only first person shooters I ever found interesting - and each moved on the field immensely. All the first person shooters that are popular today from Call of Duty and Halo to Destiny owe them so much.

So from a techie viewpoint, this was fascinating, though the author does tend to rather brush over the technical side to keep the story flowing. And from the personal side, there were plenty of fireworks too. While the book slightly overplays the traditional US business biography style of presenting disasters and triumphs to regularly fit chapter boundaries, there is no doubt there was a real roller-coaster of an existence in a way that all those reality TV stars who overuse that term wouldn't possibly understand.

Although there are plenty of other characters, the two Johns are at the book's heart - Carmack the technology wizard behind the engines that powered these worlds, and Romero the designer and flamboyant gamer. The pair inevitably clash on direction and when they split it's interesting that it's the John who doesn't go for the classic US software developer heaven of turning the offices into a playground who succeeds.

All in all, truly wonderful for anyone who was into games in that period (and should be of interest to those who have followed them since). It's a shame it stops in 2003, as things have moved on a lot since its 'how the main characters are now' epilogue - but a quick visit to Wikipedia can bring you up to speed.

You can buy Masters of Doom at Amazon.co.uk and Amazon.com.

Monday, 8 September 2014

Is £10 an hour a sensible target for the minimum wage?

I was interested to read that the Green Party of England and Wales is proposing that we should immediately raise the minimum wage from the current £6.50 to a living wage (currently £7.65 an hour outside London) and that by 2020 they say that the minimum wage should be £10 an hour.

I am generally in favour of allowing markets to set prices, and at first glance, if someone is prepared to do a job for a certain amount, then it might seem unreasonable to pay them more. But there are good reasons to have a minimum wage at what is, frankly, the very reasonable level suggested as a living wage.

Apart from anything else, if someone is paid less than a living wage, then they end up being supported by the benefit system - so that just means more taxes for the rest of us. If someone is doing a job then they ought to be able to live on the proceeds of a reasonable working week. Anything less is close to concealed slavery.  Let's have that living wage now, please, government - and why doesn't it also apply to 18-20 year olds who get a pathetic £5.13 minimum wage at the moment?

However, despite my whole-hearted support for the living wage, I can't support the Green Party policy of a £10 target, as it is entirely arbitrary. There are two suspicious things about it. One is the round number nature of £10. This shouts out that it is a number picked out of the air that sounds impressive because it has two digits. The other is having a target for 2020. Unless the Green Party has a time machine they haven't told us about, that's just too far ahead to make accurate forecasts for. We don't know what inflation will be. We don't know what the economy will be like - and to make a commitment to a specific number seems crazy.

What would be much better, but less attention grabbing than that £10 number, would be to have a target of maintaining the minimum wage at a living wage level, year on year. That would be far more practical and meaningful. And it could mean a minimum wage of more than £10 in 2020 - we can't know, of course, we just know it would be the right amount, where £10 certainly won't be. So how about it Green Party? Can you move away from PR-based politics (the driving force, sadly of most green activity) and do something that really would be a good thing? We shall see.

"Green Party of England and Wales logo" by The logo is from the http://www.greenparty.org.uk/ website.. Licensed under Fair use of copyrighted material in the context of Green Party of England and Wales via Wikipedia

Friday, 5 September 2014

The Toffler Scorecard Part 1 - Disposability

My rather battered
version of Future Shock
Way back in 1970, when the world was very different 'futurologist' (I hate that word) Alvin Toffler produced an immensely popular book called Future Shock that predicted what he believed life would be like in the twenty-first century. In a series of posts I'm looking back at some of Toffler's predictions to see how they've turned out and what that can tell us about then and now.

Reflecting the change, particularly in America, that had brought in more and more of a throw-away society, Toffler envisaged a future where this approach was taken to the extreme. Apparently, in 1970 paper dresses were all the rage (I can't say I remember this), and wear-once-then-throw-away clothes were something Toffler assumed would become the norm. I don't know if he lived in Florida or California, but realistically paper clothes were always a non-starter as anything more than a gimmick - certainly in Manchester or Scotland, say. But is certainly true that the current young generation does think of clothes as more short-term purchases than a generation that bought clothes and kept them until the wore out. (My raincoat is over 30 years old and still going strong.)

However, what Toffler missed is the way that an awareness of green issues would become a natural background to life. While the younger generation don't hang onto clothes they way some older folk do, they also don't just throw them away. Instead they resort to recycling, whether via charity shops or services like eBay and Depop. And the same goes for much of our everyday things. Yes, we do change some products a lot more than we used to, but equally we tend to recycle them, ideally for money. It would have seemed crazy in 1970 to change your phone ever two years, say (it would, have course, have been a landline phone), but when we do make the change, we trade in the old one, or sell it.

On balance, then, this is a 50:50 prediction. Neither a hit nor a miss. We certainly do treat far more things as temporary than we used to. With technology, particularly, we feel driven to upgrade. I do have one bit of ICT kit that is over 10 years old (an HP LaserJet printer that simply does the job), but the average age of my ICT is probably about 2 years. Strangely, though, despite this, we are in a society less inclined to throw-away than Toffler's. We reuse, repurpose, recycle. Where he described a tendency to increasingly knock down old buildings, we (at least in the UK) now tend to treasure them and reuse them more than was the case in the 70s. It's ethical disposability. And that's rather interesting.

If you want to discover Toffler's predictions for yourself, you can buy Future Shock at Amazon.co.uk and Amazon.com.

Thursday, 4 September 2014

Scrubs up well

Your great grandma might not have known about phenol - but she certainly would be familiar with carbolic, the harsh soap that included carbolic acid, now properly known as phenol. This simple aromatic compound might have dropped out of our morning cleansing routine (thankfully) but it has more recent roles from the production of aspirin to Agent Orange.

Discover more in my latest Royal Society of Chemistry podcast about phenol. Take a listen by clicking play on the bar at the top of the page - or if that doesn't work for you, pop over to its page on the RSC site.

Wednesday, 3 September 2014


An eyelash mite
I had the pleasure of appearing on Radio Scotland yesterday. No, not to discuss the Independence vote, but the matter of eyelash mites.

When I wrote The Universe Inside You, which uses the human body as a starting point for exploring all kinds of science from the nature of light to evolution, I just had to include (with a title like that) the veritable zoo of creatures that call our bodies home. Of course I explored the bacteria, which, with ten times as many bacterial cells in the body than human, are pretty impressive. But I also included Demodex, the eyelash mite.

These tiny little arachnids - typically 1/4 to 1/3 of a millimetre in length - feed on sloughed skin and sebaceous oil, in effect clean-up scavengers. They are transparent and pretty well impossible to see, mostly living at the base of eyelashes and eyebrow hair. What I said in UiY is that it was thought that around half of adults have them, but the reason they had become news, featured in national newspapers and on Radio Scotland, was that a study had shown that all adults had them. (Or at least, that's how it was interpreted. More on this in a moment.)

There was some interesting psychology as to why this change made them news. I suspect it is because it went from feeling like something like head lice that other people have (until there's an outbreak at your children's school) to something you have.

In fact the study is both more interesting and limited that the reporting suggested. The PLOS One paper does not actually say that mites were discovered on 100% of adults - in fact they were only spotted on 14% of adults, as it's hard to do. But what the researchers did was to take a sample of sebum and search for Demodex DNA. They discovered it on 100% of adults over 18 and 70% of eighteen-year-olds. Admittedly this isn't a perfect determinant, but as the paper puts it 'Though it is possible Demodex 16S rDNA could be found on the face of an individual without mites, the likelihood that we detect such transferred DNA in our limited sampling area would be low.'

So an interesting development. One of the conclusions was 'The diversity of D. brevis 18S rDNA found on individual humans suggests that not only do all adult humans have Demodex mites but that colonization is likely to occur more than once.' This is the interpretation that I'm a little worried about. The study is based on DNA testing on 19 adults, all from Raleigh NC. I'm not convinced that this provides sufficient data to make the the sweeping statement that all adult humans have Demodex mites - which then led to the news flurry. It may well be true, but this seems a very small sample to build that conclusion on - though its clear that the mites are significantly more prevalent than previously expected.

A bit of fun, though. Got itchy eyebrows? I thought so.

Image "Haarbalgmilbe". Licensed under Creative Commons Attribution-Share Alike 3.0 via Wikimedia Commons

Tuesday, 2 September 2014

A question of waves

A tide, earlier
Every now and then someone sends me an email with a question in it, and I try to answer.*

Sometimes these questions are rather silly - and that's fine. That's how we learn, by asking silly questions. (I do it myself all the time with real scientists.) Sometimes the questions are pretty straightforward, or mind-bogglingly wacky. But just occasionally you get one that's really interesting - and I had such a one the other day about the tide.

As we all know, the tides are primarily influenced by the Moon, though there is also some input from the Sun. But my questioner wondered why this would be the case, as the Sun has a much bigger gravitational pull on the water than the Moon does.

Two questions, then. Was he right, and if so, why does the Sun's influence appear so understated? This is one of this areas where a few back-of-an-envelope calculations can give you a useful feel for what's going on. Thanks to Mr Newton (we don't need general relativity for this, thankfully), we know that gravitational force is proportional to the mass of the body producing it divided by the square of the distance. Plug in the numbers for the Sun and the Moon and you'll find that the Sun does indeed out-pull the Moon by a factor of 160 or so. (This shouldn't be surprising - the Earth's orbit would be distinctly scary if the Moon beat the Sun.)

However, tides are not about absolute gravitational pull. Working out the actual details of tides is messy indeed, but we can get a feel by thinking that the key factor in producing them is the difference between the pull the water feels on the surface of the Earth and the pull it would feel if it were at the centre of the planet. (There are other factors, including the spin of the Earth and the fluid nature of water than add the horrendous complexity of the real calculations, but this gives us a feel.) That distance, the Earth's radius, is significant in terms of the distance to the Moon, but makes very little difference compared to the distance the Sun. It's the amount of variation of gravitational pull that matters for tides, not the absolute value. This results in the tidal effect being approximately dependent on the inverse of the distance cubed, not the distance squared as in the usual gravitational calculation. And hence the Moon becomes the big cheese. As it were.

* If it's about something in one of my books. I reserve the right not to answer questions about, say, One Direction or fashion or many other subjects.