As 2014 draws to a close my thoughts have recently turned to pondering the greatest neuroscience discoveries of the year. For me I’ve been struck by several developments in an area of biomedical science that during most of my lifetime has been considered beyond the powers of medical therapy to provide a decent remedy.
Ever since Christopher Reeve (the actor who played Superman in the much loved films of the late 70’s and 80’s) became paralysed from the neck down during an equestrian accident in 1995, the plight of people who suffer traumatic spinal damage has seemed utterly futile; despite the huge amounts of money various benefactors have ploughed into research. However this year we have seen huge leaps in scientific advancement enabling previously wheelchair-bound people to stand up and take some small but important steps forward under their own volition.
A paralysed person kicked off the 2014 World Cup in Brazil during the opening ceremony using an EEG-controlled robotic exoskeleton. But given that the person in question had to be carried onto the pitch on a golf buggy, as opposed to rising up out of their wheelchair as promised, that feat should only really be considered a drop in the ocean compared to the much more remarkable progress in paralysis rehabilitation we’ve seen over the course of 2014.
At the beginning of the year I was invited to make an appearance on “Newsround” – the Children’s BBC channel’s daily news show – to explain a totally unexpected and extraordinary breakthrough in rehabilitation research with paralysed army veterans in the USA. A chip was surgically inserted into their spinal cord, below the sites of damage, to apply weak currents of electricity in an effort to reinvigorate the involuntary spinal reflexes that enable us to maintain our balance whilst standing (no input from the brain necessary).
This unexpected development occurred when, after a few weeks of further intensive rehabilitation exercises, several people regained voluntary movement of their legs for the first time in 2-4 years. Can you imagine how good that must have felt for the people in question? As someone who personally spent three weeks of 2014 with an almost completely paralysed arm after complication during routine surgery, it brings tears to my eyes to think how amazing it must have been to have control over legs that had previously seemed utterly useless for so many long months. It seems that the current injected by the chip had unexpectedly boosted signal strength across the area of damaged spinal cord sufficiently for the electrical messages (action potentials) to get all the way down to the leg muscles.
In 2004 whilst I was doing my PhD at University College London, I attended a talk by Prof Geoff Raisman, now chair of Neural Regeneration at the Institute of Neurology in Queen Square. He presented brand new data that he was clearly extremely excited about in which he showed data that clearly depicted new neuronal growth across the site of a spinal lesion. I cannot remember whether the experiment involved rodents or non-human primates but he made it clear that it would be many years before this pioneering research could ever be used to help paralysed humans. Today, in 2014, this dream is a reality.
Darek Fidyka was paralysed from the chest down for several years after a knife attack that severed his spinal cord. The 8mm gap that prevented messages sent from his brain to reach the muscles of his leg, penis and bladder were bridged using stem cells extracted from his brain. Mr Fidyka first underwent surgery to remove one of his two olfactory bulbs – the antennae like structures that extend forwards from the brain’s limbic system, running above each nasal cavity and extending smell receptors across the skull and into the nasal epithelium. Because the olfactory receptors come into contact with so many volatile compounds (just think of how potent the gases are that get into your nostrils when you’re downwind of a bonfire) a fair amount of damage happens to these brain cells and so they must be constantly replenished. This means that the olfactory bulbs / neurons of the nasal epithelium are a great source of stem cells.
Once sufficient numbers of Olfactory Ensheathing Cells (OECs) had been cultured and several million of them injected into the gap in his spinal cord a period of intensive rehabilitation exercises got underway. 6 hours per day 5 days per week. A few no-doubt-frustrating weeks later he graduated from walking with the assistance of parallel bars in the rehabilitation gym, to walking with a frame outside the hospital in Wroclaw, Poland where the surgery took place. Perhaps as important he regained some bladder control and sexual function. An incredible achievement for Mr Fidyka, but an absolutely triumph for Prof Raisman and the hundreds of people that have contributed to the groundwork that led to this unbelievable feat of brilliance.
This story was covered in episode 10 of the podcast Geek Chic’s Weird Science – co-presented by yours truly and the gorgeous Lliana Bird – which you can subscribe to on iTunes, absolutely free of charge, by clicking here.
For daily news on the latest advances in neuroscience research you can follow me on Twitter by clicking here.
As a neuroscientist who spends much of his working life giving brain talks at events all around the country (at schools, conferences and science festivals) I’ve noticed that one theme catches public imagination over and over again: How does caffeine work? What does it do to my brain? How long does it stay in my system? Is it really that bad for me? This is one reason why it became one of the key topics in the “Smart” Drugs chapter of my book: Sort Your Brain Out. In this blog I’ll cover some of the most regularly asked questions.
How long caffeine takes to leave your system?
It depends what other drugs you’re on. If you’re on the contraceptive pill it can take up to twice as long for your liver to remove caffeine from your system. So people “on the pill” can find themselves particularly sensitive to its effects because consecutive doses stack up and are not cleared out as swiftly as in everyone else. But if you’re a smoker it is the other way around. Caffeine is removed from your system at double the speed of a non-smoker.
If you’re neither a smoker nor on the contraceptive pill the concentration of caffeine in your bloodstream is halved every 5-6 hours, but it really does depend on the individual as this “half-life” varies greatly from person-to-person.
Is it beneficial to have caffeine before a meeting / presentation / to improve concentration?
Caffeine blocks the receptors of an inhibitory neurotransmitter called adenosine, which under normal circumstances reduces overall activity across the brain. By blocking these receptors and removing the dampening influence on brain activity, caffeine increases activity across brain pathways involved in alertness, focusing attention and initiating body movements. This why people dosed up on caffeine can get quite jittery.
Whether or not caffeine is beneficial for you in a meeting / presentation or to improve concentration whilst working depends on how much you’ve already had. There’s a sweet spot where you will feel more alert and switched on at moderate levels, but beyond that you can become so wound up that it has effects that are deleterious to performance (see description of caffeinism below).
However the increase in feelings of alertness and ability to focus attention only gets regular coffee drinkers up to levels enjoyed by non-caffeine drinkers everyday. This is because once you’re a caffeine addict the brain tends to increase the numbers of adenosine receptors to compensate for the fact that there’s loads of caffeine swimming around in your brain on a daily basis. This means that your average coffee drinker has more inhibitory receptors in their brain dampening activity levels to a greater degree – so they will feel more sluggish whenever they don’t have caffeine in their system.
Is caffeine good or bad for you in the long run?
There seem to be some long-term benefits to drinking caffeine even if the short-term benefits don’t amount to a whole hill of (coffee) beans. It has been observed that regular drinkers of moderate amounts of caffeine (3 cups / day) have a lower incidence of Parkinson’s, Alzheimer’s, liver and heart diseases. This may be due to the increased numbers of inhibitory receptors triggered by ever-present levels of caffeine dampening activity levels in body and brain. The decreased activity levels across the brain caused by the larger numbers of inhibitory receptors in the caffeine drinkers’ brains may relieve the pressure on dopamine neurons that are compromised in Parkinson’s disease and the acetyl choline neurons that get clogged up with various proteins in Alzheimer’s disease. In other words caffeine seems to slow down the process of cell death so that symptoms of these diseases kick in several years later than in your average non-caffeine drinker. At the moment this mechanism is purely speculative. The jury’s still out on the precise mechanism that might account for these observations, but the evidence supporting the concept of moderate amounts of caffeine having a neuroprotective influence on the brain is steadily increasing.
Is it important to control and monitor your caffeine intake?
A dose of 10g is deadly – 100 cups and a human may well find themselves popping their clogs as a typical cup of brewed coffee contains 100mg of caffeine. (NB you may notice that in the above video from the lovely people at ASAPscience they say 1 cup of coffee has 150mg – presumably they brew it stronger over in Canada For the non-coffee drinkers out there here are some average caffeine contents of some other popular drinks. There are 80mg in a can of Red Bull, 75mg in a cup of instant coffee, 50mg in a cup of tea, 30mg in a can of Coca Cola.
Very high but not deadly doses can lead to a quite severe psychiatric condition known as caffeinism: “which is characterised by restlessness, agitation, excitement, rambling thought and speech, and insomnia.” (Winston et al, 2005). It is important to control and monitor caffeine intake because too much can interfere with appetite, make people anxious or depressed, not to mention the fact that anything that interferes with sleep will have a deleterious effect on the brain. Everyone’s sensitivity to caffeine is slightly different, but if you have trouble sleeping then you’d be well advised to avoid caffeine at least 5-6 hours before bedtime – for your brain’s sake.
3 cups of coffee per day is considered a “moderate dose” for most people. Get these in early enough to avoid any potential for them to interfere with sleep and you should get the apparent long-term brain benefits without the negative consequences associated with excessive consumption (DISCLAIMER: this should not be interpreted as medical advice – it is just the science-based opinion of the author who has a Ph.D. in neurobiology i.e. not a medical degree!).
I don’t like tea or coffee, are there any other sources of caffeine?
Caffeine is also found in kola nut (one of the original ingredients of coca cola) and guarana – a wonder berry from the Brazilian rainforest; it’s also found in low quantities in chocolate. Caffeine is also included as a stimulant in many cold and flu remedies – so beware what you reach for when you wake up in the middle of the night with a bunged up nose!
By the way: if you study the picture on the left very carefully you’ll find a face amongst the coffee beans – can you find it?
Keep looking… he’s definitely there and you’ll kick yourself for doubting me when you find him!!
If you liked this you’ll love my daily brain tweets so please follow me on Twitter by clicking here.
Parks and open spaces improve health and quality of life by incentivising people to get out and take some exercise, which is extremely good for brain health. Just being within eye shot of some greenery can accelerate healing – so even if you can’t get outside, all you need is a room with a view! If it wasn’t for the armies of parkies and council cleaning staff who clean up after members of the public who routinely leave their litter behind, these green spaces would soon become the last place you would want to spend your spare time. The question is: why do people leave their litter behind for somebody else to clear up in the first place?
All human behaviours are governed, more-or-less, by the brain’s predictions of reward and punishment. We are subconsciously guided towards actions that maximise rewards whilst minimising punishments. The pleasure pathways of the brain, in particular the nucleus accumbens, are involved in attaching a reward prediction to a certain course of action based on past experience. Drinking water when thirsty or eating food when hungry are examples of behaviours hardwired to produce powerful sensations of pleasure because they help to keep us alive. However the sense of pleasure that people get from putting rubbish in the bin is not innate, like drinking and eating, but instead it must be learned.
Nonetheless, even in the absence of a sense of reward from putting rubbish in the bin, if littering is consistently punished then that too can steer people away from anti-social and towards pro-social behaviours. Whilst most parents are still apt to discipline their children for littering, which provides valuable experience of the punishments that follow such anti-social behaviour, parents aren’t always around. In the past adults felt at liberty to scold, or even physically punish, any child that they happened to see dropping litter, but in the modern climate of political correctness this has become a thing of the past. Young people no longer learn that punishment reliably follows the act of dropping litter and so their brains do not generate the sense of discomfort, anxiety or unease (generated, if you’re interested, by the anterior insula) that would precede acts of anti-social behaviour that they know through experience is likely to be punished. So in the absence of any negative emotions associated with the act of littering, nor positive emotions associated with the act of putting litter in the bin, rubbish ends up being lobbed around willy nilly, even when a bin is conveniently located just a few steps away.
When children are brought up with a strong sense of social responsibility then in later life they may get sensations of what might be called “righteous” pleasure from doing the “right thing.” The point is that to get a feeling of satisfaction from performing pro-social behaviours you must have been trained over prolonged periods of time by parents, carers, teachers and/or peers in order to get a kick out of it. If society wants to encourage pro-social behaviours we’ve either got to praise young people more for putting litter in the bin, or make them very uncomfortable when they just drop it for someone else to deal with. Or, take a leaf out of the Texan’s book. They had great success in reducing littering on the highway (after many years of failure with several different approaches) by adopting a campaign that would appeal to young men’s sense of pride and bravado (see left).
A fascinating study, again from the journal Science (Keizer et al, 2008), indicates that evidence of other people’s antisocial behaviour can make others more likely to be antisocial themselves. This would suggest that the problem with litter goes beyond just rubbish on the streets and in our parks. In one of their experiments they demonstrated that environments in which anti-social behaviour was evident, e.g. litter strewn around on the pavement, graffitti sprayed on the walls or fire crackers set off in the background, not only makes people more likely to litter themselves, but also to commit more serious anti-social behaviours like theft. It seems that people modulate their own behaviour according to cues regarding the degree of anti-social behaviours committed by others. So if you really want to stop other people dropping litter, you might consider reducing the evidence of other people’s anti-social behaviour by picking it up yourself!
I tweet the latest neuro-breakthroughs, hot off the scientific press on a daily basis (and have been doing so for the past 5 years!) so if you’re keen click here to follow me on Twitter.
Last week I gave a talk on body language for post-graduates at Middlesex University. I promised I’d write up a blog about it as a reference for all those lovely people in the audience who listened so attentively and had so many interesting questions for me afterwards (for 2 hours!). So here’s the gist of the main points…
The brain produces many thoughts during any interaction.
Every thought generates a feeling.
Human feelings are spontaneously expressed in body language.
Thus it is possible to work backwards along this chain of events in the following way:
A person’s body language can give you an insight into what they are feeling.
Knowledge of what a person is feeling can be used to infer their thoughts.
But only if you have given the P. I. C. process careful consideration:
- PERSPECTIVES – bear context of each situation in mind: crossed arms = feeling defensive? Or just cold?
- INCONGRUENCE – when words don’t match the voice &/or body language the words will be discounted
- CLUSTERS – of body language cues are MUCH more reliable than individual ones
The key thing to bear in mind when thinking about one’s own body language is to try to avoid postures / gestures that raise the suspicion that you are feeling anxious, guilty, uncertain etc. If you know what other people might be looking out for body language-wise then you can take measures to avoid accidentally giving out the wrong message.
In 1971 Prof Albert Mehrabian together with colleagues at UCLA published a paper indicating that when we say a word the meaning of that word only accounts for 7% of the information communicated. Visual signals (body posture, facial expressions, eye contact etc) accounted for a whopping 55% of the message and acoustic signals in the voice (volume, tempo, rhythm etc) accounted for 38%. Amazingly, given how unlikely these figures seem when we first hear them, it seems that this idea has more or less stood the test of time.
Visual > Auditory > Linguistic - In Communication Signals We Trust (most > least)
Mehrabian et al’s work indicated that if what is being said somehow doesn’t fit with the rhythm, speed, volume of voice and/or facial expressions, eye movements and body posture displayed by the speaker, we become suspicious of the words and tend to ignore them. So if we wish to communicate clearly then we must take measures to ensure that these are all aligned. It is vitally important to ensure that you do not inadvertently send mixed messages into the outside world that might cause people to be confused by, angered by or distrustful of the words we speak. This is particularly important when making a first impression in a job interview, business meeting or on a date.
Two Way Street
When we feel happy we smile, when we are sad or angry we frown. Not only do these facial expressions helpfully communicate how we’re feeling to others so that they might use this information to guide their behaviour, it also affects the way we feel ourselves (facial feedback hypothesis).
If you pull a smiling expression, it might feel fake, but it will send a torrent of sensory messages to the brain about the position of your face. This, in turn, triggers activity in the emotional pathways to create feelings that match the facial expression. The same thing goes for the negative emotions. If you pull a sad face – with bottom lip protruding as if you’re going to start blubbing – eventually you will start to feel melancholy and thoughts of things you really are sad about will start to flit around in your head. People who have had Botox for cosmetic purposes – to remove frown lines in their forehead (making them physically incapable of frowning) – even leads to increased ratings of happiness!
The critical point of all this is that it’s a two way street:
• Emotions spontaneously generated by your brain can automatically induce a facial expression
• Facial expressions commanded voluntarily by your brain can automatically induce an emotion
When somebody smiles at you, you will instinctively smile back. That is because in our species a smile indicates that the smiling person in question means no harm – it says: I am friendly, you have nothing to fear. If you think about the two way street of facial expressions / emotions in the context of our innate tendency to mimic the facial expressions of the people around us – when you smile at someone it makes them smile, and their own smiling face makes them feel ever so slightly happier. Never underestimate the power of the smile. Your own happiness can be infectious and people like to spend time around people who make them feel good.
Part two describes body language evolution, leakage and Dr Jack’s A-H of body language, so please CLICK HERE.
In Sept 2013 I gave my “Brain Coach” talk at both Dulwich College and Sydenham High School. That’s the second consecutive year that Sydenham girls entering their GCSE exam year will get my crash course in applied neuroscience. The talk is summarised here on the Girl’s Day School Trust website. It covers changes that take place in their brains as they learn and various neuroscience-informed strategies to manage stress better, stabilise mood, boost problem solving and enhance exam performance. It’s the third year in a row that I’ve shared these insights with Dulwich lads about to embark on their A-levels (and I’ve just been invited back to speak to the Year 11’s in Sept 2014!). Nothing quite like repeat business to confirm you have a product that is highly valued and well received!
I’d jump at the chance to give this talk at schools all around the country. Feedback from teachers year on year indicates that students really do benefit from a better understanding of what is going on within their skulls as they learn and acquire new skills. Understanding that all their effort and hard work actually leads to physical changes in the brain is highly motivating – the audience is left to connect the dots themselves – there’s no need to ram it down their throats. Realizing that feeling stressed is a sign that body and mind are being mobilized to deal with the cause of the stress turns a negative into a positive – simply by pointing out the common misunderstanding. And advice on how to reduce levels of the stress hormone cortisol when it all starts to become too much to bear gives the students a sense of control over their state of mind. Mnemonic techniques to help them retain important information in mind not just for exams, but for a lifetime – surely the whole point of education after all – has a completely transparent utility. Here’s some feedback from a teacher Lisa Cornell who invited me to speak at Sydenham High School:
The talk .. was inspirational for staff and students alike. The students enjoyed your informal yet informative style. You made difficult concepts easy to grasp. They especially liked how you applied these high level ideas to their everyday lives and studying. You were witty and most importantly not in the slightest bit patronising. You managed to use an array of high level language and technical terms [yet] alienated nobody. I particularly liked how you broke down the latin of long words (eg explaining adrenal).
From a teacher point of view you were engaging, entertaining and a very safe pair of hands for our students to work with. A very good litmus test for any speaker is if students stay behind to speak with you. That you had a ten strong audience of Y11s for half an hour after home time says a lot. Some of those students who stayed I have never seen so enthusiastic about anything!”
I would love to get up on stage in front of many more schools each year as I genuinely feel it is one of the best uses of my broad knowledge of neuroscience and aptitude for conveying it in plain english. If you would like me to speak at your or your teenager’s school then please do drop me a line.
You might also consider following me on Twitter. I flag at least 3 interesting pearls of wisdom from the world of neuroscience and psychology research every day.
As we progress through life we inevitably find ourselves becoming increasingly forgetful. It is not as if bouts of forgetfulness never occurred when we were younger. It’s just that it begins to happen more and more frequently – to the point where it becomes much more noticeable; even troublesome . From our mid-twenties onwards we lose more neurons (brain wires) and synapses (connections between the brain wires) than we build. The long term end point of this perfectly natural, gradual process of brain aging is dementia. By which I mean if we all lived to the impossibly grand old age of 200, every single one of us would have developed dementia of one description or another along the way. In reality very few of us will even make it to the grand old age of 100, let alone 200. Of those that do, not everyone will have become plunged into the amnesic fog of dementia. So what is the difference between individuals that do and don’t develop dementia well into their senior years? Is it blind luck? Or is there something we can do to lengthen our dementia-free status?
Earlier this year I interview some of the world’s leading neuroscience researchers at the 2013 British Neuroscience Association’s annual conference at the Barbican Centre. This video is a short extract of the interview I did with the lovely Prof Irene Tracey who is not just an expert in how the brain creates and modulates pain; she is also a brilliant communicator. Here we discuss some basic brain mechanisms involved in reducing pain when the situation demands it. This is one of sixteen interviews with some amazing scientists which I’m slowly but surely editing into a short film. So watch this space…
I reviewed the first Brain Training title on the Nintendo DS a couple of years ago and, to be perfectly honest, the sequel “MORE Brain Training” a.k.a. “Brain Age 2″ is not a great deal different. Dr Kawashima’s floating head is still there in its chunky pixelated glory; guiding, encouraging and chiding you throughout. Even the constant repetition that X, Y or Z game is “great for giving your prefrontal cortex a good work” out is also ever-present. I had hoped he’d get a bit more specific about which task was working out which part of the prefrontal cortex in this sequel. Especially given that, if the crinkly outer surface of the brain was increased to the size of planet earth, the prefrontal cortex would cover an area the size of North and South America put together (at least!). Still there are a few new games, many of which bear a striking resemblance to the old ones, some are plain dull, but others really quite novel / clever. Overall I would say it is a bit tougher on the old synapses than the predecessor; which is a good thing…
You may be aware of fierce debate going on about the effectiveness of these games when it comes to positively influencing cognitive abilities that have:
- a long term impact
- that goes beyond improved performance on the specific games being played to other cognitive functions useful in daily life
I would argue that, purely in terms of short-term arousal (Steady! In your brain.. not your pants), it is really quite effective. Based on personal experience I have found that 10-15 minutes spent taxing various mental abilities with the higher levels of any of these games is a more effective way of getting going in the morning than a slug of strong coffee. So even if the evidence does not mount up to support the claims of Lumosity, Cognifit and Torkel Klingberg regarding long term cognitive benefits for everyday people that might help them in their daily life, I think it would be pretty hard to refute the claim that challenging your brain to solve a few puzzles first thing in the morning can really help you hit the ground running each and every day.
Anyway, I digress (again). What I like about Brain Age 2 is that it is really hard; punishingly hard at times. In one game you have to keep track of a stickman’s position in a running race as other runners are overtaken / overtake you. In another your task is to keep track of blocks that pile up on each other as they fall behind a screen recalling the height of one particular column. Both are good solid working memory training games (and thus have the best potential to boost IQ; read this book for full explanation) and have a nice progression to them in that they start easy on the earlier levels, build the difficulty gradually, but soon end up challenging even the sharpest of brains.
Other new games are not so challenging. “Days and Dates” and “Correct Change” are clearly built with the aim of developing cognitive skills that have an obvious practical application in everyday life. I suspect these might have been included to address criticisms leveled at the brain training market by suggesting that the games only help people get better at the specific task being tackled. Either way, figuring out what the day was 4 days after 2 days ago, or figuring out the correct coins to give as change if a £/€/$1.40 bill was paid with a ten pound/euro/dollar note, are a pretty dull ways to pass the time, if you ask me.
“Missing symbols” – adding the appropriate plus, minus, multiplication or division sign to make the sum work – verges on the dull, but the speed element keeps it challenging. You can always go faster. “Memory addition” takes mental arithmetic to the next level by having to perform a calculation but then keep one of the numbers in mind to use it in the next sum. I must admit to hissing the to-be-remembered number under my breath (recruiting the “phonological loop” aspect of working memory) so as not to get confused with the correct answer for the current sum. “Word Scramble” is cute. Solving an anagram where the letters are not just shuffled but are presented in a ring that slowly rotates. Surprisingly tough, particularly with the longer letter strings!
Anyone who has read my review of Beat City will know I am a fan of games that involve making music. So it will come as no surprise that I think “Masterpiece Recital” is brilliant. A little bit pointless for people that actually play the piano, but great for the rest of us. You have to hit the right note on a piano keyboard as the musical score scrolls past. And you don’t have to be able to read music as it labels both the keyboard and the music notation with the appropriate letter (see left). The reason I found it so satisfying is because in the later levels the tunes are really beautiful pieces of classical music (and I’m no classical music buff, that’s for sure) plus the accompanying backing music makes even the most amateurish efforts sound pretty good; even if you’re a bit late hitting the notes. You get marked down for this at the end, but whilst you’re in the game it very enjoyable to feel like you are actually creating such pretty music.
“Word Blend” is a good idea, but poorly executed. It’s loosely based on the dichotic listening test (usually different information is presented to each ear) – straight out of the psychology textbooks – whereby 2 or 3 voices simultaneously say a single word and your job is to recognize the words and write them down. Personally I just found this game irritating. Despite having the option of hearing them repeat it five times or so (but you only score points for words identified without hitting the repeat button) it can sometimes be quite impossible (for me at least) to hear one voice over the other. I suspect it is the fault of the game rather than the player because there was no improvement. So I’m either acoustically challenged, or this particular game is a bit crap.
The game I liked the most, despite upon first encountering it that it was a bit remedial, is a game that seemed to be inspired by exercises developed to help people overcome learning disabilities. “Determine The Time” is reminiscent of an cognitive development technique invented by Barbara Arrowsmith-Young (whose book: “The Woman Who Changed Her Brain” is as amazing as it is inspirational). She developed this simple clock reading task, first to help overcome her own difficulties learning relationships between symbols (like the relationship between the big hand and little hand of a clock) and then started to roll it out as an entry level exercise for kids and adults with learning disabilities (making a dazzling impact on improving their cognitive abilities).
It quite literally involves reading the time of a clock, but the twist in this particular game is that the clockface is rotated. This requires you to do a “mental rotation task” – imagining in your mind’s eye what the clock would look like if it was the right way up – so that you can give the right answer. Such spatial rotation tasks stimulate the parietal cortex (finally something that benefits a brain area other than the prefrontal cortex!!) and, presumably, improvements in these mental rotation tasks will enable the parietal cortex to manipulate all sorts of other information in space.
Incidentally, Einstein’s brain had a larger-than-normal parietal cortex and, given that this lobe is also critically involved in mathematical abilities, it is thought to account (in part at least) for his tremendous contributions to physics. As well as rotating the clock in the harder levels Dr K becomes particularly devious by mirror reflecting the images as well. So your parietal cortex has to perform two sequential transformations reflecting it back and then rotating it the right way up again. It is a very simple idea, but genuinely, in my opinion, a tremendous work out for the parietal cortex.
I am aware that so far the brain games I’ve reviewed are all on the Nintendo DS. I am also conscious that it may seem that I am in some way biased in favour of the Nintendo DS. Both are perfectly reasonable observations. For the record the true reasons that, so far, I have only reviewed titles on the Nintendo DS are quite simply that a) I happen to own one, b) positive outcomes from brain training is only possible if you play it regularly and intensively and c) the smartphone I happen to own is not optimized for gameplay.
Convenience Lends Itself To Regular Training:
For brain training to have even the slightest chance to yield genuine benefits it must be undertaken regularly, intensely and for long periods of time. In my opinion convenience is therefore a prerequisite of any good brain training game, thus I favour options that enable people to fill dead time in their daily routine with gameplay wherever they happen to be. I am aware that there are many home computer-based brain training games but as I personally feel that when I’m at my computer I should be working, not playing games – I suspect others feel the same way. This is why I haven’t reviewed the various online brain training offerings, instead focusing on those that enable you to brain train on the move. Not only is the Nintendo DS extremely portable and therefore convenient, I also happen to own one, so it is currently my device of choice for gameplay on the move (the only time I personally get the chance to get stuck in).
Why No Smartphone Based Brain Training Reviews?:
I’ve been using a Blackberry for the last few years purely for the slideout keyboard which enables me to type without looking at the buttons. Once I’ve got over my distaste for touchscreen smartphone technology (I’m nearly there) I’ll start reviewing iOS / Android brain games. In light of this avowed intent I would be grateful if anybody out there would suggest any games marketed as Brain Training so I can give them the once over (rather than leaving a comment please drop me an email by clicking here instead).
I’ve been meaning to read this book for a very very long time. I spied it on a friend’s bookshelf and wasted no time in negotiating the borrow. I’d heard about it long ago from a mate from university who studied psychology and now works for a major record label. He had left me with low expectations saying that the hype that greeted it’s launch onto the market was unwarranted. In fact, he said, Mr Gladwell contradicts the point he makes throughout the book, right at the end.
I can see how this book might be a bit “Marmite” for some people (this is a spread that we Brits either LOVE or HATE to put on our toast in the market – and it really does split people into two distinct camps at opposite ends of the spectrum). It takes an interesting point: that the expert brain is capable of making very accurate judgements in a blink (or two) of the eye – and then hammers it to death with examples from many different walks of life.
Personally, having read hundreds of pop science books now with an eye on writing my own (in fact I have finally scored a book deal – only took 5 years!), I thought Blink was great. In my talks at business conferences and training employees in the worlds of Public Relations, Market Research and Advertising I find myself discussing the neuroscience of decision making at great length. The role of instinct, gut feeling, call it what you will has long been overlooked by economics in its first few decades and the brain sciences have been filling in the gaps in the last decade or so. What I’ve found is that lay audiences NEED concrete examples to really drive the message home. And Malcolm Gladwell is not only a great story teller but he has found many wonderful examples to put flesh on the scientific skeleton.
From art experts evaluating the authenticity of a priceless statue, to police evaluating the threat posed to them by a potential criminal, the power and weakness of instant judgements are thoroughly dissected in a very compelling manner. This book may not be to the taste of those already well versed in neuroeconomics and psychology, but for the layperson my instinct tells me this is a must read.
What I like most about this book is that, admittedly with a fair degree of repetition, it makes one point clear and true – if you have developed considerable expertise then you can make sound judgements in the blink of an eye, but if you haven’t got much experience then your instincts will probably misguide you and lead to potentially catastrophic consequences.
I have many more book reviews in the works for this blog and recently reviewed Phil Barden’s “Decoded” elsewhere. In the meantime, you might also consider catching my daily brain twitterings on Twitter.
A short film describing neuroscientist Dr Jack Lewis’s first 3 years in television. As well as science consultancy for the Emmy award winning documentary “The Living Body” on National Geographic / Channel 4 and primetime quiz show Britain’s Best Brain on Channel 5, Jack has presented several TV series. His most recent roles do not feature in this showreel, but are described briefly in italics below. This showreel features highlights from Dr Jack’s broadcast output up to and including 2010:
- Dr Jack co-presented a prime-time SkyOne series called Body Language Secrets (aired 2010-2011) exploring the themes of selling, attraction, winning, laughter, power, lying and money.
- Jack’s first big break as a presenter came with People Watchers (aired 2008), a BBC2 series exploring the quirks of social psychology via a wide variety of different hidden camera experiments set throughout London.
- In his role as the Face of Faraday 2008 Dr Jack presented 4 short films which aired on Teacher’s TV and were centred around the theme Technology for Life. These films were specifically created to be played during science lessons across the whole United Kingdom in an effort to encourage 12-16 year old pupils to pursue careers in STEM subjects (i.e. science, technology, engineering and maths.)
- Jack hanging out with David Ginola and scenes from the modelling shoot were Naked Britain – another prime time SkyOne series that took a lighthearted look at British attitudes to nudity. Nothing to do with science, but a good opportunity to hone the old interviewing skills.
[Coming soon: Dr Jack is now the resident neuroscientist on ITV1‘s flagship magazine show This Morning where he presents live monthly items for the brand new “Don’t Be A Slave To Your Brain” strand. Jack has also recently appeared in two Channel 4 documentaries presented respectively by Tom Dyckhoff (Secret Life of Buildings) and Tony Robinson (Superstition). THE TECH SHOW on Discovery Science is Jack’s first solo presenting gig to air across multiple continents; beaming out across Europe, Africa and the Middle East throughout 2011]