As the sun sets over a desert riddled with enemy fighters, a solitary Special Forces officer slips on his helmet. He chooses his next move, and his thoughts are transmitted silently to the rest of his unit. As he picks up his weapon, his helmet reads his thoughts, directing an autonomous combat robot positioned nearby to cover his advance. A continuous data stream from his brainwave-monitoring helmet transmits feedback on his stress levels and the information load on his brain to his commanding officer. Thanks to a skin patch delivering drugs into the bloodstream, his stress levels are in check, and, despite sleeping for just two of the past 72 hours, he’s still thinking clearly. Besides, when he enrolled for duty he was screened for his ability to manage highly stressful situations, and his brain scans showed him to be a calm, fast-thinking risk-taker – the perfect soldier.
This scenario sounds futuristic, and to some extent it still is. But recent advances in neuroscience are bringing many of these concepts closer to reality, and prototypes of some of these systems are available now. While the 20th century saw a revolution in weaponry, military engagements of the 21st century could be transformed by a new focus on the brain.
Brain-scanning helmets, neural stimulation and drug enhancement are part of this revolution. “We’re on the threshold of all these things,” says Rod Flower, a biochemical pharmacologist at Queen Mary, University of London. “In the future, the scope for neuroscience to give a military advantage could be very large.”
In early 2012, Flower chaired a British Royal Society group that wrote a report on neuroscience’s potential for military uses. A few years earlier, the U.S. Army commissioned the U.S. National Research Council (NRC) to produce a report on the topic. Featuring predictions and hopes for the future, the report concluded that “a growing understanding of neuroscience offers huge scope for improving soldiers’ performance and effectiveness on the battlefield”. Right now, research groups around the world either run or funded by military organisations are conducting research and field trials on everything from drugs that boost reaction times to binoculars that tap into their wearer’s unconscious mind.
The potential of neuroscience to aid the military begins at the screening stage. Soldiers applying to join the Australian Special Air Service Regiment, for example, have to undergo basic checks of their psychological and physical fitness and then a three-week selection course during which their commanding officers assess their individual attributes – such as their leadership skills, their response to stress and their ability to work well in a small team. Only 16% of applicants are ultimately accepted. Brain scanning, such as functional magnetic resonance imaging (fMRI), could quickly and cheaply weed out those who are unlikely ever to make it, and pick up potential future problems in candidates who are selected – as well as identifying soldiers who are most likely to excel.
So what characteristics might these candidates possess? They need to be able to deal well with stress and to make instant decisions in complex, dangerous situations. Fighter jet pilots must monitor things like air speed and fuel availability while thinking about the location of their potential enemy and simultaneously firing missiles. Foot soldiers, meanwhile, may be faced with situations where they have to make decisions that can result in life or death – or at least the destruction of multimillion-dollar pieces of machinery.
When soldiers get this wrong and respond badly to stress, the results can be tragic – ranging from incidents of friendly fire to war crimes. The U.S. soldier who broke into three homes and killed 16 civilians, including women and children, in March 2012 in Kandahar, Afghanistan, was reportedly being treated for stress, suffering from severe nightmares, flashbacks of war scenes and persistent headaches after multiple combat tours in Iraq. In 2008, the RAND Corporation found that nearly 20% of military service members returning to the U.S. from Iraq and Afghanistan reported symptoms of post-traumatic stress disorder or major depression. In 2009, U.S. Army sergeant John Russell was charged with shooting five colleagues dead. He had completed a 15-month tour of Iraq and was being treated for stress before the attacks.
Finding ways to filter out people who don’t cope with the stresses of war would be invaluable for any military organisation, says Flower. “At the moment, it’s done really by a process of trial and error – you put soldiers, sailors and airmen through stressful situations to see how they behave and remove the ones you don’t think can cut it and perhaps give them another job,” for example as a peacekeeper in a civilian area, he says. “If there was a way you could do that simply with a half-hour test, that would save a lot of time and money as well as, hopefully, improving the success rate.”
There’s already progress towards this goal. Research by the Uniformed Services University in Bethesda, Maryland, on members of the U.S. Navy applying to become Navy SEALS has found that cadets less likely to complete the training are those whose scans reveal an intense reaction to stress by their hypothalamopituitary axis – part of the brain involved in the stress response. Meanwhile, Robert Ursano, a psychiatrist and neuroscientist also at the Uniformed Services University, suggests that a person’s total number of serotonin receptors might play a role in predicting their response to stress. Serotonin is a key neurotransmitter, or chemical that transfers signals between nerve cells, that is involved in the reward circuit.
Brain scans might also detect the fastest learners. Scott Grafton, a neuroimaging specialist at the University of California Santa Barbara, has found that fMRI can identify people with more flexible brain networks, and that this flexibility helps to predict how quickly they can learn a new task.
For some military roles – the Special Forces, for example – being a natural risk-taker is highly desirable. In an experiment published in the Journal of Neuroscience in 2008, David Zald, a psychologist at Vanderbilt University in Nashville, Tennessee, and colleagues scanned the brains of healthy people and found that those who were most novelty-seeking or risk-taking process dopamine in a different way to other people. They had fewer autoreceptors, which normally damp down the production of dopamine when levels are rising. Dopamine is responsible for the ‘high’ you feel after winning a race or skiing down a steep slope. The brains of people with fewer dopamine autoreceptors are more steeped in the chemical, predisposing them to keep taking risks and seeking out that high.
More research is needed before brain imaging can reliably be used to identify personality traits and screen soldiers for success, according to Paul Zak of Claremont Graduate University in Claremont, California, an economist turned neuroimaging specialist. But there’s significant potential here, he says, and not just as a way to speed up the selection process. Questionnaires used to assess risk-taking can be inaccurate, but it’s hard to fake data from a brain scan. “In selection situations, an individual may misrepresent themselves… making themselves more fit for a particular job or assignment than they really are. Measures such as dopamine autoreceptor levels cannot be faked.”
By about 2018, it should also be possible to predict how a potential recruit is likely to respond psychologically to environmental stresses like extreme heat or cold, say the authors of the NRC report. A quick brain-screening test to nut out a potential soldier’s response to the myriad of conditions they may face in war would be a valuable tool, says Flower.
Neuroscience could also speed up and intensify training of recruits, picking out those likely to excel in a particular area. Compared with novices, expert marksmen, for example, show increased alpha-wave activity in the left temporal regions of their brains and less communication between different parts of the brain, according to the Royal Society report. Both of these factors could, in theory, be assessed using electroencephalography (EEG), which measures electrical activity in the brain.
One of the most potentially important applications of neuroscience in training is in the field of direct brain stimulation. Transcranial direct current stimulation – known as tDCS – is a technique that involves passing weak electrical currents through the skull via electrodes attached to the head. Vince Clark, a neuroscientist at the University of New Mexico, Albuquerque, leads research on tDCS that has achieved some astonishing results.
In work published in the journal Neuroimage in January 2012, Clark and his colleagues asked volunteers to learn the location of hidden targets in a virtual reality environment. Brain imaging showed that two regions of the brain received input during this learning process. The researchers then used electrodes to pass two milliamps of current over the right temple of a further group of volunteers to stimulate these two regions while this second group learned the location of the targets in the virtual reality environment. Clark’s team found that the second group performed twice as well as those who didn’t get the tDCS. “When combined with imaging techniques, tDCS may prove to be the much sought-after tool to enhance learning in a military context,” the Royal Society report states.
Two milliamps is tiny – just one 500th of the amount needed to power a 100-watt bulb. So exactly how and why does this spark an improvement in learning? Clark thinks it probably works in two ways. First, tDCS increases levels of the neurotransmitter glutamate – thought to increase brain ‘plasticity’ – which in turn facilitates learning. Also, it may alter brain networks to focus perception and cognition on the task at hand. “You learn more when you are focussed on the information you’re trying to learn,” says Clark, who is set to become the director of a new centre being built at the University of New Mexico that will study the effects of electromagnetic stimulation on the brain. “In one sense, tDCS is like a new form of drug. You take it and it changes your behaviour and how you think. There are a lot of advantages over most drugs, though. It doesn’t circulate through your body, so it won’t affect other organs that many drugs can damage. It’s not addictive. If there’s any problem, you can turn it off in a few seconds. It’s also cheap and portable.”
If tDCS can make people generally smarter, as Clark thinks, it may help not only in facilitating success in war, but in avoiding it in the first place. “Smarter societies might have an easier time finding options other than war to work out their differences – if that is what they want,” he says. But the military potential is clear.
Of course, drugs can also boost a soldier’s performance. Sleep is one of the first casualties of prolonged combat and one of the prime research areas is on ‘wakefulness enhancers’. These are drugs that allow severely sleep-deprived people to work well, without the usual detriments to mental functioning that may lead to potentially disastrous errors. Modafinil, discovered by French scientists in the 1970s, does exactly this (though researchers don’t know precisely how), and seems to have few side effects. Various studies have found that modafinil, which enhances memory and concentration, also prevents the mental deterioration associated with a lack of sleep, improving people’s reaction times and their ability to recognise patterns.
The U.S. Air Force has investigated modafinil, while Britain’s Ministry of Defence has also reportedly commissioned research into the potential of the drug in combat situations. It’s not clear to what extent modafinil may already be used in military situations, says Flower. But “there is an enormous amount of modafinil produced every year – far more than can possibly be accounted for by prescriptions for ADHD (attention deficit hyperactivity disorder), which is the clinical usage of it,” he points out.
Another drug that has provoked strong interest is one naturally produced by the human body – oxytocin. Nicknamed the ‘moral molecule’ – or “the ‘cuddle hormone’, as it is rather nauseatingly called,” Flower says, oxytocin is produced when a woman breastfeeds her baby, during childbirth, and during orgasm. It seems to promote bonding between people.
Research has shown that giving oxytocin to people placed into competitive groups promotes trust and cooperation between members of the group, while provoking defensive, but not offensive, aggression towards the other groups. Oxytocin could increase social cohesion within a military unit, enhancing the high levels of trust and cooperation needed during combat, according to the authors of the British report.
This could be particularly useful in highly stressful situations, which inhibit the release of oxytocin. “Stress in battle is high, so empathy is low,” says Zak, who pioneered research on oxytocin. As Zak explains, “when the battle is over, soldiers engage in ‘humanising’ behaviours, such as hugs and back rubs to reduce stress and reconnect with others,” stimulating the production of oxytocin.
Soldiers have used drugs to boost their performance since at least World War II, when amphetamine use among German soldiers was reportedly widespread. Fighter pilots in the Korean War of 1950 to 1953 used amphetamines to keep them alert on long sorties. There are ethical concerns in offering such drugs to soldiers, however. They may face strong peer pressure to take the drugs, or blame them if things go awry, says Flower. In 2002, two American pilots who killed four Canadian soldiers in Afghanistan in a case of friendly fire had taken amphetamines about an hour before the incident, an inquiry found. According to the pilots, the U.S. Air Force had given them the drugs, which, they said, was then routine practice.
“Incidents where stimulant use may have led to friendly fire accidents might be a reason why many militaries are moving towards modafinil rather than amphetamines as ‘go pills’,” says Anders Sandberg, a philosopher at the Future of Humanity Institute at the University of Oxford in Britain. “Scheduling proper sleep would be better, but that’s unlikely to work in battle situations.”
Some of the most fascination neuroscience techniques aimed at enhancing soldiers’ performances don’t involve changing their brains, however, but simply monitoring them. To some extent, brain imagers may even boost the performance of military personnel without their conscious knowledge.
One crucial task for any military effort is to scour images to spot changes. Analysts scan satellite photographs and highlight shifts in missile encampments or new ammunition stockpiles, for example. While this work is vital, it’s also mentally taxing, and minor changes are easily missed. At least, it’s easy to miss these miniscule changes consciously, because your brain simply doesn’t bother your conscious awareness with small and apparently insignificant components of a scene. But what if you could identify these kinds of changes by tapping into a target spotter’s unconscious?
The U.S. Defence Advanced Research Projects Agency (DARPA) has tested the idea, using EEG caps to monitor analysts’ brainwaves. Analysts were shown a rapid series of images. If the image contained something that they were looking for, they produced what’s called a P300 brainwave, indicating their brain had picked out the presence of something interesting in the image before they were even consciously aware of it.
DARPA’s prototype system takes any image that provokes a P300 response and instantly files it for more thorough checking. The agency estimates that this kind of system, tapping into an analyst’s unconscious mind, can triple the detection of potentially interesting targets, according to the Royal Society report. Flower says he expects a lot more work into the military potential of subconscious knowledge in the future. DARPA would like smart binoculars that combine brainwave detection with advanced optics to improve a soldier’s ability to spot a threat, for instance. And there will undoubtedly be many more examples of the military use of brain-scanning caps or helmets, not only to improve performance but to connect directly with the wearer’s brain, Flower says.
Such helmets could monitor brain activity to flag any soldier showing signs of fatigue or information overload (prototypes of these made by U.S. technology company Honeywell in Morristown, New Jersey, exist right now). Use such a helmet to link a soldier to a piece of hardware, like a robot or a weapon, and you have the true beginnings of a cyborg.
Given the human brain can process images much faster than it’s consciously aware of, a weapons system with neural interfaces could provide significant advantages over other methods of controlling the release of weapons (such as physically pushing a button, for example), in terms of speed and accuracy.
This kind of system would build on research into the control of artificial limbs for people with disabilities, and the way gamers navigate through their virtual worlds. Games that involve using an EEG cap to move a ball through a maze, for example, are becoming increasingly popular. “They are not terribly sophisticated,” says Flower, “But once the games industry gets hold of an idea like this, because of the amount of cash involved, they can take it forward very quickly. I wouldn’t be surprised if something much more sophisticated emerges over the next few years.”
DARPA, for one, has made its desires public. In the short-to-medium term, it envisages brain–computer interfaces as sensory enhancers. A magnetic implant in a soldier’s fingertip could respond to data from an infrared sensor to allow them to ‘feel’ their way in the dark, for instance.
DARPA also envisions a device that would interpret signals from the brain to control an avatar moving through a virtual environment – or one that would allow the remote control of a drone aircraft or a robot or gun by thought alone. This may be futuristic now, but neurobiologist Miguel Nicolelis of Duke University, Atlanta, has used brain implants to allow monkeys to control a robot walking on a treadmill, and to ‘feel’ virtual objects using a virtual arm controlled by their brain.
Again, ethical concerns become paramount. If a thought-controlled drone dropped a weapon that killed civilians by accident, would the operator or the machine be responsible? “The problem is a diffusion of responsibility – and that people might do impulsive things,” says Sandberg.
When you act, your intention to do so proceeds through different brain regions until it turns into a pattern that actually makes your muscles move. Various brain systems can put in a veto at different steps – which is why we don’t act on all the random intentions we have all the time, and why people do stupid things when they’re drunk. “If the action is picked up at an early stage, there is less chance to veto it – the machine will start implementing it quickly and I might be unable to stop it,” says Sandberg. “This makes a brain–computer interface controller riskier than a joystick from a moral standpoint.”
These kinds of complex ethical considerations will need to be worked out, says Flower. But the potential for neuroscience and related technology to transform the soldiers of the future seems clear. Even the ability to keep soldiers awake and mentally strong would be a huge advantage, he says.
These kinds of developments may also be essential as military personnel numbers drop but technology moves forward. In many countries around the world, armed forces are shrinking. “This means the people we have are going to have to perform at a higher level,” says Flower. “In the future, the potential for neuroscience to grant a military advantage will become much more important.”