The concept of telepathy has traditionally been held to be the spontaneous ability for humans to communicate one person's thoughts and images directly to another person, sometimes in a two-way conversation, without the requirement for any technology. It has been a mainstay of science fiction and fantasy stories, and many charlatans, frauds, spiritualists and other hopefuls have claimed the ability too. However, extensive and repeated tests have in every case found that natural telepathy is not real.
Yet, the curtains appear to opening on a new stage for potential telepathy with the use of intermediate technology and "telepathy is now the subject of intense research at universities around the world"1. An important introduction is to look at what is already possible. People who are disabled and highly immobile have a few technological gadgets available to them which enable the control of computers and devices in order for them to communicate by measuring eye movements or nerve outputs. Also, neurologists use tools such as the electroencephalogram to measure general levels of activity in the brain but which are useless for capturing any specific thoughts, words or images. Even if we could sensibly detect others' brainwaves, the actual meaning would be lost to us as there is no consistent correlation between brainwaves and meanings from one person to another1 - so how might we develop the technology to allow actual person-to-person telepathic communication? Read on!
In my original text (1999), I thought that with technological help, we would be able to train our minds to communicate with computers whilst wearing scalp sensors, and that if one particular sensor technology became popular enough, we would end up with the situation where many people use the same brainwaves to communicate the same bits of information. In other words, we would intentionally train ourselves to talk the same language, electromagnetically. The next development would be that one human brain would be able to react and communicate via magnetic fields with nearby people who have undergone similar training/learning. I was wrong: we now know enough to state that this is impossible.
Even if all the neurones in the brain fired at the same time, the magnetic influence will still travel only a fraction of a millimeter from the head. Brain activity can be detected in general and in aggregate but all of the individual signals are not differentiated in the electromagnetic context. The general firings of single neurones are far too minute to be detectable by the most advanced and powerful scanners, let alone by another human1. The best readings with on-the-scalp electrodes still require patients to sit in a room specially sealed from the Earth's level of natural magnetic noise. Even if we could sensibly detect others' brainwaves in a precise way, the actual meaning would be lost to us as there is no consistent correlation between brainwaves and meanings from one person to another1. The obstacles preventing us from reading others' thoughts is not an issue of sensitivity training or of skill, but of basic physics.
“[Impulses] sent by a neuron can be picked up by a microelectrode within 100 micrometers of it in the cortex. These impulses have amplitudes of several millivolts (Brecht, et al. 2004). Electrodes on the surface of the brain are, however, too distant to detect them, but can record synchronized fluctuations of electrical potentials in masses of cortical neuronal dendrites. [...] These electrical potentials, like all moving electric charges, generate magnetic fields, which pass through the skull. They are, however, extremely weak (about 5x10-15 Tesla), some nine orders of magnitude less than the Earth's magnetic field and as much as six orders below ambient magnetic noise (Volegov, et al. 2004). [...]
(Retired) Prof. of Neurology, David C. Haas (2007)2
Electromagnetic signals from the brain start out very weak. Professor Kaku, a theoretical physicist, notes that the loss of signal strength is very pronounced. Those who have some physics knowledge might expect that the power drops by the square of the distance from the source, as with gravity - "but magnetic fields diminish much faster than the square of the distance. Most signals decrease by the cube or quartic of the distance"3.
[Thinking] includes repetitive firing of action potentials in neural circuitry containing millions of cortical and sub-cortical neurons (Hart 1993). Even if this immediate activity could be captured, seemingly insurmountable difficulties would prevent its translation into thoughts. To begin with, the translation would need to be simultaneous with the flow of thoughts as well as in the language of the thinker [... and] the neural patterns underlying any thought [is] unique for every individual. Thus, generic translations from neural patterns to verbal thoughts in any language would be impossible.”
(Retired) Prof. of Neurology, David C. Haas (2007)2
But slowly, these problems are being slightly overcome through increasingly clever gadgets, and one day we may well develop the advanced science and technology required to record a person's thoughts and transmit them to another person in a way that enables (in)direct communication on a whim.
The technology of reading our minds has started some time ago with the placing of electro-magnetically sensitive devices on and around the head. As the technology has matured, we have come to discern particular points of activity in the brain, which are called "voxels" - the pixels of brain detection.
It was a landmark and historic occasion, over two decades ago now, which saw a computer read a particular part of a patient's brain and respond with either a "positive" or "negative". The patient was first shown a ball on a screen which he must move to the top of the screen. He learns how to communicate "positive" to the electrode to move it up, and "negative" to move it down. After they changed the software so that he can select letters from the alphabet, the patient wrote a letter thanking the scientists and such for giving him an opportunity to talk.
“People have already learnt to control computers using their brain waves, read by electrodes on the scalp, or implanted electrodes into which neurones were encouraged to grow.”
New Scientist (1999)4
But such simplistic communication is not good enough - the training takes too long, and the granularity of the information is far too poor. Since then, progress has been made. A company called Emotiv has produced a prototype EEG that has only 18 electrodes (as opposed to as many as 120 in medical EEGs) and that requires no gel:
“Emotiv claims that its system can detect brain signals associated with facial expressions such as smiles and winks, different emotional states such as excitement and calmness, and even conscious thoughts such as the desire to move a particular object. [...] It seems to work well enough to make a virtual character in a game mimic a player's own facial expression, as well as permitting that player to move things around just by thinking about it.”
The problem is that we all think differently, often in different languages, and our brains are wired up differently. Many things are the same - the bits of our brains that light up when we see faces, or trees, or vertical lines, may well be the same from person to person. But the meanings and detail get recorded in our brains through our long lives of personal experience, and so there is no consistency between what "thoughts" look like electromagnetically. It does not matter how amazing our sensors get - we still need to train computers to understand what each individual's thoughts look like. The learning process (for the computer) is possibly as endless as the combinations of patterns and meanings in our brains. But scientists have been pushing in this direction to see how much can be achieved.
“Subjects would put on a helmet with EEG sensors and concentrate on certain pictures - say, the image of a car. The EEG signals were then recorded for each image and eventually a rudimentary dictionary of thought was created, with a one-to-one correspondence between a person´s thoughts and the EEG image. Then, when a person was shown a picture of another car, the computer would recognize the EEG pattern as being from a car. [...] This method can tell if you are thinking of a car or a house, but it cannot re-create an image of the car. [...]
[Researchers headed by Dr Gallant worked hard to translate such murky images into a video stream representing a person's thoughts whilst they watched a simple video.] Watching it was like viewing a movie with faces, animals, street scenes, and buildings through dark glasses. Although you could not see the details within each face or animal, you could clearly identify the kind of object you were seeing.”
The results are less clear when measuring general visuals generated by the imagination.
Electrocorticography and Open-Brain Surgery: With on-the-brain meshes containing electrocorticograms (which require open brain surgery first), more detail can be obtained because the sensors are even closer to the weak signals from the neurones. Doctors can tell what numbers patients are thinking of. With training, where the patient thinks of particular things and the computer records what the associated brain activity looks like, much more can be told and the computer can even tell what particular words are being thought of.7 But there is, as of yet, no way to record and understand "messages" from the brain that have not been previously "played" to the computer. The technology is not truly ready for any kind of spontaneous or free-flowing transmissions of thought.
With some specific improvements to our technology, telepathy can be made partially real. We can record and transmit recordings of our brain activity and have them send to another's mobile phone or tablet computer where they are displayed on the screen. We could tap our letters (slowly) and achieve communication using a worn hood full of electromagnetic sensors, slowly sending simple messages in an encoded language, or we could (if Emotiv's EEG hoods turn out to be practical) communicate general emotions, in particular facial expressions. There is no way for these messages to be beamed into another person's brain from the outside, however. Devices such as special glasses which display the messages seem to be a workable solution to that issue, but, such a solution isn't "telepathy".
We can do better. With open-brain surgery, we can place a mesh of sensors on the surface of the brain and obtain a much richer repertoire of information - general concepts, numbers and types of objects can be sent from one person to another, albeit without much detail. With the surgical implantation of electrodes, we can stimulate the brain. We probably don't want to put these into the genuine emotional or social parts of the brain, but we can develop and deploy technology to allow coded communication somewhere into a recipient's brain such as the optic nerve. Once we have a cycle - pure thoughts being sensed, transmitted, and then displayed "magically" for another person to understand, we will have achieved basic, slow and clunky telepathy.
Once this open-brain-surgery method becomes well-practiced and more popular, the true benefits and risks will present themselves. The transmission and reception of data will start off being a very close-range affair; but the possibility of using the Internet and/or the general radio spectrum as a medium for communication will be immediately tempting. Confidentiality and interception of our very thoughts will become a risk, and the integrity of the data could be modified by hackers. All of a sudden, we will realize we have opened up our very brains to the types of security risks that computers face. Imagine the horror once we realize that advertising companies now have a way of directly blasting us with images and messages about products, whenever they want!
So, progress needs to be careful. Luckily, no-one involved in this research thinks we are anyway near achieving any kind of result that is close to enabling anything close to telepathy, and, all researchers are pretty conversant with the risks involved.