brainpower_big.jpg The “racial intelligence” bulldozer that James Watson set rolling a couple of weeks back is still whirring in our backyards as the new row over superior Jewish IQ scores is begging media attention.

This time the perpetrator is the Bell Curve fame Charles Murray and his American Enterprise Institute. I am not going into much detail about it. You can read the whole thing here.

The book ‘Bell Curve’ by Charles Murray and Richard J. Herrnstein once had the fate of being the most celebrated and at the same time, the most shunned book dealing with sociology of intelligence. The book set the scene for serious debates on the way IQ tests were used as real measurements of peoples’ intelligence.
The consensus regarding the average intelligence of various races was popularized to be as follows: Ashkenazim Jews 115; East Asians 105; Whites 100; American Indians and most Hispanics 90; African Americans 85; and sub-Saharan Africans 70. By the same order, accumulation of wealth, the level of income, achieved academic success etc., were also popularized to follow closely with the average IQs of these six racial groupings.

What do we actually know about Intelligence Quotient?

The results of large studies and their meta-analyses can be filtered down to the following concrete points:

  • IQ tests aspire to assess only 4 areas of human intelligence:

Verbal comprehension, Processing Speed, Working (arithmetic) memory and Perceptual organization (visuo-spacial).

  • People who are good at one area of intelligence (defined by the test) tend to be good at other areas too (+ve correlation).
  • IQ tests are always constrained by the cultural, linguistic, socio-economic and other contextual biases.
  • IQ tests don’t assess attributes like

Creativity, Personality, Practical sense, Social sensitivity, Leadership and Altruism.

  • IQ tests can be consistently and precisely interpreted only when the scores are very low or very high.
  • General Intelligence or ‘g’, as measured by IQ scores, is an attribute of the psyche (the software), rather than any macroscopic characteristic of the brain (the hardware).
  • IQ test scores are positively correlated to genetic make up as evident from large twin studies.
  • The effects on IQ scores thought to be caused by the environmental factors tend to disappear as the test-takers enter adulthood and further into old age.
  • Even then, there is no definitive evidence yet to link higher IQ test scores to brain size, or any particular genotype.
  • Training for IQ tests can certainly improve IQ scores.

Genetic basis for intelligence

The animal Brain plays the role of just the hardware on which the software called “mind” runs. Stretching that metaphor a little points out that there need not be a one-to-one correspondence between the hardware and the software for every behavioral trait. And studies have only shown IQ to be genetically linked; they have NOT shown that intelligence is solely a genetic quality. Having the right environment is extremely essential for the full expression of such “intelligence” genes (if there are any).
The examples of higher average IQ scores in certain human races may hence be more cultural rather than biological. Cultures that pressurize their children to achieve more in academics tend to produce better IQ scores. Just like average human height has increased over generations due to better nutrition, expression of intelligence too can hopefully improve in encouraging environs.

Is intelligence an essential survival quality?

It may be difficult to swallow without salt but it’s true: Intelligence does not confer any significant evolutionary survival advantage to any species.

Survival in the natural world, in a reductionistic sense means advantage in numbers, disease-free & adaptable gene pool and flexibility with nature. Social achievement and academic profiles, which are direct correlates of high IQ scores, have nothing to do with gene propagation and adaptability, even though various human races emphasize on these aspects more and more after every generation.

If we extrapolate the selfish gene concept to the macro evolutionary scenario, we find that the mind, consciousness, intelligence and all the emergent behavioral patterns are just a few of the myriad ways of “selfish” genes to propagate themselves.

This “rule” becomes clearer when one steps further backwards and look at the picture of evolution of life on planet earth as a whole.





The past week saw the greatest controversy in a year ignited by the seemingly racist comments of the 79 year old Nobel Laureate James Watson.

Watson who shared his Nobel with Francis Crick, and Maurice Wilkins for the discovery of the Double-helical structure of DNA, had always been at the helm of controversies mostly b’coz of his fascination for Eugenics.

The latest of his comments on intelligence and race came in an interview in Sunday Times on Oct 14.

He says that he is “inherently gloomy about the prospect of Africa” because “all our social policies are based on the fact that their intelligence is the same as ours – whereas all the testing says not really” and I know that this “hot potato” is going to be difficult to address. His hope is that everyone is equal, but he counters that “people who have to deal with black employees find this not true”.

He adds that you should not discriminate on the basis of colour, because “there are many people of colour who are very talented, but don’t promote them when they haven’t succeeded at the lower level”. He writes that “there is no firm reason to anticipate that the intellectual capacities of peoples geographically separated in their evolution should prove to have evolved identically. Our wanting to reserve equal powers of reason as some universal heritage of humanity will not be enough to make it so”.

Watson was quick to apologize for his biased and baseless comments once the interview became controversial and his credibility challenged not just by the general public, but the scientific community as well.

He said …”to all those who have drawn the inference from my words that Africa, as a continent, is somehow genetically inferior, I can only apologize unreservedly. That is not what I meant. More importantly from my point of view, there is no scientific basis for such a belief.”

But whatever he said now, the damage has been done. Watson’s claims on “evidence” regarding intellectual levels of races have given the right wing neo-conservatives a chance to reiterate their anti-reservationist, anti-pluralistic demands toppling political balance everywhere.

Watson knows very well that there aren’t any statistically significant evidence to suggest that there are genetic differences in intelligence among human races identified in studies yet. In fact the last part of what is quoted as his statement (about the possible differences in the evolution of intellectual capacities of peoples geographically separated) may be quite plausible, but reading Watson in the context that he himself has set up through his eugenic rantings in his “post Nobel” years reveals a “prejudiced” man.

The funniest part is in two simple but powerful facts of neurobiology and cognitive science:

  1. there are as yet no IQ tests that are perfectly culture-free and context-independent which can be administered universally to all sects of Homosapiens
  2. no genes identified yet have been directly linked to any kind of animal intelligence.

Without satisfying these two ideal conditions, it is scientifically impossible to extract objective evidence regarding intellectual inferiority and superiority at the genetic level.

Page 3 Tag: Watson’s own reputation had been under question for long when new revelations about the role of Ms. Rosalind Franklin in the ‘DNA helix discovery’ and how her data was used in the discovery without giving her any credits came out. Story goes that Ms. Franklin was ungratefully excluded from the Nobel winning work after Watson and team “stole” the data from Ms. Franklin. Ever since the allegations came up, Watson had been attacking Ms. Franklin (even after her death) passing comments on her being intellectual but autistic and so on.

The Sunday Times interview is here.

See also :

Intelligence – Myths and reality

Evolutionary Socialism – a cause for affirmative action.


Sex differences in cognition and behavior–such as increased aggression in males–are usually thought to involve hormones, which can “masculinize” or “feminize” a brain temporarily or permanently. But now, a mouse study shows that some sex-linked genes don’t need hormones to shape male and female behavior.


The Y chromosome in males have been identified to contain the gene SRY (Sex determining Region Y ) that determines the formation of testicles long back, and by early ’90s scientists had learned how to breed mice whose genes and hormones function independently. Since human SRY is similar to SRY of mice, a model of SRY function has been developed in mice.

By knocking out the testes-determining SRY gene, on the Y chromosome, researchers made XY mice that churn out estrogen; and by adding SRY to females, they produce XX mice that manufacture male hormones.

With the help of such mice, it was shown that genes unrelated to hormone production also played an independent role in aggression and nurturing behaviors. This was a new revelation because, until then it was believed that only the hormones determined such behaviours.

A team led by neuroscientist Jane Taylor of Yale University was interested in habit-forming behaviors in which gender differences also have been documented. She and her colleagues trained these mice, as well as normal male and female mice, to poke their noses through one of three holes in order to obtain a food pellet. Then, some of the mice were subjected to “conditioned taste aversion”. After eating the food, they were injected with a chemical that made them sick (something like what we use in alcoholic patients to help them quit drinking.) Ordinarily, mice will quickly learn to avoid the food, but they will still eat it if they have developed an automatic habit. That happened more often for the XX mice regardless of whether they produced male or female hormones.Thus, they say, the sex difference must have something to do with genes that are not involved in the production of sex hormones.

Neurobiologist Lawrence Cahill of the University of California, Irvine, says that the study “relates very well to established sex differences in the acquisition of addictive habits.” For example, women progress from casual drug-taking to a drug habit faster than men do–a phenomenon some have attributed to hormones. Taylor says that the work also implies that women can be good multitaskers–by quickly forming habits that leave their higher brain functions free for other chores.

Blogger’s Post Script: The last piece about the study implying that women can be good multitaskers seems to be a hasty extrapolation to me, as the quoted study doesn’t provide results of any of that sort, although estrogen has been linked to many skilled activities and greater attention span. It is also important to keep in mind that the precise mechanisms of sex differentiation are still unknown and that gender differentiation is accomplished through a cascade of gene activations. Further factors are involved, then, before as well as after the SRY expression – for example the WT-1, SF-1, DAX-1, SOX-9 genes. (I’m not being chauvinistic here – just bringing to notice a bad practice in science reporting, that’s all.)

Report in Nature Neuroscience. (21 October).

News modified from Sciencenow.


And it goes to :
Mario R. Capecchi, Martin J. Evans and Oliver Smithies for their discoveries of “principles for introducing specific gene modifications in mice by the use of embryonic stem cells”

Mario R. Capecchi, born 1937 in Italy, US citizen, PhD in Biophysics 1967, Harvard University, Cambridge, MA, USA. Howard Hughes Medical Institute Investigator and Distinguished Professor of Human Genetics and Biology at the University of Utah, Salt Lake City, UT, USA.

Sir Martin J. Evans, born 1941 in Great Britain, British citizen, PhD in Anatomy and Embryology 1969, University College, London, UK. Director of the School of Biosciences and Professor of Mammalian Genetics, Cardiff University, UK.

Oliver Smithies, born 1925 in Great Britain, US citizen, PhD in Biochemistry 1951, Oxford University, UK. Excellence Professor of Pathology and Laboratory Medicine, University of North Carolina at Chapel Hill, NC, USA.

Their discoveries led to the creation of an immensely powerful technology referred to as gene targeting in mice . It is now being applied to virtually all areas of biomedicine – from basic research to the development of new therapies.

Our DNA is packaged in chromosomes, which occur in pairs – one inherited from the father and one from the mother. Exchange of DNA sequences within such chromosome pairs increases genetic variation in the population and occurs by a process called homologous recombination.
Mario R. Capecchi demonstrated that homologous recombination could take place between introduced DNA and the chromosomes in mammalian cells. He showed that defective genes could be repaired by homologous recombination with the incoming DNA.

Oliver Smithies who worked on Blood diseases, initially tried to repair mutated genes in human cells by correcting the disease-causing mutations in bone marrow stem cells. (Bone marrow stem cells give rise to all blood cells.) In these attempts Smithies discovered that endogenous genes could be targeted and modified by homologous recombination .

The cell types initially studied by Capecchi and Smithies could not be used to create gene-targeted animals. This required another type of cell, one which could give rise to germ cells. Only then could the DNA modifications be passed on from the parent cell to the daughter cells.

Martin Evans worked with the technology of modifying Embryonic Stem cells from mouse cells genetically and for this purpose chose retroviruses. Retroviruses have the machinery to integrate their genes into the chromosome of cells they infect.
He demonstrated transfer of such retro viral DNA from Embryonic Stem cells, into the mouse germ line. Evans also applied gene targeting to develop mouse models for human diseases. He developed several models for the inherited human disease cystic fibrosis and has used these models to study disease mechanisms and to test the effects of gene therapy.

Capecchi and Smithies had demonstrated that genes could be targeted by homologous recombination in cultured cells, and Evans had contributed the necessary vehicle to the mouse germ line – the ES-cells. The next step was to combine the two.

A “KNOCK-OUT ” mouse is one in which a certain gene has been selectively inactivated. The inactivation is achieved byhomologous recombination of the mice embryonic cells with a small segment of genetic material we artificially insert into it using retroviruses.

How do we benefit?

The technology opens the opportunities to selectively shut-up mutated genes that are known to cause diseases in mammals. This helps us to study what exactly is the function of the “disease gene”. Gene targeting has helped us understand the roles of many hundreds of genes in mammalian fetal development by creating mouse models for human diseases in labs.

Gene targeting has already produced more than five hundred different mouse models of human disorders, including cardiovascular and neuro-degenerative diseases, diabetes and cancer.
Mario R. CapecchiMartin J EvansOliver Smithies

Source: nobelprize.org

Overzealous Science Journalism.

Scientists always keep complaining that the public doesn’t understand science. Yet university and lab newsroom reports of latest research claiming of “break-through” are becoming glaring examples of how sober facts of science can finally be contorted into flashy news totally detached from reality. Quite often it is seen that the researches themselves indulge in this unbecoming activity as they prepare reports about their research for the layperson. May be it’s their craving for the limelight or may be it’s the pressure from the funding groups or university themselves.

Overzealous science reporting often exhibits two kind of basic flaws: first, where the interpretation of the research findings and their extrapolation are themselves far fetched; second, where the journalist’s understanding of the research data is imperfect.

The chief problem cropping up when researchers report their own study is that they hardly bother to contain their explanations within the limits of their research data. Instead, there is a more-than-needed emphasis on the broader implications of the new study and why the study is “so important”. This leads to unrealistic extrapolation of research data – a menacing issue especially in the field of social psychology and behavioral sciences.

Take for example the story from the Beckman institute, University of Illinois regarding the re-running of the famous Duncker’s Fortress/Tumor Problem : “Researchers Find Eye Movement Can Affect Problem-solving, Cognition.”


They report in the current (Aug., 2007) issue of Psychonomic Bulletin and Review that by occasionally guiding the eye movements of participants with a tracking task unrelated to the problem, they were able to “substantially affect their chances of problem-solving success” to the point where those groups outperformed every control group at solving the problem. These results, they conclude, demonstrate that “it is now clear that not only do eye movements reflect what we are thinking, they can also influence how we think”.

A quick run through the original paper will tell us that such generalizations were too hasty, while even the answer to the basic question of whether the problem-solvers really used the visual clues offered by eye-tracking, remains elusive yet. Remember that even with very explicit visual and analogical cognitive clues, Gick and Holyoak had not produced satisfactory results in the 1983 ‘modified re-run’ of Duncker’s original Fortress/Tumor Problem experiment.

The trends in sociobiology are much more deplorable than this. Darwinian principles in evolutionary psychology and sociobiology have almost (or at least in popular appeal) become synonymous with genetic determinism. The world is trained to ask “Did my genes make me do that?” and the media promptly replies quoting a new research: “Blame it on your genes, baby!” And then there is the new idea of “neuro-marketing”, where the detection of a flurry of regional brain activity in an fMRI on seeing a brand is “branded” as “BRAND PREFERENCE”.

Newsroom boys found it less exciting to term it “learned preference” which had been the more appropriate explanation for the behavior any way.

There are more instances of such absurd reporting in other realms of science. The recent “Soliton theory of Nerve impulse conduction” of the Copenhagen University researchers and the “Faster than Light Signal transmission Experiment” of the NEC research institute group in Princeton are reasonably good experimental designs whose results were totally contorted to look outlandish.

The Soliton Theory of Nerve Conduction was revolutionary on one account: it suggested a broader perspective into which the established ionic conduction theories could be viewed in. All it sought was to answer the perplexing age-old question – ‘why isn’t there much resistance heat generated in the nerves as a result of electrical conduction?’ Whatever the scientific plausibility of the findings, it wasn’t even close to anything the news headline shouted: nerves use sound, not electricity!”

Similar is the story of the recent claims of Sending electromagnetic impulses at supraluminal velocities. The article as usual came in popular media with roaring headlines such as “Speed of light barrier broken” and “Time travel becoming a reality”. Astonishingly unscientific claims like” Relativity being questioned” kind of interpretations were also not uncommon.
What the NEC Research Institute group did really was that they created an anomalous dispersive medium and achieved propagation of a pulse at velocities above that of light in vacuum thru early re-phasing of the component waves. Only the group velocity got faster than light, while the phase velocity remained unaffected. There is nothing NEW about this as far as relativity or time travel is concerned, notwithstanding the brilliance of the experimental setup.

Science reporting, like any other reporting, is fast yielding to sensationalism. Flashy headlines, digressions peppered with quotes from veterans, unrealistic extrapolation of lab data, unnecessary links to science fiction and fantasies, hasty generalizations, overemphasis on genetic determinism are tricks being regularly used at the newsrooms to ensnare the uninformed reader. And what finally happens is the spread of half truths that can finally boomerang on the scientific community itself.

A recently published Florida State University study is pointing at the evolutionary psychology of attractive faces.

The paper, “Can’t Take My Eyes Off You: Attentional Adhesion to Mates and Rivals,” by Jon Maner, an assistant professor of psychology at FSU, is one of the first to show how strongly, quickly and automatically we are attuned to attractive people, he said. FSU graduate students Matthew Gailliot, D. Aaron Rouby and Saul Miller co-authored the study.

In a series of three experiments, Maner and his colleagues found that the study participants, all heterosexual men and women, fixated on highly attractive people within the first half of a second of seeing them. Single folks ogled the opposite sex, of course, but those in committed relationships also checked people out, with one major difference: They were more interested in beautiful people of the same sex.

If we’re interested in finding a mate, our attention gets quickly and automatically stuck on attractive members of the opposite sex, but if we’re jealous and worried about our partner cheating on us, attention gets quickly and automatically stuck on attractive people of our own sex because they are our competitors.” Maner said

Maner’s research is based on the idea that, through processes of biological evolution, our brains have been designed to strongly and automatically latch on to signs of physical attractiveness in others in order to both find a mate and guard him or her from potential competitors.

“These kinds of attentional biases can occur completely outside of our conscious awareness,” he said.

The insecurities of romance ?

Biology or not, this phenomenon is fraught with potential romantic peril. For example, even some people in committed relationships had difficulty pulling their attention away from images of attractive people of the opposite sex. And fixating on images of perceived romantic rivals could contribute to feelings of insecurity.!

Modern technology has enhanced these pitfalls. Although there are people of striking beauty in real life, singer Frankie Valli’s pronouncement that “you’re just too good to be true” may be the case when it comes to images in movies and magazines or on the Internet.

“It may be helpful to try to minimize our exposure to these images that have probably been ‘doctored,’” Maner said. “We should pay attention to all of the regular-looking people out in the world so that we have an appropriate standard of physical beauty. This is important because too much attention to ultra-attractive people can damage self-esteem as well as satisfaction with a current romantic partner.”

“Women paid just as much attention to men as men did to women,” he said. “I was also surprised that jealous men paid so much attention to attractive men. Men tend to worry more about other men being more dominant, funny or charismatic than they are. But when it comes to concerns about infidelity, men are very attentive to highly attractive guys because presumably their wives or girlfriends may be too” Maner said.

sources: FSU news room; http://content.apa.org/journals/psp

Gravity, Relativity and Quantum experiences: searching for a consensus.

“Quantum Mechanics is very impressive. But an inner voice tells me that it is not yet the real thing. The theory produces a good deal but hardly brings us closer to the secret of the Old One…I am at all events convinced that He does not play dice.”

– Albert Einstein
(in a reply to one of Max Born’s Letters in 1926)

Every time the word ‘quantum’ is used to imply some surrealistic idea, I quiver with apprehension. So much has been said about the so called ‘uncertainty’ lying deep within it, the ‘nonlocal’ connivance of the non-classical world and the weird notions of ‘probability’ and ‘wave function collapse’ that the shroud of mystery around the theory thickens by the moment.

There are countless half-boiled hypotheses that claim to link animal consciousness and the concept of ‘soul’ to principles of quantum physics run across the information highway and a casual mind is easily attracted to one or the other.

Despite being the most validated theory of all physics, Quantum Mechanics is still viewed by many as something that essentially needs adjustments in order for it to conform to our common-sense view of the classical world. But there is this larger majority of physicists and cosmologists out there who are convinced that it’s our classical world-view that needs revision – the world wide struggle for a unified theory of the forces tell the tale. Roger Penrose, celebrated mathematician and theoretical physicist, belongs to the first group. He suspects that the cause of failure in unifying the theories lies in a perspective difference.

QM and Relativity

There are four fundamental forces in the universe: electromagnetism; the strong and the weak nuclear forces, and finally gravity. General Relativity attributes gravity to the effect of matter on space-time fabric. Imagine a stretched sheet of rubber to represent the fabric of spacetime and a few iron balls placed on it to represent stars and planets and other massive celestial bodies. The dents made on the rubber sheet by the iron balls can be regarded as the geometric alterations caused by the presence of matter in spacetime. A smaller iron ball when set to roll over this sheet, moves uniformly forward until it falls into one of the deeper dents caused by larger iron balls than itself. Analogously, the orbits of celestial bodies are due to a curved or bent space surrounding the larger body. This effect is ‘gravity’, says Einstein.
Quantum Mechanics (QM) and Quantum Field Theory (QFT) have been able to explain all the fundamental forces except Gravity in terms of particle interactions; the Standard Quantum model of fundamental forces considers gravity as an attractive force mediated by the exchange of gravitons.

“Although this tension between relativity and quantum mechanics may be mostly dormant at the energies that are currently experimentally accessible, there are situations where the interaction between the matter fields and quantum fields and the gravitational field becomes relevant. For example, physically realistic models of the universe predict an initial singularity. At this singularity, classical physics breaks down and it is assumed that a quantum theory of gravity, i.e. a theory combining general relativity with quantum physics, will be necessary to probe the physics of the early universe.”

Nevertheless, both Quantum Mechanics and Relativity give excellent testable predictions and are widely accepted as useful but unproven and ‘incomplete’ models of some deeper reality.

Penrose’s views

Penrose derives his inspiration from Einstein, who believed that a theory incorporating the relativistic nature of gravity and the non-classical nature of Quantum world data would be possible only by correcting Quantum mechanical notions rather than Relativity. Einstein was definitely disturbed by the anti-relativistic findings arising in the quantum scheme of things and the indeterminacy popularly called Heisenberg’s uncertainty principle.
But Penrose is more bothered with two things about QM; in fact the first one takes root from the second:

The first is about the ability of a quantum particle to be simultaneously present at two different locations, even though larger chunks of matter don’t seem to do that despite being made up of the very same quantum particles. Simply speaking, why is it that an electron, in the famous double slit experiment, appears to be present simultaneously at two places, while a person or a chair does not appear to do that? The second is about taking the ‘mathematical’ process of wave function collapse (‘State-vector reduction’) for real.

Collapsing Wave functions: real or mathematical?

A photon is emitted from a source in the direction of a receiver. On its way a half silvered mirror is kept. In a non-mathematical language, we can say that the probability of the photon passing through the half-silvered mirror and hitting the receiver is 50% and the probability of it reflecting off the mirror is also 50%. But going through the real mathematical representation denoting the states of the photon, a subtle but relevant problem is revealed: The probabilities of whether the photon hits the receiver or not spring up only after the photon encounters the mirror. Before hitting the mirror, nothing can be said about the route of the photon in terms of probabilities; the photon is said to be in a combination of states – “it will pass through the mirror” and “it will bounce off the mirror”. (Other situations can also be imagined up, where more than just these two states could exist.).
This kind of “combination states” are not just products of theoretical experiments, they have been demonstrated for real in countless occasions ; the most intriguing being that of the double slit experiment, where a single photon is seen to pass thru two adjacent slits simultaneously!

The real fate is decided after the interaction of the photon with the mirror. What is “spooky” about this interaction is that only after this can we calculate the probable fates mathematically. The generally accepted interpretation of this scenario is that the mirror represents a part of the experimental apparatus and the interaction of the photon with the mirror is equivalent to a “measurement process” which causes the split-up states (fates) of the photon to abruptly collapse into a single state. Before “measurement” there is only “it will pass through the mirror” and “it will bounce off the mirror”. After measurement there is either “it will pass through the mirror” or “it will bounce off the mirror”. The interaction of the photon with the mirror resulting in collapse of split-states into a single state is termed “decoherence”.

This view, called the Copenhagen Interpretation or formerly, the Bohr-Heisenberg interpretation, (after its most famous patrons Neils Bohr and Werner Heisenberg) is more of a colloquial representation of a mathematical statement. Bohr himself had suggested on many occasions that physical properties can be meaningfully ascribed to the object only in relation to some actual experimental results. He also held that the quantum scheme of wave equations is a mere symbolic representation that is useful for making predictions and it doesn’t directly depict any aspect of reality whatsoever. Note that this is starkly different from the widely held misconception that Copenhagen Interpretation demands a conscious observer to perform the act of measurement for the “mysterious” collapse of states to occur (the “collapse” part was actually a later addition by John von Neumann)

Penrose acknowledges that his approach is that of a realist – one who maintains that all physical theories worded in mathematics correspond to some aspect of objective reality out there, however small the accuracies of experimental observations be.
The standard interpretations of the quantum experiments do not say what exactly happens during decoherence; not even in theory. Penrose finds “decoherence”, a real process where relativistic gravity could come into action. Also, he believes that the experimental evidences of finding the same quantum particle (electrons, photons etc) at two different places should be a real phenomenon – something which happens out there.

His approach can be summarized in a very simplified form as below:

General Relativity proposes that if a piece of matter exists in a region of spacetime for a sufficiently long duration, the geometry of that region of spacetime is altered accordingly and gravity is a result of this alteration of spacetime (recall the iron balls and rubber sheet analogy).

It follows therefore that if an electron or any quantum particle really exists simultaneously in two regions of spacetime, then each of the duplicates should possess mass and therefore alter the geometry of its regional spacetime, resulting in gravitational fields of their own.

But to keep one duplicate of the particle away from the gravitational influence of the other requires energy. The interacting gravitational fields destabilize the split-states causing them to “decay” (collapse) into one or the other classical alternative.

If the whole system is left totally undisturbed by other environmental factors, then the split-up (duplicated) state of the quantum particle will remain stable for a time period inversely proportional to the energy needed to prevent the gravitationally induced decay.
This decay process is conveniently named Objective Reduction (OR for short), meaning that the process is not an artifact of the ‘act of observation’ or the ‘experimental apparatus’. Thus gravity is nicely woven into the scheme of quantum experiences.

As we can see, to take the ‘wave-collapse’ concept in the literal sense of the word or as a symbolic representation of reality is a matter of choice, so long as we can generate mathematical expressions that predict experimental outcomes. And for a physical theory, that’s what matters the most.

The important thing to note is that, as Penrose pointed out in several later interviews and lectures, in most occasions, mass movements in the environment itself result in (gravitation induced) collapse of states. The decay-time relationship with gravitational self-energy works perfectly well only in situations where the particle under observation is kept isolated from all environmental influences.
Thus as per Penrose’s interpretation, every piece of matter irrespective of macroscopic or microscopic state, can exist simultaneously in two or more split states like the electron in the slit experiments. What causes them to be seen as a single deterministic mass in our classical world is gravity.

Where does consciousness and mind come into all this? How do brain cells utilize spacetime to generate something as bewildering as ‘consciousness’?


More of that in part 3 of Quantum Consciousness: How physics changes the way we look at mind.