Cartesian science posits that data is everything. But history is full of examples where intuition created major knowledge breakthroughs.
There are intricate and complex connections, and it is not the case that a single linear, monolithic approach is the only way to scientific progress.
Stanford University’s professor of biology and neuroscience and MacArthur “Genius” Fellow Robert Sapolsky has written a fascinating new book on human behaviour, Behave: The Biology of Humans At Our Best and Worst.
Sapolsky is justly famous for his research in primate biology, and in this book, he distils the wisdom of his years of research into the human condition.
Surprisingly, what Sapolsky has come up with is a notion of the connectedness of things. There has long been a debate about whether it is nature or nurture that determines our behaviour, and Sapolsky’s answer would be disappointing to those who expect a clear-cut answer, one way or the other. Because the answer, as we in the East could have told Cartesian Westerners, is, “It’s complicated, and it’s either, and it’s both.” He also cautions us that almost all the factors he considers are pliable and plastic, much as epigenetics can modify your own genes.
In a recent conversation on the KQED Forum, Sapolsky said: “Often due to ideology, there’s a pull towards saying ‘Aha, here’s the area of the brain that explains everything, the gene, the hormone, the evolutionary mechanism, the childhood trauma, the special pair of socks you were wearing that morning, that explains everything.” Historically, there’s been a pull towards trying to make sense of the biology of the human being saying ‘Aha, it’s all about this brain region,’ but what you see instead is not only do you have to incorporate all these perspectives to make sense of what just happened, you have to incorporate the neurobiology of what happened one second before, and the endocrinology of what happened the day before, and adolescence, and the epigenetics of fetal life, and genetics and culture.”
Genes and evolution are intertwined. Brain function and childhood are intertwined. Neurons and behaviour are intertwined. The culture of your pastoral ancestors from thousands of years ago and the level of stress hormones in your blood are intertwined. In a sense, the more we learn, the more humble we should become because it is so evident that we know so little. Biology, in particular, may be far too complicated for the neat little theories that Western science is so fond of.
I was reminded of a talk I heard from a Nobel-prize-winning researcher who’s totally enamoured of “telomeres” – compound structures at the ends of chromosomes – and their singular role is signaling the effects of aging and cancer. No, there probably are a hundred such indicators, not one.
But then, physics, the quintessential Cartesian science, has already warned us: there is the Heisenberg Uncertainty Principle, that tells us that there are fundamental limits to knowledge. It might be some kind of celestial joke from nature: it is impossible for us to know everything, period. Quantum effects impose probability rather than certainty.
The connectedness that Sapolsky talks about, suggests that Western Cartesian “science” is myopic. As in American political scientist Francis Fukuyama’s much-criticised phrase “the end of history”, we have not actually reached the end of science and discovery, however, much of the dogma of the day seems so persuasive.
One of the dogmas du jour is that data is everything, and that unless you can quantify it, well, it doesn’t exist. There is truth to the idea that without data, and quantification, you are opening things up to subtle biases based on your understanding of things. But it is also the case that a lot of the data that we are creating is in a sense “junk data”, because it is not converted into information, much less knowledge or insight.
The volume of junk data will go up manifold because all those Internet of Things devices will be spewing out terabytes per second. We have already seen how the signal-to-noise ratio of the Internet has gone down dramatically, with highly believable “fake news” now being peddled with deadly effect. That subtle concept –discrimination, or judgement – is coming into the picture more and more.
(Incidentally, in an era where machine learning is threatening more and more jobs, its Achilles heel may well be its inability to explain how it made a particular decision. And thus human judgement – along with creativity – may command a high premium.)
As an example, I just read an article on Medscape: “Derailing the ‘Inevitable’ Onset of Diabetes”, which mentions that the best predictor of colon cancer in the state of Delaware is a failure to finish high school. A simple-minded correlation would suggest that therefore the best way to reduce colon cancer would be to force everyone to complete high school, and lo! there would be no more colon cancer.
Of course, a more insightful view would be to analyse what makes people not complete school, and the consequences thereof: do they come from lower-income families? Do they go into low-paying jobs? Are they more liable to eat sugary but cheap processed food and lots of meat and to smoke? Do they become stressed out due to higher unemployment? And so on. This is necessary to convert data into information.
There is another danger with data crunching: an inherent belief that most of us have in the pleasingly symmetric Bell Curve, or normal distribution. (This is in addition to the widespread confusion between correlation and causation.) If you assume that the phenomenon you are studying has a normal distribution, but it actually has a fat-tailed distribution, you will seriously underestimate the risks associated with unforeseen events: thus the entire Black Swan scenario from Nassim Nicholas Taleb. You may assume something to be a very rare event (plus five sigma, say) and therefore you ignore that, but it turns out to be not so rare after all; thus, disaster.
There is also a tendency to become so enamoured of one’s pet theory that you would rather twist yourself into a pretzel to explain new information rather than discard the theory. An excellent example is the complex system of epicycles that medieval Vatican monks built to explain the retrograde motion of planets, given their theory that everything revolved around the earth. When Galileo Galilei showed up with a new, simpler and more elegant heliocentric theory, they wanted to burn him at the stake rather than abandon their theory.
On 7 June, there was a news item in the scientific journal Nature: “Oldest Homo Sapiens Fossil Claim Rewrites Our Species’ History.” The discovery of human remains in Morocco dating back to 315,000 years pushes back our species’ origins by 100,000 years. This also means that pet theories about a single, East African origin of our species – and by extension the idea that a mitochondrial Eve exists, and was the equivalent of Eve from the biblical Genesis story – now needs to be rewritten. Similarly, we need to be prepared for the eventuality that a significant chunk of what we currently believe as “scientific truth” may well be flawed and thus needs to be discarded.
Interestingly, this also throws another light on a hypothesis: that the so-called vanaras in the Ramayana were not actually apes, but a hominid species such as Neanderthals: it is now clear that Homo sapiens and Neanderthals (and Denisovans too) overlapped for tens of thousands of years, and interbred, until somehow the more imaginative Homo sapiens prevailed and survived.
One final point about epistemology – the theory of knowledge, especially with regard to its methods, validity, and scope, and the distinction between justified belief and opinion. There is a certain hallowed idea about the “scientific method” and about science philosopher Karl Popper’s ideas of “falsifiability”. Yes, in the normal course of events, one comes up with a (falsifiable) hypothesis, designs experiments to prove the truth or falsehood of the hypothesis, and then moves on with additional data. But how do you come up with the hypothesis in the first place? That’s where human creativity and intuition play a big role.
The fact is that there are many things that may be true in the natural world, although we have not been able to prove them true. For practical purposes, the fact that they work in general may be quite sufficient. For instance, Fields medal winner Manjul Bhargava once talked about the so-called Pythagoras theorem. Who gets the credit for it?
Well, Egyptians had a good inkling about it, Indians provided the exact statement, and the Chinese actually proved it. But the theorem was useful even in the absence of formal proof. Exactly the same holds true for almost 100 per cent of the software that we use every day. Almost none of it is formally proved, partly because that would be an immense exercise as each line of code would take around 10 pages of mathematics for a formal proof. But you know that your SAP enterprise software or Windows OS or Android app is (generally speaking) not going to blow up on you, and that is good enough for practical purposes.
And there is also the extraordinary story of Srinivasa Ramanujan. He arrived at every one of his extraordinary results through intuition (his best explanation was that the Goddess of Namakkal spoke to him in his sleep) and many of the staggeringly non-obvious but beautiful equations have been laboriously proven to be true many years later. Similarly, we have heard of many other flashes of intuition (for instance, the structure of the benzene ring that came to German chemist August Kekule in a dream and was one of the key discoveries that laid the foundations of organic chemistry).
Of course, there have been many instances of people dreaming up entirely absurd things, or hearing voices in their heads, or believing gods spoke to them, all of which was probably delusion or psychosis. I am not suggesting that anything anybody dreams up is inspired. All I am saying is that, like Sapolsky suggests, there are intricate and complex connections, and it is not the case that a single linear, monolithic approach is the only way to scientific progress. Knowledge, being complex, progresses by leaps and bounds, chaotically, and as science fiction author Arthur C Clarke said memorably, “Any sufficiently advanced technology is indistinguishable from magic.”
A sufficiently advanced bit of knowledge, intuited, may well be indistinguishable from magic, too. Indeed, in an era where artificial intelligence may take over many of the mundane tasks in our lives, there could be a tendency towards a society with a very few creative people dominating a lot of “useless” people, as per Israeli historian Yuval Noah Harari’s dystopian vision from Sapiens: A Brief History of Humankind.
What then, is the average person to do? I suggest that they cultivate the arts, and their creativity. In teaching Design Thinking, I have come to realise that everyone is creative in their own way, but we self-censor our creativity.
In a world where quotidian knowledge is handled by machines, we should concentrate on the things that machines cannot do as well as humans can: judgment, creativity, discrimination, beauty, ethics. That may be the key to survival.