hubris as high mutational burden

Checkpoint inhibitors seem to work best against tumor types and cancers with lots of genetic mutations. Because it is unusual in the body, this heavy mutational load seems to be easier for the immune system to identify as not belonging to ‘self’. Lung cancers triggered by smoking are generally loaded with mutations, and smokers respond to the checkpoint-inhibition therapies better than those who have never smoked. One strategy is to use combination therapies — such as chemotherapy plus a checkpoint inhibitor — to trigger mutations that will make it easier for the immune system to recognize tumor cells.

The Quest to Extend the Reach of Checkpoint Inhibitors in Lung Cancer (Weintraub)

Stronger cancers are easier to defeat. (Who would have thought that smoking can increase the odds of survival?) Strategically speaking, this outrageously counter-intuitive conclusion is actually quiet generalizable.

Making your enemy stronger makes sense in many different contexts. Once the ego inflates and hubris kicks in, your enemy inevitably starts making mistakes, just like a highly mutated cancer cell giving itself away to the immune system. The trick is to reach this state as quickly as possible so that you still have enough energy to act with fury when your enemy makes the fatal mistake. (Remember that you do not need to win every battle to become the final victor.)

Complex systems exhibit phase transitions. Making your enemy stronger can tilt the equilibrium, helping you initiate a favorable phase transition. For instance, as a young adult growing up, you need to rebel against your parents and friendly parents make this maturation process harder. Similarly, as you dump plastic into it, nature needs to learn how to turn this waste into food and eco-friendly policies make the adaptation process harder. As you can not expect to grow up via trivial adversities, you can not expect nature to come up with plastic eating bacteria via occasional exposures.

PS: On a similar note, see the post Against Small Doses which argues in favor of (low frequency) high doses within the (positive) pleasure domain, whereas the current post is focused on (negative) pain domain.

biology as computation

If the 20th century was the century of physics, the 21st century will be the century of biology. While combustion, electricity and nuclear power defined scientific advance in the last century, the new biology of genome research - which will provide the complete genetic blueprint of a species, including the human species - will define the next.

Craig Venter & Daniel Cohen - The Century of Biology

It took 15 years for technology to catch up with this audacious vision that was articulated in 2004. Investors who followed the pioneers got severely burned by the first hype cycle, just like those who got wiped out by the dot-com bubble.

But now the real cycle is kicking in. Cost of sequencing, storing and analyzing genomes dropped dramatically. Nations are finally initiating population wide genetics studies to jump-start their local genomic research programs. Regulatory bodies are embracing the new paradigm, changing their standards, approving new gene therapies, curating large public datasets and breaking data silos. Pharmaceutical companies and new biotech startups are flocking in droves to grab a piece of the action. Terminal patients are finding new hope in precision medicine. Consumers are getting accustomed to clinical genomic diagnostics. Popular culture is picking up as well. Our imagination is being rekindled. Skepticism from the first bust is wearing off as more and more success stories pile up.

There is something much deeper going on too. It is difficult to articulate but let me give a try.

Mathematics did a tremendous job at explaining physical phenomena. It did so well that all other academic disciplines are still burning with physics envy. As the dust settled and our understanding of physics got increasingly more abstract, we realized something more, something that is downright crazy: Physics seems to be just mathematics and nothing else. (This merits further elaboration of course, but I will refrain from doing so.)

What about biology? Mathematics could not even scratch its surface. Computer science on the other hand proved to be wondrously useful, especially after our data storage and analytics capabilities passed a certain threshold.

Although currently a gigantic subject on its own, at its foundations, computer science is nothing but constructive mathematics with space and time constraints. Note that one can not even formulate a well-defined notion of complexity without such constraints. For physics, complexity is a bug, not a feature, but for biology it is the most fundamental feature. Hence it is not a surprise that mathematics is so useless at explaining biological phenomena. 

The fact that analogies between computer science and biology are piling up gives me the feeling that we will soon (within this century) realize that biology and computer science are really just the same subject.

This may sound outrageous today but that is primarily because computer science is still such a young subject. Just like physics converged to mathematics overtime, computer science will converge to biology. (Younger subject converges to the older subject. That is why you should always pay attention when a master of the older subject has something to say about the younger converging subject.)

The breakthrough moment will happen when computer scientists become capable of exploiting the physicality of information itself, just like biology does. After all hardware is just frozen software and information itself is something physical that can change shape and exhibit structural functionalities. Today we freeze because we do not have any other means of control. In the future, we will learn how to exert geometric control and thereby push evolution into a new phase that exhibits even more teleological tendencies.

A visualization of the AlexNet deep neural network by Graphcore

A visualization of the AlexNet deep neural network by Graphcore


If physics is mathematics and biology is computer science, what is chemistry then?

Chemistry seems to be an ugly chimera. It can be thought of as the study of either complicated physical states or failed biological states. (Hat tip to Deniz Kural for the latter suggestion.) In other words, it is the collection of all the degenerate in-between phenomena. Perhaps this is the reason why it does not offer any deep insights, while physics and biology are philosophically so rich.

necessity of dying

Cancer is agelessness achieved at cellular level. We want to defeat it in order to achieve agelessness at bodily level.

How ironic.

What we do not see is that agelessness achieved at bodily level will in turn destroy agelessness we achieved at societal level by destroying the most important circuit breaker of societal positive feedback loops. Without death, we will have power concentrations of catastrophic magnitudes. Intergenerational transmission mechanisms will become pointless as the need to hand over anything to younger generations disappears. We will become like cancer cells, endangering the survival of our very society by refusing to die.

How tragic.

machine learning revolution

Take agriculture, for example. In agriculture you plant seeds, you fertilize them, you spray them, and then you wait six months and hope for the best. Once upon a time, people really understood the land, they understood the plants, and they had this intuitive feel for what would happen. Then farms got bigger and there were fewer people on them. We didn't have the ability to walk the land, and feel it, and close the loop. In the absence of information, we had to resort to monoculture. We closed the loop on an industrial scale. We went to predictability: If I use this Monsanto seed, spray it with this chemical, in this very well understood soil, I'm likely to get a pretty good outcome. In the absence of information, we went with predictability, simplicity, and a lot of chemicals.

- Closing the Loop (Chris Anderson)

We literally rationalised the shit out of the environment after we successfully formalised the inner workings of the consciousness. (We basically got super excited once we found out how to exert scalable control over the nature.)

Now we are living through another massive revolution. This time we have discovered the inner workings of the unconsciousness and it is a lot better at dealing with the sort of complexities and non-linearities exhibited by ecosystems. Intuitive control has finally become scalable.

The worst is behind us. Our relationship with the environment will slowly recover. Machine learning will allow control structures to stay beautiful and organic even as they scale.


Artificial intelligence (AI) is a misnomer. Intelligence is a notion we associate with consciousness (which works rationally / analytically), not unconsciousness (which works intuitively / statistically.)

This is not surprising, since the actual historical evolution of the idea of AI started off as an analytical project and ended up as a statistical one. (Analytical approach to automate human-level intelligence failed.)

This too is not surprising, since nature itself too invented unconsciousness first. (In other words, the straight-forward looking consciousness has to be deeper and more complex than unconsciousness.)

Side Note: Here, by unconsciousness, I am referring to the intuitional part of the reasoning mind, not to the lower level older brain parts that are responsible for things like the automation of bodily functions. Hence the remarks are not in contradiction with Moravec’s Paradox.

Notice that there is a strange alteration going on here. Human reason was built on top of human intuition. Now machine intuition is being built on top of (advanced) human reason. In the future, machine reason will be built on top of machine intuition.


The reason why our unconscious mind copes better with complexity has to do with its greater capacity to simultaneously deal with many variables. Rational mind on the other hand extracts heuristics and stories out of patterns. This process results in a drastic reduction of number of variables which in turn results in a loss of accuracy.

Take, for example, the problem of predicting the distribution of trees in a forest using only map data, such as elevation, slope, sunlight, and shade. Jock Blackard and Denis Dean did the original calculations in 1999, and left behind an enormous public database for other mathematicians to use. According to Vapnik, when computers were trained using 15,000 examples, their predictions were 85 to 87 percent accurate. Pretty good. But when they were fed more than 500,000 examples, they developed more complex rules for tree distribution and boosted the accuracy of their predictions to more than 98 percent.

“This means that a good decision rule is not a simple one, it cannot be described by a very few parameters,” Vapnik said. In fact, he argues that using many weak predictors will always be more accurate than using a few strong ones.

- Teaching Me Softly (Alan S. Brown)


What distinguishes excellent service (and maintenance) from very good service (and maintenance) is anticipatory power. Machine learning techniques are increasing our predictive capacity. We will soon be able to fix problems before they even occur. All problems, including those related to customer unhappiness and mechanical breakdowns, exhibit early signs which can be picked up by machine learning frameworks.

Life is about to become significantly happier and safer for our children.

fundamentality as nonlinearity

In mathematics, the fundamental things are obvious. They are the axioms and the definitions. You play with them and the entire edifice changes. A single additional condition in your definition can cause a chain reaction resulting in a tremendous number of revisions in proofs that are dependent on your definition.

What is fundamental in product design is not that obvious. Features like Facebook's feed and Tinder's swiping unleashed an immense creative activity resulting in thousands of new analogical startups. Sometimes small UX changes like Snapchat's ephemerality can cause drastic changes in behaviour. 

In essence, what is fundamental can only be recognised when you nudge it. In other words, fundamentality is a perturbative notion: Greater the nonlinearity, greater the fundamentality. 

This interpretation works even in areas outside of mathematics, where there is no observable derivational depth. Large nonlinearities may be manifestations of aggregations of many small nonlinearities (as in mathematics and physics) or single "atomic" instances (as in social sciences where the human mind can short-circuit the observable causality diagrams).

hopeless climate

For all misantrophes out there, here are some reasons to be hopeless about the climate change initiatives. (This is a very belated post sitting in my drafts.)

- It may be too late to take action. Tipping point may be long gone.

- A rescue plan in terms of a drastic shift in global energy policy may be impossible to execute. Monitoring may be hard. Nations may pretend to commit but in reality defect. For economic reasons, developing nations may choose to go through their own industrialisation periods by burning cheap coal. They may justify this behaviour by pointing out that the carbon dioxide built-up is essentially due to developed nations' accumulated past carbon emissions.

- In every drastic change there are winners and losers. Perhaps the climate change will turn Europe into a desert, but middle Asia into a fertile land. This unevenness makes it even harder to cooperate.

- Major policy shifts may entail harsh economic consequences today, and the losses experienced by the current generation may be greater than the discounted value of gains experienced by the future ones.

fractal turbulences

You never feel the whirls that are too large or too small. Turbulence affects you at your own scale.

What a beautiful fact, a rich reservoir of metaphors!

- Every level in a social hierarchy has its own dilemmas and conflicts. You should not hope to reach a plateau of happiness by climbing the social ladder.

- Every scale in physics has its own dynamics. For instance, the dynamics here on Earth are almost completely independent from those of the Milky Way.

- Periods of stillness can be deceptive. At any time, the imperceptibly small changes boiling underneath may combine to form the next chaotic period that will literally swallow you up like a rogue wave

- The source of your unhappiness may be due to the turbulences you feel at a level that you do not really belong to. Try adjusting your conditions and expectations down or up.

- Millions of trillions of microbes live on earth, minding their own businesses. They are completely oblivious to our aspirations, sufferings, breakthroughs, disappointments etc. We live on completely different levels, yet are part of the same ecosystem.

types of instabilities

Misanthropes thrive on physical instabilities. Catastrophic events of all sorts are welcome. If there are no such events happening now, then the misanthropes will forecast one to happen sometime soon.

Intellectuals thrive on political instabilities. Popular uprisings, wars and massacres are all welcome. If such events no longer occurred, then intellectuals would switch careers and become historians.

Traders thrive on economic instabilities. Any event that introduces volatility into the markets is welcome. Even if there is no such event happening now, sooner or later the susceptibility of traders to herd behavior will create one spontaneously.

subjective randomness

People are extremely good at finding structure embedded in noise. This sensitivity to patterns and regularities is at the heart of many of the inductive leaps characteristic of human cognition, such as identifying the words in a stream of sounds, or discovering the presence of a common cause underlying a set of events. These acts of everyday induction are quite different from the kind of inferences normally considered in machine learning and statistics: human cognition usually involves reaching strong conclusions on the basis of limited data, while many statistical analyses focus on the asymptotics of large samples. The ability to detect structure embedded in noise has a paradoxical character: while it is an excellent example of the kind of inference at which people excel but machines fail, it also seems to be the source of errors in tasks at which machines regularly succeed. For example, a common demonstration conducted in introductory psychology classes involves presenting students with two binary sequences of the same length, such as HHTHTHTT and HHHHHHHH, and asking them to judge which one seems more random. When students select the former, they are told that their judgments are irrational: the two sequences are equally random, since they have the same probability of being produced by a fair coin. In the real world, the sense that some random sequences seem more structured than others can lead people to a variety of erroneous inferences, whether in a casino or thinking about patterns of births and deaths in a hospital.

- Griffiths & Tenenbaum - From Algorithmic to Subjective Randomness

We perceive the more orderly pattern HHHHHHHH to be a less likely outcome of the coin-tossing experiment, while in reality it is as likely as the other pattern HHTHTHTT.

Why do we expect all random uniform processes (i.e. experiments with uniform probability distributions) to generate visually-disordered outcomes? Recall that the second law of thermodynamics dictates that the entropy of an isolated system tends to increase over time. In other words an isolated system constantly evolves towards the more-likely-to-happen states. Since such states are often the more-disorderly-looking ones, it is not surprising that we developed the just-mentioned-expectation. Most of what-is-perceived-to-be-random (i.e. entropy-driven) in nature does in fact result in visual disorder.

The paper (from which I extracted the above quotation) suggests that our subjective interpretation of randomness is more in line with what is called algorithmic complexity. (i.e. Greater complexity is equated with greater randomness.) This observation is not surprising neither. Why? Because the-more-disorderly-looking patterns tend to have higher algorithmic complexity. (Here I put "tend" in italics because a pattern may be algorithmically simple but nevertheless visually ugly.)

There is a small caveat though. In some rare cases, the more-likely-to-happen states that an isolated system evolves towards may not at all look disorderly. In fact, the final equilibrium stage may have a lot of visual structure. Here is a nice example:

Individual particles such as atoms often arrange into a crystal because their mutual attraction lowers their total energy. In contrast, entropy usually favors a disordered arrangement, like that of molecules in a liquid. But researchers long ago found, in simulations and experiments, that spheres without any attraction also crystallize when they are packed densely enough. This entropy-driven crystallization occurs because the crystal leaves each sphere with some space to rattle around. In contrast, a random arrangement becomes "jammed" into a rigid network of spheres in contact with their neighbors. The entropy of the few "rattlers" that are still free to move can't make up for the rigidity of the rest of the spheres.

Read this for further details.

structural information inside DNA

I had always thought that structural symmetry was strictly a product of evolution due to its phenotypical advantages. Most animals for example have bilateral symmetry. Plants on the other hand exhibit other types of symmetries. In nature one rarely encounters structures that are devoid of such geometrical patterns.

While reading an article on algorithmic complexity, it immediately dawned upon me that there may be another important reason why symmetry is so prevalent.

First, here is a short description of algorithmic complexity:

Given an entity (this could be a data set or an image, but the idea can be extended to material objects and also to life forms) the algorithmic complexity is defined as the length (in bits of information) of the shortest program (computer model) which can describe the entity. According to this definition a simple periodic object (a sine function for example) is not complex, since we can store a sample of the period and write a program which repeatedly outputs it, thereby reconstructing the original data set with a very small program.

Geometrical patterns allow economization. Presence of symmetries can drastically reduce the amount of information that needs to be encoded in the DNA for the orchestration of biochemical processes responsible for the structural development of the organism. Same may be true for more complicated morphological shapes that are still mathematically simple to describe. An example:

Researchers discovered a simple set of three equations that graphed a fern. This started a new idea - perhaps DNA encodes not exactly where the leaves grow, but a formula that controls their distribution. DNA, even though it holds an amazing amount of data, could not hold all of the data necessary to determine where every cell of the human body goes. However, by using fractal formulas to control how the blood vessels branch out and the nerve fibers get created, DNA has more than enough information.