waves of decentralizations

Evolutionary dynamics always start off well-defined and centralized, but overtime (without any exception) mature and decentralize. Our own history is full of beautiful exemplifications of this fact. In historical order, we went through the following decentralization waves:

  • Science decentralized Truth away from the hegemony of Church.

  • Democracy decentralized Power.

  • Capitalism decentralized Wealth.

  • Social Media decentralized Fame away from the media barons.

Today, if you are not powerful, wealthy or famous, there is no one but to blame yourself. If you do not know the truth, there is not one but to blame yourself. Everything is accessible, at least in theory. This of course inflicts an immense amount of stress on the modern citizen. In a sense, life was a lot easier when there was not so much decentralization.

Note how important social media revolution really was. Most people do not recognize the magnitude of change that has taken place in such a short period of time. In terms of structural importance, it is on the same scale as the emergence of democracy. We no longer distinguish a sophisticated judgment from an unsophisticated one. Along with “Every Vote Counts”, now we also have “Every Like Counts”.

Of course, the social media wave was built on another, even more fundamental decentralization wave, which is the internet itself. Together with the rise of internet, communication became completely decentralized. Today, in a similar fashion, we are witnessing the emergence of blockchain technology which is trying to decentralize trust by creating neutral trust nodes with no centralized authority behind them. For instance, you no longer need to be a central bank with a stamp of approval from the government to a launch a currency. (Both internet and blockchain undermine political authority and in particular render national boundaries increasingly more irrelevant.)

Internet itself is an example of a design, where robustness to communication problems was a primary consideration (for those who don't remember, Arpanet was designed by DARPA to be a communication network resistant to nuclear attack). In that sense the Internet is extremely robust. But today we are being introduced to many other instances of that technology, many of which do not follow the decentralized principles that guided the early Internet, but are rather highly concentrated and centralized. Centralized solutions are almost by definition fragile, since they depend on the health of a single concentrated entity. No matter how well protected such central entity is, there are always ways for it to be hacked or destroyed.

Filip Piekniewski - Optimality, Technology and Fragility

As pointed out by Filip, evolution favors progression from centralization to decentralization because it functionally corresponds to a progression from fragility to robustness.

Also, notice that all of these decentralization waves initially overshoot due to the excitement caused by their novelty. That is why they are always criticized at first for good reasons. Eventually they all shed off their lawlessness, structurally stabilize, go completely mainstream and institutionalize themselves.

science vs technology

  • Science (as a form of understanding) gets better as it zooms out. Technology (as a form of service) gets better as it zooms in. Science progresses through unifications and technology progresses through diversifications.

  • Both science and technology progress like a jellyfish moves through the water, via alternating movements of contractions (i.e. unifications) and relaxations (i.e. diversifications). So neither science or technology can be pictured as a simple linear trend of unification or diversification. Technology goes through waves of standardizations for the sake of achieving efficiency and de-standardizations for the sake of achieving a better fit. Progress happens due to the fact that each new wave of de-standardization (magically) achieving a better fit than the previous wave, thanks to an intermittent period of standardization. Opposite happens in science, where each new wave of unification (magically) reaches a higher level of accuracy than the previous wave, thanks to an intermittent period of diversification.

  • Unification is easier to achieve in a single mind. Diversification is easier to achieve among many minds. That is why the scientific world is permeated by the lone genius culture and the technology world is permeated by the tribal team-work culture. Scientists love their offices, technologists love their hubs.

“New scientific ideas never spring from a communal body, however organised, but rather from the head of an individually inspired researcher who struggles with his problems in lonely thought and unites all his thought on one single point which is his whole world for the moment.”
- Max Planck

  • Being the originator of widely adopted scientific knowledge makes the originator powerful, while being the owner of privately kept technological knowledge makes the owner powerful. Hence, the best specimens of unifications quickly get diffused out of the confined boundaries of a single mind, and the best specimens of diversifications quickly get confined from the diffused atmosphere of many minds.

  • Unifiers, standardizers tend to be more masculine types who do not mind being alone. Diversifiers, de-standardizers tend to be more feminine types who can not bear being alone. That is why successful technology leaders are more feminine than the average successful leader in the business world, and successful scientific leaders are more masculine than the average successful leader in the academic world. Generally speaking, masculine types suffer more discrimination in the technology world and feminine types suffer more discrimination in the scientific world.

  • Although unifiers play a more important role in science, we usually give the most prestigious awards to the diversifiers who deployed the new tools invented by the unifiers at tangible famous problems. Although diversifiers play a more important role in technology, we usually remember and acknowledge only the unifiers who crystallized the vast efforts of diversifiers into tangible popular formats.

  • Technological challenges lie in efficient specializations. Scientific challenges lie in efficient generalizations. You need to learn vertically and increase your depth to come up with better specializations. This involves learning-to-learn-new, meaning that what you will learn next will be built on what you learned before. You need to learn horizontally and increase your range to come up with better generalizations. This involves learning-to-relearn-old, meaning that what you learned before will be recast in the light of what you will learn next.

  • Technology and design are forms of service. Science and art are forms of understanding. That is why the intersection of technology and art, as well as the intersection of science and design, is full of short-lived garbage. While all our “external” problems can be tracked back to a missing tool (technological artifact) or a wrong design, all our “internal” problems can be traced back to a missing truth (scientific fact) or wrong aesthetics (i.e. wrong ways of looking at the world).

  • Scientific progress contracts the creative space of religion by outright disproval of certain ideas and increases the expressive power of religion by supplying it with new vocabularies. (Note that the metaphysical part of religion can be conceived as “ontology design”.) Technological progress contracts the creative space of art by outright trivialization of certain formats and increases the expressive power of art by supplying it with new tools. (Think of the invention of photography rendering realistic painting meaningless and the invention of synthesizers leading to new types of music.) In other words, science and technology aid respectively religion and art to discover their inner cores by both limiting the domain of exploration and increasing the efficacy of exploration. (Notice that artists and theologians are on the same side of the equation. We often forget this, but as Joseph Campbell reminds us, contemporary art plays an important role in updating our mythologies, and keeping the mysteries alive.)

  • Scientific progress replaces mysteries with more profound mysteries. Technological progress replaces problems with more complex problems.

  • Both science and technology progress through hype cycles, science through how much phenomena the brand new idea can explain, technology through how many problems the brand new tool can solve.

  • Scientific progress slows down when money is thrown at ideas rather than people. Technological progress slows down when money is thrown at people rather than ideas.

  • Science progresses much faster during peacetime, technology progresses much faster during wartime. Scientific breakthroughs often precede new wars, technological breakthroughs often end ongoing wars.

dire need for social reform

Look at the history of all mass social traumas. (Rise and fall of feudalism etc.) You will see that they are all preceded by transformative technological and economic disruptions and followed by transformative social and spiritual reforms.

We are going through a similar trauma at the moment. These structural changes can be hard to see while you are inside them since they manifest themselves in myriad of details. However when you go back to evaluate what happened, the picture is always crystal clear. (This evaluation can not be conducted right after the dust settles. You literally need some distance to see what really happened.)

Today we have entered into a new phase in the development of the next layer of complexity within the grand narrative of life. (To understand what I mean, read Emergence of Life post.) This new technological wave is slowly unfolding, but it is probably on par with the industrial revolution, perhaps even a couple of magnitudes more powerful. Long story short, our centralized digital brain has finally emerged. (i.e. The multi-cloud layer linking up all cloud-based computation and storage resources.) This development has already started to have massive effects on our psyches via the infiltration of social media and the penetration of artificial intelligence into our everyday lives. Artists and writers have felt the zeitgeist and are responding to it by writing books and shooting movies to raise social awareness about the oncoming possible consequences of the new technologies.

Clearly, the emergence of the next life forms is a vastly complicated, non-linear process. Nature is giving birth to something new through us and naturally we are the ones who are most affected by this traumatic unfolding.

Today, society as we know it is literally falling apart:

  • Friendship has evolved into an unrecognizable form.

  • Our lives have become so complex (a natural side effect of the emergence) and we expect so much from our life partners that the notion of marriage has morphed into an all-or-nothing form. Divorce rates are skyrocketing, and the whole institution is crumbling under immense weight.

  • Our schools are extremely out-of-date and nobody seems to have the balls, persistence and the vision to reform them. (Hint: Handing out more screens will not solve the problem.) We are not preparing our kids for the challenges they will be facing when they grow up. In fact, we are not even preparing them for today’s challenges. The situation is so ridiculous that I sincerely believe that we would be better off by turning the entire thing off.

  • Our economic and social safety nets are insufficient to cope with the oncoming technological wave. People are feeling left-behind and depressed, especially since our current macro structures are implicitly asking them to derive the meaning of life from their jobs. (Hint: Handing out more money will not solve the problem.) Only after the epic rise of China (with its top-down, long-term-thinking, centralized, globally-optimized decision making mechanisms) have the business elites in United States recognized that they actually have social obligations, beyond maximizing shareholder value.

And the list goes on…

We need to speed up, otherwise our social reforms will not be able catch up with the increasing speed and magnitude of technological changes. Make no mistake, technology will not slow down for us. Emergence of the next level of life forms is an unstoppable process. If this process collapses, we, as humanity, will collapse along with it. In other words, if we can not give rise to these new life forms, evolution will promptly get rid of us and try again. (Human-level minds will re-emerge somehow, somewhen, somewhere.)

So what are we doing now? Are we reforming?

No.

What type of leaders do you need for preaching social progress and propagating social reform? You need liberal leaders. What have our liberal leader done? They fucked up badly, really really badly. Now conservatism is coming back with full force everywhere. People are fleeing back to safety, falling back onto old notions, closing down on themselves, against each other and towards new ideas. And they have every right to do so, because they feel betrayed. They can not pinpoint exactly what went wrong, but they feel that the elites have not done their jobs. And they are absolutely right. Elites chose to mind their own business and think of their own pockets. Most still feel no sense of duty towards the society. If they felt any, we would not be in this shit situation today, regressing back in time while technology is marching ahead with no stop in sight.

It will probably take another 20 years before the society gives another chance to liberal progressives and opens up to new social reforms. Again, make no mistake, liberals have done this to themselves. They can not cry it out. They need to change. In a world where a substantial majority of the graduates of the most revered university (Harvard College) chooses to pursue careers in investment banking and consulting, in a world where the most revered technology leader (Elon Musk) sees salvation of humanity through a fantasy colonization of Mars, common people will obviously feel betrayed. Our best brains need to be socially conscious. Our best leaders need to be morally sensible. If they will not do the job, society will look elsewhere, just as they are doing now.

There is an immense psychological distress at the moment. People who are supposed to save us are clueless. They do not have any spiritual strength to deal with this new (self-induced) massive attack on our social infrastructures and well-beings.

  • Most define their lives through their work, which will soon mostly be rendered irrelevant by artificial intelligence. These ones are hopeless.

  • Some define their lives through their children. These ones will be sacrificing the spiritual health of the children to salvage their own, by making the children serve their own psychological needs.

  • Some, as expected, seek help from science. But the psychologists are clueless about questions of meaning. They have even less of an idea about the deep structural evolutionary causative factors that have led to this mess.

As I said at the beginning, all technological shocks have to be followed by spiritual transformations. We literally need to ask again to ourselves what it means to be a human being. To do this at scale, we need a new set of spiritual leaders who can guide us through this new mess we created. Religion should evolve to stay relevant. Our educated elite is no longer governed by any higher values simply because they can not find any religious doctrines they can resonate with. (Doctrines meant to be addressed to uneducated masses living two thousand years ago will not do the job.)

What may be salvaging us today is a few glitters of basic humanistic instincts, here and there, a few good people with good common sense in some high level offices. But this is clearly not enough. You can not solve the greater social challenges we are facing today simply by throwing more love at them. Of course, empathy is necessary for revising and building the superstructures we need, but it is not enough by itself. (It is not even enough in quantity at the moment.)

Salvation will not happen by going to Mars. It will happen through a deep understanding of how evolution works, and through a guided progressive social reform that is not out-of-touch with the new challenges of our times.

My biggest worry is that we are slowing down too much today. Do you know what happens when spiritual guidance and principles of social self-governance fail to keep up with technological progress? Bad decisions and eventually wars! Darkness takes over, and humanity gets hammered until it realigns its values and understands its real priorities.

“The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.”

- Isaac Asimov

Why do we always have to go through the hard way? We need to understand that this game is getting exponentially more dangerous. We are not playing with swords any more. After the next world war, there may not be another “phoenix rising from its ashes” story. Of course, as I said before, nature will always rise from its ashes and keep constructing greater complexities and autonomies, but that does not necessarily have to involve us.

truth as status quo

We now have the science that argues how you're supposed to go about building something that doesn't have these echo chamber problems, these fads and madnesses. We're beginning to experiment with that as a way of curing some of the ills that we see in society today. Open data from all sources, and this notion of having a fair representation of the things that people are actually choosing, in this curated mathematical framework that we know stamps out echoes and fake news.

The Human Strategy

Fads and echo chambers provide the means to break positive feedback loops (by helping us counter them with virtual positive feedback loops) and get out of bad equilibriums (by helping us cross the critical thresholds necessary to initiate change). Preventing illusion is akin to preventing progress. Every new truth starts with untruth. Future will be in conflict with today. Today’s new reality is yesterday’s false belief.

It is startling to realize how much unbelief is necessary to make belief possible.

Eric Hoffer - The True Believer (Page 79)

We are constructors of our social world as well as receivers.That is why companies like Facebook should never be involved in this war against “fake news”. Truth is inherently political. Algorithms for sniffing it out will inevitably end up defending the status quo.

nonsensically high valuation of uber

Uber will be the single largest value collapse in technology history. Here are the reasons why:

  • The service is a commodity. Users do not care about which driver or car picks them up as long as the driver is not crazy and the car is not filthy. (It is hard to preserve even such basic qualities at large scale. Generally speaking, you can not perform above average when you become almost as big as the market itself. This in turn increases your insurance cost per transaction.)

  • The technology is a commodity. It is no longer hard to build the basic application from scratch. (Even municipalities have started doing it themselves.)

  • Neither drivers or users are loyal to the company. Uber is fundamentally a utility app with very low switching costs. A lot of drivers and users utilize the rival apps as well. (Driver are looking for more rides and users are looking for cheaper prices.) In fact, there are even aggregator apps that help drivers juggle more easily between the different networks.

  • It is not a winner-takes-all market as it was imagined. Any network that is dense enough so that the average waiting time for the user is below 5 minutes is good enough. Killing competitors through predatory pricing does not change the basic market structure. If the market allows an oligopolistic structure, it will sooner or later (i.e. once Uber runs out of all the stupid money in the world to finance every ride) converge on one.

  • Unit economics is not improving. (In fact, as mentioned above, insurance cost per transaction is getting worse.) Rides do not scale since most of the costs are variable. (Cars are owned by the drivers and efficiency gains from their greater utilization quickly maxes out, especially since they are used for personal purposes as well. So there is not much for Uber to suck away.) The underlying (evil) hope is that once Uber becomes a monopoly, it will be able to relax and dictate prices. This is a false hope however since governments do not tolerate in-your-face physical monopolies, especially if they create negative externalities like luring people away from public transportation and increasing congestion. (They seem to be more lenient with abstract digital ones.)

  • With the arrival of autonomous cars, the whole industry will change. Any gains from building a driver network will be gone, making it easier to launch Uber-like services with sheer capital. (Make no mistake, there will be a LOT of new capital coming in. Germans will hit especially hard, old money will form alliances etc.) Autonomy itself will become quickly commoditized and centralized around a few intermediary technology companies who will be training all the models and centralizing all the data. Also, at some point, (as they do in the hospitality industry) users will start placing greater value on the consistency of quality. Hence, instead of riding in random cars, they will prefer to become members of the fleets owned by car manufacturers themselves. God knows what else will happen… Autonomy will be big disruptive wave with a lot of currently-unforeseeable consequences.

Nevertheless do not short-sell Uber when it goes IPO this year. It is backed by an aggressive giant called SoftBank which is ruled by an old man who is backed by an infinite amount of blood money. As Keynes said, markets can remain irrational longer than you can remain solvent, and it will for surely be the case for Uber.

classical vs innovative businesses

As you move away from zero-to-one processes, economic activities become more and more sensitive to macroeconomic dynamics.

Think of the economy as a universe. Innovative startups correspond to quantum mechanical phenomena rendering something from nothing. The rest of the economy works classically within the general relativity framework where everything is tightly bound to everything else. To predict your future you need to predict the evolution of everything else as well. This of course is an extremely stressful thing to do. It is much easier to exist outside the tightly bound system and create something from scratch. For instance, you can build a productivity software that will help companies increase their profit margins. In some sense such a software will exist outside time. It will sell whether there is an economic downturn or an upturn.


In classical businesses, forecasting near future is extremely hard. Noise clears out when you look a little further out into the future. But far future is again quite hard to talk about since you start feeling the long term effects of innovation being made today. So difficulty hierarchy looks as follows:

near future > far future > mid future

In innovative businesses, forecasting near future is quite easy. In the long run, everyone agrees that transformation is inevitable. So forecasting far future is hard but still possible. However what is going to happen in mid term is extremely hard to predict. In other words, the above hierarchy gets flipped:

mid future > far future > near future

Notice that what is mid future is actually quite hard to define. It can move around with the wind, so to speak, just as intended by the goddesses of fate in Greek mythology.

In Greek mythology the Moirae were the three Fates, usually depicted as dour spinsters. One Moira spun the thread of a newborn's life. The other Moira counted out the thread’s length. And the third Moira cut the thread at death. A person’s beginning and end were predetermined. But what happened in between was not inevitable. Humans and gods could work within the confines of one's ultimate destiny.

Kevin Kelly - What Technology Wants

I personally find it much more natural to just hold onto near future and far future, and let the middle inflection point dangle around. In other words I prefer working with innovative businesses.

Middle zones are generally speaking always ill-defined, presenting another high level justification for the barbell strategy popularized by Nassim Nicholas Taleb. Mid-term behavior of complex systems is tough to crack. For instance, short-term weather forecasts are highly accurate and long-term climate changes are also quite foreseeable, but what is going to happen in mid-term is anybody’s guess.

Far future always involves “structural” change. Things will definitely change but the change is not of statistical nature. As mentioned earlier, innovative businesses are not affected by the short term statistical (environmental / macro economic) noise. Instead they suffer from mid term statistical noise of the type that phase-transition states exhibit in physics. (Think of turbulence phenomenon.) So the above two difficulty hierarchies can be seen as particular manifestations of the following master hierarchy:

statistical unpredictability > structural unpredictability > predictability


Potential entrepreneurs jumping straight into tech without building any experience in traditional domains are akin to physics students jumping straight into quantum mechanics without learning classical mechanics first. This jump is possible, but also pedagogically problematic. It is much more natural to learn things in the historical order that they were discovered. (Venture capital is a very recent phenomenon.) Understanding the idiosyncrasies and complexities of innovative businesses requires knowledge of how the usual, classical businesses operate.

Moreover, just like quantum states decohere into classical states, innovative businesses behave more and more like classical businesses as they get older and bigger. The word “classical” just means the “new” that has passed the test of time. Similarly, decoherence happens via entanglements, which is basically how time progresses at quantum level.

By the way, this transition is very interesting from an intellectual point of view. For instance, innovative businesses are valued using a revenue multiple, while classical businesses are valued using a profit multiple. When exactly do we start to value a mature innovative business using a profit multiple? How can we tell apart its maturity? When exactly a blue ocean becomes a red one? With the first blood spilled by the death of competitors? Is that an objective measure? After all, it is the investor’s expectations themselves which sustain innovative businesses who burn tons of cash all the time.

Also, notice that, just as all classical businesses were once innovative businesses, all innovative businesses are built upon the stable foundations provided by classical businesses. So we should not think of the relationship as one way. Quantum may become classical, but quantum states are always prepared by classical actors in the first place.


What happens to classical businesses as they get older and bigger? They either evolve or die. Combining this observation with the conclusions of the previous two sections, we deduce that the combined predictability-type timeline of an innovative business becoming a classical one looks as follows:

1
(Innovative) Near Future
Predictability

2
(Innovative) Mid Future
Statistical Unpredictability
(Buckle up. You are about to go through some serious turbulence!)

3
(Innovative) Far Future
Structural Unpredictability
(Congratulations! You successfully landed. Older guys need to evolve or die.)

4
(Classical) Near Future
Statistical Unpredictability
(Wear your suit. There seems to be radiation everywhere on this planet!)

5
(Classical) Mid Future
Predictability

6
(Classical) Far Future
Structural Unpredictability
(New forms of competition landed. You are outdated. Will you evolve or die?)

Notice the alteration between structural and statistical forms of unpredictability over time. Is it coincidental?


Industrial firms thrive on reducing variation (manufacturing errors); creative firms thrive on increasing variation (innovation).
- Patty McCord - How Netflix Reinvented HR

Here Patty’s observation is in line with our analogy. He is basically restating the disparity between the deterministic nature of classical mechanics and the statistical nature of quantum mechanics.

Employees in classical businesses feel like cogs in the wheel, because what needs to be done is already known with great precision and there is nothing preventing the operations to be run with utmost efficiency and predictability. They are (again just like cogs in the wheel) utterly dispensable and replaceable. (Operating in red oceans, these businesses primarily focus on cost minimization rather than revenue maximization.)

Employees in innovative businesses, on the other hand, are given a lot more space to maneuver because they are the driving force behind an evolutionary product-market fit process that is not yet complete (and in some cases will never be complete).


Investment pitches too have quite opposite dynamics for innovative and classical businesses.

  • Innovative businesses raise money from venture capital investors, while classical businesses raise money from private equity investors who belong to a completely different culture.

  • If an entrepreneur prepares a 10 megabyte Excel document for a venture capital, then he will be perceived as delusional and naive. If he does not do the same for a private equity, then he will be perceived as entitled and preposterous.

  • Private equity investors look at data about the past and run statistical, blackbox models. Venture capital investors listen to stories about the future and think in causal, structural models. Remember, classical businesses are at the mercy of macroeconomy and a healthy macroeconomy displays maximum unpredictability. (All predictabilities are arbitraged away.) Whatever remnants of causal thinking left in private equity are mostly about fixing internal operational inefficiencies.

  • The number of reasons for rejecting a private equity investment is more or less equal to the number of reasons for accepting one. In the venture capital world, rejection reasons far outnumber the acceptance reasons.

  • Experienced venture capital investors do not prepare before a pitch. The reason is not that they have a mastery over the subject matter of the entrepreneur’s work, but that there are far too many subject-matter-independent reasons for not making an investment. Private equity investors on the other hand do not have this luxury. They need to be prepared before a pitch because the devil is in the details.

  • For the venture capital investors, it is very hard to tell which company will achieve phenomenal success, but very easy to spot which one will fail miserably. Private equity investors have the opposite problem. They look at companies that have survived for a long time. Hence future-miserable-failures are statistically rare and hard to tell apart.

  • In innovative businesses, founders are (and should be) irreplaceable. In classical businesses, founders are (and should be) replaceable. (Similarly, professionals can successfully turn around failing classical companies, but can never pivot failing innovative companies.)

  • Private equity investors with balls do not shy away from turn-around situations. Venture capital investors with balls do not shy away from pivot situations.

pharma vs diagnostics

Bioinformatics industry is bifurcating into the two categories defined by the two extreme-value generation endpoints, namely drug development and data creation.

  • Drugs come with patent protection and therefore create defensible sources of revenue. Data usually suffers from diminishing returns and data generation can not sustain value indefinitely, but this is not true for the case of biology which is (almost by definition) the most complex subject in the universe. (The fact that biological data seems to have a shorter half-life makes the situation even worse.)

  • Pharma companies develop the drugs and (the volume driven) diagnostics companies generate (the majority of) the data.

Pharma companies love to dip into data because it enables them to drive their precision medicine programs forward by enabling

  • the targeting of the right patient cohorts for existing drugs, and

  • the generation of novel drug targets.

Better precision medicine generates more knowledge about the genetic variants and more drugs targeting them, which in turn render diagnostics tests respectively more accurate and useful. In other words, more data eventually leads to an increase in the demand for diagnostics tests and therefore results in the generation of even more data. (This positive feedback cycle will greatly accelerate the maturation of the precision medicine paradigm in the near future.)

Pharma companies and diagnostics companies behave very differently (as summarized in the table below) and this creates a polarity in the product and business model configuration space for the bioinformatics industry whose primary customers (in the private domain) are these two types of companies.

Pharma vs Diagnostics.png

Last two lines are very important and worth explaining in greater detail:

  • Pharma companies do basic research and therefore want to tap into all types of data sets. (They also have a greater tendency use all types of analytical applications while diagnostic companies ignore the long tail.) These datasets are generally huge and may be residing in private cloud or some public cloud provider. So pharma companies have to be able to connect to all of these datasets and run computation-heavy analysis that seamlessly weave through them. (When you are dealing with big data, computation needs to go to the data rather than other way around.) In other words, they naturally belong to the multi-cloud paradigm. Diagnostics companies, on the other hand, belong to the cloud paradigm since they are optimizing cost and will just choose a single cloud provider based on price and convenience. (Read this older blog post to better understand the difference and polarity between the multi-cloud and cloud paradigms.)

  • Pharma companies are looking for help to solve their complex problems. Hence they are primarily focused on solutions. This pushes the software layer behind the services layer. In other words, software is still there but it is the service provider who is mostly using it. Diagnostics companies, on the other hand, focus on their unit economics. They do not need much consulting since they just optimize the hell out of their production pipelines and leave them alone for the most of the time.

thoughts on cybersecurity business

  • Cybersecurity and drug development are similar in the sense that neither can harbor deep, long-lived productification processes. Problems are dynamical. Enemies eventually evolve protection against productified attacks.

  • Cybersecurity and number theory are similar in the sense that they contain the hardest problems of their respective fields and are not built on a generally-agreed-upon core body of knowledge. Nothing meaningful is accessible to beginner-level students since all sorts of techniques from other subfields are utilized to crack problems.

Hence, in its essence, cyber security is an elite services business. Anyone else claiming the opposite (that it is a product company, that it does not necessitate the recruitment of the best minds of the industry) is selling a sense of security, not real security.

future of pharmaceutical industry

What will the future of pharmaceutical industry look like?

It is clear that we are reaching one end of a paradigm, but what most people still do not get is how big the oncoming changes will be. We are on the cusp of a great intellectual revolution, on par with the revolution in 20th century physics. Computer science is unlocking biology, just like mathematics unlocked physics, and the consequences will be huge. (Read this older post for a deeper look at this interesting analogy between analogies.)

For the first time in history, we are engineering solutions from scratch rather than stumbling into them or stealing them from nature. Western medicine is only now truly taking off.

Not only will this transformation be breathtaking, but it will also be unfolding at a speed much faster than we expect. As biology becomes more information theoretical, pharmaceutical industry will become more software driven and will start displaying more of the typical dynamics of the software industry, like faster scaling and deeper centralization and modularization.

Of course, predicting the magnitude of change is not the same thing as predicting how things will actually unfold. (Sometimes I wonder which one is harder. Remember Paul Saffo: “We tend to mistake a clear view of the future for a short distance.”) Let us give a try anyway.


1. Splitting and Centralization of the Quantitative Brain

Just like the risk analytics layer is slowly being peeled out of big insurance companies as it is becoming more quantitative (small companies could not harbor such analytics departments anyway), the quantitative layer of the drug development process will split out of the massive pharmaceutical companies. (Similarly, in the autonomous driving space, companies like Waymo are licensing out self-driving technologies to big car manufacturers.)

Two main drivers of this movement:

  • Soft Reason. Culturally speaking, traditional (both manufacturing and service) companies can not nurture software development within themselves. Big ones often think that they can, but without exception they all end up wasting massive resources to realize that it is not a matter of resources. Similarly, they always end up suffocating the technology companies they acquire.

  • Hard Reason. Unlike services and manufacturing, software scales perfectly. In other words, the cost of reproduction of software is close to nil. This leads to centralization and winner-takes-all effects. (Even within big pharmas bioinformatics and IT departments are centralized.) Software developed in-house can never compete with software developed outside, which serves many customers, takes as input more diverse use cases and improves faster.

Study of complex systems (which biology is an example of) is conducted from either a state centric or process centric perspective, using either statistical (AI driven) or deterministic (algorithm driven) methods. (Read this older post for a deeper look at the divide between state and process centric perspectives.)

In other words, the quantitative brain in biology will be centralized around four different themes:

  1. Algorithm Driven + State Centric

  2. AI Driven + State Centric

  3. Algorithm Driven + Process Centric

  4. AI Driven + Process Centric

Xtalpi is a good example for the 4th category. Seven Bridges in its current form belongs to the 1st category. There are other examples out there that fit neatly into one of these categories or cut across a few. (It is tough to cut across both state centric and process centric perspectives since latter is mostly chemistry and physics driven and tap into a very different talent pool.)


2. Democratization and Commodification of Computation

Big pharma companies could afford to buy their own HPCs to run complex computations and manage data. Most are still holding onto these powerful clusters, but they are all realizing that this is not sustainable for two main reasons:

  • They either can not accommodate bursty computations or can not keep the machines busy all time. So it is best for the machines to be aggregated in shared spaces where they are maintained centrally.

  • Since data size is exploding doubly exponentially, it is becoming harder to move and more expensive to store. (Compute needs to go where data is generated.)

Cloud computing took off for reasons entirely unrelated to biomedical data analysis, which will soon be the biggest beneficiary of this revolution as biomedical data sizes and computation needs surpass everything else. (It is not surprising that the centralized disembodied brain is developing in the same way as our decentralized embodied brains did. It got enlarged for social reasons and deployed later for scientific purposes.) Small biotechs can now run complex computations on massive data repositories and pay for computation just like they pay for electricity, only for the amounts they use. Big pharmas too are migrating to the cloud, finally coming to terms with the fact that cloud is both safer and cheaper. They are no longer uncomfortable departing with their critical data and no longer ignorant about the hidden costs of maintaining local hardware.

Long story short, democratization of computation is complete (aside from some big players with sunk cost investments) and the industry has already moved on to its next phase. Today we are witnessing a large scale commoditization of cloud services, driven by the following two factors:

  • Supply Side. Strong rivals arriving and catching up with Amazon Web Services.

  • Demand Side. Big players preferring to be cloud agnostic and supporting multi-cloud.


3. Democratization, Uniformization and Centralization of Data

Democratization. Big pharmas are hoarding data. They are entering into pre-competitive consortiums and forming partnerships with or buying diagnostics companies straight out. Little pharmas (startup biotechs) are left out of this game, just as they were left out of the HPC game. But just like Amazon democratized computing, National Institutes of Health (NIH) is now trying to democratize data. (Amazon and NIH are playing parallel roles in this grand story. Interesting.) Sooner or later public data will outstrip private data simply because health is way too important from a societal point of view.

Uniformization. NIH is also trying to uniformize data structures and harmonize compliance and security standards across the board, so that data can flow around at a higher speed.

Centralization. NIH not only wants to democratize and uniformize data, but it also wants to break data silos. Data is a lot more useful when it all comes together. (Fragmentation problem is especially acute in US.) Similarly, imagine if everyone could hold all of their health data on a blockchain that they can share with any pharma in return for a compensation. This is another form of centralization, radically bringing together everything at an individual level. All pharma companies need to do is to take a cross section across the cohorts they are interested in.

With its top-down centralized policy making and absence of incumbent (novel drug developing) big pharmas, China will skip all of the above steps just as Africa skipped grid-based centralized electricity distribution and is jumping straight into off-grid decentralized solar power technologies.


4. Streamlining and Cheapening of Clinical Trials

It is extremely time consuming and expensive to get a drug approved. In 2000s, only 11 percent of drugs entering phase 1 clinical trials ended up being approved by FDA. Biotech startups that can make it to phase 3 usually end up selling themselves completely (or partially on a milestone basis) to big pharma companies simply because they can not afford the process. In other words, the final bottleneck for these startups in getting to the market on their own is clinical trials.

This problem is much more multi dimensional and thorny, but there is still hope:

  • Time. Regulations are being more streamlined and thereby making the processes faster.

  • Cost. Genomics and real world data are enabling better targeting (or - in the case of already approved drugs - retargeting) of patients and resulting in better responding cohorts and thereby driving costs down.

  • Risk. As we get better at simulating human biology on hardware and software, parallelizability of experimentation will increase and thereby the number of unnecessary (sure to fail) experiments on human beings will decrease. In other words, just as in the software world, experiments will fail faster.


5. Democratization and Decentralization of Drug Development

As some of the largest companies in the world, big pharmas are intimidating, but from an evolutionary point of view, they are actually quite primitive. The existing fatness is not due to some incredible prowess or sustained success, it is entirely structural in the sense that the industry itself has not fully matured and modularized yet. (In fact, there is little hope that they can execute the necessary internal changes and evolve a contemporary data-driven approach to drug development. That is why they seek acquisitions, outside partnerships etc.)

If you split open a big pharma today, you will see a centralized quantitative brain (consisting of bioinformatics and IT departments) and a constellation of independent R&D centers around this brain. This is exactly what the whole pharma industry will look like in the future.

Once quantitative brain is split off and centralized, computation is democratized and commoditized, data is democratized, uniformized and centralized, and clinical trials is streamlined and cheaper, there will be no need for biotech startups to merge themselves into the resource-rich environments of big pharma companies. Drugs will be developed in collaboration with the brain and be co-owned. (Currently we have already started seeing partnerships between the brain and the big pharma. Such partnerships will democratize and become common place.)

Biology will start off in independent labs and stay independent, and the startups will not have to sell themselves to the big guys if they do not want to, just as in the software world.

Biology is way too complex to allow repeat successes. Best ideas will always come from outsiders. In this sense, pharma industry will look more like the B2C software world rather than the B2B software world. Stochastic and experimental.

We have already started to see more dispersed value creation in the industry:

“Until well into the 1990s, a single drug company, Merck, was more valuable than all biotech companies combined. It probably seemed as if biotech would never arrive—until it did. Of the 10 best-selling drugs in the US during 2017, seven (including the top seller, the arthritis drug Humira) are biotech drugs based on antibodies.”

- MIT Tech Review - Look How Far Precision Medicine Has Come

(I did not say anything about the manufacturing and distribution steps since the vast majority of these physical processes is already being outsourced by pharma companies. In other words, these aspects of the industry have already been modularized.)

Future of Pharma.png

cloud vs multi-cloud

In the cloud world,

  • software wants to be free. Cloud providers are incentivized to offer all sorts of free goods to drive more data and compute usage, because that is basically how they make money. They are high volume / low margin infrastructural businesses.

  • hardware wants to be virtualized. Just like sequencing centers aggregate, centralize and virtualize sequencers and offer sequencing as a service, cloud providers do the same for PCs and offer data storage and computing as a service. Users do not directly interact with the machines themselves.

In other words, cloud providers commoditize the stack above them via free-ization and the stack below them via virtualization, and thereby increase the percentage of the value they capture in the value chain.

Thinking pictorially we have the following situation:

 
Cloud World  Meat Strategy

Cloud World
Meat Strategy

 

Here, the stacks composed of small squares represent commoditized competitive markets with many players, and the monolithic stack represents a monopolistic market.

Thinking of the whole figure as a hamburger, we can say that the cloud world is “pro-meat”.

Notice that all stacks’ incentives are aligned horizontally, in the sense that they all want the entire industry to grow and the bottlenecks (wherever in the value chain they may arise) to be eliminated. (i.e. Think of industry growth as the horizontal expansion of all stacks) But stacks’ incentives are not necessarily aligned vertically, in the sense that one stack capturing more of the surplus generated by the entire value chain often implies another stack capturing less. (i.e. Dynamics among the stacks is often governed by zero-sum games rather than non-zero-sum games.) Hence each stack wants to democratize (i.e. commoditize) the neighboring stacks that it interacts with. (Read this older post for a deeper look at such stack dynamics.)

Now, a multi-cloud software strategy weakens the cloud layer (middle stack) by commoditizing cloud providers and thereby releases the tension on the hardware layer (bottom stack). Thinking pictorially we have the following situation:

 
Multi-Cloud World  Bread Strategy

Multi-Cloud World
Bread Strategy

 

This is essentially why IBM (after missing the cloud wave due to short-sightedness) ended up recently buying Red Hat for 34 billion USD:

This acquisition brings together the best-in-class hybrid cloud providers and will enable companies to securely move all business applications to the cloud. Companies today are already using multiple clouds. However, research shows that 80 percent of business workloads have yet to move to the cloud, held back by the proprietary nature of today’s cloud market. This prevents portability of data and applications across multiple clouds, data security in a multi-cloud environment and consistent cloud management.

IBM and Red Hat will be strongly positioned to address this issue and accelerate hybrid multi-cloud adoption. Together, they will help clients create cloud-native business applications faster, drive greater portability and security of data and applications across multiple public and private clouds, all with consistent cloud management. In doing so, they will draw on their shared leadership in key technologies, such as Linux, containers, Kubernetes, multi-cloud management, and cloud management and automation.

- IBM Newsroom - IBM to Acquire Red Hat, Completely Changing the Cloud Landscape