doğru zamanda bırakabilmek

Doğadaki her üssel hızda yükselme er ya da geç hız kesip düşüşe geçer. Bu yüzden efsane olmak zordur. Ancak tepedeyken bırakırsanız, grafiğinizi doğru yerde keserseniz efsane olabilirsiniz. Böylece insanlar hayallerinde grafiğinizi (doğal olmayan bir şekilde) üssel hızda uzatmaya eder ve “kim bilir daha neler yapacaktı “ gibi laflar ederler.

Fakat hiç bir babayiğit kariyerini en üst noktasında bırakıp emekliye ayrılmayı, başka bir işle filan uğraşmayı göze alamaz. Bu yüzden efsaneler hep zamansız ölümlerden doğar.

İntihar etmek de çözüm değildir bu arada, çünkü Ciroan’ın da dediği gibi zamanlamasını hiç bir zaman doğru beceremeyiz.

It is not worth the bother of killing yourself, since you always kill yourself too late.
- Emil M. Ciroan

hypothesis vs data driven science

Science progresses in a dualistic fashion. You can either generate a new hypothesis out of existing data and conduct science in a data-driven way, or generate new data for an existing hypothesis and conduct science in a hypothesis-driven way. For instance, when Kepler was looking at the astronomical data sets to come up with his laws of planetary motion, he was doing data-driven science. When Einstein came up with his theory of General Relativity and asked experimenters to verify the theory’s prediction for the anomalous rate of precession of the perihelion of Mercury's orbit, he was doing hypothesis-driven science.

Similarly, technology can be problem-driven (the counterpart of “hypothesis-driven” in science) or tool-driven (the counterpart of “data-driven” in science). When you start with a problem, you look for what kind of (existing or not-yet-existing) tools you can throw at the problem, in what kind of a combination. (This is similar to thinking about what kind of experiments you can do to generate relevant data to support a hypothesis.) Conversely, when you start with a tool, you try to find a use case which you can deploy it at. (This is similar to starting off with a data set and digging around to see what kind of hypotheses you can extract out of it.) Tool-driven technology development is much more risky and stochastic. It is a taboo for most technology companies, since investors do not like random tinkering and prefer funding problems with high potential economic value and entrepreneurs who “know” what they are doing.

Of course, new tools allow you to ask new kind of questions to the existing data sets. Hence, problem-driven technology (by developing new tools) leads to more data-driven science. And this is exactly what is happening now, at a massive scale. With the development of cheap cloud computing (and storage) and deep learning algorithms, scientists are equipped with some very powerful tools to attack old data sets, especially in complex domains like biology.


Higher Levels of Serendipity

One great advantage of data-driven science is that it involves tinkering and “not really knowing what you are doing”. This leads to less biases and more serendipitous connections, and thereby to the discovery of more transformative ideas and hitherto unknown interesting patterns.

Hypothesis-driven science has a direction from the beginning. Hence surprises are hard to come by, unless you have exceptionally creative intuition capabilities. For instance, the theory of General Relativity was based on one such intuition leap by Einstein. (There has not been such a great leap since then. So it is extremely rare.) Quantum Mechanics on the other hand was literally forced by experimental data. It was so counter intuitive that people refused to believe it. All they could do is turn their intuition off and listen to the data.

Previously data sets were not huge, so scientists could literally eye ball them. Today this is no longer possible. That is why now scientists need computers, algorithms and statistical tools to help them decipher new patterns.

Governments do not give money to scientists so that they can tinker around and do whatever they want. So a scientist applying for a grant needs to know what he is doing. This forces everyone to be in a hypothesis-driven mode from the beginning and thereby leads to less transformative ideas in the long run. (Hat tip to Mehmet Toner for this point.)

Science and technology are polar opposite endeavors. Governments funding science like investors fund technology is a major mistake, and also an important reason why today some of the most exciting science is being done inside closed private companies rather than open academic communities.


Less Democratic Landscape

There is another good reason why the best scientists are leaving the academia. You need good quality data to do science within the data-driven paradigm, and since data is so easily monetizable the largest data sets are being generated by the private companies. So it is not surprising that the most cutting edge research in fields like AI is being done inside companies like Google and Facebook, which also provide the necessary compute power to play around with these data sets.

While hypotheses generation gets better when it is conducted in a decentralized open manner, the natural tendency of data is to be centralized under one roof where it can be harmonized and maintained consistently at a high quality. As they say, “data has gravity”. Once you pass certain critical thresholds, data starts generating strong positive feedback effects and thereby attracts even more data. That is why investors love it. Using smart data strategies, technology companies can build a moat around themselves and render their business models a lot more defensible.

In a typical private company, what data scientists do is to throw thousands of different neural networks at some massive internal data sets and simply observe which one gets the job done better. This of course is empiricism in its purest form, not any different than blindly screening millions of compounds during a drug development process. As they say, just throw it against a wall and see if it sticks.

This brings us to a major problem about big-data-driven science.


Lack of Deep Understanding

There is now a better way. Petabytes allow us to say: "Correlation is enough." We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.

Chris Anderson - The End of Theory

We can not understand the complex machine learning models we are building. In fact, we train them the same way one trains a dog. That is why they are called black-box models. For instance, when the stock market experiences a flash crash we blame the algorithms for getting into a stupid loop, but we never really understand why they do so.

Is there any problem with this state of affairs if these models get the job done, make good predictions and (even better) earn us money? Can not scientists adopt the same pragmatic attitude of technologists and focus on results only, and suffice with successful manipulation of nature and leave true understanding aside? Are not the data sizes already too huge for human comprehension anyway? Why do we expect machines to be able to explain their thought processes to us? Perhaps they are the beginnings of the formation of a higher level life form, and we should learn to trust them about the activities they are better at than us?

Perhaps we have been under an illusion all along and our analytical models have never really penetrated that deep in to the nature anyway?

Closed analytic solutions are nice, but they are applicable only for simple configurations of reality. At best, they are toy models of simple systems. Physicists have known for centuries that the three-body problem or three dimensional Navier Stokes do not afford a closed form analytic solutions. This is why all calculations about the movement of planets in our solar system or turbulence in a fluid are all performed by numerical methods using computers.

Carlos E. Perez - The Delusion of Infinite Precision Numbers

Is it a surprise that as our understanding gets more complete, our equations become harder to solve?

To illustrate this point of view, we can recall that as the equations of physics become more fundamental, they become more difficult to solve. Thus the two-body problem of gravity (that of the motion of a binary star) is simple in Newtonian theory, but unsolvable in an exact manner in Einstein’s Theory. One might imagine that if one day the equations of a totally unified field are written, even the one-body problem will no longer have an exact solution!

Laurent Nottale - The Relativity of All Things (Page 305)

It seems like the entire history of science is a progressive approximation to an immense computational complexity via increasingly sophisticated (but nevertheless quiet simplistic) analytical models. This trend obviously is not sustainable. At some point we should perhaps just stop theorizing and let the machines figure out the rest:

In new research accepted for publication in Chaos, they showed that improved predictions of chaotic systems like the Kuramoto-Sivashinsky equation become possible by hybridizing the data-driven, machine-learning approach and traditional model-based prediction. Ott sees this as a more likely avenue for improving weather prediction and similar efforts, since we don’t always have complete high-resolution data or perfect physical models. “What we should do is use the good knowledge that we have where we have it,” he said, “and if we have ignorance we should use the machine learning to fill in the gaps where the ignorance resides.”

Natalie Wolchover - Machine Learning’s ‘Amazing’ Ability to Predict Chaos

Statistical approaches like machine learning have often been criticized for being dumb. Noam Chomsky has been especially vocal about this:

You can also collect butterflies and make many observations. If you like butterflies, that's fine; but such work must not be confounded with research, which is concerned to discover explanatory principles.

- Noam Chomsky as quoted in Colorless Green Ideas Learn Furiously

But these criticisms are akin to calling reality itself dumb since what we feed into the statistical models are basically virtualized fragments of reality. Analytical models conjure up abstract epi-phenomena to explain phenomena, while statistical models use phenomena to explain phenomena and turn reality directly onto itself. (The reason why deep learning is so much more effective than its peers among machine learning models is because it is hierarchical, just like the reality is.)

This brings us to the old dichotomy between facts and theories.


Facts vs Theories

Long before the computer scientists came into the scene, there were prominent humanists (and historians) fiercely defending fact against theory.

The ultimate goal would be to grasp that everything in the realm of fact is already theory... Let us not seek for something beyond the phenomena - they themselves are the theory.

- Johann Wolfgang von Goethe

Reality possesses a pyramid-like hierarchical structure. It is governed from the top by a few deep high-level laws, and manifested in its utmost complexity at the lowest phenomenological level. This means that there are two strategies you can employ to model phenomena.

  • Seek the simple. Blow your brains out, discover some deep laws and run simulations that can be mapped back to phenomena.

  • Bend the complexity back onto itself. Labor hard to accumulate enough phenomenological data and let the machines do the rote work.

One approach is not inherently superior to the other, and both are hard in their own ways. Deep theories are hard to find, and good quality facts (data) are hard to collect and curate in large quantities. Similarly, a theory-driven (mathematical) simulation is cheap to set up but expensive to run, while a data-driven (computational) simulation (of the same phenomena) is cheap to run but expensive to set up. In other words, while a data-driven simulation is parsimonious in time, a theory-driven simulation is parsimonious in space. (Good computational models satisfy a dual version of Occam’s Razor. They are heavy in size, with millions of parameters, but light to run.)

Some people try mix the two philosophies, inject our causal models into the machines and enjoy the best of both worlds. I believe that this approach is fundamentally mistaken, even if it proves to be fruitful in the short-run. Rather than biasing the machines with our theories, we should just ask them to economize their own thought processes and thereby come up with their own internal causal models and theories. After all, abstraction is just a form of compression, and when we talk about causality we (in practice) mean causality as it fits into the human brain. In the actual universe, everything is completely interlinked with everything else, and causality diagrams are unfathomably complicated. Hence, we should be wary of pre-imposing our theories on machines whose intuitive powers will soon surpass ours.

Remember that, in biological evolution, the development of unconscious (intuitive) thought processes came before the development of conscious (rational) thought processes. It should be no different for the digital evolution.

Side Note: We suffered an AI winter for mistakenly trying to flip this order and asking machines to develop rational capabilities before developing intuitional capabilities. When a scientist comes up with hypothesis, it is a simple effable distillation of an unconscious intuition which is of ineffable, complex statistical form. In other words, it is always “statistics first”. Sometimes the progression from the statistical to the causal takes place out in the open among a community of scientists (as happened in the smoking-causes-cancer research), but more often it just takes place inside the mind of a single scientist.


Continuing Role of the Scientist

Mohammed AlQuraishi, a researcher who studies protein folding, wrote an essay exploring a recent development in his field: the creation of a machine-learning model that can predict protein folds far more accurately than human researchers. AlQuiraishi found himself lamenting the loss of theory over data, even as he sought to reconcile himself to it. “There’s far less prestige associated with conceptual papers or papers that provide some new analytical insight,” he said, in an interview. As machines make discovery faster, people may come to see theoreticians as extraneous, superfluous, and hopelessly behind the times. Knowledge about a particular area will be less treasured than expertise in the creation of machine-learning models that produce answers on that subject.

Jonathan Zittrain - The Hidden Costs of Automated Thinking

The role of scientists in the data-driven paradigm will obviously be different but not trivial. Today’s world-champions in chess are computer-human hybrids. We should expect the situation for science to be no different. AI is complementary to human intelligence and in some sense only amplifies the already existing IQ differences. After all, a machine-learning model is only as good as the intelligence of its creator.

He who loves practice without theory is like the sailor who boards ship without a rudder and compass and never knows where he may cast.

- Leonardo da Vinci

Artificial intelligence (at least in its today’s form) is like a baby. Either it can be spoon-fed data or it gorges on everything. But, as we know, what makes great minds great is what they choose not to consume. This is where the scientists come in.

Deciding what experiments to conduct, what data sets to use are no trivial tasks. Choosing which portion of reality to “virtualize” is an important judgment call. Hence all data efforts are inevitably hypothesis-laden and therefore non-trivially involve the scientist.

For 30 years quantitative investing started with a hypothesis, says a quant investor. Investors would test it against historical data and make a judgment as to whether it would continue to be useful. Now the order has been reversed. “We start with the data and look for a hypothesis,” he says.

Humans are not out of the picture entirely. Their role is to pick and choose which data to feed into the machine. “You have to tell the algorithm what data to look at,” says the same investor. “If you apply a machine-learning algorithm to too large a dataset often it tends to revert to a very simple strategy, like momentum.”

The Economist - March of the Machines

True, each data generation effort is hypothesis-laden and each scientist comes with a unique set of biases generating a unique set of judgment calls, but at the level of the society, these biases get eventually washed out through (structured) randomization via sociological mechanisms and historical contingencies. In other words, unlike the individual, the society as a whole operates in a non-hypothesis-laden fashion, and eventually figures out the right angle. The role (and the responsibility) of the scientist (and the scientific institutions) is to cut the length of this search period as short as possible by simply being smart about it, in a fashion that is not too different from how enzymes speed up chemical reactions by lowering activation energy costs. (A scientist’s biases are actually his strengths since they implicitly contain lessons from eons of evolutionary learning. See the side note below.)

Side Note: There is this huge misunderstanding that evolution progresses via chance alone. Pure randomization is a sign of zero learning. Evolution on the other hand learns over time and embeds this knowledge in all complexity levels, ranging all the way from genetic to cultural forms. As the evolutionary entities become more complex, the search becomes smarter and the progress becomes faster. (This is how protein synthesis and folding happen incredibly fast within cells.) Only at the very beginning, in its most simplest form, does evolution try out everything blindly. (Physics is so successful because its entities are so stupid and comparatively much easier to model.) In other words, the commonly raised argument against the possibility of evolution achieving so much based on pure chance alone is correct. As mathematician Gregory Chaitin points out, “real evolution is not at all ergodic, since the space of all possible designs is much too immense for exhaustive search”.

Another venue where the scientists keep playing an important role is in transferring knowledge from one domain to another. Remember that there are two ways of solving hard problems: Diving into the vertical (technical) depths and venturing across horizontal (analogical) spaces. Machines are horrible at venturing horizontally precisely because they do not get to the gist of things. (This was the criticism of Noam Chomsky quoted above.)

Deep learning is kind of a turbocharged version of memorization. If you can memorize all that you need to know, that’s fine. But if you need to generalize to unusual circumstances, it’s not very good. Our view is that a lot of the field is selling a single hammer as if everything around it is a nail. People are trying to take deep learning, which is a perfectly fine tool, and use it for everything, which is perfectly inappropriate.

- Gary Marcus as quoted in Warning of an AI Winter


Trends Come and Go

Generally speaking, there is always a greater appetite for digging deeper for data when there is a dearth of ideas. (Extraction becomes more expensive as you dig deeper, as in mining operations.) Hence, the current trend of data-driven science is partially due to the fact that scientists themselves have ran out of sensible falsifiable hypotheses. Once the hypothesis space becomes rich again, the pendulum will inevitably swing back. (Of course, who will be doing the exploration is another question. Perhaps it will be the machines, and we will be doing the dirty work of data collection for them.)

As mentioned before, data-driven science operates stochastically in a serendipitous fashion and hypothesis-driven science operates deterministically in a directed fashion. Nature on the other hand loves to use both stochasticity and determinism together, since optimal dynamics reside - as usual - somewhere in the middle. (That is why there are tons of natural examples of structured randomnesses such as Levy Flights etc.) Hence we should learn to appreciate the complementarity between data-drivenness and hypothesis-drivenness, and embrace the duality as a whole rather than trying to break it.


If you liked this post, you will also enjoy the older post Genius vs Wisdom where genius and wisdom are framed respectively as hypothesis-driven and data-driven concepts.

pain and learning

FAAH is a protein that breaks down anandamide, also known as the “bliss molecule,” which is a neurotransmitter that binds to cannabinoid receptors. These are some of the same receptors that are activated by marijuana. With less FAAH activity, this patient was found to have more circulating levels of anandamide, which may explain her resistance to feeling pain.

... Dr. James Cox, another author and senior lecturer at the Wolfson Institute for Biomedical Research at University College London, said, “Pain is an essential warning system to protect you from damaging and life-threatening events.” Another disadvantage to endocannabinoids and their receptor targets is that poor memory and learning may be unwanted byproducts. Researchers said the Scottish woman reported memory lapses, which mirrors what is seen in mice missing the FAAH gene.

Jacquelyn Corley - The Case of a Woman Who Feels Almost No Pain Leads Scientists to a New Gene Mutation

Pain is needed to register what is learned. As they say, no pain no gain.

You can easily tell that you are not learning much if everything is flowing too smoothly. You take notice only upon encountering the unexpected and the unexpected is painful.

I advise mature students to stay away from well-written textbooks. They are like driving on a wide and empty highway. Typos keep you alert, logical gaps sharpen your mind and bad arguments force you to generate new ideas. You should generally make the reading process as hard for yourself as possible.

Educational progress can be achieved by making either the content or the environment more challenging. If you can perform well under constraints, you will perform even better when the environment normalizes.


Engagement enhances learning not because it increases focus but because it increases grit. Struggle is necessary. If the teaching is not engaging, student will more easily give up on the struggle. The goal is not to eliminate the struggle.

The more confident a learner is of their wrong answer, the better the information sticks when they subsequently learn the right answer. Tolerating big mistakes can create the best learning opportunities.

David Epstein - Range (Page 86)

So the harder you fall the better. The more wrong you turn out to be, the more unforgettable will the experience be. As they say, never waste a good crisis.

People usually go into defensive mode when their internal reality clashes with the external reality. That is basically why persuasion is such a hard art form to master. The radicalized easily become even more radicalized when you try to lay a convincing path to moderation.

Of course, there are times when you need to close up, refuse to learn and stick with your beliefs. World is complex, situations are multi-faceted, refutations are never really that clear. In some sense, every principle looks stupid in certain contexts. The principled man knows this and nevertheless takes the risk, because he thinks that looking stupid sometimes is better than looking like an amorphous mass of jelly all the time. Someone who is constantly learning and therefore constantly in revision mode runs the danger of becoming jelly-like. Sometimes one may need to prefer the pain of resisting to the pain of learning.


The essence of the neuromatrix theory of pain is that chronic pain is more a perception than a raw sensation, because the brain takes many factors into account to determine the extent of danger to the tissues. Scores of studies have shown that along with assessing damage, the brain, when developing our subjective experience of pain perception, also assesses whether action can be taken to diminish the pain, and it develops expectations as to whether this damage will improve or get worse. The sum total of these assessments determines our expectation as to our future, and these expectations play a major role in the level of pain we will feel. Because the brain can so influence our perception of chronic pain, Melzack conceptualized it as more of "an output of the central nervous system.”

Norman Doidge - The Brain’s Way of Healing (Page 10)

Pain is not an objective factor. As with everything else, it is gauged in an anticipatory manner by the mind. If you implicitly or explicitly believe that the associated costs will be greater, your pain will be greater.

Since pain is necessary for learning, this means that learning too is done in an anticipative manner. That is why proper coaching is so essential. The student needs to have some idea about what he desires for the future so that his cost function becomes more well-defined.

When one has no expectation from the future, one is essentially dead and floating, and has reverted back to basic-level survival mode. You need to make yourself susceptible to higher forms of pain. Some of the greatest minds I have met had mastered the art of getting mad and pissed-off. They were extremely passionate about some subject and had cultivated an exceptional level of emotional sensitivity in that area.

weaknesses and biases

All weaknesses arise from certain extremities and all successes are traceable to certain extremities. So defend your extremities, and in order to not suffer from the accompanying weaknesses, choose the environments you walk into carefully. All weaknesses manifest themselves contextually. Learn how to manage the context, not the weakness.

Similarly, your biases are your strengths. Defend them fiercely. They are what differentiates you from others. Thinking is methodological. Creativity is overrated. (Both can be learned.) What matters is the input and input is shaped by biases.

complexity and failure

Complex structures that are built slowly over time via evolutionary processes (e.g. economies, companies, buildings, species, reputations, software) tend to be robust, but when they collapse, they do so instantly.

In the literature, this asymmetry is called the Seneca Effect, after the ancient Roman Stoic Philosopher Lucius Annaeus Seneca who said "Fortune is of sluggish growth, but ruin is rapid".

Some remarks:

  • That is why only highly educated people can be spectacularly wrong. Only with education can one construct contrived highly complex arguments of the type which can fail on several different levels and lead to a spectacular failure. (Remember, the most outrageous crimes in history were carried out in the name of complex ideologies.)

  • That is also why good product designers think hard before beginning a design process that is sure to complexify over time. Complex designs collapse in entirety and are very difficult to salvage or undo. Similarly, good businessmen think hard before opening a new business since the decision to close one later is a much harder process.

  • Once entrepreneurs start building a business, they immediately start to suffer from sunk cost and negativity biases, which are specific manifestations of the much more general asymmetry between construction and destruction. We tend to be conservative with respect to complex structures because they are hard to build but easy to destruct. (Unsurprisingly, these psychological biases look surprising to the theoretical economists who have never really built anything complex and prone-to-failure in their lives.)

physics as study of ignorance

Contemporary physics is based on the following three main sets of principles:

  1. Variational Principles

  2. Statistical Principles

  3. Symmetry Principles

Various combinations of these principles led to the birth of the following fields:

  • Study of Classical Mechanics (1)

  • Study of Statistical Mechanics (2)

  • Study of Group and Representation Theory (3)

  • Study of Path Integrals (1 + 2)

  • Study of Gauge Theory (1 + 3)

  • Study of Critical Phenomena (2 + 3)

  • Study of Quantum Field Theory (1 + 2 + 3)

Notice that all three sets of principles are based on ignorances that arise from us being inside the structure we are trying to describe. 

  1. Variational Principles arise due to our inability to experience time as a continuum. (Path information is inaccessible.)

  2. Statistical Principles arise due to our inability to experience space as a continuum. (Coarse graining is inevitable.)

  3. Symmetry Principles arise due to our inability to experience spacetime as a whole.  (Transformations are undetectable.)

Since Quantum Field Theory is based on all three principles, it seems like most of the structure we see arises from these sets of ignorances themselves. From the hypothetical outside point of view of God, none of these ignorances are present and therefore none of the entailed structures are present neither.

Study of physics is not complete yet, but its historical progression suggests that its future depends on us discovering new aspects of our ignorances:

  1. Variational Principles were discovered in the 18th Century.

  2. Statistical Principles were discovered in the 19th Century.

  3. Symmetry Principles were discovered in the 20th Century.

The million dollar question is what principle we will discover in the 21st Century. Will it help us merge General Relativity with Quantum Field Theory or simply lead to the birth of brand new fields of study?

waves of decentralizations

Evolutionary dynamics always start off well-defined and centralized, but overtime (without any exception) mature and decentralize. Our own history is full of beautiful exemplifications of this fact. In historical order, we went through the following decentralization waves:

  • Science decentralized Truth away from the hegemony of Church.

  • Democracy decentralized Power.

  • Capitalism decentralized Wealth.

  • Social Media decentralized Fame away from the media barons.

Today, if you are not powerful, wealthy or famous, there is no one but to blame yourself. If you do not know the truth, there is not one but to blame yourself. Everything is accessible, at least in theory. This of course inflicts an immense amount of stress on the modern citizen. In a sense, life was a lot easier when there was not so much decentralization.

Note how important social media revolution really was. Most people do not recognize the magnitude of change that has taken place in such a short period of time. In terms of structural importance, it is on the same scale as the emergence of democracy. We no longer distinguish a sophisticated judgment from an unsophisticated one. Along with “Every Vote Counts”, now we also have “Every Like Counts”.

Of course, the social media wave was built on another, even more fundamental decentralization wave, which is the internet itself. Together with the rise of internet, communication became completely decentralized. Today, in a similar fashion, we are witnessing the emergence of blockchain technology which is trying to decentralize trust by creating neutral trust nodes with no centralized authority behind them. For instance, you no longer need to be a central bank with a stamp of approval from the government to a launch a currency. (Both internet and blockchain undermine political authority and in particular render national boundaries increasingly more irrelevant.)

Internet itself is an example of a design, where robustness to communication problems was a primary consideration (for those who don't remember, Arpanet was designed by DARPA to be a communication network resistant to nuclear attack). In that sense the Internet is extremely robust. But today we are being introduced to many other instances of that technology, many of which do not follow the decentralized principles that guided the early Internet, but are rather highly concentrated and centralized. Centralized solutions are almost by definition fragile, since they depend on the health of a single concentrated entity. No matter how well protected such central entity is, there are always ways for it to be hacked or destroyed.

Filip Piekniewski - Optimality, Technology and Fragility

As pointed out by Filip, evolution favors progression from centralization to decentralization because it functionally corresponds to a progression from fragility to robustness.

Also, notice that all of these decentralization waves initially overshoot due to the excitement caused by their novelty. That is why they are always criticized at first for good reasons. Eventually they all shed off their lawlessness, structurally stabilize, go completely mainstream and institutionalize themselves.

science vs technology

  • Science (as a form of understanding) gets better as it zooms out. Technology (as a form of service) gets better as it zooms in. Science progresses through unifications and technology progresses through diversifications.

  • Both science and technology progress like a jellyfish moves through the water, via alternating movements of contractions (i.e. unifications) and relaxations (i.e. diversifications). So neither science or technology can be pictured as a simple linear trend of unification or diversification. Technology goes through waves of standardizations for the sake of achieving efficiency and de-standardizations for the sake of achieving a better fit. Progress happens due to the fact that each new wave of de-standardization (magically) achieving a better fit than the previous wave, thanks to an intermittent period of standardization. Opposite happens in science, where each new wave of unification (magically) reaches a higher level of accuracy than the previous wave, thanks to an intermittent period of diversification.

  • Unification is easier to achieve in a single mind. Diversification is easier to achieve among many minds. That is why the scientific world is permeated by the lone genius culture and the technology world is permeated by the tribal team-work culture. Scientists love their offices, technologists love their hubs.

“New scientific ideas never spring from a communal body, however organised, but rather from the head of an individually inspired researcher who struggles with his problems in lonely thought and unites all his thought on one single point which is his whole world for the moment.”
- Max Planck

  • Being the originator of widely adopted scientific knowledge makes the originator powerful, while being the owner of privately kept technological knowledge makes the owner powerful. Hence, the best specimens of unifications quickly get diffused out of the confined boundaries of a single mind, and the best specimens of diversifications quickly get confined from the diffused atmosphere of many minds.

  • Unifiers, standardizers tend to be more masculine types who do not mind being alone. Diversifiers, de-standardizers tend to be more feminine types who can not bear being alone. That is why successful technology leaders are more feminine than the average successful leader in the business world, and successful scientific leaders are more masculine than the average successful leader in the academic world. Generally speaking, masculine types suffer more discrimination in the technology world and feminine types suffer more discrimination in the scientific world.

  • Although unifiers play a more important role in science, we usually give the most prestigious awards to the diversifiers who deployed the new tools invented by the unifiers at tangible famous problems. Although diversifiers play a more important role in technology, we usually remember and acknowledge only the unifiers who crystallized the vast efforts of diversifiers into tangible popular formats.

  • Technological challenges lie in efficient specializations. Scientific challenges lie in efficient generalizations. You need to learn vertically and increase your depth to come up with better specializations. This involves learning-to-learn-new, meaning that what you will learn next will be built on what you learned before. You need to learn horizontally and increase your range to come up with better generalizations. This involves learning-to-relearn-old, meaning that what you learned before will be recast in the light of what you will learn next.

  • Technology and design are forms of service. Science and art are forms of understanding. That is why the intersection of technology and art, as well as the intersection of science and design, is full of short-lived garbage. While all our “external” problems can be tracked back to a missing tool (technological artifact) or a wrong design, all our “internal” problems can be traced back to a missing truth (scientific fact) or wrong aesthetics (i.e. wrong ways of looking at the world).

  • Scientific progress contracts the creative space of religion by outright disproval of certain ideas and increases the expressive power of religion by supplying it with new vocabularies. (Note that the metaphysical part of religion can be conceived as “ontology design”.) Technological progress contracts the creative space of art by outright trivialization of certain formats and increases the expressive power of art by supplying it with new tools. (Think of the invention of photography rendering realistic painting meaningless and the invention of synthesizers leading to new types of music.) In other words, science and technology aid respectively religion and art to discover their inner cores by both limiting the domain of exploration and increasing the efficacy of exploration. (Notice that artists and theologians are on the same side of the equation. We often forget this, but as Joseph Campbell reminds us, contemporary art plays an important role in updating our mythologies, and keeping the mysteries alive.)

  • Scientific progress replaces mysteries with more profound mysteries. Technological progress replaces problems with more complex problems.

  • Both science and technology progress through hype cycles, science through how much phenomena the brand new idea can explain, technology through how many problems the brand new tool can solve.

  • Scientific progress slows down when money is thrown at ideas rather than people. Technological progress slows down when money is thrown at people rather than ideas.

  • Science progresses much faster during peacetime, technology progresses much faster during wartime. Scientific breakthroughs often precede new wars, technological breakthroughs often end ongoing wars.

dire need for social reform

Look at the history of all mass social traumas. (Rise and fall of feudalism etc.) You will see that they are all preceded by transformative technological and economic disruptions and followed by transformative social and spiritual reforms.

We are going through a similar trauma at the moment. These structural changes can be hard to see while you are inside them since they manifest themselves in myriad of details. However when you go back to evaluate what happened, the picture is always crystal clear. (This evaluation can not be conducted right after the dust settles. You literally need some distance to see what really happened.)

Today we have entered into a new phase in the development of the next layer of complexity within the grand narrative of life. (To understand what I mean, read Emergence of Life post.) This new technological wave is slowly unfolding, but it is probably on par with the industrial revolution, perhaps even a couple of magnitudes more powerful. Long story short, our centralized digital brain has finally emerged. (i.e. The multi-cloud layer linking up all cloud-based computation and storage resources.) This development has already started to have massive effects on our psyches via the infiltration of social media and the penetration of artificial intelligence into our everyday lives. Artists and writers have felt the zeitgeist and are responding to it by writing books and shooting movies to raise social awareness about the oncoming possible consequences of the new technologies.

Clearly, the emergence of the next life forms is a vastly complicated, non-linear process. Nature is giving birth to something new through us and naturally we are the ones who are most affected by this traumatic unfolding.

Today, society as we know it is literally falling apart:

  • Friendship has evolved into an unrecognizable form.

  • Our lives have become so complex (a natural side effect of the emergence) and we expect so much from our life partners that the notion of marriage has morphed into an all-or-nothing form. Divorce rates are skyrocketing, and the whole institution is crumbling under immense weight.

  • Our schools are extremely out-of-date and nobody seems to have the balls, persistence and the vision to reform them. (Hint: Handing out more screens will not solve the problem.) We are not preparing our kids for the challenges they will be facing when they grow up. In fact, we are not even preparing them for today’s challenges. The situation is so ridiculous that I sincerely believe that we would be better off by turning the entire thing off.

  • Our economic and social safety nets are insufficient to cope with the oncoming technological wave. People are feeling left-behind and depressed, especially since our current macro structures are implicitly asking them to derive the meaning of life from their jobs. (Hint: Handing out more money will not solve the problem.) Only after the epic rise of China (with its top-down, long-term-thinking, centralized, globally-optimized decision making mechanisms) have the business elites in United States recognized that they actually have social obligations, beyond maximizing shareholder value.

And the list goes on…

We need to speed up, otherwise our social reforms will not be able catch up with the increasing speed and magnitude of technological changes. Make no mistake, technology will not slow down for us. Emergence of the next level of life forms is an unstoppable process. If this process collapses, we, as humanity, will collapse along with it. In other words, if we can not give rise to these new life forms, evolution will promptly get rid of us and try again. (Human-level minds will re-emerge somehow, somewhen, somewhere.)

So what are we doing now? Are we reforming?

No.

What type of leaders do you need for preaching social progress and propagating social reform? You need liberal leaders. What have our liberal leader done? They fucked up badly, really really badly. Now conservatism is coming back with full force everywhere. People are fleeing back to safety, falling back onto old notions, closing down on themselves, against each other and towards new ideas. And they have every right to do so, because they feel betrayed. They can not pinpoint exactly what went wrong, but they feel that the elites have not done their jobs. And they are absolutely right. Elites chose to mind their own business and think of their own pockets. Most still feel no sense of duty towards the society. If they felt any, we would not be in this shit situation today, regressing back in time while technology is marching ahead with no stop in sight.

It will probably take another 20 years before the society gives another chance to liberal progressives and opens up to new social reforms. Again, make no mistake, liberals have done this to themselves. They can not cry it out. They need to change. In a world where a substantial majority of the graduates of the most revered university (Harvard College) chooses to pursue careers in investment banking and consulting, in a world where the most revered technology leader (Elon Musk) sees salvation of humanity through a fantasy colonization of Mars, common people will obviously feel betrayed. Our best brains need to be socially conscious. Our best leaders need to be morally sensible. If they will not do the job, society will look elsewhere, just as they are doing now.

There is an immense psychological distress at the moment. People who are supposed to save us are clueless. They do not have any spiritual strength to deal with this new (self-induced) massive attack on our social infrastructures and well-beings.

  • Most define their lives through their work, which will soon mostly be rendered irrelevant by artificial intelligence. These ones are hopeless.

  • Some define their lives through their children. These ones will be sacrificing the spiritual health of the children to salvage their own, by making the children serve their own psychological needs.

  • Some, as expected, seek help from science. But the psychologists are clueless about questions of meaning. They have even less of an idea about the deep structural evolutionary causative factors that have led to this mess.

As I said at the beginning, all technological shocks have to be followed by spiritual transformations. We literally need to ask again to ourselves what it means to be a human being. To do this at scale, we need a new set of spiritual leaders who can guide us through this new mess we created. Religion should evolve to stay relevant. Our educated elite is no longer governed by any higher values simply because they can not find any religious doctrines they can resonate with. (Doctrines meant to be addressed to uneducated masses living two thousand years ago will not do the job.)

What may be salvaging us today is a few glitters of basic humanistic instincts, here and there, a few good people with good common sense in some high level offices. But this is clearly not enough. You can not solve the greater social challenges we are facing today simply by throwing more love at them. Of course, empathy is necessary for revising and building the superstructures we need, but it is not enough by itself. (It is not even enough in quantity at the moment.)

Salvation will not happen by going to Mars. It will happen through a deep understanding of how evolution works, and through a guided progressive social reform that is not out-of-touch with the new challenges of our times.

My biggest worry is that we are slowing down too much today. Do you know what happens when spiritual guidance and principles of social self-governance fail to keep up with technological progress? Bad decisions and eventually wars! Darkness takes over, and humanity gets hammered until it realigns its values and understands its real priorities.

“The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom.”

- Isaac Asimov

Why do we always have to go through the hard way? We need to understand that this game is getting exponentially more dangerous. We are not playing with swords any more. After the next world war, there may not be another “phoenix rising from its ashes” story. Of course, as I said before, nature will always rise from its ashes and keep constructing greater complexities and autonomies, but that does not necessarily have to involve us.

gender inequalities

There are many inequalities between the genders. Males for instance experience greater variation in many traits. Whatever you measure, top and bottom percentiles always end up being male dominated. (e.g. Smartest as well as dumbest people - as measured by IQ - tend to be male.) Nature is taking greater risk with males who - as a population - pay for this greater variance in the form a lower average lifespan. (Generally speaking, in complex systems, there is a correlation between experience of stress and generation of variance.)

Here is another interesting inequality pattern that has got to do with desirability rankings:

Desirability of women is mostly biological and keeps declining. Desirability of men on the other hand is socio-biological and peaks around 45. Whereas reproductivity declines after a certain age, status accumulation has no theoretical limit.

Here are some further observations:

  • Societies compensate the drop in biological desirability by providing a status boost for entering motherhood which is considered as a holy institution in many cultures.

  • No wonder why grooming for women is such a huge industry. Anything that helps to prolong the perception of a biological downfall is appreciated. But what is the corresponding industry for men? It is get-rich-quick schemes, promising to provide quick sociological boosts. Men are much more susceptible to spam emails, gambling of all sorts etc. (Tragically, this susceptibility increases with age. Generally speaking, everyone should be de-risking their financial portfolios as they get older.) Status consciousness for men is like beauty obsession for women.

  • Average marriage age happens to be around the time when the desirability of men is equal to the desirability of women. This is another indirect proof of the generally accepted fact that it is actually women who make the marriage choices. They wait until the last second, making unconscious estimations of the future status trends for their potential husbands. Males often feel duped afterwards, but do not really understand why. At some point some of them quit and return back to the marriage pool with much higher status points. (Of course, making such a return is harder for women. That is probably why they cheat less although they feel an equal amount of temptation.)

  • Majority of germ line mutations come from sperms. (Men go through way too many sperm production cycles in their lifetimes.) In other words, the “quality” of a gene pool gets affected a lot by the older men deciding to have kids late in life. I put the word quality in quotation marks since evolution actually desires to create more variation and try out new things. It is just happens to be easier to do so through the male side. (As pointed out at the beginning of this post, most risk taking is conducted through the males.) Hence, if evolution is any guide, sexiness of older men will keep increasing over time. (Variation wants to increase and will exploit all mechanisms to do so.) Good news for the George Clooney’s of the world.