What Tech tells us about corporate culture

Tech Culture is an object of fascination to the public, even though there is little evidence that high-profile Tech companies share a homogeneous set of values. Until 2014, Facebook's internal motto was "move fast, break things" however, like all companies beyond a certain size, Tech giants can’t afford to break too many things when they try to move fast. And that’s precisely what a corporate culture helps achieving. In reality, there is more to learn about culture and ingredients of success of an organization from the science underpinning technology, than from trying to pin down what Tech culture is.

Earlier this month, I attended a training organized by the HR department of my employer ; a large Private Equity firm ; on Tech culture and the ingredients of success of high-profile Tech companies. Bruce Daisley, the speaker with experience at Youtube and Twitter, walked the audience through countless anecdotal examples to illustrate different shapes that corporate culture can take at some of the most successful Tech companies, formerly known as the FAANGs. Although there was no real recommendation as to which key ingredients a PE firm should introduce to its cultural recipe, the event was an undeniable success judging by the employee turnout and attention throughout the presentation. This reflects the general hype of the Tech sector despite recent headwinds, and it clearly met a demand for something different to the typical trainings that employees receive during their careers.

The presentation was not so much about disrupting, but rather about remaining relevant (i.e avoiding disruption) and efficient as a company scales. Regarding scaling, the example of Amazon’s internal communication certainly deserved an in-depth focus: according to Daisley, communication between teams must be limited, Power Point presentation are banned, and decision-makers received 4-to-6-pagers which they all read in opening of “silent meetings”. As for Google employees’ 70/20/10-time-allocation, which had limited practical reality internally, Amazon’s internal communication may not actually comply with the stated rules; and silent meetings have not prevented large-scale scandals. Yet these guidelines shape Amazon’s corporate culture with the view to avoid being distracted away from operations.

Amazon is interesting because communication is an explicit constituent of corporate culture whereas, for most organizations, the relationship between culture and communication is implicit. However, in all cases, the stronger the culture, the more efficient the organization. If you are familiar with Information Theory, you probably see how culture drives efficiency in communication.

Information Theory is a sub-field of mathematics that was developed by US Mathematician and Engineer Claude Shannon in the late 1940’s and which studies the quantification, storage and communication of information. It found applications in data compression, cryptography, error detection/correction in communication. It has been central to the development of all modern communication methods, including with objects millions of miles away in space.

If Shannon is sometimes referred to as the father of Tech, it is probably not in recognition for his Information Theory, but instead for a more fundamental contribution to computing as he came up with the idea that switches in electrical circuits could be used to encode information (with 0 or 1 depending on the switches positions). He is effectively the inventor of the bit and the master’s degree thesis in which he described this use-case in 1937 was later dubbed “the most important master’s thesis of the twentieth century”.

Claude Shannon (1916-2001)
Source : MIT Museum

At the origin of Information Theory, Shannon’s great insight was to establish a link between probabilities and the quantity of information necessary to convey a message. Most people apply Information Theory every day without knowing it: when you send or receive a text message skipping letters (often vowels), the message remains understandable because you can mentally fill the gaps. The gap-filling process is reasonably simple because of the limited number of letters to choose from, and even fewer options to build actual words. The higher the probabilities, the lower the quantity of information needed; and this is exactly how culture improves efficiency. Not only does this apply to corporate culture but also to any common framework, including explicit rules or laws, for the formation of ideas, for communication and ultimately for actions.

In 1950, US mathematician and philosopher Norbert Wiener stated in his visionary book The Human Use of Human Beings : Cybernetics and Society that “we are not stuff that abides but patterns that perpetuate themselves. A pattern is a message”, supporting the idea that humanity is best described by culture(s) than by “each molecule in it”, and therefore Information Theory applies to groups of people.

Wiener is considered the originator of cybernetics, the science of communication as it relates to living things and machines. He investigated Information Theory independently of Shannon around the same time. Wiener was a member of the MIT faculty when Shannon was a student there and, although they had very different approaches, Shannon credited Wiener for some of his work on entropy. Today, Wiener is primarily famous for his work on feedback mechanisms in neural networks with John von Neumann in the late 1940’s. This machine-learning approach - highly theoretical at the time - was dropped during the second half of the 20th century due to technical constraints and unavailability of large training data-sets, but big data and technical leaps at the beginning of the 21st century (semiconductors, cloud computing…) allowed the approach to prevail.

Norbert Wiener (1894-1964)
Source : Tekniska Museet, via Flickr

Pushing Information Theory to the extreme, if culture is so strong that you know in advance what is going to happen, no information needs to be conveyed so you don’t even need a message or a meeting. Individuals may have other motivations to continue to produce as much data, but no one would learn anything from it so its information-content would be virtually nil (there would still be information about the issuers' other motivations...).

At this point, you would be right to highlight that using culture to minimize the amount of information processed - basically increasing the odds of certain events in favor of others - contradicts the mere idea of diversity. For a Human Resource department, choosing the Tech industry as the backdrop of a training on corporate culture is an interesting bet because this sector is probably where the tension between efficiency and diversity is the most obvious.

On the one hand, the Tech industry churns out tools that can reduce the expected value of information-content - called entropy in Information Theory - to levels that no corporate culture could ever achieve (the concepts of “masters” and “slaves” are still commonly used to describe interactions in programming). APIs can provide structured data in standard ways, algorithms can encode causal chains and manage workflow processes, models can support decision-making…Technology can make organizations very lean and scalable, so Tech players have many reasons to be tempted to implement some of these techniques to streamline their own operations. The risk is always that, over time, the information processing capacity of both “senders” and “receivers” in the communication chain decreases, either because employees are replaced by others with lower learning capabilities or because existing learning capabilities are simply not nurtured : as high-information-content events (i.e surprises) become so rare, it is easy to lose sight of the necessity to maintain spare capacity.

On the other hand, given the relatively low barriers-to-entry and high scalability of many Tech solutions, not being a victim of disruption must keep incumbent Tech players awake at night. The core concern is not to prevent the development of disrupting technologies, but to identify innovations early-on, through internal or third-party research, and to integrate new technologies as they mature to remain relevant in the long run. By definition, innovations can hardly be expected so their information-content is high, all-the-more if they have disruption potential. The higher the surprise, the larger the information-content.

Two opposite forces are therefore at play: an urge to reduce entropy to increase efficiency and a need to process the high-information-content events that randomly occur. Although they may be particularly strong in Tech, these two forces affect any organization, community or population. As Wiener put it in Cybernetics and Society :” in control and communication we are always fighting nature’s tendency […] for entropy to increase”. Here, Wiener refers to the original meaning of entropy in Physics - before Shannon adopted the term for his science of communication- which measures disorder, randomness or uncertainty. Entropy is critical in thermodynamics for example. The second law of thermodynamics explains that, without external input (i.e for a given internal energy), the entropy of isolated systems only evolves one way : it increases over time until an equilibrium is reached at maximum chaos.

Time and entropy

Incidentally, increasing entropy may be the only evidence of a direction of time in Physics – at least according to the theory of relativity which implies that our sense of the passage of time is just “an illusion”, to use Albert Einstein’s words. This link between time and entropy also seems relevant in context of Culture, notably in the organization of populations or communities. Was it Hobbes’ intuition when he described his Leviathan (1651) as an artificial man whose limbs “move by springs and wheels as [does] a watch”?

In any case, this echoes another reality of Tech giants who can accelerate time or slow it down according to their own “master” clocks driving “slaves” across the world to synchronize processes (database updates, script runs in sequences, …) before serving end-users1.

Note that we are juggling with two concepts of entropy, entropy in Physics and entropy in Information Theory, that both speak to diversity but are not at all equivalent. We used Culture as a bridge between the two; a holistic approach that Wiener was comfortable with whilst Shannon was reluctant to generalize. So long as mere particles are concerned, the two concepts are almost synonymous, but living organisms’ ability to learn leads to a major contradiction: once something completely unexpected – meaning a perceived probability of 0 - has been experienced, it is no longer completely unexpected. Even if the new probability assigned is 0.000001% instead of 0%, the information-content decreases so entropy, in the Information Theory sense, decreases. And this is incompatible with the second law of thermodynamics.

Although most people would immediately associate intelligence with the ability to learn, the latter is only a necessary condition to the former ; it is not sufficient. The ability to learn and Life are intrinsically related, as Biology shows, even in very unintelligent forms. Bacteria constitute one of the first and simplest forms of life, as basic as one cell without nucleus and often a single chromosome. They reproduce through binary fission, effectively producing clones. Yet they “learn”, as evidenced by their genome, which acts as a record of the “surprises” that they have encountered in the past: CRISPR-Cas9, the genome-editing technology now deployed in the Pharma industry, is a natural mechanism used by bacteria to integrate portions of DNA from viruses that infected them, thereby developing immune defense for the future. Immune systems that rely on a dynamical encoding of new information in DNA in order to produce specific proteins next time a similar threat comes up, demonstrate an ability to “learn” that is common to most forms of life, albeit practical mechanisms can vary substantially.

Anyone interested in information technologies must stand in awe of Nature’s DNA/RNA machinery. But this machinery is prone to errors that continuously alter the code and keep Life on Nature’s path towards chaos (according to thermodynamics). Random mutations modify existing genes - a relatively small number of sequences of nucleotides that encode proteins-, and de novo genes come-up out of random mutations on non-encoding DNA sequences of nucleotides representing the vast majority of DNA. In addition, there is the interesting case of “horizontal” transfers whereby genetic information is transferred from a species to another via intermediaries, generally bacteria or [retro]viruses that can pick-up material from - and insert material into - their hosts’ genomes. It is estimated that up to 8% of the human genome is made up from leftovers of retroviruses although humans do not [naturally] use CRISPR-Cas9.

Organism is opposed to chaos, to disintegration, to death, as message is to noise. To describe an organism, we do not try to specify each molecule in it, and catalogue it bit by bit, but rather to answer certain questions about it which reveal its pattern: a pattern which is more significant and less probable as the organism becomes, so to speak, more fully an organism

Norbert Wiener, 1950

Walking in Wiener’s philosophical footsteps, the direction of travel is set by Nature and the complexity of any form of life may be measured by how much it can slow the travel down. The intensity of resistance is the pattern that Wiener described. To that end, Life’s toolbox is not limited to various degrees of ability to learn; sexual reproduction has also proved very efficient to reduce the probability that a mutation is passed on to the next generation unless it brings significant advantages. By analogy to attractive and repulsive forces in planetary systems, one idea of sustainability could be an equilibrium between the two opposite forces affecting entropy: if it increases too much, the group may not hold together; but if it decreases too much, the group could collapse on itself.

Biology provides numerous examples of extremely low entropy, which characterizes species that are ripe for disruption. Among all, one fruit perfectly illustrates the risks associated with maximization of efficiency. The banana is a peculiar fruit: it is parthenocarpic which means that no pollination is needed to produce fruits. This makes banana production relatively attractive as visibility on yield is higher than for typical crops.

However, in their original forms, most banana varieties produce fruits with large seeds when pollinated, which make them inedible. So bananas have been bred to produce very small seeds (today, most people don’t even realize that there are seeds in bananas) and these small seeds largely prevent the species from propagating naturally : seeds rarely germinate. Producers can go around this issue by creating clones of a tree using tissue from rhizome, a type of root. Cloning trees has the additional benefit of producing consistent fruits which merchants love (most people also don’t realize that all bananas they eat, even several months or years apart, are clones). If, on top of that, you manage to select a variety that grows dense bunches of bananas with thick peel, you also make the bananas resistant and easy to ship. As a result of such domestication process, one type of banana called Gros Michel became dominant during the 19th century and, for over 100 years, genetically identical bananas have been consumed across the world.

Banana Seeds
Source : Mkumaresa, via Wikimedia Commons

Clones all share the same strengths… and the same weaknesses. This is the reason why it is highly unlikely that you know what a Gros Michel banana looks like, let alone what it tastes like. The entire variety was wiped-out by a fungal disease – the Panama disease – starting the very end of the 19th century. Producers switched to another variety, called Cavendish, with similar characteristics (small seeds, thick peel, but different taste) and apparently resistant to Panama disease. Representing over 40% of the global production and the majority of exports to developed countries, the Cavendish banana is the dominant form since 1950. And again, all bananas are virtually genetically identical. But the rise of the Cavendish banana was only a temporary relief for the $8bn-banana-industry: a new strain of Panama disease started attacking the Cavendish in Asia in 2008 and the fungus could not be contained. It reached Latin America in 2019 and is effectively present on all continents with no known treatment. Commercial extinction of the Cavendish banana is now considered a likely outcome.

To the question of optimal corporate culture, the story of the Gros Michel and Cavendish bananas should serve as a reminder that diversity is not only about inclusion of those who were previously left out, it is also about strengthening the foundations of long-term survival. In other words, the right amount of diversity is needed and this principle has been a core ingredient of complex life : sexual reproduction does not only prevent mutations from spreading, it also mixes genetic information, adding randomness as each parent provides only half of their own chromosomes.

As a community, an organization or a population, a strong culture undoubtedly contributes to increasing efficiency, even more so if it is supported by explicit rules and/or technology. As for Life, it seems that the more an organization can demonstrate its ability to control time, i.e to control entropy, the more advanced it is. But getting exposure to “surprises” that carry a lot of information seems essential, and it cannot be a one-time thing : as we, humans, are pretty good at learning, re-introducing diversity must be a constant effort. Facilitating constant exposure of employees to randomness in the workplace justifies initiatives like Google’s 70/20/10 time-allocation. But it is not enough : the “receivers’” capacity to process messages with high-information-content, i.e. the ability to understand and learn, must also be maintained and nurtured over time, otherwise the diversity efforts would be pointless. That’s an important caveat as we enter a technological era where more and more of the learning will be outsourced to machines.


1 In Atlas of AI (2021), Kate Crawford describes how Google's TrueTime creates a proprietary universal time that can be adjusted to account for network latency


Julius Stratton, Norbert Wiener, Claude Shannon
Source : MIT Museum
We care about your privacy so we do not store nor use any cookie unless it is stricly necessary to make the website to work
Got it
Learn more