Physics explains why there is no information on social media

Physics dictates machines should minimize entropy, and humans are complying on TikTok, Facebook, Twitter, and other platforms.
Written by Tiernan Ray, Contributing Writer on

Anyone who has watched a dozen videos on TikTok with the same dance moves, or read innumerable tweets with the same canned expressions knows that there's very little information on social media. 

That is not an accident -- it is by design. Social media apps are communications channels, but communications of a particular kind. They are designed to transmit an aggregate signal of all the things people are saying, and in so doing, boost advertising revenue. To do so, social media seeks to minimize what is known as entropy, which is basically equivalent to minimizing information. 


Austrian physicist Ludwig Boltzmann was the first to interpret entropy in statistical terms. Over time, the probability of the many different possible energy states of particles increases, making it harder and harder to predict the state of matter.  

It all goes back to physics. The Second Law of Thermodynamics was formulated in the nineteenth century. It says that entropy increases in the universe over time. The Austrian physicist Ludwig Boltzmann gave the first statistical interpretation of the Second Law. Boltzmann said that over time, the probability of every possible energy state of particles in matter increases, so that it becomes more and more difficult to predict the state of matter with certainty.

The classic illustration of entropy is a glass falling and breaking. A broken glass doesn't put itself back together. Things break down and become less certain -- the universe becomes less organized, not more. That is time's arrow, and our experience of life. 

In 1948, the famous Bell Labs scientist Claude Shannon applied Boltzmann's statistical theory to information. In the now famous paper, "A Mathematical Theory of Communication," Shannon wrote that, just like particles in a gas, a message in English can have entropy, by which he meant the many possible ways that the letters of the alphabet can be assembled into words, and words into phrases forming a message.

For example, half of anything we write in English is formed by the laws of English, including grammar, syntax, and spelling, wrote Shannon. The other half is what we freely choose. That free choice is entropic, or information-rich. Information, and entropy, is freedom, wrote Shannon -- our ability to exercise choice within a system of rules. 

"[I]nformation is a measure of one's freedom of choice when one selects a message," as Shannon put it.

Information, in this sense, was a balance -- the right amount of entropy. Too little entropy, and the message to be communicated would be trivial. Too much entropy, and the message could be a chaotic mess.

In an ideal communications channel, said Shannon, the entropy of the message would be received perfectly, and the information would be transmitted. Think of this as the "good" entropy, the exercise of freedom.

But there is a problem in any communications system, said Shannon. On the receiving end, the message one transmits can become subject to noise in the communications channel. Letters can get garbled, and whole phrases can be scrambled by having letters jump position or drop out.

The received message, called the "signal," acquires a new kind of entropy, a new uncertainty, and that blocks the message and reduces communication. Call this the "bad" entropy. 

As Shannon put it, and as is all too often forgotten by those who quote Shannon, "It is therefore possible for the word information to have either good or bad connotations.

"Uncertainty which arises by virtue of freedom of choice on the part of the sender is desirable uncertainty. Uncertainty which arises because of errors or because of the influence of noise is undesirable uncertainty."

Boltzmann's statistical approach provided a solution to Shannon's concern. If entropy is a statistical fact, then in theory, entropy could be reduced, and order could be increased. Time's arrow could be reversed. The glass could put itself back together. 

To Shannon, there were two ways to reduce the bad entropy in communications terms. Either one could limit the possible messages that can be sent, or one could apply codes that impose redundancy, such as doubling each character in a word, so the uncertainty is reduced.

Flash forward to social media. Social media is also trying to recover a signal from noise in a communications channel. But what kind of communications channel is social media? 


Bell Labs scientist Claude Shannon founded the theory of information on Boltzmann's statistical interpretation of entropy. Messages contain entropy in a person's free choice of combinations of elements within a set of rules. A received signal contains parasitic entropy in the form of noise, which can complicate the effort to transmit the intended message.  

Konrad Jacobs, Erlangen

It's not a communications channel between people, for the problem of how to send a message has already been solved in the seventy years in which Shannon's coding techniques were applied. You do it all the time when you send a text message on your phone. And Web pages mean anyone can tell people what they think and thereby communicate information. Person-to-person communication was solved long before social media showed up. 

Instead, social media is a communications channel to recover the signal of the messages in aggregate, the totality of messages people send. If all two hundred million active users on Twitter are tweeting all day, or the nearly two billion active users on Facebook are posting, what is the signal that is supposed to come out of all of that?

All the many messages form signals, the prevalence of themes, the amplification of gestures. The total signal could be progressive politics in some cases, conservative politics in another, or football, or computer programming styles -- just about anything. 

The content is not important, what's important is that it amounts to an increasingly clear signal. Whatever the signal is, it is the totality, not the individual messages. Social media is the next-order derivative, if you will, of human communication -- the emergent signal of mass behavior. 

And that's where entropy comes into play. Seen from social's point of view, the good entropy -- the unpredictability of lots of people doing stuff on the Internet -- is also the bad entropy, in the sense that it may make the received signal highly uncertain.

People chattering away are like Boltzmann's ideal gas, where the particles become increasingly hard to predict. Something has to reduce entropy. The glass has to put itself back together. 

As Shannon suggested, there are two choices. Either the messages can become more redundant and predictable, or a coding system can be applied that transforms the messages into something redundant and predictable. 


Shannon conceived of a complete communications system, where messages, chosen freely by people from among a set of rules, go in the system as the input at the source, and are received on the other end as a signal. Shannon was concerned with how to preserve the good entropy of the freely chosen message from being overwhelmed at the receiver in the parasitic entropy of noise.

Claude Shannon, 1948.

Both approaches are in use on social media. Coding is explicitly applied by things such as "like" buttons. In the form of the social graph on Facebook, or the Twitter information graph, users cluster in patterns that supposedly reveal what their true interests are. As you click "like" on a post, or you retweet something, or as you share a TikTok video, you're averaging out to a less-random set of choices.

Thus, the explicit information -- here's a photograph of my vacation, or, here's my view on the mayoral candidates -- is not the important component on social. What's important is the bucket of interests into which such behavior falls. It's the unconscious, hidden signal behind the individual messages.

But explicit coding is not the only element at play. Humans on social media understand on some level that reducing entropy is important. That's why they voluntarily work with the system to reduce entropy. 

Mindful of likes and follows, humans will choose behavior that reinforces predictability. Verbal ticks of the sort "I can't even" on Twitter, conveying an attitude, or the perfect dance move replicated on TikTok, are ways for a person to put themselves in accord with the dominant signal on these social networks. They are examples of people making every message redundant so that the signal comes through the noise. 

People will, of their own choosing, reduce their entropy and align with the machine. Every time someone on Facebook chooses to recirculate something known to produce oohs and ahs, and every time someone prepares the perfect sunset beach photo for Instagram that can be assured to receive "likes," if accompanied by the proper hashtag, it is an instance of self-shaping behavior, the voluntary reduction of entropy, and, hence, the reduction of information.

Memes, the use of a single, recognizable image, are a form of data compression, and also the quintessence of entropy reduction. All the possible ways to speak can be reduced to a visual utterance that is already in circulation among most people. The meme conveys no information precisely because it gives all who see it the predictable behavioral signal they already possess.  

Understanding social's role in reducing entropy makes clear some misconceptions about social media.

Numerous people have pointed out that social doesn't just reveal preferences, it shapes them. Roger McNamee, in Zucked, a deeply critical book about Facebook, describes manipulation of user behavior as central to The Social Network. McNamee draws upon the insights of those who are intimately familiar with Facebook, such as onetime Facebook engineer Tristan Harris. 

The terms "growth hacking" and "data voodoo dolls" explain the use of algorithms by Facebook and others to induce people to behave in certain predictable ways.

As powerful as such examinations are, they suffer from a misconception. Such critiques suggest that manipulation is a departure from the true mission of social -- namely, to allow people to come together and communicate.

Seen through the lens of Shannon and Boltzmann, the reduction of entropy via redundant, and therefore predictable, behavior is not an aberration of the communications channel, it's the whole point of the communications machine. As a machine to transmit a signal of intent, social is designed to reduce entropy -- to promote predictability. 

Again, the content is not important. What is important is that content of any kind increasingly forms a clear signal.

And that reveals another important misconception. The typical conception is that social media is a communications channel for people, first and foremost, supported by advertising dollars as a kind of necessary evil. 

But, as profit-driven enterprises, the most important signal social media can recover is not human expression but rather the buying signal of advertisers. When ad buyers, such as large brands, buy "promoted tweets," or Facebook banner ads, the advertisers themselves are revealing their preferences.

Again, with maximum entropy, ad buyers would be randomly placing ads all over the place with about equal frequency. Presented with repetitive, predictable content, ad buying becomes sorted into predictable buckets of the most likely spending. 


The information-rich diversity of chattering humans, left, must be reduced to more organized buckets, right, on social in order to sort out a clear signal for advertising purposes.

Tiernan Ray for ZDNet

If an advertiser wants to place ads against a trending topic, the expressed preference is for that kind of narrowly defined material. Social reduces the uncertainty of advertiser preferences. 

Although ad buyers think of themselves as knowing what they want, social media engineers know that most ad buying is haphazard and scattershot. Most advertisers have a budget, and they press buy and ask questions later. They have very little idea what they're doing. Social provides a way to bring order to that chaotic intent. 

Do advertisers get anything out of all this? Based on the claims of Facebook, Twitter, Snap, and Pinterest, there is greater transparency, so that advertisers can see their return on investment in things such as what's called "reach," and either implied or explicit intent on the part of users -- intent to buy, such as actually purchasing, or intent to at least learn more about a product. 

However, to the extent advertisers are being induced into a predictable set of buying buckets, it may be difficult for advertisers themselves to tell what is their own choice and what is their compliance with the machine, just like the users. 

This is especially true during political advertising season, a big period for ad buying on social media, just as it is in offline ads. In the US, social media is mainly helping to transmit the buying intent for Republican or Democrat boosting, whether or not it has any actual voting impact. 

Do humans get anything in the bargain? The signal that is transmitted by social media is not meant for human consumption. It is meant to be plugged into another machine, the advertising buying machine, especially the machine known as programmatic ad buying, which responds reflexively to data. Whether humans enjoy social media, or learn anything from social media, is irrelevant. 

Of course, humans don't feel that way. Anyone who has posted a vacation picture on Facebook feels that they are not merely participating in collective activity, but sharing information, and also conveying meaning. And that may be true on some level.

Without getting into the philosophical implications of shaped, and self-shaping, behavior -- "See how great my vacation is!" -- whether such acts are information-rich is irrelevant to the machine if those utterances don't monetize. Because then they don't help to recover the buying signal of advertising. 

To social media, most human behavior, including your vacation pictures, is just noise.   

Editorial standards