Newsletter Natural Selection

Apparently, Substack wants to destroy newspapers. And maybe that would be good — maybe it would be good for journalism to be democratized, for bloggers to inherit the earth. Of course we’re bloggers and not newspapers, so maybe we’re biased.

Obviously it would be great if someone came up with a set of blogging and newsletter tools that were just amazing, that were the clear front-runner, that outperformed every other platform. We’d love it if the technical problems were all solved and we just had a perfect set of blogging tools.

But if everyone ends up on the same platform, well, that’s kind of dangerous. If one company controls the whole news & blogging industry, they can blacklist whoever they want, and can squeeze users as much as they want.

Even if you think Substack has a good track record, there’s no way they can guarantee that they won’t squeeze their writers once they control the market. Even if you trust the current management, at some point they will all retire, or all die, or the company will be bought by, and then you’re shit outta luck.

Substack just can’t make a credible commitment that makes it impossible for them to abuse their power if they get a monopoly. You have to take them at their word. But since management can change, you can’t even really do that. They just can’t bind their hands convincingly.

But there may be some very unusual business models that would fix this problem. 

On the Origin of Substacks

Imagine there’s a “Substack” company that commits itself to breaking in half every time it gets 100,000 users (or something), creating two child companies. Each company ends up with 50,000 users. All the blogs with even-numbered IDs go to Substack A, and all the blogs with odd-numbered IDs go to Substack B. The staff gets split among these two companies, and half of them move to a new office. Both companies retain the same policy of breaking in half once they hit that milestone again — an inherited, auto-trust-busting mechanism.

(Splitting into exactly two companies wouldn’t have to be a part of the commitment. They could equally choose to break up into Substack Red, Substack Blue, and Substack Yellow: Special Pikachu Edition.)

In addition, a core part of the product would be high-quality, deeply integrated tools to switch from one of these branches to another. Probably this would involve an easy way to export all your posts and a list of your subscribers to some neutral file format (maybe a folder full of markdown, css, and csv files), and to import them from the same format into a new blog. If you end up in Substack B and you want to be in Substack A instead (your favorite developer works there or something), the product would make it very easy to switch, maybe to the point of being able to switch at the push of a button.

To help with this, the third and final commitment of the company, and all child companies, would be to keep the software open-source. Unlike biological evolution, software evolution isn’t siloed. If Substack Air implements a great feature, and the team over at Substack Earth likes it too, they can just go to the open-source code of their sister company, snip that AAAGTCTGAC, and copy it over to their branch.

Each of these child companies would go on to develop their tool from the same starting point, but of course the companies would speciate over time. More competitive branches would get to 100,000 users first and would split again, so there would be more descendants of successful branches. Bad branches would die off or just never grow enough to speciate. 

Because it’s easy to switch, branches that make a bad decision will also face an exodus of users to different branches that don’t suck. Of course, one species of Substack might choose to remove the feature that allows you to switch easily, but this seems like evolutionary suicide. Faced with the prospect of being locked in, most users will switch away if there’s any hint of removing this feature. It’s ok if people decide to stay, of course, things might just get weird.

After several generations of isolation from the main line, bloggers will look like this.

Many branches would die — nature is red in tooth and claw, after all — but many companies die off in the normal course of the economy anyways. And it’s reassuring that there would be an ecosystem of similar, related companies that would be ready to hire on any deserving refugees.

Ghosts Undergoing Mitosis

This would keep one company from taking over the blogging ecosystem and imposing terrible conditions. Or rather, if one lineage did dominate the blogging ecosystem, that could be a good thing, not a danger to free thought and free expression. That lineage would be split up across multiple companies with different leadership styles and different values, and would lack the kind of monopoly that tempts men to evil.

If Substack were our company, we would not only implement this idea, we would emphasize it a lot in our marketing and recruitment — not least because your target demo, bloggers, are smart and paranoid. They want this kind of freedom, ownership, and control, and they’re worried about the fact that current platforms sometimes seem a little power-hungry.

It would take Substack a minute to make this pivot, but other companies could do it right now. In fact, the web publishing platform Ghost is already planning to do something along these lines.

Ghost is already open-source, which is the big requirement to get started immediately. If they developed some quality Ghost-to-Ghost migration tools (uhhh… G2G tools?) and started branching, they could do this tomorrow. But probably they don’t want to follow our plan, for in an amusing display of convergent evolution, they have come up with a very similar plan of their own: they plan to stop growing at 50 employees and let other companies take on the growth from there.

(For more about Ghost’s fascinating business model, see here.) 

John O’Nolan, the founder of Ghost, who apparently lives on a sailboat (mad props) was interviewed by the Indie Hackers Podcast, where he said (go to 41:49 or so):

Interviewer: I think with you, we were just talking about this, I think, a few months ago, you have this other arbitrary constraint, I’m not sure why you have it, it might be like a side effect of the fact that you’re a nonprofit, but you can’t hire more than fifty people, was it? At which point you’re constrained and you have to figure out how to grow and become bigger and better without hiring a single more person than fifty. So where’s that constraint come from? 

John: I love that you brought this up, because it’s something I think more and more about nowadays. We’re coming up on, I think 27 people, so more than halfway there, and the rate at which we’re hiring is increasing, so the kind of fifty-sixty number is very much on the horizon, it’s within sight. And the constraint comes from, I have never worked at a company bigger than that which didn’t have office politics, or disconnection from the mission, or where things kind of stopped being fun. And from all the people we’ve hired over the years, there’s a remarkable amount of refugees, who were ar startups, they passed the kind of sixty, seventy person mark, things stopped being fun, middle management came in, the founders sort of left the early team behind, and started pursuing growth goals at the cost of people, and everything just sort of like *sigh* lost what made the journey special, around about that point. 

And there are just so many people who have the exact same story, at a certain point we just said, ok, well, what if we just don’t grow bigger than that, we’ll just stick like not bigger than fifty-ish. Fifty, sixty, somewhere around there. Not going to like, say, be really belligerent about a fixed number, but around that point, what if we just put a line, say, “ok no more”. And… what will that do? 

So, first of all, the same as what I was talking about earlier, it keeps Ghost as a company I’m happy to be stuck with. I want to have a group of fifty or sixty people where I know every single person well — not a large group of strangers who are all just working to a common economic incentive, but a team, a group of people who really know each other deeply and meaningfully, which I think you can still achieve around that size. 

But then, the logical question that follows is, ok, what are the goals of the company once you have fifty or sixty people and you still have ambition? How do you fulfill whatever goals you have that kind of don’t fit into the model of that size of company? And the answer is, you have to change your ambition, or you have to change the model with which you approach your goals.

So, a lot of how I think about Ghost now is less about growing one company — one centralized company — and more about growing a large, decentralized ecosystem. So whereas many-slash-most companies will try to grow bigger, and absorb smaller companies, and kind of be this big blob, consuming more and more of the market to become the holy grail of what everyone wants to become, which is a monopoly that dominates a market, kind of think about the opposite, how can we make Ghost, the products, a really strong and stable core, and then spin off all the other things for which there is demand from the market, but that we don’t have a big enough team to build. 

So maybe that’s community features, or maybe it’s video and media that integrates with Ghost really well, or maybe there’s an enterprise hosting option of people who DO love to get those emails from large companies with a big procurement process and close those deals. If we can have our smaller team make a tight core that enables lots of businesses to exist around Ghost, and around that open-source core, then an ecosystem will evolve around it of multiple economic dependents, and it will probably function similarly to a large company, except that I won’t control all of it, and that’s actually very appealing to me, I don’t want to control all of it, I don’t want to have the final say in how everything should evolve. 

This sounds a lot like the speciation idea we describe above. He even uses the term “ecosystem”! 

There are a few core differences. Limiting the company by the number of employees rather than the number of users might be the better way to go. So in a different version of our proposal, the company could be organized into several teams and the teams could become separate companies once the company has hit 60 employees or something. 

O’Nolan envisions an ecosystem more like Darwin’s finches — related companies that spread out to fill different niches, one for blogging, one for comments, one for video, one for different hosting models, etc. This seems like it would be relatively easy to do, and you can see how a successful company would draw related companies into existence, like a coral reef.

In contrast, we imagine an ecosystem of different companies competing (hopefully friendly competition, but still competition) for the same major niche, like birds and mice all competing for the same nuts and seeds. This seems good because competition will lead to better products, especially given built-in features that let bloggers vote with their feet. It also seems uniquely good in that, if Ghost or Substack or anyone does come to dominate the blogging world, this system will keep them from monopolizing it. 

So we think Ghost should consider not stopping at 50 employees, but undergoing mitosis instead, and splitting into Ghost Day and Ghost Night; or Ghost Sweet and Ghost Sour; or Ghost To and Ghost Fro; or Ghost Claw and Ghost Fang; or Ghost Sound and Ghost Fury; or Ghost Charm and Ghost Strange; or Ghost Video and Ghost Radio; or Ghost Milk and Ghost Honey; or Ghost Rosencrantz and Ghost Guildenstern; or Ghost Migi and Ghost Hidari; or Ghost Ale and Ghost Lager (and Ghost Lambic); or X-Mas Ghosts Past, Present, and Future; or 

*cane reaches out from the wings and pulls us off stage*

Special thanks to our friend Uri Bram for enlightening discussions about the world of online publishing.

Predictions for 2050

Erik Hoel makes a list of predictions for 2050.

This may seem like the far-flung future, but as Hoel points out, it’s only 28 years away. Making predictions for 2050 based on what we see today is just like sitting in the early ‘90s and predicting what the world will look like in the 2020s.

Hoel makes his predictions based on a simple insight: change is incremental, and the minor trends of today are the institutional changes of tomorrow. If you want to know what 2050 will look like, think about the nascent trends of the early 2020s and project them into the future:

If you want to predict the future accurately, you should be an incrementalist and accept that human nature doesn’t change along most axes. … To see what I mean more specifically: 2050, that super futuristic year, is only 29 years out, so it is exactly the same as predicting what the world would look like today back in 1992. … what was most impactful from 1992 were technologies or trends already in their nascent phases, and it was simply a matter of choosing what to extrapolate. For instance, cellular phones, personal computers, and the internet all existed back in 1992, although in comparatively inchoate stages of development. … The central social and political ideas of our culture were established in the 1960s and 70s and took a slow half-century to climb from obscure academic monographs to Super Bowl ads. So here are my predictions for 2050. They are all based on current trends.

We think this approach is really smart. In fact, we like it so much that we wanted to take it for a test drive. In this post, we make our own set of predictions for 2050, using Hoel’s method of picking out trends that we suspect will go on to shape the 2020s, 2030s, and 2040s.

Projects are more fun when you do them with friends, so we invited a bunch of other bloggers to make their own predictions for 2050, using the same approach of extrapolating trends that they think are important today. So far we have predictions from:

Here at SMTM, we’re going to add something to Hoel’s original method of extrapolating “technologies or trends already in their nascent phases”: regression to the mean. What we mean by this is, well — the 20th century was very unusual in many different ways.

A lot of things that we take for granted are really, really new — like 401ks (invented in 1978), Traditional (1974) and Roth (1997) IRAs, and modern credit scores (1989). Indexes like the Dow Jones and the S&P 500 run back several decades, but index funds that track them only appeared in 1972. In 1940, only 5% of US adults over 25 had a college degree and only 25% had a high school diploma. Even income tax wasn’t a permanent part of the US tax system until 1913 — we had to do a whole amendment to the Constitution to make it happen.

Some of these may be here to stay, but looking back from 2050, a lot of 20th century “institutions” will look like a flash in the pan. The trends that are holding will probably hold, but any 20th century abnormalities that seem to be reversing are likely to go back to the way they were for most of human history. A nascent trend that looks like regression to the historical mean is much more likely to be a trend that will continue on to 2050.

Hoel’s Predictions

We agree with a lot of Hoel’s predictions. A Martian colony, or crewed missions to Mars at least, are looking pretty likely as the price of space travel drops (and he’s not the only one predicting this). We’re also reminded of the recent increasing interest in charter cities and Georgism — Mars would be a great location for your wacky new city and it’s the closest we’re going to get to making all-new land, at least any time soon. 

Hoel is clearly right that we will move away from stores, but this might also look like more business done out of people’s homes (like was done historically) or like more business done in something like a marketplace with semi-permanent stalls (like was done historically). 

Genetic engineering of embryos to avoid disease is already being done and it does seem like this will happen more and more. Similarly, anti-aging technology is already here and will just keep getting better and cheaper, especially given that Peter Thiel is involved. But this is sort of hard to square with Hoel’s final prediction, that 2050 will be “the winter of my life”, at the age of only 62. It seems a little pessimistic on Hoel’s part. Didn’t you hear that in 2050, 62 will be the new 25?  

Sometimes we agree with the general picture, but not with the details. Education will indeed be mostly online (again, it already is), but it will look more radical than what Hoel imagines. The real education giant today is not Harvard, or even MITx, but YouTube, and we will see more of THAT in the future. 

Hoel is right that AI will be impactful in day-to-day life, but we think this is true only in the obvious ways. You will still have Siri and Alexa, but you won’t have Data, or even Bender. We might have better image classifiers and even decent chatbots. Strong AI may be a possibility by 2050 (a topic for another time), but by the “extrapolate the future from current trends” technique, in 2050 many classifiers will still have a hard time telling the difference between a dragonfly and a bus. GPT-29 will be able to churn out a movie script as formulaic as that of the average Hollywood scriptwriter (and may well replace them) but it won’t be replacing writing that requires anything as complicated as “themes” or “meaning”. 

Hoel predicts wild changes in family structure — specifically, the decline of the family and the rise of the throuple. We agree family dynamics will change, but again we disagree on the specifics. More on this in a minute.

A few predictions we disagree with in general. Hoel predicts that the online mob will create an endless culture war, and that “the future really is female”. But the current culture war is amusingly soft compared to many cultural conflicts in living memory, and the fact that women get a majority of all degrees means very little when you believe that university degrees are worth less and less all the time!

Finally, we disagree that people and culture will become boring. Thanks for reading a pseudonymous mad science blog called SLIME MOLD TIME MOLD

SMTM Predictions


This first one is less an original prediction than an elaboration on Hoel, who says: 

Buzzing drones of all shapes and sizes will be common in the sky (last year Amazon won FAA approval for its delivery drone service, opening the door for this). Small robots will be everywhere, roving the streets in urban areas, mostly doing deliveries.

We agree. Robots will stay dumb but you will see a lot of them, possibly in delivery. It’s hard to look at work from Boston Dynamics and not expect that in 30 years we’ll have lots of quadrupedal robots trotting around our streets, carrying goods and generally acting as porters, footmen, and stevedores. 

If you get the price point low enough, small robots might even replace backpacks and suitcases. Boston Dynamics’ robot Spot currently costs $74,500, but 30 years of R&D can do a lot. Let’s take computers — in the early 90s, 1 GB of storage cost about $10,000. But these days you can get a 2 TB drive for about $50, which puts a 1 GB at only a few cents. If the same thing happened to Spot, it would cost less than a dollar. We don’t expect anything this drastic, but similar forces could turn quadrupedal robots into household goods. Our bet is on robotic palanquins.

The Witch of the Waste: robotics thought leader.

It also seems very likely that with 30 more years of R&D, we’ll have ironed out all the last problems with self-driving cars, so expect that kind of robot as well.

More Infectious Disease

For most of human history, infectious disease was a fact of life. As in so many things, the 20th century was an aberration. We developed antibiotics, improved hygiene, even eliminated some diseases altogether. But this pleasant moment in the sun is over. Someone writing in December 2019 might be forgiven for thinking that with our medical knowledge and scientific might, we could defeat any disease that might rise up. But evidently not.

This XKCD from 2015 aged kind of poorly. even says, “at the time of writing it was not readily apparent that the old dog still has some teeth”

This means more pandemics. Many will become endemic, as will probably happen with COVID. Some existing diseases will become resistant to our best antibiotics. If we’re really unlucky, we will see the return of smallpox or some horrible mystery plague released by the thawing permafrost. (Hoel is also concerned about this.) 

We still have germ theory, so we won’t be sent back to the state of things in 1854. But the future will look more like the past, and we’ll have to start paying attention to disease in the way our ancestors did. As historian Ada Palmer describes, “I have never read a full set of Renaissance letters which didn’t mention plague outbreaks and plague deaths, and Renaissance letters from mothers to their traveling sons regularly include, along with advice on etiquette and eating enough fennel, a list of which towns to avoid this season because there’s plague there.” Embrace tradition with this delicious recipe for Fenkel in soppes

Citizen Research

These days, big universities and medical centers and stuff are responsible for most research. But this is a big deviation from the historical norm. In the past, random haberdashers and architects and patent clerks and high school teachers, or just rich people with too much time on their hands, were the ones doing most of the cutting-edge research. 

There are already many signs of regression to the mean on this. Anonymous 4channers are publishing proofs to longstanding superpermutation problems on anime boards. The blog Astral Codex Ten (and predecessor blog Slate Star Codex, by the same author) publishes major reviews (“much more than you wanted to know”) on a wide variety of topics — disease seasonality, links between autism and intelligence, melatonin, you name it. Sometimes he even does empirical work — case studies on the effect of CO2 on cognition, large nootropics usage surveys, studies of SSRI usage, etc.

Pseudonymous internet besserwisser Gwern writes long articles on everything from Gaussian expected maximums to generating anime faces with neural networks. Wikipedia, the largest and most-read reference work in history, is written entirely by volunteers. And of course there’s us, Slime Mold Time Mold, creating a book-length original work where we argue for a new theory of the obesity epidemic. 

This is only going to speed up. The 2020s will see a lot more research from people who aren’t in the academy, and by 2050, most of the best scholarship will be done by laypeople.

Elective Chemistry

At some point in the near future, the trends of plastic surgery, nootropics, psychedelic legalization, trans hormone therapy, and bodybuilding will collide, with spectacular results. 

Doing things to reshape your body and mind is an idea as old as dirt, but with recent advances in technology, and breakdowns in cultural taboos, the practice of what could be called “elective chemistry” is going to take off, probably in the next 10 or 20 years. 

Why let nature be the only one who has any say over the chemicals affecting your mind and body? It’s already common to use caffeine, alcohol, and tobacco to reshape your mind. If you’re willing to go out of your way, you can get a psychiatrist to prescribe any number of mind-altering chemicals, and many people today are on lexapro or modafinil or adderall or wellbutrin full-time. And while this is easy enough to do legally, it’s even easier outside the law — many people use psychedelics, steroids, or hormone therapies illegally, to change their minds or bodies as they see fit.

This won’t just become more acceptable for people on the margins of society, it will become mainstream. Cis people are already the largest consumers of hormone therapy and other medical procedures normally assocaited with trans healthcare (largely because of base rates, but even so). Cis men sometimes go on androgen replacement therapy as they age, and cis women often go on hormone replacement therapy after menopause, which sometimes includes testosterone. And it’s equally easy to use them as mind-altering substances, since they have psychological effects as well as physical ones.

Working out, getting plastic surgery, and taking steroids or hormones are all just forms of body modification. We’ve already come to accept piercings and tattoos, to the point where they’re practically boring. In the near future, most forms of body modification will be unremarkable, in the literal sense that you cannot be bothered to remark on them.  

(This may be extended even further by the development of better prosthetics, like the extra thumb or connecting your brain directly to social media — wait that last one seems like a bad idea.)

Europe will become less important, regional politics more important, and general de-globalization

Europe was a technological and cultural backwater for most of history. Then, in the 16th century, Europe began a period of explosive growth and development, sometimes called the Great Divergence. There’s a lot of interesting debate as to why this happened, but it definitely did happen. 

It was also definitely a historical anomaly, and there are already signs of things going back to the way they were. There was a crunch in favor of Europa and her direct offshoots up to the middle of the 20th century, but since 1950 things have been turning around:

The fastest-growing economies in the world are all countries like Bangladesh, Ethiopia, Vietnam, Turkey, and Iran. Brazil is already the 13th-largest economy in the world, Indonesia the 16th, and Nigeria the 27th — all ahead of countries like Ireland (29th), Norway (31st), Denmark (37th), and Portugal (49th). It’s hard to predict who the big winners will be, but it’s clear that Europe will become less and less important, as countries in the rest of the world become major powers.

As wealth and power gets more distributed, supply chains will get shorter and less global. Measures of globalization used to increase year after year, but they sputtered in the financial crash of 2008 and never really recovered. COVID has provided another shock, a disruption that is far from over. There isn’t really a trend away from globalization yet, but the trend in favor of globalization has definitely stalled. 

There may also be regression to the mean in protectionism. Historically, many states have supported themselves largely through tariffs (see e.g. for the US), and protectionism may be good for growing economies. If globalization really has stalled for the long-term, and certainly if it starts to reverse, we may see more and more tariffs, even a shift in how governments fund themselves. Russia and India have already begun taking steps in that direction, and other countries may follow.

Non-nuclear families

Historically, most people lived in large extended families. The nuclear family, at least as we know it today, is largely an artifact of the unusual circumstances of the 20th century. As income inequality and the cost of buying a home increase, more people will live in large groups — be that group houses, “adult dorms”, or multigenerational homes. COVID has accelerated this trend. More young adults (18 to 29) are living at home now than they were at any point since 1900. The future doesn’t look like Leave it to Beaver, or even The Simpsons

Part of this will be transitioning back to a system where familial wealth is more important than personal wealth. Historically if your family disowned you, you were screwed. This is why a mainstay of 19th century literature is killing your brother for an inheritance.

And as much as the “kill your brother for the inheritance” thing was a pattern of the upper classes, familial wealth was more important than personal wealth even for peasants (though for peasants, it was sort of more communal wealth than familial).

This is why we agree with Hoel’s prediction of major changes in family structure. We agree that “normal families” are on their way out. But we disagree on nearly all the specifics. We don’t expect to see lots of single-parent homes — we expect more multi-generational homes, group homes, or other arrangements, with lots of adults co-raising children. See e.g. Kelsey Piper’s experience, her main conclusions being “I have no idea how people with two parent households manage” and “I wish we had even more breastfeeding parents”. Put that on a t-shirt: Even More Breastfeeding Parents by 2050.

And instead of seeing a rise in throuples, we expect to see a return of that very old-fashioned arrangement, the Harem — where a person of means has multiple wives, one wife and multiple concubines, etc. etc. 

Wage labor becomes less common 

Tying yourself to a major employer is still the norm today, but this is changing. Some people will be paid on retainer (i.e. a salary), and some jobs where you really are being paid for your time (e.g. security guard) will still be hourly, but more and more people will be paid to complete specific tasks or deliver a particular result, with no questions asked as to how fast they did it.  

We think the gig economy is coming for the rest of the marketplace, but instead of everything being chopped up into little tasks and ruled by corporations (à la Uber), we expect it to look like more contract workers and fewer full-time employees. More people will be self-employed, or will form small companies to deliver goods or services on demand. 

We expect this is (mostly) a good thing. People benefit from being their own boss and being able to do the work however they want, as long as they get it done. Being paid to stand around and look busy isn’t good for anyone. 

To a historical person, wage labor would be one of the strangest things about the modern world, and the idea of a steady job with benefits would be even stranger. Most people were farmers and almost never had any reason to handle money. Even if you were a potter or a blacksmith, you were paid for your product, the actual bowl or knife you were selling, not for your labor or the hours you worked.

Antibodies to the Outrage Economy

Once upon a time, clickbait was a major annoyance, but it was mostly a problem because people fell for it. The term was invented in 2006, and clickbait was the scourge of the internet for a few years, but by 2014 the cultural immune response was in full swing. The Onion launched ClickHole that year, Facebook started taking steps to squash clickbait, blah blah blah.

Now, no one reads clickbait because we’ve learned better. People are learning again. Hoel is worried that “Social media will ensure an endless culture war and internal social upheaval.” But we’re not worried, because soon we will develop cultural antibodies to the outrage economy, just like we developed cultural antibodies to clickbait, or to evolution vs. creationism debates, or to whatever was blowing up the internet in the 1990s (arguments about Microsoft?).

In fact, we’re already getting there. There was a time when we used to click on outrageous political stories. Now I think, “They’re rifting me”, and move on without clicking. No one has written the definitive piece on it yet, but “don’t read the news” is a meme that’s gaining steam. We hear it from our friends all the time. People are waking up to the fact that the news will do almost anything to raise your blood pressure, and that freaking out about “the issues of the day” does no one any good.

There will always be some new brainworm that we have to develop cultural antibodies to. And it might be fun to speculate which stupid argument will threaten to tear us apart next. But the outrage economy is on its way out, and the divisions of 2050 will look very, very different from the divisions we see today.

Identity and Anonymity Online

In the early days of the internet, everyone was anonymous — as the old saying goes, “on the Internet, nobody knows you’re a dog.”

But today the assumption is that everyone uses their real name. This is Mark Zuckerberg’s fault, for pushing real names on Facebook. “You have one identity,” he says in David Kirkpatrick’s 2010 book, The Facebook Effect. “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly. Having two identities for yourself is an example of a lack of integrity.”

But this will be even more of a flash in the pan than fighting about politics online. Internet anonymity is already coming back into style (hello) and this trend will continue into the future. Most people will have a mix of public and private accounts, pseudonyms, alts, and pen names. As with many of our other “predictions”, this is pretty much true already — what will change is that there will eventually be widespread acknowledgement and acceptance.

This is also a regression to the historical mean. Public anonymity and pseudonymity is a long and esteemed tradition — just ask Voltaire, George Sand, Mark Twain, Lewis Carroll, George Orwell, or Dr. Seuss.

During the American revolution, practically everyone was using a pseudonym. Many of these guys were already famous public figures, and ALSO writing pseudonymous letters. They were having it both ways — they had alts! 

Alexander Hamilton, James Madison, and John Jay wrote The Federalist Papers under the name “Publius”. But Ben Franklin was the real master of this — his pseudonyms included not only Richard Saunders of Poor Richard’s Almanack, but also “Silence Dogood”, “Caelia Shortface”, “Martha Careful”, “Anthony Afterwit”, “Miss Alice Addertongue”, “Polly Baker”, and “Benevolus”. We shudder to think what he would have done with even a dial-up connection.

Advances in crypto, VR, AR, and social networking will splinter the web, not unite it. More virtual locations means more places for different identities to thrive — just like how your family group text is different from the Discord channel you have with your friends, is different from your reddit comments, is different from your LinkedIn profile, is different from the messages you send on tinder. 

Loss of the distinction between “Lowbrow” and “Highbrow”

In 2018, Kendrick Lamar’s DAMN. became the first rap album to win a Pulitzer. Before this, the prize had only ever gone to classical or jazz. In 2020, skateboarding was in the Olympics for the first time, and most of the medals went to teenagers. The game Hades from Supergiant Games recently won a Hugo Award. It’s the first video game to do so, but it’s unlikely to be the last. 

In fact, the Hugo Awards themselves may be a good example of this trend. Back in 1953 when the Hugos started, fantasy and science fiction (and everything nerdy) was fringe stuff, totally marginal. Today, comic book superheroes dominate the box office and Targaryens are household names. 

This trend shows every sign of continuing. Things that are fringe, lowbrow, and popular will continue getting more and more official recognition, to the point that we will eventually lose the distinction between lowbrow and highbrow art at all. Olympic fencing is already on the same plane as olympic surfing, and soon there will be no social difference between comic games like Untitled Goose Game by House House, and comic operas like Le nozze di Figaro by Wolfgang Amadeus Mozart. If that seems impossibly flippant, remember that Mozart once composed a canon in B-flat major called “Lick me in the ass”.

High culture

This is part of why we’re not concerned that people and culture will become boring — cultural forces are constantly driving bizarre, fringe works towards the mainstream, and this trend shows no signs of stopping. Among other things, this will be really good for social mobility.

Minorities as minorities

Saying that America will be a majority-minority country by 2050 is the wrong way of thinking about it. By 2050, we won’t think about minorities in the same way at all. We won’t hold on to this sense of minority-nonminority — we’ll give up the minority-nonminority idea in favor of something more specific.

The categories that are important now won’t be important in 30 years. Concept that we take for granted — the idea of being Italian, or German, or even just European — didn’t exist until pretty recently. We expect a return to a sort of negative multiculturalism — everyone is sort of fighting with everyone else, like how all the cities on the Italian peninsula used to go at it without much of a sense of shared Italian identity.

Legacy media struggles to keep up, but race and gender already compete with minority identities like subculture and political leanings. Your identity comes from being a goth, a furry, by wearing hiking clothes to the office, by wearing a $1,200 Canada Goose jacket on the New York subway in October, by your favorite sports team, by the websites you frequent, by which author or podcaster you won’t shut up about, by which YouTubers you reference, by being a progressive or a libertarian or an ACAB commie. In many contexts, your status as one of these minorities already matters more than your race, gender, or even sexuality — and online, your meatspace traits barely matter at all. 

The True Uses of the Internet are not yet Known

Johannes Gutenberg invented the printing press in 1440, but Martin Luther didn’t publish his 95 theses until 1517. If it takes a new technology 77 years to come into its own strength, we shouldn’t be surprised.

There are a number of dates we could choose for the invention of the internet — the first ARPANET connections in 1969, the TCP/IP standard in 1982, or the first web pages in 1991. Maybe 1993 is the right choice, being the year of the first web browser, the invention of HTML, and Eternal September, though basic technologies like URLs and HTTP didn’t come until a few years later! 

If we do go with 1993, then 77 years later would be the nice round 2070. Maybe the modern world moves a little bit faster than the protestant reformation, but anyone who thinks the internet hasn’t lived up to expectations in terms of changing the world should wait a minute. The cutting-edge developments of the early 2020s will come to seem like Jacobus de Varagine’s Legenda aurea — which you’ve probably never heard of, that’s the point. We haven’t seen the internet’s real face yet.