Early hearing aids sucked. Your options were pretty much limited to asking people to shout, or using one of those giant ear trumpets. The first major advancement seems to have been smaller ear trumpets, shaped like seashells and worn on a headband:
Digital hearing aids started appearing in the 1980s, though you still had to wear a big transistor strapped to your chest. But things slowly got better and better with behind-the-ear devices and eventually in-canal hearing aids.
One of our family members wears hearing aids full-time, and modern hearing aids, while still expensive, are pretty impressive. They’re almost invisible, and they sit deep in the ear — you can use them to boost or block out certain frequencies, and if you turn them down, they essentially function as earplugs. They have bluetooth, so you can listen to music or have your phone calls go straight to your ears. They’re basically just a slightly fancier kind of earbud.
Apple introduced AirPods in 2016. While they are still cheaper than a high-quality pair of hearing aids, that won’t last forever. Eventually these two devices will meet in the middle, and it won’t take until 2050.
This is an especially clear case, but the same thing will happen with lots of assistive technology. Inventions that are meant to restore our senses or abilities will begin to surpass them, and then everyone will benefit from using them, not just people with disabilities. It will happen with hearing aids first, but it’s easy to imagine a world where AR glasses become better than unassisted eyesight, or robotic leg braces end up better than your knees. You can already buy basic assistive exoskeletons for about $900, it’s coming.
Medical Science Realizes that Women are People too
More women’s health problems will be solved, and this will lead to greater understanding of how the human body works in general, since women and men are basically the same except for small differences in the amounts/ratios of their hormones. An obvious example is the role of hormones in thermoregulation — women usually feel colder than men, despite having slightly higher core temperatures and slightly more body fat, and hot flashes are the stereotypical side effect of menopause. This seems kind of weird but everyone just takes it for granted.
(For what it’s worth, men have hormonal cycles too, which are if anything even less well understood.)
Certainly this covers anything about menopause and hormonal rhythms, but women are also more likely to have IBS, arthritis, and celiac disease, and twice as likely to have migraines. About 2/3 of Americans with Alzheimer’s are women. Figuring out why women are more likely to have these diseases will help us treat everyone more effectively, and lead to medical breakthroughs.
Everything Will be on Video
For a long time no one really knew what a tsunami looked like. They strike rarely and without warning, so there isn’t much time for you to send your local landscape painter or a camera crew to the scene. They don’t tend to leave a ton of eyewitnesses — if you’re close enough to get a good look at what’s happening, you’re probably dead. So for a long time, most people imagined a huge cresting wave like the ones you see at a surfing beach, just ten or a hundred times bigger.
The real picture is slightly more complicated (Randall goes into more detail here) but in general he’s right. Do a google image search for “tsunami” and you’ll see a lot of photoshopped images of giant cresting waves rising up above major cities.
Giant squid have long been monsters of legend, but the whole 20th century came and went without anyone photographing a giant squid alive. All this changed in (also) 2004, when a Japanese team managed to capture a photograph of a giant squid using a lure. Not long after that, we had video — first on the surface in 2006, and then in its natural habitat in 2012.
The 2020 Beirut explosion caught everyone by surprise. But there were still multiple videos and images available immediately, within minutes, to anyone on twitter:
You probably heard about the recent volcanic eruption near Tonga. Like Beirut, we immediately had multiplevideos within hours. Unlike Beirut, some of these were satellite videos. Partially we point this out to say, you can see this shit from space. But partially we want to emphasize that even satellite video now ends up on twitter and reddit in a matter of hours, if not minutes.
This is the world we’re living in. Almost everyone has a video camera in their pocket at all times. This isn’t entirely true in the developing world, but it’s getting more true there all the time. And when the event is something that can’t be captured on your cellphone, like a volcanic eruption visible from space, the footage will make its way to twitter in a few minutes anyways.
From here on out, anything interesting will be captured on video, and usually that video will be publicly available. When we were looking into the leanest and fattest cities in the US, and learned about the explosion at the Chemtool lithium grease factory in Rockton, IL, we were able to find not one but several videos of the explosion publicly available on the internet. We didn’t even have to look that hard.
“Never seen anything like this” is right, and that’s the byword of the next several decades. This will probably be humdrum by 2050, but between now and then there will be a lot of firsts. Like the first (decent) video of a tsunami in 2004, and the first video of a giant squid in 2006, there will soon be the first video of Halley’s Comet, maybe the first video of an asteroid impact, and of course the first video of the Loch Ness Monster.
So unless we have a total civilizational collapse, from now on expect that all important historical events will be captured on video. By 2050, expect them all from multiple angles, in glaring HD. If Napoleon is brought back to life through the power of cloning, and marches across Europe in 2034, expect to be able to count the pores on his nose in the newsreel footage.
Hoel makes his predictions based on a simple insight: change is incremental, and the minor trends of today are the institutional changes of tomorrow. If you want to know what 2050 will look like, think about the nascent trends of the early 2020s and project them into the future:
If you want to predict the future accurately, you should be an incrementalist and accept that human nature doesn’t change along most axes. … To see what I mean more specifically: 2050, that super futuristic year, is only 29 years out, so it is exactly the same as predicting what the world would look like today back in 1992. … what was most impactful from 1992 were technologies or trends already in their nascent phases, and it was simply a matter of choosing what to extrapolate. For instance, cellular phones, personal computers, and the internet all existed back in 1992, although in comparatively inchoate stages of development. … The central social and political ideas of our culture were established in the 1960s and 70s and took a slow half-century to climb from obscure academic monographs to Super Bowl ads. So here are my predictions for 2050. They are all based on current trends.
We think this approach is really smart. In fact, we like it so much that we wanted to take it for a test drive. In this post, we make our own set of predictions for 2050, using Hoel’s method of picking out trends that we suspect will go on to shape the 2020s, 2030s, and 2040s.
Projects are more fun when you do them with friends, so we invited a bunch of other bloggers to make their own predictions for 2050, using the same approach of extrapolating trends that they think are important today. So far we have predictions from:
Here at SMTM, we’re going to add something to Hoel’s original method of extrapolating “technologies or trends already in their nascent phases”: regression to the mean. What we mean by this is, well — the 20th century was very unusual in many different ways.
A lot of things that we take for granted are really, really new — like 401ks (invented in 1978), Traditional (1974) and Roth (1997) IRAs, and modern credit scores (1989). Indexes like the Dow Jones and the S&P 500 run back several decades, but index funds that track them only appeared in 1972. In 1940, only 5% of US adults over 25 had a college degree and only 25% had a high school diploma. Even income tax wasn’t a permanent part of the US tax system until 1913 — we had to do a whole amendment to the Constitution to make it happen.
Some of these may be here to stay, but looking back from 2050, a lot of 20th century “institutions” will look like a flash in the pan. The trends that are holding will probably hold, but any 20th century abnormalities that seem to be reversing are likely to go back to the way they were for most of human history. A nascent trend that looks like regression to the historical mean is much more likely to be a trend that will continue on to 2050.
We agree with a lot of Hoel’s predictions. A Martian colony, or crewed missions to Mars at least, are looking pretty likely as the price of space travel drops (and he’s not the only one predicting this). We’re also reminded of the recent increasing interest in charter cities and Georgism — Mars would be a great location for your wacky new city and it’s the closest we’re going to get to making all-new land, at least any time soon.
Hoel is clearly right that we will move away from stores, but this might also look like more business done out of people’s homes (like was done historically) or like more business done in something like a marketplace with semi-permanent stalls (like was done historically).
Genetic engineering of embryos to avoid disease is already being done and it does seem like this will happen more and more. Similarly, anti-aging technology is already here and will just keep getting better and cheaper, especially given that Peter Thiel is involved. But this is sort of hard to square with Hoel’s final prediction, that 2050 will be “the winter of my life”, at the age of only 62. It seems a little pessimistic on Hoel’s part. Didn’t you hear that in 2050, 62 will be the new 25?
Sometimes we agree with the general picture, but not with the details. Education will indeed be mostly online (again, it already is), but it will look more radical than what Hoel imagines. The real education giant today is not Harvard, or even MITx, but YouTube, and we will see more of THAT in the future.
Hoel is right that AI will be impactful in day-to-day life, but we think this is true only in the obvious ways. You will still have Siri and Alexa, but you won’t have Data, or even Bender. We might have better image classifiers and even decent chatbots. Strong AI may be a possibility by 2050 (a topic for another time), but by the “extrapolate the future from current trends” technique, in 2050 many classifiers will still have a hard time telling the difference between a dragonfly and a bus. GPT-29 will be able to churn out a movie script as formulaic as that of the average Hollywood scriptwriter (and may well replace them) but it won’t be replacing writing that requires anything as complicated as “themes” or “meaning”.
Hoel predicts wild changes in family structure — specifically, the decline of the family and the rise of the throuple. We agree family dynamics will change, but again we disagree on the specifics. More on this in a minute.
A few predictions we disagree with in general. Hoel predicts that the online mob will create an endless culture war, and that “the future really is female”. But the current culture war is amusingly soft compared to many cultural conflicts in living memory, and the fact that women get a majority of all degrees means very little when you believe that university degrees are worth less and less all the time!
Finally, we disagree that people and culture will become boring. Thanks for reading a pseudonymous mad science blog called SLIME MOLD TIME MOLD.
This first one is less an original prediction than an elaboration on Hoel, who says:
Buzzing drones of all shapes and sizes will be common in the sky (last year Amazon won FAA approval for its delivery drone service, opening the door for this). Small robots will be everywhere, roving the streets in urban areas, mostly doing deliveries.
We agree. Robots will stay dumb but you will see a lot of them, possibly in delivery. It’s hard to look at work from Boston Dynamics and not expect that in 30 years we’ll have lots of quadrupedal robots trotting around our streets, carrying goods and generally acting as porters, footmen, and stevedores.
If you get the price point low enough, small robots might even replace backpacks and suitcases. Boston Dynamics’ robot Spot currently costs $74,500, but 30 years of R&D can do a lot. Let’s take computers — in the early 90s, 1 GB of storage cost about $10,000. But these days you can get a 2 TB drive for about $50, which puts a 1 GB at only a few cents. If the same thing happened to Spot, it would cost less than a dollar. We don’t expect anything this drastic, but similar forces could turn quadrupedal robots into household goods. Our bet is on robotic palanquins.
It also seems very likely that with 30 more years of R&D, we’ll have ironed out all the last problems with self-driving cars, so expect that kind of robot as well.
More Infectious Disease
For most of human history, infectious disease was a fact of life. As in so many things, the 20th century was an aberration. We developed antibiotics, improved hygiene, even eliminated some diseases altogether. But this pleasant moment in the sun is over. Someone writing in December 2019 might be forgiven for thinking that with our medical knowledge and scientific might, we could defeat any disease that might rise up. But evidently not.
We still have germ theory, so we won’t be sent back to the state of things in 1854. But the future will look more like the past, and we’ll have to start paying attention to disease in the way our ancestors did. As historian Ada Palmer describes, “I have never read a full set of Renaissance letters which didn’t mention plague outbreaks and plague deaths, and Renaissance letters from mothers to their traveling sons regularly include, along with advice on etiquette and eating enough fennel, a list of which towns to avoid this season because there’s plague there.” Embrace tradition with this delicious recipe for Fenkel in soppes.
These days, big universities and medical centers and stuff are responsible for most research. But this is a big deviation from the historical norm. In the past, random haberdashers and architects and patent clerks and high school teachers, or just rich people with too much time on their hands, were the ones doing most of the cutting-edge research.
Doing things to reshape your body and mind is an idea as old as dirt, but with recent advances in technology, and breakdowns in cultural taboos, the practice of what could be called “elective chemistry” is going to take off, probably in the next 10 or 20 years.
Why let nature be the only one who has any say over the chemicals affecting your mind and body? It’s already common to use caffeine, alcohol, and tobacco to reshape your mind. If you’re willing to go out of your way, you can get a psychiatrist to prescribe any number of mind-altering chemicals, and many people today are on lexapro or modafinil or adderall or wellbutrin full-time. And while this is easy enough to do legally, it’s even easier outside the law — many people use psychedelics, steroids, or hormone therapies illegally, to change their minds or bodies as they see fit.
This won’t just become more acceptable for people on the margins of society, it will become mainstream. Cis people are already the largest consumers of hormone therapy and other medical procedures normally assocaited with trans healthcare (largely because of base rates, but even so). Cis men sometimes go on androgen replacement therapy as they age, and cis women often go on hormone replacement therapy after menopause, which sometimes includes testosterone. And it’s equally easy to use them as mind-altering substances, since they have psychological effects as well as physical ones.
Working out, getting plastic surgery, and taking steroids or hormones are all just forms of body modification. We’ve already come to accept piercings and tattoos, to the point where they’re practically boring. In the near future, most forms of body modification will be unremarkable, in the literal sense that you cannot be bothered to remark on them.
It was also definitely a historical anomaly, and there are already signs of things going back to the way they were. There was a crunch in favor of Europa and her direct offshoots up to the middle of the 20th century, but since 1950 things have been turning around:
The fastest-growing economies in the world are all countries like Bangladesh, Ethiopia, Vietnam, Turkey, and Iran. Brazil is already the 13th-largest economy in the world, Indonesia the 16th, and Nigeria the 27th — all ahead of countries like Ireland (29th), Norway (31st), Denmark (37th), and Portugal (49th). It’s hard to predict who the big winners will be, but it’s clear that Europe will become less and less important, as countries in the rest of the world become major powers.
As wealth and power gets more distributed, supply chains will get shorter and less global. Measures of globalization used to increase year after year, but they sputtered in the financial crash of 2008 and never really recovered. COVID has provided another shock, a disruption that is far from over. There isn’t really a trend away from globalization yet, but the trend in favor of globalization has definitely stalled.
There may also be regression to the mean in protectionism. Historically, many states have supported themselves largely through tariffs (see e.g. for the US), and protectionism may be good for growing economies. If globalization really has stalled for the long-term, and certainly if it starts to reverse, we may see more and more tariffs, even a shift in how governments fund themselves. Russia and India have already begun taking steps in that direction, and other countries may follow.
Historically, most people lived in large extended families. The nuclear family, at least as we know it today, is largely an artifact of the unusual circumstances of the 20th century. As income inequality and the cost of buying a home increase, more people will live in large groups — be that group houses, “adult dorms”, or multigenerational homes. COVID has accelerated this trend. More young adults (18 to 29) are living at home now than they were at any point since 1900. The future doesn’t look like Leave it to Beaver, or even The Simpsons.
Part of this will be transitioning back to a system where familial wealth is more important than personal wealth. Historically if your family disowned you, you were screwed. This is why a mainstay of 19th century literature is killing your brother for an inheritance.
And as much as the “kill your brother for the inheritance” thing was a pattern of the upper classes, familial wealth was more important than personal wealth even for peasants (though for peasants, it was sort of more communal wealth than familial).
This is why we agree with Hoel’s prediction of major changes in family structure. We agree that “normal families” are on their way out. But we disagree on nearly all the specifics. We don’t expect to see lots of single-parent homes — we expect more multi-generational homes, group homes, or other arrangements, with lots of adults co-raising children. See e.g. Kelsey Piper’s experience, her main conclusions being “I have no idea how people with two parent households manage” and “I wish we had even more breastfeeding parents”. Put that on a t-shirt: Even More Breastfeeding Parents by 2050.
And instead of seeing a rise in throuples, we expect to see a return of that very old-fashioned arrangement, the Harem — where a person of means has multiple wives, one wife and multiple concubines, etc. etc.
Wage labor becomes less common
Tying yourself to a major employer is still the norm today, but this is changing. Some people will be paid on retainer (i.e. a salary), and some jobs where you really are being paid for your time (e.g. security guard) will still be hourly, but more and more people will be paid to complete specific tasks or deliver a particular result, with no questions asked as to how fast they did it.
We think the gig economy is coming for the rest of the marketplace, but instead of everything being chopped up into little tasks and ruled by corporations (à la Uber), we expect it to look like more contract workers and fewer full-time employees. More people will be self-employed, or will form small companies to deliver goods or services on demand.
We expect this is (mostly) a good thing. People benefit from being their own boss and being able to do the work however they want, as long as they get it done. Being paid to stand around and look busy isn’t good for anyone.
To a historical person, wage labor would be one of the strangest things about the modern world, and the idea of a steady job with benefits would be even stranger. Most people were farmers and almost never had any reason to handle money. Even if you were a potter or a blacksmith, you were paid for your product, the actual bowl or knife you were selling, not for your labor or the hours you worked.
Antibodies to the Outrage Economy
Once upon a time, clickbait was a major annoyance, but it was mostly a problem because people fell for it. The term was invented in 2006, and clickbait was the scourge of the internet for a few years, but by 2014 the cultural immune response was in full swing. The Onion launched ClickHole that year, Facebook started taking steps to squash clickbait, blah blah blah.
Now, no one reads clickbait because we’ve learned better. People are learning again. Hoel is worried that “Social media will ensure an endless culture war and internal social upheaval.” But we’re not worried, because soon we will develop cultural antibodies to the outrage economy, just like we developed cultural antibodies to clickbait, or to evolution vs. creationism debates, or to whatever was blowing up the internet in the 1990s (arguments about Microsoft?).
In fact, we’re already getting there. There was a time when we used to click on outrageous political stories. Now I think, “They’re rifting me”, and move on without clicking. No one has written the definitive piece on it yet, but “don’t read the news” is a meme that’s gaining steam. We hear it from our friends all the time. People are waking up to the fact that the news will do almost anything to raise your blood pressure, and that freaking out about “the issues of the day” does no one any good.
There will always be some new brainworm that we have to develop cultural antibodies to. And it might be fun to speculate which stupid argument will threaten to tear us apart next. But the outrage economy is on its way out, and the divisions of 2050 will look very, very different from the divisions we see today.
But today the assumption is that everyone uses their real name. This is Mark Zuckerberg’s fault, for pushing real names on Facebook. “You have one identity,” he says in David Kirkpatrick’s 2010 book, The Facebook Effect. “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly. Having two identities for yourself is an example of a lack of integrity.”
But this will be even more of a flash in the pan than fighting about politics online. Internet anonymity is already coming back into style (hello) and this trend will continue into the future. Most people will have a mix of public and private accounts, pseudonyms, alts, and pen names. As with many of our other “predictions”, this is pretty much true already — what will change is that there will eventually be widespread acknowledgement and acceptance.
This is also a regression to the historical mean. Public anonymity and pseudonymity is a long and esteemed tradition — just ask Voltaire, George Sand, Mark Twain, Lewis Carroll, George Orwell, or Dr. Seuss.
During the American revolution, practically everyone was using a pseudonym. Many of these guys were already famous public figures, and ALSO writing pseudonymous letters. They were having it both ways — they had alts!
Alexander Hamilton, James Madison, and John Jay wrote The Federalist Papers under the name “Publius”. But Ben Franklin was the real master of this — his pseudonyms included not only Richard Saunders of Poor Richard’s Almanack, but also “Silence Dogood”, “Caelia Shortface”, “Martha Careful”, “Anthony Afterwit”, “Miss Alice Addertongue”, “Polly Baker”, and “Benevolus”. We shudder to think what he would have done with even a dial-up connection.
Advances in crypto, VR, AR, and social networking will splinter the web, not unite it. More virtual locations means more places for different identities to thrive — just like how your family group text is different from the Discord channel you have with your friends, is different from your reddit comments, is different from your LinkedIn profile, is different from the messages you send on tinder.
Loss of the distinction between “Lowbrow” and “Highbrow”
In fact, the Hugo Awards themselves may be a good example of this trend. Back in 1953 when the Hugos started, fantasy and science fiction (and everything nerdy) was fringe stuff, totally marginal. Today, comic book superheroes dominate the box office and Targaryens are household names.
This trend shows every sign of continuing. Things that are fringe, lowbrow, and popular will continue getting more and more official recognition, to the point that we will eventually lose the distinction between lowbrow and highbrow art at all. Olympic fencing is already on the same plane as olympic surfing, and soon there will be no social difference between comic games like Untitled Goose Gameby House House,and comic operas like Le nozze di Figaro by Wolfgang Amadeus Mozart. If that seems impossibly flippant, remember that Mozart once composed a canon in B-flat major called “Lick me in the ass”.
This is part of why we’re not concerned that people and culture will become boring — cultural forces are constantly driving bizarre, fringe works towards the mainstream, and this trend shows no signs of stopping. Among other things, this will be really good for social mobility.
Minorities as minorities
Saying that America will be a majority-minority country by 2050 is the wrong way of thinking about it. By 2050, we won’t think about minorities in the same way at all. We won’t hold on to this sense of minority-nonminority — we’ll give up the minority-nonminority idea in favor of something more specific.
The categories that are important now won’t be important in 30 years. Concept that we take for granted — the idea of being Italian, or German, or even just European — didn’t exist until pretty recently. We expect a return to a sort of negative multiculturalism — everyone is sort of fighting with everyone else, like how all the cities on the Italian peninsula used to go at it without much of a sense of shared Italian identity.
Legacy media struggles to keep up, but race and gender already compete with minority identities like subculture and political leanings. Your identity comes from being a goth, a furry, by wearing hiking clothes to the office, by wearing a $1,200 Canada Goose jacket on the New York subway in October, by your favorite sports team, by the websites you frequent, by which author or podcaster you won’t shut up about, by which YouTubers you reference, by being a progressive or a libertarian or an ACAB commie. In many contexts, your status as one of these minorities already matters more than your race, gender, or even sexuality — and online, your meatspace traits barely matter at all.
The True Uses of the Internet are not yet Known
Johannes Gutenberg invented the printing press in 1440, but Martin Luther didn’t publish his 95 theses until 1517. If it takes a new technology 77 years to come into its own strength, we shouldn’t be surprised.
There are a number of dates we could choose for the invention of the internet — the first ARPANET connections in 1969, the TCP/IP standard in 1982, or the first web pages in 1991. Maybe 1993 is the right choice, being the year of the first web browser, the invention of HTML, and Eternal September, though basic technologies like URLs and HTTP didn’t come until a few years later!
If we do go with 1993, then 77 years later would be the nice round 2070. Maybe the modern world moves a little bit faster than the protestant reformation, but anyone who thinks the internet hasn’t lived up to expectations in terms of changing the world should wait a minute. The cutting-edge developments of the early 2020s will come to seem like Jacobus de Varagine’s Legenda aurea — which you’ve probably never heard of, that’s the point. We haven’t seen the internet’s real face yet.