A Chemical Hunger – Part II: Current Theories of Obesity are Inadequate

[PART I – MYSTERIES]

Current theories of the obesity epidemic are inadequate. None of them hold up to closer scrutiny, and none can explain all of the mysteries mentioned in Part I. But these mysteries are real, puzzling data about the obesity epidemic.

You’re probably familiar with several theories of the obesity epidemic, but there is strong evidence against all of them. In this section, we focus on the case against a couple of the most popular theories. 

2.1    Calories In, Calories Out

A popular theory of obesity is that it’s simply a question of calories in versus calories out (CICO). You eat a certain number of calories every day, and you expend some number of calories based on your metabolic needs and physical activity. If you eat more calories than you expend, you store the excess as fat and gain weight, and if you expend more than you eat, you burn fat and lose weight.

This perspective assumes that the body stores every extra calorie you eat as body fat, and that it doesn’t have any tools for using more or less energy as the need arises. But this isn’t the case. Your body has the ability to regulate things like its temperature, and it has similar tools to regulate body fatness. When we look closely, it turns out that “calories in, calories out” doesn’t match the actual facts of consumption and weight gain.

“This model seems to exist mostly to make lean people feel smug,” writes Stephen Guyenet, “since it attributes their leanness entirely to wise voluntary decisions and a strong character. I think at this point, few people in the research world believe the CICO model.”

It’s not that calories don’t matter at all. People who are on a starvation diet of 400 calories per day will lose weight, and as we will see in this section, people who eat hundreds of calories more than they need will usually gain weight. The problem is that this ignores how the body accounts for the calories coming in and going out. If you don’t eat enough, your body finds ways to burn fewer calories. If you eat too much, your body doesn’t store all of the excess as fat, and compensates by making you less hungry later on. Calories are involved in the math but it’s not as simple as “weight gain = calories in – calories out”.

[Edit: We’ve added an interlude clarifying this section on CICO in response to reader questions and objections. If you have objections to anything below, read the interlude here because we probably address it!]

2.1.1    Common Sense

First, we want to present some common-sense arguments for why diet and exercise alone don’t explain modern levels of obesity.

Everyone “knows” that diet and exercise are the solution to obesity. Despite this, rates of obesity continue to increase, even with all the medical advice pointing to diet and lifestyle interventions, and a $200 billion global industry devoted to helping people implement these interventions. It’s not that no one is listening. People are exercising more today than they were 10 or even 20 years ago. Contrary to stereotypes, more than 50% of Americans meet the HHS guidelines for aerobic exercise. But obesity is still on the rise.

It’s true that people eat more calories today than they did in the 1960s and 70s, but the difference is quite small. Sources have a surprisingly hard time agreeing on just how much more we eat than our grandparents did, but all of them agree that it’s not much. Pew says calorie intake in the US increased from 2,025 calories per day in 1970 to about 2,481 calories per day in 2010. The USDA Economic Research Service estimates that calorie intake in the US increased from 2,016 calories per day in 1970 to about 2,390 calories per day in 2014. Neither of these are jaw-dropping increases.

If we go back further, the story actually becomes even more interesting. Based on estimates from nutrient availability data, Americans actually ate more calories in 1909 than they did in 1960.

Finally, there are many medical conditions that cause obesity. For example, Prader-Willi Syndrome, a genetic disorder characterized by intense hunger and resulting obesity, hypothyroidism, an endocrine disorder where people experience loss of appetite yet still gain 5-10 pounds, and lesions to the hypothalamus, which often lead to intense weight gain, sometimes accompanied by great hunger but many times not.

2.1.2    Scientific Evidence

In addition to these common-sense objections, decades of research suggests that diet and exercise are not to blame for rising rates of obesity.

Studies of controlled overfeeding — you take a group of people and get them to eat way more than they normally would — reliably find two things. First, a person at a healthy weight has to eat huge amounts of calories to gain even a couple pounds. Second, after the overfeeding stops, people go right back to the weight they were before the experiment.

The great-grandaddy of these studies is the Vermont prison experiment, published in 1971. Researchers recruited inmates from the Vermont State Prison, all at a healthy weight, and assigned some of them to eat enormous amounts of food every day for a little over three months. How big were these meals? The original paper doesn’t say, but later reports state that some of the prisoners were eating 10,000 calories per day.

On this olympian diet, the prisoners did gain considerable weight, on average 35.7 lbs (16.2 kg). But following the overfeeding section of the study, the prisoners all rapidly lost weight without any additional effort, and after 10 weeks, all of them returned to within a couple pounds of their original weight. One prisoner actually ended up about 5 lbs (2.3 kg) lighter than before the experiment began!

Inspired by this, in 1972, George Bray decided to conduct a similar experiment on himself. He was interested in conducting overfeeding studies, and reasoned that if he was going to inflict this on others, he should be willing to undergo the procedure himself. First he tried to double each of his meals, but found that he wasn’t able to gain any weight — he simply couldn’t fit two sandwiches in his stomach at every sitting.

He switched to energy-dense foods, especially milkshakes and ice cream, and started eating an estimated 10,000 calories per day. Soon he began to put on weight, and gained about 22 lbs (10 kg) over 10 weeks. He decided this was enough and returned to his normal diet. Six weeks later, he was back at his original weight, without any particular effort.

In both cases, you’ll notice that even when eating truly stupendous amounts of food, it actually takes more time to gain weight than it does to lose it. Many similar studies have been conducted and all of them find basically the same thing — check out this recent review article of 25 studies for more detail.

Overfeeding in controlled environments does make people gain weight. But they don’t gain enough weight to explain the obesity epidemic. If you eat 10,000 calories per day, you might be able to gain 20 or 30 pounds, but most Americans aren’t eating 10,000 calories per day.

We can compare these numbers to the increases in average calories per day we reviewed earlier. Sure, consumption in the US went from 2,025 calories per day in 1970 to 2,481 calories per day in 2010, a difference of 456 calories. But consider Poehlman et al. (1986), where researchers fed a group of 12 men 1,000 extra calories a day for 22 days. On average the men gained about 5 lbs (2.2 kg), but some of them actually lost weight instead.

And it’s not as though these participants are eating 1,000 extra calories of celery and carrots. In one study, the extra calories came from “sherbet, fruit juices, margarine, corn oil, and cookies”. But the content doesn’t seem to matter very much. Another study compared overfeeding with carbohydrates (mostly starch and sugar) and overfeeding with fat (mostly dairy fat like cream and butter). The two groups got their extra calories from different sources, but they were overfed by the same amount. After two weeks, both groups gained the same amount of fat, 3.3 lbs on average. A similar study overfed volunteers by 1,194 calories on either a high-carb or a high-fat diet for 21 days. Both groups gained only about 2 lbs of fat.

The fact that many of these are twin studies provides even more evidence against CICO. In groups of twins that are all overfed by the same amount, there is substantial variation between the different participants in general. Some people gain a lot of weight, others gain almost none. But each person gains (or loses!) about the same amount of weight as their twin. In some cases these correlations can be substantial, as high as r = 0.90. This strongly suggests that genetics plays a large role in determining how the body responds to overfeeding.

The story with exercise is the same as with overeating — it makes a difference, but not much. One randomized controlled trial assigned overweight men and women to different amounts of exercise. More exercise did lead to more body fat loss, but even in the group exercising the most — equivalent to 20 miles (32.0 km) of jogging every week for eight months — people only lost about 7 lbs.

You might think that hunter-gatherers have a more active lifestyle than we do, but this isn’t always true. The Kitavans examined in 1990 by Staffan Lindeberg were only slightly more active than westerners, had more food than they knew what to do with, and yet were never obese. “Many Westerners have a level of physical activity that is well within the range of the Kitava population,” he wrote. “Hence, physical activity does not seem to explain most of the differences in disease pattern between Kitava and the Western world.”

A recent meta-analysis of 36 studies compared the effects of interval training exercise with more traditional moderate-intensity continuous training. The authors call interval training “the magic bullet for fat loss” (this is literally in the title) and trumpet that it provides 28.5% greater reductions in total absolute fat mass than moderate exercise. But what they don’t tell you is that this is a difference between a loss of about 3 lbs and about 4 lbs, for an exercise program running 12 weeks long. Needless to say, this difference isn’t very impressive. Other meta-analyses find similar results: “neither short-term HIIT/SIT nor MICT produced clinically meaningful reductions in body fat.”

Maybe diet and exercise together are worth more than the sum of their parts? Sadly this doesn’t seem to be the case either. If anything, when combined they are worth less than the sum of their parts. One meta-analysis comparing interventions based on diet, exercise, and diet plus exercise found that people lost about 23.5 lbs (10.7 kg) on diets, 6.4 lbs (2.9 kg) on an exercise regime, and 24.2 lbs (11.0 kg) on diet plus exercise. After a year, diet plus exercise was down to 18.9 lbs (8.6 kg). Other meta-analyses are more tempered, for example, finding a loss of about 3.6 lbs (1.6 kg) after two years of diet plus exercise interventions. Again this is more weight loss than zero, but it clearly rules out diet plus exercise as an explanation for the obesity epidemic. People in 1950 were a lot leaner than they are now, but it’s not because they ate less and exercised more.

2.2    Good Calories and Bad Calories

Ok, calories themselves may not be the villain here. But maybe it’s not that we’re eating more than we used to — maybe it’s that we’re eating differently. Maybe one particular macronutrient or source of calories is to blame.

2.2.1    Dietary Fat

Dietary fat seems like a possible culprit. After all, fat makes you fat, right? Turns out it’s not so simple.

To begin with, fat consumption has actually fallen over the past few decades, while obesity has skyrocketed. This isn’t consistent with an explanation where dietary fat leads to obesity.

Plenty of cultures eat extremely high-fat diets and remain very lean indeed. You’ll remember that the Maasai diet is about 3000 calories per day, and 66% of that is from fat. But the Maasai don’t suffer from obesity. In fact, Kalahari Bushmen love fat and apparently wax poetic about it.

Clinical results agree: dietary fat doesn’t have much of an impact on long-term weight. Putting people on a low-fat diet reduces their weight in the short term, but in trials lasting for longer than one year, they tend to return to normal. When they are directly compared, low-fat and high-fat diets have about the same impact on weight loss.

This is a little difficult to square with animal studies that find that a high-fat diet reliably leads to obesity in monkeys, dogs, pigs, hamsters, squirrels, rats, and mice. It could just be that humans are not monkeys, dogs, pigs, hamsters, squirrels, rats, or mice, and that while dietary fat has an adverse effect on these species, it doesn’t do much to us. Some of the hamster studies, for example, induced obesity simply by giving the hamsters extra sunflower seeds, a phenomenon not observed in humans. Pigs, in particular, will become obese even on low-fat diets when given the opportunity.

We even see differences within a specific kind of animal. The same high-fat diet will make one species of hamster (Syrian hamsters) obese and leave another species of hamster (golden hamsters) merely chubby. If the findings can’t generalize between different species of hamsters, we shouldn’t expect them to generalize to humans.

It could also be that dietary fat leads to obesity in mammals held in captivity, possibly due to factors like stress. Metabolic ward studies restrict your movement, but it’s not exactly like living your whole life in a laboratory cage. And it’s worth noting that about 10–15% of macaque and rhesus monkeys in captivity become obese when they reach middle age, despite the fact that they are fed a relatively low-fat (10% of energy) diet.

In any case, it’s hard to square a fat-based explanation for the obesity epidemic with the fact that fat consumption hasn’t increased in step with the rise of obesity and the fact that low-fat diets don’t lead to much weight loss.

2.2.2    Carbohydrates

Ok, maybe fat doesn’t make you fat. How about carbohydrates? All this bread can’t be good for us.

This theory is dead on the starting line, though, because as obesity has gone up, consumption of carbohydrates has gone down (see figure).

Overall carbohydrate intake from 1980 to 2010 in the United States. Figure prepared by Stephan J Guyenet, reproduced with permission.

This is enough to make it clear that carbohydrate consumption isn’t driving the obesity epidemic, but we can take a slightly closer look anyways, just to be sure.

Eating lots of carbs can actually make you lose weight. High-carbohydrate diets cause weight loss, even when not restricting calories. A study from 2003 examined low-fat diets in 16 overweight people. Naturally, this low-fat diet was high in carbohydrates. When patients started the low-fat diet and were told to eat as much as they wanted, they actually ate 291 calories less per day.

But their carbohydrate intake increased, from 253 grams per day to 318 grams per day. On this diet they lost 8 lbs (3.8 kg) on average over a 12-week period. In the DIETFITS randomized controlled trial, 609 people fed a whole-food, high-carbohydrate diet lost 12 pounds (5.3 kg) over one year, not significantly different from the 13 pounds (6.0 kg) of weight lost on a whole-food low-carbohydrate diet. The high-carbohydrate diet also supplied about 1.5 times as much sugar as the low-carbohydrate diet.

The residents of Kitava, mentioned earlier, have a diet of starchy roots and tubers. Almost 70% of their calories come from carbohydrates, but they don’t suffer from obesity, diabetes, or heart disease. 

(Lindeberg also says: “The long primate history of fruit eating, the high activity of human salivary amylase for effcient starch digestion, and some other features of human mouth physiology … suggest that humans are well prepared for a high carbohydrate intake from non-grain food sources. … in contrast to most other animals including non-human primates, humans have an exceptional capacity to produce salivary amylase in order to begin hydrolysis of starch in the mouth.”)

In general, cultures with very high intakes of carbohydrate tend to be lean. Most agricultural societies around the world have a diet that is high in carbohydrates and low in fat. Agricultural societies are different from industrialized ones in many ways, of course. But even in those agricultural cultures with abundant food, people are typically lean, with low rates of diabetes and cardiovascular disease.

This is true even if the carbohydrate is white rice. In Japan, white rice is a primary staple food (p. 338), and has been for a long time. About 62% of the Japanese diet is carbohydrates, and most of this is white rice. Despite this, Japanese rates of obesity have been, and continue to be, the lowest of any industrialized nation.

In fact, people who move from Japan to the US and begin eating less white rice become much heavier. This suggests that the difference isn’t simply genetic. These immigrants do end up eating a diet much higher in fat — but of course, from the previous section, we’ve seen that fat can’t be responsible for this change.

Nor is it likely to be some other carbohydrate staple. Wheat consumption, for example, has been falling for a century. People in the US ate almost twice as much wheat (primarily in the form of bread) in the 1880’s than they do today. If wheat were responsible, people would have been massively obese during reconstruction and entirely lean today. Obviously that is not what we observe.

If the historical data isn’t enough for you, there are entire reviews devoted to the health impacts of wheat, pretty conclusively showing that it isn’t a cause of obesity.

2.2.3    Sugar

Everyone knows that added sugar is the real villain, right? Wrong again.

Sugar consumption has been declining for 20 years in the US, while obesity and diabetes rates have increased. The sugar data in the figure below includes all added sugars such as honey, table sugar, and high-fructose corn syrup, but doesn’t include sugars naturally occurring in fruits and vegetables.

 Overall sugar intake from 1980 to 2013 in the United States. Figure prepared by Stephan J Guyenet, reproduced with permission.

We see something similar in what has been called The Australian Paradox, where obesity in Australia nearly tripled between 1980-2003, while sugar consumption dropped 23%.

Multiple lines of evidence confirm that sugar consumption is falling worldwide. In the US, consumption of sugary beverages dropped between 1999 and 2010. We see the same trend in longitudinal studies of a particular cohort tracked from 1991 to 2008. It’s not that consumers can’t find the sugar they crave, of course — there have been no major changes in the availability of sugary foods.

We see that public health efforts to reduce sugar consumption have worked. In fact, they’ve worked very well. But they don’t seem to have made any difference to the obesity epidemic.

Tightly-controlled metabolic ward studies also show that the sugar content of a diet doesn’t matter much. One study of 17 men compared a 25 percent sugar, high-carbohydrate diet to a 2 percent sugar, very-low-carbohydrate (ketogenic) diet of equal calories. After four weeks, they found that the high-carbohydrate diet caused slightly more body fat loss than the very-low-carbohydrate (ketogenic) diet, despite the fact that the two diets differed more than tenfold in sugar content. We see similar results in mice and in rats: “Animals fed a low-fat, high-sucrose (LH) diet were actually leaner than animals fed a high-complex-carbohydrate diet.”

We can further cite the fact that many cultures, such as the Hadza of Tanzania, the Mbuti of the Congo, and the Kuna of Panama all eat diets relatively high in sugar (sometimes as high at 80%), and yet none of these cultures have noticeable rates of obesity, diabetes, cardiovascular disease, etc.

2.3    Diet in General

Over the past 40 years, there hasn’t been much of a change in where people get their calories from. Americans get about 50% of their calories from carbohydrates, 30% from fat, and 20% from protein, and they have for years. At the same time obesity continues to go up and up. Comparing these two trends, it’s hard to imagine that macronutrients have anything to do with the obesity epidemic.

You’ll recall that Mystery 8 is that all diets work about equally well. It doesn’t matter which diet you choose — you lose about the same number of pounds regardless.

All diets work. The problem is that none of them work very well. Stick to just about any diet for a couple weeks and you will probably lose about 10 pounds. This is ok, but it isn’t much comfort for someone who is 40 lbs overweight. And it isn’t commensurate with the size of the obesity epidemic.

Systematic comparison of weight-loss diets with different compositions of fat, protein, and carbohydrates finds that across many different reduced-calorie diets, people lose about 13.2 lbs (6 kg) over six months, and that in all cases people began to gain weight back after 12 months. It’s not just weight loss, either. Satiety, hunger, satisfaction with the diet, and adherence to the protocol is similar for all diets.

There are too many diets to review in full, of course, but we see the same pattern in every diet that has been extensively studied. Let’s look at just a few.

2.3.1    Ketogenic Diet

We’ve already mentioned a few ketogenic diets, and as we’ve seen, they don’t work much better than other diets do.

There is one meta-analysis of ketogenic diet studies, comparing very-low-carbohydrate ketogenic diets to low fat diets in overweight and obese adults. Across thirteen randomized controlled trials, ketogenic diets only caused 2 pounds (0.9 kg) more weight loss than the traditional low-fat diets after 12 months.

2.3.2    Low-Glycemic Diet

Study after study finds that low-glycemic diets don’t work for weight loss.

One study from 2007 randomly assigned 203 women to either a high-glycemic or low-glycemic diet. The difference in glycemic index was considerable, with the high-glycemic diet having an index twice as high as the low-glycemic diet. The groups consumed the same amount of calories and reported similar levels of hunger.

Despite this, there was no difference between the groups. After two months the LGI group had lost 1.6 lbs (0.72 kg) and the HGI group had lost 0.7 lbs (0.31 kg), but this difference wasn’t sustained. After 18 months on the diet, the LGI group had lost 0.9 lbs (0.41) kg and the HGI group had lost 0.6 lbs (0.26 kg), and this difference was statistically indistinguishable (p = .93). Large differences in glycemic index have no meaningful long-term (or even short-term) effect on calorie consumption or body weight.

Another 18-month randomized trial compared a low-glycemic load (40% carbohydrate and 35% fat) vs low-fat (55% carbohydrate and 20% fat) diet in 73 obese young adults in the Boston, Massachussets area. In both diets, participants were largely eating whole foods; vegetables, beans, and fruit were major components of both diets. In both diets, people were allowed to eat as much as they wanted.

Both groups reported similar levels of hunger and consumed similar amounts of calories. The two diets were rated equally easy to stick to and equally tasty. Both groups lost about 4-5 lbs after 6 months. But both groups started to gain weight back soon after. In fact, the trajectory of weight loss is so identical, we simply have to show you the graph:

Trajectory of weight loss on two diets.

Note the p-value of 0.99, which indicates that the two trajectories are about as statistically indistinguishable as is mathematically possible.

We find this in study after study. Meta-analysis also finds that low-glycemic diets don’t do any better than other diets when it comes to weight loss. When the reviewers pick out the studies that show the best performance for low-glycemic diets, they still find a difference of only 4 lbs (1.8 kg). If that’s a success, we have to wonder what failure would look like.

2.3.3    Future Dietary Explanations

Eating fewer calories will lead most people to lose a couple pounds, and it doesn’t really matter what calories they restrict. Cutting back on fat works about as well as cutting back on carbs. In both cases, a couple pounds isn’t enough to explain the obesity epidemic.

Over the past 50 years, medical science has looked at diet from practically every angle. But none of these diet-based explanations have gone anywhere. People are still getting fatter. They got fatter over the last decade. And they got fatter over the decade before that. And the one before that. Every country in the world is growing more obese. And the trend has never once been reversed.

You could certainly cook up another diet-based explanation. But there’s no reason to expect that this explanation would do any better than any of the others.

It’s time to start looking for explanations outside the world of calories, macronutrients, and exercise. At this point, we should assume that the obesity epidemic isn’t caused by our diet.

Could it be a lifestyle difference? Possibly, but signs point against it. Smoking is more prevalent in Japan than among Japanese-Americans, yet Japanese-Americans have much higher rates of hypertension. Similarly, many hunter-gatherers are heavy smokers, including the Kitavans (76% of men and 80% of women) and the Bushmen of South Africa, but these societies have no sign of heart disease.

2.5    Lipostat

There is one theory of obesity which is almost entirely satisfying, based around the body’s ability to regulate its adiposity.

A house has a thermostat. The owner of the house sets the temperature to 72 degrees F. The thermostat detects the temperature of the house and takes action to drive the temperature to the set point of 72°F. If the house is too cold, the thermostat will turn on the furnace. If the house is too warm, the thermostat will turn on the air conditioning.

The human body has a lipostat (from the Greek lipos, meaning fat). Evolution and environmental factors set body fatness to some range — perhaps a BMI of around 23. The lipostat detects how much fat is stored and takes action to drive body fatness to the set point of a BMI of 23. If your body is too thin, the lipostat will drive you to eat more, exercise less, sleep more, and store more of what you eat as fat. If your body is too fat, the lipostat will turn on the air conditioning. Just kidding, the lipostat will drive you to eat less, move and fidget more, and store less of the food you eat as fat.

According to this theory, people become obese because something has gone wrong with the lipostat. If the owner of a house sets the thermostat to 120°F, the house will quickly become too hot, and it will stay that way until the set point is changed or the furnace explodes. Something similar is happening in obesity. The set point has been moved from a healthy and natural level of adiposity (BMI of about 23) to an unusually high level (BMI 30+), and all the regulatory systems of the body are working in concert to push adiposity to that level and keep it there.

The lipostat model is supported by more than a hundred years of evidence. By the 1970s, Dr. Michel Cabanac and collaborators were publishing papers in the journal Nature on what they called the “ponderostat” (pondero = weight). This was later revised to the adipostat (adipo = fat), and eventually, as we call it here, the lipostat.

Modern neuroscience and medical review articles (those are three separate links) overwhelmingly support this homeostatic explanation. In animals and humans, brain damage to the implicated areas leads to overeating and eventual obesity. These systems are well-understood enough that by targeting certain neurons you can cure or cause obesity in mice. While we don’t approve of destroying neurons in human brains with hyperspecific chemical techniques, the few weight-loss drugs approved by the FDA largely act on the brain (hopefully without destroying any neurons).

The lipostat explains why diet and exercise work a little, why they don’t work well enough to reverse obesity, and why even people who lose weight on diets generally end up gaining that weight right back.

In a house where the thermostat has been set to 120°F, there are a lot of things we can do to lower the temperature. We can open all the doors and windows. We can open the icebox. We can order mountains of dry ice off of the internet. All of these things will lower the temperature of the house a little, but even with these measures, the house will still be hotter than the healthy temperature of 72ºF. The furnace will work double-time to push the temperature back up to 120ºF, if it’s not redlining already. And as soon as you relax any of your heat-dissipation measures, the temperature will go right back up to where it was before.

(We can also go down into the basement and hit the furnace with crowbars until it doesn’t work very well anymore. This is a pretty extreme solution and also, incidentally, why gastric bypass surgery works so great.)

When people intentionally overeat, as in the overfeeding studies we reviewed, they temporarily gain a little weight, but when they stop overeating, they quickly return to their original weight. When people intentionally undereat, as they do on a diet, they temporarily lose a little weight, but when they stop undereating they quickly return to their original weight. In fact, they usually return to near their original weight even if they keep undereating. The lipostat has a target weight and, when not actively opposed, it will push your body weight to that weight and do its best to keep it there.

There are many signals that the brain uses to measure how much fat the body is carrying. One of the most important is the hormone leptin, which is naturally produced by fat cells. Part of the action of the lipostat is making sure that leptin levels are kept within a desired range, which helps keep us at a desired weight.

Very rarely, people are born with a genetic mutation that makes it so their fat cells no longer produce leptin. The lipostat notices that it isn’t detecting any leptin, and assumes that the body has no fat stores at all, with predictable results. Usually these children are of normal birth weight, but from the first weeks of their lives, they are insatiably hungry. By age two, they weigh more than 50 pounds, and may be as high as 60% fat by weight. They have a truly incredible drive to eat:

. . . leptin-deficient children are nearly always hungry, and they almost always want to eat, even shortly after meals. Their appetite is so exaggerated that it’s almost impossible to put them on a diet: if their food is restricted, they find some way to eat, including retrieving stale morsels from the trash can and gnawing on fish sticks directly from the freezer. This is the desperation of starvation [. . . ] they become distressed if they’re out of sight of food, even briefly. If they don’t get food, they become combative, crying and demanding something to eat.

The lipostat account is extremely convincing. The only weakness in the theory is that it’s not clear what could cause the lipostat to be set to the wrong point. In leptin-deficient children, their body simply can’t detect that they are obese. But most people produce leptin just fine. What is it that throws this system so totally out of balance?

While the lipostat perspective does in a sense explain why people become obese (their lipostat is out of alignment), it’s not really a theory of the obesity epidemic, since it doesn’t explain why our lipostats began getting more and more out of balance around 1980.

Even advocates of the theory are perfectly willing to admit this. In The Hungry Brain, Stephen Guyenet writes:

Many researchers have tried to narrow down the mechanisms by which [diet] causes changes in the hypothalamus and obesity, and they have come up with a number of hypotheses with varying amounts of evidence to support them. Some researchers believe the low fiber content of the diet precipitates inflammation and obesity by its adverse effects on bacterial populations in the gut (the gut microbiota). Others propose that saturated fat is behind the effect, and unsaturated fats like olive oil are less fattening. Still others believe the harmful effects of overeating itself, including the inflammation caused by excess fat and sugar in the bloodstream and in cells, may affect the hypothalamus and gradually increase the set point. In the end, these mechanisms could all be working together to promote obesity. We don’t know all the details yet…

Guyenet favors a “food reward” explanation, where eating “highly rewarding food” causes a mild form of brain damage that turns up the set point of the lipostat. He’s even gone so far as to propose (as an April Fools joke) a collection of boring recipes called The Bland Food Cookbook.

You’ll notice that in all these theories, the factors that damage the lipostat are related to diet. But as we’ve just argued above, the persistent failure to find a solution in our diets strongly suggests that we should start looking elsewhere for the explanation.

2.6    What, Then?

We should start seriously considering other paradigms. If diet and exercise are out as explanations for the epidemic, what could possibly explain it? And what could possibly explain all of the other bizarre trends that we have observed?


[Next Time: A CHEMICAL HUNGER]


A Chemical Hunger – Part I: Mysteries

The study of obesity is the study of mysteries.

Mystery 1: The Obesity Epidemic 

The first mystery is the obesity epidemic itself. It’s hard for a modern person to appreciate just how thin we all were for most of human history. A century ago, the average man in the US weighed around 155 lbs. Today, he weighs about 195 lbs. About 1% of the population was obese back then. Now it’s about 36%.

Back in the 1890s, the federal government had a board of surgeons examine several thousand Union Army veterans who fought in the Civil War. This was several decades after the end of the war, so by this point the veterans were all in their 40’s or older. This gives us a snapshot of what middle-aged white men looked like in the 1890s. When we look at their data, we find that they had an average BMI of about 23 (overweight is a BMI of 25 and obese is a BMI of 30 or more). Only about 3% of them were obese. In comparison, middle-aged white men in the year 2000 had an average BMI of around 28. About 24% were obese in early middle age, increasing to 41% by the time the men were in their 60s.

(Most experts consider measures like body fat percentage to be better measures of adiposity than BMI, and we agree. Unfortunately, nearly every source reports BMI, and most don’t report body fat percentage. Here, we use BMI so that we can compare different sources to one another.)

It’s not just that we’re a little fatter than our great-grandparents — the entire picture is different.


People in the 1800s did have diets that were very different from ours. But by conventional wisdom, their diets were worse, not better. They ate more bread and almost four times more butter than we do today. They also consumed more cream, milk, and lard. This seems closely related to observations like the French Paradox — the French eat a lot of fatty cheese and butter, so why aren’t they fatter and sicker?

Our great-grandparents (and the French) were able to maintain these weights effortlessly. They weren’t all on weird starvation diets or crazy fasting routines. And while they probably exercised more on average than we do, the minor difference in exercise isn’t enough to explain the enormous difference in weight. Many of them were farmers or laborers, of course, but plenty of people in 1900 had cushy desk jobs, and those people weren’t obese either.

Something seems to have changed. But surprisingly, we don’t seem to have any idea what that thing was.

Mystery 2: An Abrupt Shift 

Another thing that many people are not aware of is just how abrupt this change was. Between 1890 and 1976, people got a little heavier. The average BMI went from about 23 to about 26. This corresponds with rates of obesity going from about 3% to about 10%. The rate of obesity in most developed countries was steady at around 10% until 1980, when it suddenly began to rise.

Trends in adult overweight, obesity, and severe obesity among men and women aged 20–74: United States, 1960–1962 through 2015–2016. SOURCES: NCHS, National Health Examination Survey and National Health and Nutrition Examination Surveys.

Today the rate of obesity in Italy, France, and Sweden is around 20%. In 1975, there was no country in the world that had an obesity rate higher than 15%.

This wasn’t a steady, gentle trend as food got better, or diets got worse. People had access to plenty of delicious, high-calorie foods back in 1965. Doritos were invented in 1966, Twinkies in 1930, Oreos in 1912, and Coca-Cola all the way back in 1886. So what changed in 1980?

Common wisdom today tells us that we get heavier as we get older. But historically, this wasn’t true. In the past, most people got slightly leaner as they got older. Those Civil War veterans we mentioned above had an average BMI of 23.2 in their 40s and 22.9 in their 60’s. In their 40’s, 3.7% were obese, compared to 2.9% in their 60s. We see the same pattern in data from 1976-1980: people in their 60s had slightly lower BMIs and were slightly less likely to be obese than people in their 40s (See the table below). It isn’t until the 1980s that we start to see this trend reverse. Something fundamental about the nature of obesity has changed.

Distribution of BMI and obesity prevalence, non-Hispanic white men in the US by time period and age group. Adapted from Helmchen & Henderson, 2003.

Mystery 3: The Ongoing Crisis 

Things don’t seem to be getting any better. A couple decades ago, rising obesity rates were a frequent topic of discussion, debate, and concern. But recently it has received much less attention; from the lack of press and popular coverage, you might reasonably assume that if we aren’t winning the fight against obesity, we’ve gotten at least to a stalemate.

But this simply isn’t the case. Americans have actually gotten more obese over the last decade. In fact, obesity increased more than twice as much between 2010 and 2018 than it did between 2000 and 2008.

Rates of obesity are also increasing worldwide. As The Lancet notes, “unlike other major causes of preventable death and disability, such as tobacco use, injuries, and infectious diseases, there are no exemplar populations in which the obesity epidemic has been reversed by public health measures.”

All of this is, to say the least, very mysterious.

1.1    Weird Mysteries

Then there are the weird mysteries.

Mystery 4: Hunter-Gatherers 

A common assumption is that humans evolved eating a highly varied diet of wild plants and animals, that our bodies still crave variety, and that we would be better off with a more varied diet. But when we look at modern hunter-gatherers, we see this isn’t true. The !Kung San of Tanzania get about 40% of their calories from a single food source, the mongongo nut, with another 40% coming from meat. But the !Kung are extremely lean (about 110lbs on average) and have excellent cardiovascular health.

Of course, variety isn’t everything. You would also expect that people need to eat the right diet. A balanced diet, with the right mix of macronutrients. But again, this doesn’t seem to be the case. Hunter-gatherer societies around the world have incredibly different diets, some of them very extreme, and almost never suffer from obesity.

Historically, different cultures had wildly different diets — some hunter-gatherers ate diets very high in sugar, some very high in fat, some very high in starch, etc. Some had diets that were extremely varied, while others survived largely off of just two or three foods. Yet all of these different groups remained lean. This is strong evidence against the idea that a high-fat, high-sugar, high-starch, low-variety, high-variety, etc. diet could cause obesity.

A Tanzanian hunter-gatherer society called the Hadza get about 15 percent of their calories from honey. Combined with all the sugar they get from eating fruit, they end up eating about the same amount of sugar as Americans do. Despite this, the Hadza do not exhibit obesity. Another group, the Mbuti of the Congo, eat almost nothing but honey during the rainy season, when honey can provide up to 80% of the calories in their diet. These are all unrefined sugars, of course, but the Kuna of Panama, though mostly hunter-gatherers, also obtain white sugar and some sugar-containing foods from trade. Their diet is 65% carbohydrate and 17% sugar, which is more sugar than the average American currently consumes. Despite this the Kuna are lean, with average BMIs around 22-23.

The Inuit, by contrast, traditionally ate a diet consisting primarily of seal meat and blubber, with approximately 50% of their calories coming from fat. This diet is quite low in fruits and vegetables, but obesity was virtually unknown until the arrival of western culture. The Maasai are an even more extreme example, subsisting on a diet composed “almost exclusively of milk, blood, and meat”. They drink “an average of 3 to 5 quarts/day of their staple: milk supplemented with cow’s blood and meat“. This adds up to about 3000 calories per day, 66% of those calories being from fat. (They also sometimes eat honey and tree bark.) But the Maasai are also quite lean, with the average BMI for both men and women being again in the range of 22-23, increasing very slightly over age.

Kitava is a Melanesian island largely isolated from the outside world. In 1990, Staffan Lindeberg went to the island to study the diet, lifestyle, and health of its people. He found a diet based on starchy tubers and roots like yam, sweet potato, and taro, supplemented by fruit, vegetables, seafood, and coconut. Food was abundant and easy to come by, and the Kitavans ate as much as they wanted. “It is obvious from our investigations,” wrote Lindeberg, “that lack of food is an unknown concept, and that the surplus of fruits and vegetables regularly rots or is eaten by dogs.”

About 70% of the calories in the Kitavan diet came from carbohydrates. For comparison, the modern American diet is about 50% carbohydrates. Despite this, none of the Kitavans were obese. Instead they were in excellent health. Below, you’ll see a photo of a Kitavan man being examined by Lindeberg.

Kitavans didn’t even seem to gain weight in middle age. In fact, BMI was found to decrease with age. Many lived into their 80s or 90s, and Lindeberg even observed one man who he estimated to be 100 years old. None of the elderly Kitavans showed signs of dementia or memory loss. The Kitavans also had no incidence of diabetes, heart attacks, stroke, or cardiovascular disease, and were unfamiliar with the symptoms of these diseases. “The only cases of sudden death they could recall,” he reports, “were accidents such as drowning or falling from a coconut tree.”

Mystery 5: Lab Animals and Wild Animals 

Humans aren’t the only ones who are growing more obese — lab animals and even wild animals are becoming more obese as well. Primates and rodents living in research colonies, feral rodents living in our cities, and domestic pets like dogs and cats are all steadily getting fatter and fatter. This can’t be attributed to changes in what they eat, because lab animals live in contained environments with highly controlled diets. They’re being fed the same foods as always, but for some reason, they’re getting fatter.

This seems to be true everywhere you look. Our pets may eat scraps from the table, but why would zoo animals, being fed by professionals, also be getting fatter? Even horses are becoming more obese. This is all very strange, and none of it fits with the normal explanations for the obesity epidemic.

Mystery 6: Palatable Human Food 

Lab rats gain some weight on high-fat diets, but they gain much more weight on a “cafeteria diet” of human foods like Froot Loops [sic] and salami (see also here).

It used to be that if researchers needed obese rats for a study, they would just add fat to normal rodent chow. But it turns out that it takes a long time for rats to become obese on this diet. A breakthrough occurred one day when a graduate student happened to put a rat onto a bench where another student had left a half-finished bowl of Froot Loops. Rats are usually cautious around new foods, but in this case the rat wandered over and began scarfing down the brightly-colored cereal. The graduate student was inspired to try putting the rats on a diet of “palatable supermarket food”; not only Froot Loops, but foods like Doritos, pork rinds, and wedding cake. Today, researchers call these “cafeteria diets”.

Sure enough, on this diet the rats gained weight at unprecedented speed. All this despite the fact that the high-fat and cafeteria diets have similar nutritional profiles, including very similar fat/kcal percentages, around 45%. In both diets, rats were allowed to eat as much as they wanted. When you give a rat a high-fat diet, it eats the right amount and then stops eating, and maintains a healthy weight. But when you give a rat the cafeteria diet, it just keeps eating, and quickly becomes overweight. Something is making them eat more. “Palatable human food is the most effective way to cause a normal rat to spontaneously overeat and become obese,” says neuroscientist Stephan Guyenet in The Hungry Brain, “and its fattening effect cannot be attributed solely to its fat or sugar content.”

Rodents eating diets that are only high in fat or only high in carbohydrates don’t gain nearly as much weight as rodents eating the cafeteria diet. And this isn’t limited to lab rats. Raccoons and monkeys quickly grow fat on human food as well.

We see a similar pattern of results in humans. With access to lots of calorie-dense, tasty foods, people reliably overeat and rapidly gain weight. But again, it’s not just the contents. For some reason, eating more fat or sugar by itself isn’t as fattening as the cafeteria diet. Why is “palatable human food” so much worse for your waistline than its fat and sugar alone would suggest?

Mystery 7: Altitude 

People who live at higher altitudes have lower rates of obesity. This is the case in the US, and also seems to be the case in other countries, for example Spain and Tibet. When US Army and Air Force service members are assigned to different geographic areas, they are more at risk of developing obesity in low-altitude areas than in high-altitude ones. Colorado is the highest-altitude US state and also has the lowest incidence of obesity.

If you look at a map of county-level obesity data in the United States, the Rockies, the Sierra Mountains, and the Appalachians stand out quite clearly: 

County-Level Estimates of Obesity among Adults aged 20 and over, 2009. Map from the CDC.

Similarly, there is a condition called “altitude anorexia” where individuals who move to a high-altitude location sometimes lose a lot of weight all at once (see also here, here, and weight loss results here). This effect also seems to apply to lab rats who are moved to labs at higher altitudes.

In addition, there is some evidence for a similar relationship between altitude and the rate of diabetes, with people living at a higher elevation having lower rates of diabetes than those living near sea level, even when statistically adjusting for variables like age, BMI, and physical activity.

We know that oxygen and carbon dioxide vary with elevation, so you might expect that this is attributable to these differences. But the evidence is pretty thin. Combined with a low-calorie diet, exercise in a low-oxygen environment does seem to reduce weight more than exercise in normal atmospheric conditions, but not by much. Submarines have CO2 levels about 10 times higher than usual, but a US Navy study didn’t find evidence of consistent weight gain. The atmosphere itself can’t explain this.

One paper, Hypobaric Hypoxia Causes Body Weight Reduction in Obese Subjects from Lippl et al. (2012), claims to show a reduction in weight at high altitude and suggests that this weight loss is attributable to differences in oxygen levels. However, there are a number of problems with this paper and its conclusions. To begin with, there isn’t a control group, so this isn’t an experiment. Without an appropriate control, it’s hard to infer a causal relationship. What they actually show is that people brought to 2,650 meters lost a small amount of weight and had lower blood oxygen saturation, but this is unsurprising. Obviously if you bring people to 2,650 meters they will have lower blood oxygen, and there’s no evidence linking that to the reported weight loss. They don’t even report a correlation between blood oxygen saturation and weight loss, even though that would be the relevant test given the data they have. Presumably they don’t report it because it’s not significant. In addition there are major issues with multiple comparisons, which make their few significant findings hard to interpret (for more detail, see our full analysis of the paper).

Mystery 8: Diets Don’t Work 

There’s a lot of disagreement about which diet is best for weight loss. People spend a lot of time arguing over how to diet, and about which diet is best. I’m sure people have come to blows over whether you lose more weight on keto or on the Mediterranean diet, but meta-analysis consistently finds that there is little difference between different diets.

Some people do lose weight on diets. Some of them even lose a lot of weight. But the best research finds that diets just don’t work very well in general, and that no one diet seems to be better than any other. For example, a 2013 review of 4 meta-analyses said:

Numerous randomized trials comparing diets differing in macronutrient compositions (eg, low-carbohydrate, low-fat, Mediterranean) have demonstrated differences in weight loss and metabolic risk factors that are small (ie, a mean difference of <1 kg) and inconsistent.

Most diets lead to weight loss of around 5-20 lbs, with minimal differences between them. Now, 20 lbs isn’t nothing, but it’s also not much compared to the overall size of the obesity epidemic. And even if someone does lose 20 lbs, in general they will gain most of it back within a year.


Hello! If you’re just joining us, check out achemicalhunger.com, the table of contents helps make the series easier to navigate!

[Next Time: CURRENT THEORIES ARE INADEQUATE]


Links for June 2021

Antidepressant, or Tolkien Character? Not all that hard, but I made mistakes on more than I would like to admit. Don’t mix up Haldir and Haldol.

Look at two things — are they the same or different? We can do it, you can (hopefully) do it, ducklings can do it, BEES can do it — convolutional neural networks, ”one of the most powerful classes of artificial intelligence systems”, can’t really do it

Naturally this is but one of many ways you can confuse modern “artificial intelligence”. We’re particularly fond of “anti-face” dazzle makeup intended to disrupt facial-recognition software. We’d be very interested to see research on the effectiveness of different kinds of dazzle makeup at fooling these algorithms.

Some medical history we didn’t know — when the medical journal The Lancet was founded, it was intensely anti-establishment, which is part of why it is named after a cutting implement. Sadly, this no longer seems to be the case.

In other medical developments, consider this story of a patient who suspected doctors were using too high of a dose of Dexamethasone, convinced them to run studies on the effectiveness of a lower dose, and ended up saving a lot of lives.

On the even more extreme end of patient involvement, Benjamin Stecher describes what it’s like to be awake while a team of doctors is drilling into your skull so that they can install hardware for deep brain stimulation.

And now for something completely different: John Cleese referencing Thomas Kuhn’s The Structure of Scientific Revolutions on twitter. The crossover I didn’t even know I needed.

Designing antennas by hand is hard and takes a long time. Designing antennas with evolutionary algorithms is fast and easy, and as a little bonus, the antennas look really weird:

A new book (we haven’t read it) argues that getting hammered — and in particular, getting hammered together — made civilization possible. (h/t Roger’s Bacon) We’re paraphrasing. Naturally related to some of our previous work. We also feel obliged to reference Chinese poet Li Bai.

Maybe you heard that George W. Bush was a cheerleader. What you may not know is that he is not alone among US Presidents in this distinction.

I’m sure you’ve all heard by now about the dangerous wildlife in Australia, but did you hear about the mouse plague? (warning: not for the faint of heart) They’re in the middle of an especially bad mouse plague now. “The horror lurking in the darkness,” reports the New York Times, “is a throng of thousands of mice swarming above.” Other choice quotes from the article: “They used to say once they start eating each other, it’ll be over, but they’ve been eating each other since December, and it’s not stopping.”

Links for May 2021

A recent trend has people posting their ideal political platform. But no platform can beat Angelyne for Governor — she has the key to California. What is her secret? A lady never tells, but we suspect she is reading the Chinese classics. In The Way of the General, Zhuge Kongming says, “One person can teach ten, ten people can teach a hundred, a hundred people can teach a thousand, a thousand can teach ten thousand.” Angelyne says, “California will hire social workers to select individuals and help them one by one … Once a person has been homed/established, California will hire that person to help relocate another. And the social workers shall continue on, as well. And that becomes exponential! Voila! Everybody has a home and the streets are clean!”  Truly Angelyne is a 仁人.

Cinnamon and Avocados are closely related to each other. Bananas are more closely related to lilies and skunk cabbage than they are to papayas. Aloe vera is closely related to asparagus and onions, but not at all related to cacti, which are closely related to beets. There is no such thing as a tree.

A little birdie on twitter told me that there’s a group on discord that are trying to breed quail back into dinosaurs. We wish them success but also, uh… be careful guys.

Also from twitter: “I cannot stress enough just how much you need to stop whatever it is you are doing and watch the opening titles of this documentary. I promise you’re not ready for what’s coming.”

We love stories about what questionable research practices look like from the inside, not only because it humanizes the people involved (most of them don’t mean to do anything wrong) but also because it helps make it obvious how easy it can be to slip into this kind of behavior if you’re not careful. Here’s a new one from Devon Price, who was in graduate school right in the middle of the revolution.

Joe Hilgard, famous (or infamous) psychology research methods gadfly, recently left academia for a job as a data scientist, and writes about it in a post titled “Smell you later”. While we disagree with the author’s cocktail choices, his candor is quite interesting in saying, “I feel that a lot of the research that we do doesn’t matter.” We suspect many academics feel similarly, but it’s very rare to hear someone say it, even when they’re leaving.

One of the most interesting things about scams and grifts is just how audacious they can be while still getting away with it. Enter the Quadro Tracker, a bomb / drug / person / weapon / lost golf ball-detection device from the 90s that was sold at prices of up to $8,000 and turned out to be a box of dead ants. This does make cops and school districts look like a bunch of suckers, but I wonder if the real mark wasn’t everyone else. Seems like cops and school officials would love to have an excuse to accuse or search anyone they like, backed up by an $8,000 price tag. If the real nefarious purpose of the device was to manufacture reasonable suspicion and probable cause, I’m afraid to say that it may have worked (for a while).

New in cognitive science: If I fits I sits: A citizen science investigation into illusory contour susceptibility in domestic cats (Felis silvestris catus). In this project, Gabriella Smith and colleagues show that cats want to sit in imaginary boxes as well as real ones. The study is also notable for crowdsourcing citizen science, i.e. using bored cat-owners in pandemic lockdown to collect data.

Also new: researchers at University College London gave people a third thumb to see what it would do to their brain. The brain results are meh but you have to see the videos of this thumb in action. [Insert joke about being “all thumbs”.]

Finally, some news about us, the blog: We’re looking for an agent who can represent nonfiction trade books. We already have one manuscript written which early readers have called a “fascinating, comprehensive review” and “a wild new paper that blew my mind”. If you are an agent or know an agent who might be interested in representing us, please contact us!

Cheating at School is a Better Idea Than Ever

With absolutely no apologies to The Wall Street Journal.


A year of absolute, unprecedented bullshit has spurred an eruption of cheating among students, from grade school to college. With many students isolated at home over the past year—and with the pointless grind of school revealed for what it truly is—academic dishonesty has never been such an obviously reasonable choice.

Some pedants fear the new generation of cheaters will be loath to stop even after the pandemic recedes. “Students have finally found a way to avoid my bullshit, and worse, they know it works,” said Phineas Whateley, senior teaching fellow in soup calculation at Royal University College in London, who has studied academic integrity issues for more than two decades, though apparently without learning anything of value. He said cheating sites number in the thousands, from individuals to large-scale operations.

Concerned about his West Carolina State University students cheating in a statistics class, Richard Penistone launched a plan.

Rather than writing a more reasonable exam, or spending time helping students master the material, Mr. Penistone, a course coordinator, wasted countless hours writing a computer program that generated a unique set of questions for each student. Those questions quickly showed up on a for-profit homework website that helped him to identify who posted them. 

About 200 students were caught cheating—one-fourth of the class. Yet somehow Mr. Penistone was more concerned about punishing these 19-year-olds than he was that he had created a class that 25% of his students decided was such bullshit that they couldn’t be arsed to even attempt the final exam. 

We note that Mr. Penistone is a course coordinator, not faculty. We assume he’s not tenured; he doesn’t even have an advanced degree. What is his deep-seated loyalty to this West Carolina State University, exactly? Do they pay him so especially well, that he is roused to stay up late writing code to generate and distribute 800 totally unique exams?

Overall, cases of academic dishonesty more than doubled in the 2019-20 academic year at WC State, with the biggest uptick as students were forced into the absurdity that is the Zoom classroom, according to the school.

Educators say stress and pressure, possibly related to the global pandemic maybe???, are a big reason why students cheat. “Especially in a time of stress, they realize that there are more important things than the rote memorization and regurgitation we force them to do on exams,” said Myra Capwell, president of the International Global Center for Academic Honor, Security, and Integrity, and director of the Kansas-Nebraska-Indiana Interstate University Dignified Honor and Integrity System.

Lucien Hoyt, an 18-year-old freshman at Mamimi University in Cambridge, Ohio, said he knows students who have used homework help sites for studying—and (brace yourself, dear reader) for cheating. He said he hasn’t cheated himself, but then again, he knows we’re narcs, so he would say that. 

He said students, including himself, are frustrated with virtual learning because it throws into stark relief how artificial these courses are, and how none of it matters. “I haven’t struggled this way with learning material, ever,” he said. “In the classroom I had the vague sense I might actually be getting an education. But with the trappings stripped away, it just becomes so clear that what they’re asking us to do is total busywork.”

At the K-12 level, schools are free to indulge their whimsy to become miniature police states, and many block a range of homework help websites from district computers to prevent cheating. Ultimately even this exercise in authoritarianism is pointless, however, since this doesn’t stop a student from visiting the site from a different device. 

Middle-school teacher Aurora Zimmer in Lake, Califorina, has put less emphasis on testing during online learning because it is also dawning on her, if somewhat slowly, that this is an exercise in futility. “We have no control of what is going on when you’re on a computer,” she said. “We can’t even force you to ask us to go to the bathroom. It really makes you think.”

Measures taken in the name of online cheating have spawned a new kind of comforting-and-not-at-all-draconian industry: surveillance-type companies that hire online randos to actually watch students take tests from home. I don’t know about you, but I find the idea of a faceless company hiring an online stranger to watch my 19-year-old child take a test in their bedroom very reassuring.

The internet strangers hired by these companies look for suspicious behavior (this is good because they are presumably experts in suspicious behavior), such as a student disappearing from camera view (going to the bathroom) or being slipped answers (eating chips). Some use “facial-detection” “software” to automatically penalize students who glance, however briefly, out of frame, or make “unusual movements”. This allows universities to not only be pedantic at heretofore unimaginable speeds, it allows them to outsource it as well.

Proctorio, based in Scottsdale, Arizona, said it monitored 21 million exams in 2020 world-wide, up from 6 million exams in 2019.

ProctorU, based in Hoover, Alaska, notes worrying displays of basic desires for respect, freedom, and privacy. “Some of these students must have accidentally been paying attention in their American History classes,” says one ProctorU drone with a sneer, “But we have a leg up on King George III. These latter-day George Washingtons and Patrick Henrys don’t stand a chance.” 

Here are some “funny stories” about cheating to make it sound amusing and disguise the human cost and egregious civil rights violations inherent to this kind of in-home surveillance: Some of the busts include a student suspected of trying to use a drone’s camera to take images of a test to possibly share with others; another who was trying to cheat by using information on sticky notes on his dog; and a female student who sneezed and disappeared from view, to suddenly be replaced by a male wearing a blond wig, impersonating her. Dogs and crossdressing! Isn’t that funny? Now go back to bed, America. 

Among the newer ways to cheat are homework auction sites, which give students a say in who does their work and at what price. Students post their assignment on a website, along with a deadline; the website acts as a marketplace for bidders who offer to do the assignment.

The bidders, who often refer to themselves as tutors, can tout degrees and other credentials. Some companies allow students to rate their work and post reviews online.

Stella Walker, a blogger and content strategist for Buy4Cheating.com, a site where students can auction out writing assignments, said the site’s terms prohibit academic fraud and plagiarism. She said she supposes cheating can happen, but it would be on a student’s conscience. “I know you’re a snitch, bonehead,” she told us.

One self-described independent tutor listed as Seymour Butz in a Craigslist ad said in an interview by text message that business was booming during the pandemic. The Craigslist ad noted services such as doing students’ math work. The tutor disavowed the label of a cheater for students, and said that the tutor helps students learn by providing written tutorials and explanations for math problems.

“No way would any student use my cheating service to avoid doing their work,” said Mr. Butz. “Boy do I ever disavow that label, you can put THAT in your article.”

Mr. Butz touted bachelor’s and master’s degrees from Pinto University in the ad, but the university said it was unable to locate such information in its records. We at The Wall Street Journal are beginning to suspect that “Seymour Butz” may be an alias

Other popular websites that students use to get help—by submitting a question for an expert to quickly answer, or by searching a database of previous answers—include Chegg and Brainly, which said they have seen a big increase in users during the pandemic.

Chegg, a publicly held company based in Santa Clara, Calif., prides itself on a willingness to be a big squealer, and help institutions determine the identities of those who cheat. “We really like to play both sides. It gives us a deep, almost visceral pleasure, to serve as a sort of giant honeypot sting operation for entrapping helpless students,” they told us by greeting card, despite the fact that we specifically did not ask them for comment, and don’t know how they found our home address. On the basis of such scummy practices, Chegg saw total net revenue of $644.3 million in 2020, a 57% increase year over year. Subscribers hit a record 6.6 million, up 67%, and students are charged between $9.95 and $19.95 per month for the privilege of letting Chegg stab them in the back.

Mr. Penistone at WC State said Chegg helped identify the 200 students that used its website to avoid taking his exhausting final exam. Some students posted exam questions to get answers while others accessed the information, all traceable through users’ email addresses, IP addresses and the time of the access.

Another website that students were suspected of using to cheat on the exam to a lesser extent showed actual moral fiber and didn’t cooperate with the university, Mr. Penistone said bitterly.

The students were given three options: meekly accept their punishment, join Mr. Penistone in what we can only imagine must be an excruciatingly awkward Zoom call to “review the evidence”, or dispute the accusation with the Office of Student Conduct. This office designed and staffed by the university and tasked with enforcing its rules is certain to give them all a fair hearing, we are sure.

“A lot of the students responsible said, ‘It’s unfair to put us through this, because we’re going through a pandemic,’ ” Mr. Penistone said. “Fortunately these complaints fell on deaf ears. I had no choice because there was a zero-tolerance policy. I mean, I’m the one who designed the class, and the exams, and the zero-tolerance policy. But really, I had no choice.”

Even after the bust, the cheating didn’t stop. This is unsurprising, because the issue is not cheating, but unrealistic expectations in texts and exams. A close analogy might be, “Even after the floggings, the attempts to mutiny didn’t stop.” I wonder why.

“In the fall semester, of 1,000 students, I still had attitude problems academic integrity issues with 70 or 80,” he said. “I still don’t understand the basic issues at play here — probably they have just gotten better at cheating, but fortunately I am blissfully unaware of all things happening in my classroom.”

The real tragedy of course is how this all contributes to greater societal alienation — that cheating is now being outsourced to faceless corporations, rather than being a way to build community with fellow classmates. What kind of America are we building for our children?

Sobering Drug Tales that didn’t make it into “Higher”


The following are three drug tales and a few Vin Mariani ads that we cut from our recent post
Higher than the Shoulders of Giants, because they were too long or didn’t fit.


From Was Dr. Carl Koller driven from Vienna in 1885?

After his colleague, Dr. Fritz Zinner, called him an impudent Jew in public in the General University Hospital of Vienna, Koller reacted by hitting the man in the face. A duel with heavy sabres was the outcome; Koller was unharmed, whilst his opponent received two deep gashes. Such duels were strictly forbidden at that time already, but were nonetheless still executed. In consequence, Koller’s hopes of obtaining a position in the Eye Department, for which he was very well qualified, and of an academic career in Vienna were dashed and he had to emigrate.



Tidbits on the history of MDMA per Wikipedia

American chemist and psychopharmacologist Alexander Shulgin reported he synthesized MDMA in 1965 while researching methylenedioxy compounds at Dow Chemical Company, but did not test the psychoactivity of the compound at this time.

While not finding his own experiences with MDMA particularly powerful, Shulgin was impressed with the drug’s disinhibiting effects and thought it could be useful in therapy. Believing MDMA allowed users to strip away habits and perceive the world clearly, Shulgin called the drug window. Shulgin occasionally used MDMA for relaxation, referring to it as “my low-calorie martini”, and gave the drug to friends, researchers, and others who he thought could benefit from it. One such person was Leo Zeff, a psychotherapist who had been known to use psychedelic substances in his practice. When he tried the drug in 1977, Zeff was impressed with the effects of MDMA and came out of his semi-retirement to promote its use in therapy. Over the following years, Zeff traveled around the United States and occasionally to Europe, eventually training an estimated four thousand psychotherapists in the therapeutic use of MDMA. Zeff named the drug Adam, believing it put users in a state of primordial innocence.

In the late 1970s and early 1980s, “Adam” spread through personal networks of psychotherapists, psychiatrists, users of psychedelics, and yuppies. Hoping MDMA could avoid criminalization like LSD and mescaline, psychotherapists and experimenters attempted to limit the spread of MDMA and information about it while conducting informal research.

In an early media report on MDMA published in 1982, a Drug Enforcement Administration (DEA) spokesman stated the agency would ban the drug if enough evidence for abuse could be found. By mid-1984, MDMA use was becoming more noticed. Bill Mandel reported on “Adam” in a 10 June San Francisco Chronicle article, but misidentified the drug as methyloxymethylenedioxyamphetamine (MMDA). In the next month, the World Health Organization identified MDMA as the only substance out of twenty phenethylamines to be seized a significant number of times.

After a year of planning and data collection, MDMA was proposed for scheduling by the DEA on 27 July 1984 with a request for comments and objections. The DEA was surprised when a number of psychiatrists, psychotherapists, and researchers objected to the proposed scheduling and requested a hearing. 

Sensational media attention was given to the proposed criminalization and the reaction of MDMA proponents, effectively advertising the drug. In response to the proposed scheduling, the Texas Group increased production from 1985 estimates of 30,000 tablets a month to as many as 8,000 per day, potentially making two million ecstasy tablets in the months before MDMA was made illegal. By some estimates the Texas Group distributed 500,000 tablets per month in Dallas alone. According to one participant in an ethnographic study, the Texas Group produced more MDMA in eighteen months than all other distribution networks combined across their entire histories. 

… 

Urged by Senator Lloyd Bentsen, the DEA announced an emergency Schedule I classification of MDMA on 31 May 1985. The agency cited increased distribution in Texas, escalating street use, and new evidence of MDA (an analog of MDMA) neurotoxicity as reasons for the emergency measure. The ban took effect one month later on 1 July 1985 in the midst of Nancy Reagan‘s “Just Say No” campaign.

As a result of several expert witnesses testifying that MDMA had an accepted medical usage, the administrative law judge presiding over the hearings recommended that MDMA be classified as a Schedule III substance. Despite this, DEA administrator John C. Lawn overruled and classified the drug as Schedule I. Later Harvard psychiatrist Lester Grinspoon sued the DEA, claiming that the DEA had ignored the medical uses of MDMA, and the federal court sided with Grinspoon, calling Lawn’s argument “strained” and “unpersuasive”, and vacated MDMA’s Schedule I status. Despite this, less than a month later Lawn reviewed the evidence and reclassified MDMA as Schedule I again, claiming that the expert testimony of several psychiatrists claiming over 200 cases where MDMA had been used in a therapeutic context with positive results could be dismissed because they weren’t published in medical journals.



Kary Mullis describes his first time taking LSD in his autobiography, Dancing Naked in the Mind Field:

At Georgia Tech, I had a wife and a little girl. I had short hair and I studied all the time. My senior year I made perfect grades. I studied physics and math and chemistry to the point where I would never have to study them again. And all I knew about drugs was what I read in magazines like Time and Life. I learned that marijuana was a dangerous addictive drug and that I should stay away from it. On the other hand, I learned that LSD was a miracle that just might enable scientists to understand the workings of the brain, could be the cure for alcoholism, and, just incidentally, might prevent World War III. Psychiatrists were prescribing it for their patients. In 1966 LSD had not yet been made illegal. Respected, well known people were admitting that they had experimented with LSD. The Luce family, the publishers of Time and Life, were so intrigued by the scientific potential of LSD that they funded the research of Harvard professor Timothy Leary.

A person who loved playing with chemicals as much as I did just couldn’t help but be intrigued by LSD. The concept that there existed chemicals with the ability to transform the mind, to open up new windows of perception, fascinated me. I considered myself to be a serious scientist. At the time it was still all very scholarly and still legal. There was no tawdry aura over it. People weren’t blaming their kids’ problems on it yet. Hippies had just started to differentiate themselves from beatniks and the difference seemed to be fewer years and more hair on the hippies. And they stayed in college.

In 1966 I wanted to try LSD. My wife, Richards, helped me pack up the Impala, we put our daughter Louise in the back seat, and we drove to Berkeley for graduate school. It was the first time I’d been to California and it surprised me. I had not expected that the trees would be different. I didn’t know that the Pacific Ocean was always cold. I didn’t expect San Francisco to be foggy in the summer. I thought there would be naked girls. I certainly didn’t know that I would be changed so profoundly.

I didn’t want to take LSD alone. I had learned that from magazines. The first week of class I became friendly with the only guy in my class with long hair, Brad. I figured he would have LSD. Brad was smart. He appreciated the fact that I could calculate how long it would take the moon to fall to the Earth. He had graduated from Oberlin College, where they knew it was possible to do such a calculation but they wouldn’t be so crass as to actually learn how.

Brad had experimented with psychedelic drugs and agreed to guide me through my first trip. He suggested that before I took LSD, I should smoke some marijuana because it might give me some idea of how my consciousness would be changed. Marijuana scared me, I told him. Everything I’d read about it said that it was a bad drug, an addictive drug — one toke and you’re a slave for life.

He persuaded me to smoke a “joint,” as he called it. Within moments my fear disappeared. I was laughing. Brad and I talked about wise things for hours. At some point, Brad left. I looked at Richards, my wife, with new eyes. She was the same Richards, but not to me. I grabbed her in a primitive way, rolled her onto our enhanced bed, and felt the surging power of bliss.

A week later I said, “Brad’s going to come over tonight. I’m taking acid.” Richards said she would make a nice dinner.

During dinner, Brad gave me what was called a double-domed 1000 microgram Owsley. He had bought it for five dollars. It was soon to become illegal. I didn’t finish dinner. I started laughing. I got up from the table and realized, on the way to the couch, that everything I knew was based on a false premise. I fell down through the couch into another world.

Brad put Mysterious Mountain by Hovhaness on the stereo and kept playing it over and over. It was the perfect background for my journey. I watched somebody else’s beliefs become irrelevant. Who was that Kary Mullis character? That Georgia Tech boy. I wasn’t afraid. I wasn’t anything. I noticed that time did not extend smoothly—that it was punctuated by moments—and I fell down into a crack between two moments and was gone.

My body lay on the couch for almost four hours. I felt like I was everywhere. I was thrilled. I’d been trapped in my own experiences—now I was free. The world was filled with incredibly tiny spaces where no one could find me or care what I was doing. I was alone. My mind could see itself.

Brad had given me 1000 micrograms because he wanted me to have a thorough experience. I think he said “blow your ass away.” With 100 micrograms you feel a little weird, you might hallucinate, and you can go dancing, but you know you’re on acid. You’re aware that you’re having a trip and the things that you see are hallucinations. You know that you should not respond to them. When you take 1000 micrograms of LSD, you don’t know you’ve taken anything. It just feels like that’s the way it is. You might suddenly find yourself sitting on a building in Egypt three thousand years ago, watching boats on the Nile.

After four hours Brad told me we were going to take a ride in the car. I didn’t know what a car was. We got inside this thing and it started moving and I started to panic. I didn’t want to be in a car. I didn’t like movement, I just wanted to find a quiet place. Eventually we stopped in Tilden Park by a fountain. I got some water. It was cold and fluid but it wasn’t the water I knew. It left trails and it was alive. I didn’t know Brad, I didn’t know my wife. When they got me back in the car I understood I was inside a vehicle. I knew it had a key that made it work, but I didn’t want it to. I was sitting in the back seat, and we started down Marin Avenue, which drops 800 feet in four blocks. Berkeley was below and I was dizzy. I reached over from the back seat and pulled out the key. Brad took back the key, told me to behave, and drove home.

About five o’clock in the morning I began to come back to earth. The most amazing aspect of the entire experience was that I landed back in the middle of my normal life. It was so sweet to hear the birds, to see the sun come up, to watch my little girl wake up and start playing. I appreciated my life in a way I never had before.

On the following Monday I went to school. I remember sitting on a bench, waiting for a class to begin, thinking, “That was the most incredible thing I’ve ever done.”

I wrote a long letter to my mother. I often wrote to my mother to tell her what I was thinking about. As I was writing the letter, I began to realize that for the first time in my life, there were some things that I might not be able to explain to her. But I tried.

My mother responded by sending me an article she’d torn from the Reader’s Digest. It said that taking LSD was bad for your brain and will cause flashbacks for the rest of your life. She entreated me not to do it anymore. I wrote back that it was too late. It had already changed me.

I wanted to understand what had happened. How could 1000 micrograms—one thousandth of a gram—of some chemical cause my entire fucking sensorium to undergo such incredible changes? What mechanisms inside my brain were being so drastically affected? What did these chemicals do to my visuals? I wanted to know how it worked. I wanted to know more about neurochemistry.

Berkeley had a classic biochemistry department, meaning it consisted of professors who specialized in the chemical mechanisms underlying all life. They didn’t know much about mammals, besides their wives and students, and they weren’t interested in neurotransmitters. I was on my own. I knew that my brain was behind my eyes. I learned that no one knew very much about how it functions. We knew which parts of the brain controlled certain things, but we didn’t know how or why. It seemed pretty obvious to me that neuroactive drugs might help us find out. These chemicals caused a really interesting interaction between psychology, biochemistry, and anatomy, but we didn’t know why. There was good reason to expect that we might learn something about mental illnesses, which might be caused by an imbalance in the chemistry of the brain.

Drug laws don’t have much to do with science or health. Opium was made illegal in California because Chinese dock workers in San Francisco were taking jobs away from Irish dock workers who preferred to be drunk than opiated. Opium dens were raided and Chinese workers were arrested. Conveniently, they couldn’t report for work in the morning. They moved north.

Marijuana was declared illegal after the end of Prohibition in 1938 because the opium/alcohol cops needed something to police or they’d lose their jobs. To gain public support, marijuana was depicted as a dangerous drug that caused black and Mexican men to lust after white women. It wasn’t the drug. Black men and Mexican men didn’t suddenly develop a need for white women; white men suddenly developed a need, after 1938, for jobs. Alcohol was back in; marijuana was shortly going to be out. People who wanted to be into prohibition would now prohibit marijuana. The same people, and maybe their children, would be happy to make a living prohibiting LSD.

LSD somehow got connected with the anti–Vietnam War movement. Drugs had to be the reason that the youth of America had long hair, wore beads, enjoyed sex, and didn’t think it was a good idea to go to a foreign country and kill the locals. Psychedelic drugs were made illegal.

I Want to Believe the Sexy van Leeuwenhoek is Out There

Old friends and doctoral students in the sciences, MENO and SALVIATI, are discussing the state of science funding. 


Meno: There are lots of ways to have a good life. Research isn’t even an easy way.

Salviati: No, it’s a pretty shit way, and would be even if you were divinely gifted. There are interesting side-effects, but it’s not exactly the good life. Antonie van Leeuwenhoek had the right idea — first he became a rich merchant in the sale of cloth, then while fiddling with lenses in his spare time, was the first to discover microorganisms, and more or less invented the field of microbiology.

Meno: Yeah, what’s the modern day version of that?

Salviati: We know that Dr. H [a scholar of their mutual acquaintance, who became very wealthy from oil investments before pursuing graduate school — Ed.] could have done it, and you wouldn’t need to be nearly as wealthy as he was. I do wonder if the option used by Leibniz and Leonardo is still open, of finding a rich patron. It would be easier today than it was for them — you wouldn’t need to find a king or a duke, there are plenty of people today rich enough to support a scientist.

Meno: A science sugar daddy.

Salviati: Yes, but you can’t say that. Hm – Actually, I think you could probably do very well by finding the smartest and most capable camgirls, and doing some kind of OnlyFans science accelerator setup.

Meno: Once you’re doing sex work to fund your research, maybe just teach instead?

Salviati: Camgirls can make like $100,000 a month.

Meno: Really??

Salviati: It’s true, though only the most successful. More usual is around $40/hour, but even that’s quite a good wage. And camgirls, or successful sex workers more generally, might be expected to be especially gifted scientists.  Camgirls are not the only profession that this would work for, of course. Any job with good pay and flexibility could do the same. But camgirls are notable for the flexibility of their profession, and their abilities are especially likely to be underestimated by society at large. You’re selecting heavily for people who are open to doing something rather unorthodox, and this kind of free-thinking is essential to the scientist. Further, by going to the most successful camgirls, you’d be selecting people who have already demonstrated ability to set up what is essentially a very successful small business.

Meno: I gotta get into that.

Salviati: If you try, I wouldn’t hold it against you.

Meno: You first.

Salviati: Most of them, of course, do not make that much. But also most people do not win major international scholarships, as you have, my friend.

Meno: And only I shall do both.

Salviati: The noted camgirl / philosopher / hetaira known as Aella, a sort of latter-day Phryne or Aspasia, has collected much data that may be relevant. Here she finds that on OnlyFans, those with a college degree make more money than those without — and those with a master’s degree or higher do even better. Note that the bars are by percent ranking — as in, who is in the top 5% of people on the site? — and so a shorter bar means more success, and more money.

Salviati: Here she finds that while most people on OnlyFans do not make much money, the most successful take in several thousand dollars per month. She herself has made $100,000 in a month on at least one occasion. 

Salviati: She says “the big change is around 0.7-0.8%”, and with around a million people on OnlyFans, that is about seven or eight thousand camgirls making a small fortune every month. If only one in a thousand of these women also have the ability to be great scholars, then there are several of them; and I would wager, it is more than one in a thousand. 

 van Leeuwenhoek and Aella in exactly the same pose, and very nearly the same outfit. Ok, has anyone seen both of them at the same party?

Meno: I suppose the first step is to get hot, is it not?

Salviati: If you want to become a successful cammer, Meno, I can hardly stand in your way, but this is not what I had in mind. I was thinking you could recruit talented camgirls, bring them together, and offer them scientific training. They’re fully-funded, having no need of patronage — they have patronage already. 

Meno: In exchange for them funding your research?

Salviati: No, sort of as a science accelerator.

Meno: What do you mean?

Salviati: Well, would you say that society overvalues or undervalues the scientific ability of camgirls?

Meno: Society undervalues their scientific ability, certainly.

Salviati: I could hardly agree more. Society clearly undervalues their ability. When society misses ability in this way, it means there is unappreciated value waiting to be created. There’s value just lying around. The greatest amount of untapped value belongs to the camgirls, of course, and it would be their place to gain from it. But when something is undervalued, it also means that there is a space for those who, unlike the rest of society, recognize skill and virtue when they see it, and invest in it when no one else will. You and I have little money, but we do have the benefit of our skills, experience, and scientific training that is unavailable to many. We could — for example — go to top camgirls who also seem gifted and capable, especially those with science backgrounds. We could offer to connect them to camgirls of similar interest and abilities, and for a modest fee, offer them whatever scientific training we can give, and begin collaborations with them. The camgirls could do it themselves, naturally, but someone would need to coordinate this and it could equally well be us, since we could also offer them some training (especially statistics training). Together you could get a house or two in a remote area with good internet. They can keep camming to make money — van Leeuwenhoek didn’t quit his job as a cloth merchant! — and then you all collaborate to cure cancer or something.

Meno: This sounds like more work and less payoff than getting an academic job.

Salviati: Look, you’re the one who called patrons “science sugar daddies.” I’m just agreeing with you.

Meno: Hahaha – I appreciate the idea. It’s important to ideate on ways to reach the goal, which is sufficient money and freedom to pursue science.

Salviati: Yes

Meno: There’s no guarantee that an academic job provides the best tradeoffs, but it does seem to provide a reasonable standard to beat.

Salviati: I just wonder if the kind of guy who would pay a woman $20/month to look at her nudes wouldn’t pay her $100/month to look at her nudes AND support her research. There will be some people who will pay to see nudes, and some people will pay to support their favorite scientist or artist. There will also be people who do both, and I wonder if they won’t put up even more money than they would in either case alone. To put in stupid stats terms, I look at Patreon and OnlyFans and wonder if there might be a significant interaction.

Meno: I think you may be optimistic about the scientific skill of camgirls. I’m sure they’re better scientists than people might expect, but good scientists are rare.

Salviati: All that needs to happen is for it to be undervalued, I think, and for there to be a lot of them, which there are. I guess I’m a big believer in the effect of what can be called “scenes”. If there are many untapped geniuses, and we bring them together, what they produce together will be much greater than what they could ever produce apart. Paul Graham at one point wrote about what he calls the case of the Milanese Leonardo. He says, “Practically every fifteenth century Italian painter you’ve heard of was from Florence, even though Milan was just as big. People in Florence weren’t genetically different, so you have to assume there was someone born in Milan with as much natural ability as Leonardo. What happened to him?” Or we can consider “The Martians”, the nickname for a group of scientists from Hungary. They all grew up in the same parts of Budapest, went to the same high schools, and eventually went to America where among other things they led the Manhattan Project.

Meno: I’m not sure I see why they only have to be undervalued in science, rather than good at science.

Salviati: That might be the weak link. It depends on how seriously we take the “Milanese Leonardo” argument. Now, you and I agree that the two of us have at least some chance to make a great discovery. Where does that promise come from in us? Are you and I in the top .01% in terms of natural ability? Or in the top 5% plus our connections, the scenes we are a part of?

Meno: Regardless, being near the top is what matters, right? If I think someone’s IQ is 70 but it’s actually 80, I’ve undervalued them but they’re not going to produce good science.

Salviati: You’re right that it’s not strictly about undervaluation. But there are thousands of sex workers, in fact hundreds of thousands. Not everyone has it in them to be a great scientist, but a great scientist can come from anywhere. If we believe that camgirls are undervalued for their scientific skills — a safe assumption — then there are likely camgirls with the potential to be great scientists, and who society has missed. Another PhD student I know recently told me (though I omit their name in case they are hesitant), “sex workers know more about the human condition than any psychologist, change my mind” … Oh yeah that was after I said “Time to start my own journal, with blackjack and hookers”

Meno: I think I’m going to do the postdoc I was recently offered, rather than try to turn thousands of sex workers into scientists, but I see the appeal.

[Crowd boos]

Salviati: This seems more like a good opportunity for sex workers who want to be van Leeuwenhoek than it does for us. But I would be happy to consult for sexy van Leeuwenhoek, so maybe there is some profit to be made on our side.

Meno: The real key… is to become the sexy van Leeuwenhoek.

Salviati: Sadly I am not sexy enough. But I want to believe the sexy van Leeuwenhoek is out there.

FIN

Links for April 2021

“THOUGH TOMMY IS A MALE SQUIRREL HE HAS TO WEAR FEMININE CLOTHES BECAUSE TAIL INTERFERES WITH HIS WEARING PANTS,” Life Magazine reported defensively in 1944. “The Bullis family … took him on the road in their Packard automobile, where he … gave uninspiring radio interviews. … At the height of his fame his fan club numbered 30,000 members. … When he died in 1949 he was stuffed and mounted … and his nightmarish fate pursued him even into the grave.” Apparently the interview was along with FDR, so perhaps it was more entertaining than they say! Unfortunately we can’t find it. We also learned that in his journeys Tommy was “accompanied by a bulldog that had gold teeth and wore a fez. ‘He may very well have gotten the dog stuffed, too,’ Jim said. If so, its whereabouts are unknown.” The article also calls avocado “avocado pear” — truly the past is a foreign country.

In case you were wondering: yes, mice CAN hallucinate, and they hallucinate more than average when you give them ketamine. Apparently this is the kind of finding you need to get published in Science these days.

All you need to do to get into Nature, on the other hand, is brew cannabis using hacked beer yeast. The resulting beer is probably illegal but we’re guessing the yeast itself is not. Can you make sourdough with this stuff? What are the limits on this discovery? ARE there any limits??? 

The kids are more than all right, they’re playing 5-Dimensional Chess With Multiverse Time Travel. If you’re not afraid of the next generation yet, you should be.

Totally unrelated, Kelsey Piper at Vox argues compellingly that voting rights should be extended to young people regardless of age. How young? To “every American citizen who can successfully fill out a ballot.” We’ve made similar arguments about age in the past, but this is even more radical, and we approve. 

In 1997 Jim Kardach of Intel was reading Frans G. Bengtsson’s historical novel about 10th-century vikings The Long Ships in between working on wireless data transfer. When it came time to name the technology, he named it after King Harald Bluetooth and invented a logo based on the runes for H and B. Similarly, the peace symbol was originally for Nuclear Disarmament and is a combination of the flag semaphore characters for N and D.  

Also at the intersection of new and very old is the 🧿 Nazar Amulet Emoji, representing an eye-shaped amulet believed to protect against the evil eye.

You remember all those stories about the vibrator being invented to treat hysteria? Probably outright lies, invented in the ‘90s. The NYT reports: Everything You Know About the Invention of the Vibrator Is Wrong 

Werner Herzog interviewed by Jenkem, a skateboarding magazine. “I am puzzled because I am not familiar with the scene of skateboarding,” he says. “At the same time, I had the feeling that, ‘Yes, that is kind of my people.’ … You have to accept trial and error. And I see them doing a certain jump or trying to slide on a metal rail, and they do it 25 times and fail. The 26th time, they fail. The 30th time, they fail. It’s good that you accept failure and you don’t give up, and finally you land the right jump and you keep sliding and screeching down a handrail.”

Did you hear about the garden gnome shortage?

In other international news, a Russian man ‘trapped’ on Chinese reality TV show has finally been voted off, and the story is truly pitiable. “His lack of enthusiasm played out in half-hearted singing, rapping and dancing alongside the other, more eager contestants. … he urged the public to vote him out, saying he did not want to be among the 11 winners of the show, who are contractually obliged to form a boy band. ‘Don’t love me, you’ll get no results,’ he said on one episode. But viewers took to his dour persona and kept him in the running for nearly three months. … Fans, some earnest and some ironic, dubbed him ‘the most miserable wage slave’ and celebrated him as an icon of ‘Sang culture’, a popular concept among Chinese millennials referring to a defeatist attitude towards everyday life. ‘Don’t let him quit,’ one viewer commented on a video of a dejected-looking Mr Ivanov performing a Russian rap. ‘Sisters, vote for him! Let him 996!’ another fan commented, using the Chinese slang for the gruelling work schedule that afflicts many young employees, especially in digital start-ups.”

Sometimes it feels like everything has already been discovered, like there’s no possibility of progress in the modern world. But progress is alive and well, at least in the realm of pro NES Tetris, where a new technique has been discovered that is faster than Hypertapping!

Peer review doesn’t always work great. But you know what does work great, all the time, with definitely no problems ever? That’s right — tinder! This is the pitch for papr.io, which describes itself as, “Tinder, but for papers & preprints.” We can’t tell if this is brilliant or moronic. 

Higher than the Shoulders of Giants; Or, a Scientist’s History of Drugs


I. 

The United States used to introduce new constitutional amendments all the time. But after the 26th Amendment in 1971, we stopped coming up with new amendments and haven’t added any since. (The 27th Amendment doesn’t really count — while it was ratified in 1992, it was proposed all the way back in 1789. It’s also only one sentence long and really boring.)

Global GDP used to grow faster and faster all the time — the time it took the global economy to double in size showed a pretty clear linear trend. This was the rule until about 1960-1980, when economic growth suddenly stagnated. Global GDP is still going up, but it’s now growing at a more or less constant rate, instead of accelerating. 

Productivity and hourly wages used to be tightly linked — if you’re creating more value for your employer, they will be willing to pay you more. However, around 1970, these two trends suddenly decoupled. You may have seen graphs like this: 

There used to be less than 1 lawyer per 1000 Americans, though that number was slowly increasing. That is, until about 1971, when it suddenly shot up. Now there are about 4 lawyers for every 1000 Americans. In some parts of the country, the ratio can be as high as 10 per 1000. This is (unsurprisingly) true in New York but also unexpectedly true in our home state of Vermont, which has 5.8 lawyers per 1000 people. It’s ok though, I hear they can’t enter your home unless you invite them in. 

It used to be that about 100 out of every 100,000 people in the population were in prison. That is, until about 1971, when that rate started climbing. Now about 700 out of every 100,000 Americans are incarcerated.

There are even signs that scientific progress has been slowing down since — you guessed it! — about 1970 (see also this paper). 

This is only a small selection of the many things that seem to have gone terribly wrong since about 1970. For a more complete picture, check out the excellent Wake Up, You’ve Been Asleep for 50 Years and WTF Happened In 1971?, which are our sources for most of the trends described above. 

So yeah, what the F did happen in the early 1970s? When dozens of unexplained trends all seem to start in the same year, it seems like more than coincidence — you start wondering if there might be a monocausal event

“The break point in America is exactly 1973,” says economist Tyler Cowen, “and we don’t know why this is the case.” One possible culprit is the 1973 oil embargo, because many of these trends have to do with energy. But Cowen doesn’t think this holds water. “Since that time, the price of oil in real terms has fallen a great deal,” he says, “and productivity has not bounded back.” 

Another possible culprit is the US going off the gold standard in 1971, part of the set of measures known as the Nixon shock (also the name of our new Heavy Metal band). This makes some sense because many of these trends have to do with the economy. But it’s not clear if this is a good explanation either, as many of these trends seem to be global, and most of the world is not on the US dollar. The US is admittedly a pretty big deal, but we’re not the only economy in the world.

But it’s also possible that all this comes from a different policy that Nixon signed into law the year before: the 1970 Controlled Substances Act.

II. 

The early history of coffee is shrouded in mystery. Legends of its discovery date as far back as the 9th century CE, but whenever it was discovered, it’s clear that it came from Africa and had reached the Middle East by 1400. The first coffeehouse in Istanbul opened around 1554, and word of coffee began reaching Europe in the middle 1500s. Even so, it took Europeans about a hundred more years to really take note — the first coffeehouse in Christendom didn’t open until 1645, when one popped up in Venice.

Only five years later, in 1650, the first coffeehouse in England opened in Oxford. There is nothing new under the sun, so unsurprisingly it was very popular with students and intellectuals. Early patrons included Christopher Wren and John Evelyn, and later additions included Hans Sloane, Edmund Halley, and Isaac Newton, who according to some stories, “are said to have dissected a dolphin on a table in the coffeehouse before an amazed audience.” Coffee is a hell of a drug. 

The first coffeehouse in London opened in 1652 in St. Michael’s Alley in Cornhill, operated by a Greek or Armenian (“a Ragusan youth”) man named Pasqua Rosée. The coffee house seems to have been named after Roseé as well, and used him as its logo — one friend who wrote him a poem addressed the verses, “To Pasqua Rosée, at the Sign of his own Head and half his Body in St. Michael’s Alley, next the first Coffee-Tent in London.”

The Royal Society, the oldest national scientific institution in the world, was founded in London on 28 November 1660. The founding took place at the original site of Gresham College, which as far as we can tell from Google Maps, was a mere three blocks from Rosée’s coffeehouse. Some accounts say that their preferred coffeehouse was in Devereux Court, though, which is strange as that is quite a bit further away. But this may be because Rosée’s coffeehouse was destroyed in the Great Fire of 1666.

In 1661, Robert Boyle published The Sceptical Chymist, which argues that matter is made up of tiny corpuscules, providing the foundations of modern chemistry. In 1665, Robert Hooke published Micrographia, full of spectacularly detailed illustrations of insects and plants as viewed through a microscope, which was the first scientific best-seller and invented the biological term cell. By 1675, there were more than 3,000 coffeehouses in England. In 1687, Newton published his Principia

As the popular 1667 broadside News from the Coffe House put it: 

So great a Universitie

I think there ne’re was any;

In which you may a Schoolar be

For spending of a Penny.

This trend continued into the following centuries. As just one example, Voltaire (1694-1778) reportedly consumed a huge amount of coffee per day. No, REALLY huge. Most sources seem to suggest 40 to 50 cups, but The New York Times has it as “more than 50 cups a day.” Perhaps the cups were very small. Wikipedia says “50-72 times per day”, but we can’t tell where they got these numbers. I ask you, what kind of drugs would this man be on, if he were alive today?

Do we really think this mild stimulant could be responsible for the Scientific Revolution? Well to be entirely clear, we aren’t the first ones to make this argument. Here’s a Huffington Post article reviewing several books and essays on the same idea, including one by Malcolm Gladwell. And in Weinberg and Bealer’s The World of Caffeine, the authors tell us that the members of the Royal Society, “had something in common with Timothy Leary, the Harvard professor who experimented with LSD, in that they were dabbling in the use of a new and powerful drug unlike anything their countrymen had ever seen. Surviving recorded accounts confirm that the heavily reboiled sediment-ridden coffee of the day was not enjoyed for its taste, but was consumed exclusively for its pharmacological benefits.”

Today we tend to take coffee in stride, but this stimulant didn’t seem so mild at the time. In 1675, King Charles II briefly banned coffeehouses in London, claiming they had “very evil and dangerous effects.” We don’t know the exact details of the public response, but it was so negative that the king changed his mind after only eleven days! Ten years later, coffee houses were yielding so much tax revenue to the crown that banning them became totally out of the question. 

Merchants panicked over an imagined danger to the economy, one writing, “The growth of coffee-houses has greatly hindered the sale of oats, malt, wheat, and other home products. Our farmers are being ruined because they cannot sell their grain; and with them the landowners, because they can no longer collect their rents.” The owner of the second coffeehouse in London, James Farr, was prosecuted by his neighbors in 1657, “for making and selling a sort of liquor called coffe, as a great nuisance and prejudice to the neighborhood, etc.”

On the less official side of things, the 1674 anonymous WOMEN’S PETITION AGAINST COFFEE REPRESENTING TO PUBLICK CONSIDERATION THE Grand INCONVENIENCIES accruing to their SEX from the Excessive Use of that Drying, Enfeebling LIQUOR (which possibly deserves to be read in full, if only for the 1674 use of “cuckol’d” and “dildo’s”) declared, among other things:

Never did Men wear greater Breeches, or carry less in them of any Mettle whatsoever. There was a glorious Dispensation (’twas surely in the Golden Age) when Lusty Ladds of seven or eight hundred years old, Got Sons and Daughters; and we have read, how a Prince of Spain was forced to make a Law, that Men should not Repeat the Grand Kindness to their Wives, above NINE times in a night … the Excessive Use of that Newfangled, Abominable, Heathenish Liquor called COFFEE …has…Eunucht our Husbands, and Crippled our more kind Gallants, that they are become as Impotent, as Age, and as unfruitful as those Desarts [sic] whence that unhappy Berry is said to be brought.

It’s not like these concerns disappeared as people got used to it. As late as the early 1900s, physicians were still raving about the dangers of this terrible drug. As the wonderful (and sadly defunct) site History House reports:

In the spectacularly titled Morphinism and Narcomanias from Other Drugs (1902), one T. D. Crothers, M.D. tells a few tales of delirium induced by coffee consumption. He also remarks, not unlike analogies to marijuana made by current drug crusaders, that, “Often coffee drinkers, finding the drug to be unpleasant, turn to other narcotics, of which opium and alcohol are the most common.” Similarly, in A System of Medicine (1909), edited by the comically degreed Sir T. Clifford Allbutt (K.C.B., M.A., M.D., LL.D., D. Se., F.R.C.P., F.R.S., F.L.S., F.S.A., Regius Professor of Physic [Internal medicine] in the University of Cambridge), some contributors announce their distaste for caffeine: “We have seen several well-marked cases of coffee excess… the sufferer is tremulous, and loses his self-command… the speech may become vague and weak. By miseries such as these, the best years of life may be Spoilt.”

High doses of caffeine cause odd behavior in test animals. Rats will bite themselves enough to die from blood loss, prompting Consumers Union to observe, “Some readers may here be moved to protest that the bizarre behavior of rats fed massive doses of caffeine is irrelevant to the problems of human coffee drinkers, who are not very likely to bite themselves to death.”

Neither did the science-coffee connection disappear with Newton and Hooke. Researchers still consume more coffee than any other profession. The mathematician Alfréd Rényi quipped, “A mathematician is a machine for turning coffee into theorems,” and he and his colleagues, including Paul Erdős, drank copious amounts. At one point, when trying to explain why Hungary produces so many mathematicians, one of the reasons Erdős gave was, “in Hungary, many mathematicians drink strong coffee … At the mathematical institute they make particularly good coffee.” 

The first webcam, great ancestor to all those Zoom calls you’ve been having, was developed by University of Cambridge computer scientists so they could watch the coffee pot without having to leave their desks.  

And while it’s very popular, coffee isn’t the only way to get your sweet, sweet caffeine fix. Consider the connection between Tea and the British Empire. [cue Rule, Britannia!

Hey Jared we have a piping hot tip for you

Caffeine in one form or another continued to be the stimulant of choice until the middle of the 19th century, when the Germans made an even more exciting discovery.

III. 

When the Spanish arrived in South America, they noticed that some of the natives had the refreshing habit of chewing on the leaves of a local plant, “which make them go as they were out of their wittes.” At first the Spaniards were concerned but then they realized it was pretty great, and started using it themselves — for medicinal purposes, of course. 

Even so, chemistry was not fully developed in the 1600s (they needed to wait for the coffee to hit), so despite many attempts it took until 1855 for the active ingredient to be purified from coca leaves. This feat was accomplished by a German named Friedrich Georg Carl Gaedcke. With this success, another German chemist (Friedrich Wöhler) asked a German doctor who happened to be going on a round-the-world trip (Carl Scherzer) to bring him back more of these wonderful leaves. The doctor came back a few years later with a trunk full of them, which the second chemist passed on to yet a third German chemist, Albert Niemann, who developed a better way of purifying the new substance, which he published as his dissertation. (Sadly he never got to enjoy the substance himself, as he discovered mustard gas the same year and died the year after that, probably from working too closely with mustard gas.)

And with this series of developments, pure cocaine was injected directly into the German nervous system.

A typical example of the effects of cocaine on the German scientific body can be found in a man you might have heard of — Sigmund Freud, who has the same birthday as one of the authors. Having recently moved on from his earlier interest in trying to find the testicles and/or ovaries of eels (don’t laugh, it was a major scientific question of the day!), he found himself VERY EXCITED by the possibilities of this new treatment, which had just become available to physicians.

“Woe to you, my Princess, when I come,” wrote Sigmund Freud to his future wife, Martha Bernays, on June 2, 1884. “I will kiss you quite red and feed you till you are plump. And if you are forward, you shall see who is the stronger, a gentle little girl who doesn’t eat enough, or a big wild man who has cocaine in his body. In my last serious depression I took cocaine again and a small dose lifted me to the heights in a wonderful fashion. I am just now collecting the literature for a song of praise to this magical substance.”

He didn’t just use cocaine to intimidate (???) his fiancée, though. Freud also found that it had professional applications. “So I gave my lecture yesterday,” he wrote in a letter a few months earlier, “Despite lack of preparation, I spoke quite well and without hesitation, which I ascribe to the cocaine I had taken beforehand. I told about my discoveries in brain anatomy, all very difficult things that the audience certainly didn’t understand, but all that matters is that they get the impression that I understand it.” We see that not much has changed since the 1880s. 

Freud wasn’t the only one who was excited by this new discovery, of course. Only two years later, a bedridden Robert Louis Stevenson wrote The Strange Case of Dr Jekyll and Mr Hyde, a 30,000-word novella that he completed in about three days. Many accounts suggest that Stevenson was high on cocaine during this brief, incredibly productive period, possibly recreationally, or possibly because it was simply part of the medicine he was taking. This claim is somewhat contested, but we’re inclined to believe it — you try writing 30,000 words in three days, by hand, while bedridden, without the help of a rather good stimulant.

One Italian, Paolo Mantegazza, was so enthusiastic about the new substance that he actually developed a purification process of his own in 1859. Over the next several decades, he founded the first Museum of Anthropology in Italy, served in the Italian parliament, published a 1,200-page volume of his philosophical and social views, at least three novels, and several scientific books and papers (this paper from 2008 claims that he founded the field of sexual medicine), including one in which he wrote:

“I sneered at the poor mortals condemned to live in this valley of tears while I, carried on the wings of two leaves of coca, went flying through the spaces of 77,438 words, each one more splendid than the one before. An hour later, I was sufficiently calm to write these words in a steady hand: God is unjust because he made man incapable of sustaining the effect of coca all lifelong. I would rather have a life span of ten years with coca than one of 10,000,000,000,000,000,000,000 centuries without coca.”

We should note that while Mantegazza was very productive in these decades, he was also a vivisectionist and a racist. Clearly not everyone should have access to cocaine of this quality.

A different Italian looked at cocaine and saw not poor mortals condemned to live in the valley of tears, but economic opportunity. He happened to read a paper by Mantegazza on the substance, and was inspired. This man was Angelo Mariani, and in 1863 he “invented” cocawine, by which we mean he put cocaine in wine and then sold it. 

Apparently this was more than just a good idea. Cocaine.org, a reputable source if ever we’ve seen one, tells us, “If cocaine is consumed on its own, it yields two principal metabolites, ecgonine methyl ester and benzoyleconine. Neither compound has any discernible psychoactive effect. Cocaine co-administered with alcohol, however, yields a potent psychoactive metabolite, cocaethylene. Cocaethylene is very rewarding agent in its own right. Cocaethylene is formed in the liver by the replacement of the methyl ester of cocaine by the ethyl ester. It blocks the dopamine transporter and induces euphoria. Hence coca wine drinkers are effectively consuming three reinforcing drugs rather than one.”

Mariani is notable less for taking cocaine himself, and more for being possibly the most influential drug pusher of all time. His enticing product, called Vin Mariani, soon became a favorite of the rich, powerful, and highly productive, unleashing the creative potential of cocaine on the world.

YOU CAN TELL SHE’S VERY EXCITED ABOUT THIS WINE FOR SOME REASON

A good catalogue of its influence can be found in the literally thousands of celebrity endorsements it received, and which were proudly displayed in its ads. “Testimonials from eminent personages were so numerous that Mariani, as great a public relations man as he was a chemist, published them in handsome leather-bound volumes—replete with portraits and biographical sketches of the endorsers.” Many of these names and endorsements seem to have been lost to time, but here are a few you might recognize. 

Presumably you have heard of the Pope. Pope Leo XIII and Pope Pius X both enjoyed Vin Mariani, and Pope Leo XIII liked it so much that he often carried a hip flask of the wine. He even awarded Mariani a Vatican Gold Medal, “to testify again in a special manner his gratitude.” He also appeared on a poster advertisement endorsing the wine, and later called Mariani a “benefactor of humanity”. AP news reports that the chief rabbi of France liked it too. 

Sarah Bernhardt, famous actress and subject of the most entertaining Wikipedia entry of all time, said, “My health and vitality I owe to Vin Mariani. When at times unable to proceed, a few drops give me new life.” Jules Verne, one of the fathers of science fiction, wrote, “Vin Mariani, the wonderful tonic wine, has the effect of prolonging life.” Frédéric Auguste Bartholdi, who you will know as the sculptor of the Statue of Liberty, wrote, “this precious wine will give me the strength to carry out certain other projects already formed.” Alexander Dumas is said to have enjoyed it as well, but we can’t find a quote. 

In 1892, Thomas Edison contributed the almost maddeningly vague note, “Monsieur Mariani, I take pleasure in sending you one of my photographs for publication in your album.” Edison was already quite famous by this point, and it’s not clear how long he had been enjoying the effects of Vin Mariani, but we can make an educated guess. 

Vin Mariani was invented in 1863, and we know that by 1868, Edison had a reputation for working “at all hours, night or day”. His famous Menlo Park lab was built in 1876, and soon began producing inventions at a steady rate — the phonograph in 1877, his work on electric lights about 1880, motion picture devices in 1891, and so on. 

In 1887, one writer noted, “he scarcely sleeps at all, and is equally as irregular concerning his eating”. The same account quotes a “co-laborer” of Edison’s as saying, “he averaged eighteen hours [of work] a day. … I have worked with him for three consecutive months, all day and all night, except catching a little sleep between six and nine o’clock in the morning.” In 1889, when he was 42, he told Scientific American that he slept no more than four hours a night. Given that we know he enjoyed Vin Mariani, we think this is good evidence of just how much he must have been drinking. 

Mariani claimed to have collected over four thousand such endorsements from various celebrities. It’s only natural that he also collected endorsements from physicians. In one of his ads, he trots out the following: “In cases of morphinomania, Dr. Dujardin-Beaumetz has pointed out the advantage to be obtained with the Vin Mariani, and following him, Dr. Palmer, of Louisville, and Dr. Sigmaux Treaux [sic] of Vienna, have obtained excellent results with this therapeutic agent.” Yes, you saw that right — that last name there is a botched attempt to spell “Dr. Sigmund Freud”. Maybe Mariani was high on his own supply after all.

While Mariani deserves credit as the man who got cocaine to the masses, the Germans were the ones who first purified the cocaine, and the ones who undoubtedly put it to the best scientific and medical use.

[content warning for the next several paragraphs: descriptions of 19th-century medical experimentation]

It’s easy for a modern person to miss the fact that aside from alcohol and getting held down by surgical assistants, there were few anaesthetics at this point in history. Laughing gas (nitrous oxide) was discovered in 1776, but the Americans took a long time to figure out that it could be used for anything other than killing animals and getting high, and were still struggling with the idea that it might have medical applications. 

Furthermore, laughing gas is a general anaesthetic, not a local anaesthetic, and a weak one at that. It was totally unsuitable for delicate operations like eye surgery. 

People had already noticed that a dose of cocaine will numb your nose, lips, or tongue. Even so, it took the combined powers of Sigmaux Treaux Sigmund Freud and his friend Karl Koller, an ophthalmology intern, to make this breakthrough. Koller was interested in finding a local anaesthetic for eye surgery, and he had already tried putting various chemicals, including morphine, into the eyes of laboratory animals, with no success. Separately, Freud was convinced that cocaine had many undiscovered uses. So in 1884, when Freud left to go pay a visit to Martha, he left Koller some cocaine and encouraged him to experiment with it. 

While Freud was away, Koller made his discovery. Amazingly, in his papers Koller describes the exact moment when he made the connection:

Upon one occasion another colleague of mine, Dr. Engel, partook of some (cocaine) with me from the point of his penknife and remarked, “How that numbs the tongue.” I said, “Yes, that has been noticed by everyone that has eaten it.” And in the moment it flashed upon me that I was carrying in my pocket the local anesthetic for which I had searched some years earlier.

Dr. Gaertner, an assistant in the lab where Koller worked, continues the story in more detail:

One summer day in 1884, Dr. Koller, at that time a very young man … stepped into Professor Strickers laboratory, drew a small flask in which there was a trace of white powder from his pocket, and addressed me … in approximately the following words: “I hope, indeed I expect that this powder will anesthetize the eye.” 

“We’ll find out that right away”, I replied. A few grains of the substance were thereupon dissolved in a small quantity of distilled water, a large, lively frog was selected from the aquarium and held immobile in a cloth, and now a drop of the solution was trickled into one of the protruding eyes. At intervals of a few seconds the reflex of the cornea was tested by touching the eye with a needle… After about a minute came the great historic moment, I do not hesitate to designate it as such. The frog permitted his cornea to be touched and even injured without a trace of reflex action or attempt to protect himself, whereas the other eye responded with the usual reflex action to the slightest touch. The same tests were performed on a rabbit and a dog with equally good results. … 

Now it was necessary to go one step further and to repeat the experiment upon a human being. We trickled the solution under the upraised lids of each other’s eyes. Then we put a mirror before us, took a pin in hand, and tried to touch the cornea with its head. Almost simultaneously we could joyously assure ourselves, “I can’t feel a thing.” We could make a dent in the cornea without the slightest awareness of the touch, let alone any unpleasant sensation or reaction. With that the discovery of local anesthesia was completed. I rejoice that I was the first to congratulate Dr. Koller as a benefactor of mankind.

The final proof came on August 11, 1884, when Koller performed the first successful cocaine-aided cataract surgery. Koller was only 25 when he made this discovery, a Jewish medical student so poor that he had to ask a friend to present the findings for him, since he could not afford the train fare to go to the ophthalmology conference in Heidelberg that year. 

The finding was received with worldwide amazement and enthusiasm. “Within three months of this date,” says one paper, “every conceivable eye operation had been attempted using cocaine, in every part of the world.” The idea spread “not just into ophthalmology, but wherever mucous membranes required surgery—in gynecology, proctology, urology, and otolaryngology.”  Encyclopedia Britannica says that this finding “inaugurated the modern era of local anesthesia.”

In fact, cocaine got such an amazing reputation as a local anaesthetic that the suffix -”caine” was back-formed from the name, and was used form names of new local anaesthetics as they were discovered, like amylocaine, lidocaine, bupivacaine, prilocaine, and procaine (aka novocaine).

[content warning: more descriptions of 19th-century medical experimentation]

As the technique developed further, people started using cocaine as an anaesthetic in spinal operations. The first was an American named James Leonard Corning, who also happened to be a big fan of Vin Mariani. In 1885, he performed a spinal injection of cocaine on a dog (why?), and found that this left the dog temporarily unable to use its legs. 

Encouraged by this finding, he soon decided to give a similar injection to a patient who had recently been referred to him for “addiction to masturbation”. Corning gave the man cocaine as a spinal injection of some sort (there is scholarly debate over what sort!). After 20 minutes, he noticed that “application of [a wire brush] to the penis and scrotum caused neither pain nor reflex contraction.” Whether this was a successful treatment for the unfortunate patient is not recorded.

A German surgeon named August Bier independently came up with the idea in 1898. He and his assistant August Hildebrandt performed the procedure on several patients as part of routine surgeries, until one day in August 1898, when for reasons that remain unclear, they decided to experiment on each other.

“Hildebrandt was not a surgeon and his ham-fisted attempts to push the large needle through Bier’s dura proved very painful,” begins one account, not at all what you would expect from the rather dry-sounding volume Regional Anaesthesia, Stimulation, and Ultrasound Techniques. It continues, “The syringe of cocaine and needle did not fit well together and a large volume of Bier’s cerebrospinal fluid leaked out and he started to suffer a headache shortly after the procedure.” Probably because of the flawed injection, Bier was not anaesthetized at all.

Bier of course was a surgeon, and so when it was his turn to give Hildebrandt the injection, he performed it flawlessly. Soon Hildebrandt was very anaesthetized. To test it, reports Regional Anaesthesia, “Bier pinched Hildebrandt with his fingernails, hit his legs with a hammer, stubbed out a burning cigar on him, pulled out his pubic hair, and then firmly squeezed his testicles,” all to no effect. In a different account, this last step was described as “strong pressure and traction to the testicles”. They also pushed a large needle “in down to the thighbone without causing the slightest pain”, and tried “strong pinching of the nipples”, which could hardly be felt. They were thrilled. With apparently no bad blood over this series of trials, the two gentlemen celebrated that evening with wine and cigars, and woke up the next morning with the world’s biggest pair of headaches, which confined them to bed for 4 and 9 days, respectively. You can read the account in its thrilling original German here.

(Why genital flagellation has such a central role in the climax of both of these stories is anyone’s guess.)

Despite the wild tale of the discovery, this represented a major medical advancement, which made many new techniques and treatments a possibility. Spinal anaesthesia is now a common technique, used in everything from hip surgery to Caesarean sections. Soon Bier and others had developed various forms of regional anaesthesia, which made it safe to perform new and more delicate operations on the arms and legs.

A more prosaic discovery, but no less important, was made by Richard Willstätter in 1898. At the time there was some academic debate about the chemical structure of cocaine, and there were a couple competing theories. Willstätter proved that they were both wrong, came up with the correct structure, and demonstrated that he was correct by synthesizing cocaine in the lab. This was not only the first artificial synthesis of cocaine, but the first synthesis of an organic structure that we’re aware of.  

We’re tempted to wink and ask why he was so motivated to develop a synthetic cocaine, but we’ve looked through Willstätter’s autobiography, and he very clearly states at one point, “although I always possessed cocaine from my youth on, I never knew the temptation to experience its peculiar effects myself.” Maybe this was because by 1894 they had discovered that cocaine had some side effects (even the diehard Freud was off it by 1904), or maybe because he was a nice Jewish boy who wouldn’t mess around with that sort of thing (though Dr. Karl “pins-in-the-eyes” Koller was also Jewish). In any case, his early fame was closely related to the rise of cocaine, and he went on to win the 1915 Nobel Prize for Chemistry.

Just like England was the center of learning in the enlightenment, Germany was the center of scientific advancement in the second half of the 19th century, especially in the natural sciences. Anyone who wanted to study biology, chemistry, or physics had to learn German, because that’s the language all the best volumes and journals were printed in. 

Around 1897, the great Spanish neuroscientist Santiago Ramón y Cajal wrote, “it must be admitted that Germany alone produces more new data than all the other nations combined when it comes to biology. … A knowledge of German is so essential that today there is probably not a single Italian, English, French, Russian, or Swedish investigator who is unable to read monographs published in it. And because the German work comes from a nation that may be viewed as the center of scientific production, it has the priceless advantage of containing extensive and timely historical and bibliographic information.”

“We can only speculate as to how twentieth century history would be different if the Germans had discovered marijuana instead of cocaine,” writes History House (they wrote about the history of drugs a lot, ok?).

This persisted until the two World Wars, when German scientific dominance ended. In a footnote to the 1923 edition of his book, Ramón y Cajal notes that other countries had begun, “competing with, and in many cases surpassing, the work of German universities, which for decades was incomparable.” 

One explanation is the obvious one: that the wars destroyed Germany’s ability to do good science. (Also they kicked out all the scientists who were Jewish, gay, communists, etc.) But another explanation is that America began to discover new drugs of her own. 

IV.

There were other drugs of course, to fill the gap between German scientific dominance and the third drug revolution of the 1950s and ’60s. Cocaine had already become illegal in the United States in 1914, so people were on the lookout for alternative highs.

In contrast to his rival Edison, Nikola Tesla doesn’t drink cocaine wine. Tesla didn’t smoke — he didn’t even take tea or coffee. “I myself eschew all stimulants,” he once told Liberty magazine in 1935. “I am convinced that within a century coffee, tea, and tobacco will be no longer in vogue.” Perhaps this was because of his amazing, and apparently substance-unaided, ability to visualize designs in his mind’s eye. Tesla said elsewhere that when he first designed a device, he would let it run in his head for a few weeks to see which parts would begin to wear out first.

Tesla did, however, LOVE to drink. “Alcohol … will still be used,” he said. “It is not a stimulant but a veritable elixir of life.” When Prohibition came around in the United States, Tesla did break the habit, but he wrote that the law would, “subject a citizen to suffering, danger and possible loss of life,” and suggested that damages from the resulting lawsuits against the government would soon exhaust the treasury. 

(And what was the worst of these vices according to Tesla, the one more dangerous than rum, tobacco, or coffee? Nothing less than chewing gum, “which, by exhaustion of the salivary glands, puts many a foolish victim into an early grave.”)

Obviously Tesla was wrong about the cost of reparations from Prohibition. But is it a coincidence that Prohibition was the law of the land for the decade running up to the Great Depression? Was it a coincidence that the Great Depression began to turn around in March 1933, the same month that President Roosevelt signed the first law beginning the reversal of Prohibition? Probably it is, but you have to admit, it fits our case surprisingly well. 

While Alcohol is a depressant, perhaps it stimulates the curious spirit in some number of our fellow creatures, as it seems to have done for Tesla. Again from History House:

Washington’s taste for Madeira wine shows up [in his accounts] with mindnumbing regularity: from September 1775 to March 1776, Washington spent over six thousand dollars on booze. … Revolutionary War-era persons drank a phenomenal amount. We have here an account of a gentleman’s average consumption: “Given cider and punch for lunch; rum and brandy before dinner; punch, Madeira, port and sherry at dinner; punch and liqueurs with the ladies; and wine, spirit and punch till bedtime, all in punchbowls big enough for a goose to swim in.”

The other drug as old as time has also been associated with scientific productivity. One contributor to the 1971 book Marihuana Reconsidered, who wrote under the pseudonym “Mr. X”, said that he often enjoyed cannabis, found that it improved his appreciation for art, and even made him a better scientist. In the late ‘90s, after his death, Mr. X was revealed to be Carl Sagan. On the topic of his professional skills, he said: 

What about my own scientific work? While I find a curious disinclination to think of my professional concerns when high – the attractive intellectual adventures always seem to be in every other area – I have made a conscious effort to think of a few particularly difficult current problems in my field when high. It works, at least to a degree. I find I can bring to bear, for example, a range of relevant experimental facts which appear to be mutually inconsistent. So far, so good. At least the recall works. Then in trying to conceive of a way of reconciling the disparate facts, I was able to come up with a very bizarre possibility, one that I’m sure I would never have thought of down. I’ve written a paper which mentions this idea in passing. I think it’s very unlikely to be true, but it has consequences which are experimentally testable, which is the hallmark of an acceptable theory.

Marijuana doesn’t help everyone be a better scientist — some people just get paranoid, or just fall asleep. But it’s especially interesting that Sagan found it hallucinogenic, because the third drug revolution was all about hallucinogens. 

The history of hallucinogens is pretty weird, even by the standards of how weird drug history normally is. Hallucinogens are relatively common, and in theory we could have discovered them at any point in the past several thousand years. But aside from occasional mishaps involving ergot poisoning, hallucinogens didn’t play much of a role in human history until the middle of the 20th century. 

Like the coca plant, Psilocybin mushrooms (“shrooms”) grow in the dirt and have been around forever. Unlike the coca plant, they grow all over the world, and have always been readily available. Indigenous groups around the world have used them in ceremonies and rituals, but they weren’t used as a recreational drug until 1955

Europeans certainly had access to these shrooms for thousands of years, but the first well-documented report of psilocybin consumption in Europe was a case described in the London Medical and Physical Journal in 1799, of a man who picked Psilocybe semilanceata (“liberty cap”) mushrooms in London’s Green Park and had them for breakfast with his four children. First the youngest child, “was attacked with fits of immoderate laughter, nor could the threats of his father or mother refrain him.” Then the father, “was attacked with vertigo, and complained that every thing appeared black, then wholly disappeared.” Soon all of them were affected. The doctor who made the report didn’t see this as a potential good time, or a way to expand the mind — he refers to the effect as “deleterious”.

While it has been enjoyed by many people, we can’t find much evidence of mercantile, economic, or scientific discoveries associated with the use of shrooms. This may not be the drug’s fault, since it was banned so soon after being brought to popular attention. 

But there is one major cultural development linked to psilocybin. In his book Mycelium Running: How Mushrooms Can Help Save the World, Paul Stamets describes a discussion he had with Frank Herbert, author of Dune, in the 1980s. Herbert showed him a new method he had developed for growing mushrooms on newly-planted trees, which at the time everyone thought was impossible. They kept talking, and:

Frank went on to tell me that much of the premise of Dune — the magic spice (spores) that allowed the bending of space (tripping), the giant worms (maggots digesting mushrooms), the eyes of the Freman (the cerulean blue of Psilocybe mushrooms), the mysticism of the female spiritual warriors, the Bene Gesserits (influenced by tales of Maria Sabina and the sacred mushroom cults of Mexico) — came from his perception of the fungal life cycle, and his imagination was stimulated through his experiences with the use of magic mushrooms.

Dune is the best-selling science fiction novel of all time, winner of the Hugo and the very first Nebula award, and one of my personal favorites. Even if this were the only thing shrooms had inspired, it would be a pretty big deal.

The other major naturally-occurring hallucinogen seems to have had a wider impact, and has a laundry list of famous users and associated creations. This drug is mescaline, the active ingredient in peyote cactus. As with cocaine, the Germans were the first to discover mescaline, but unlike cocaine, they didn’t seem to do anything with it. Possibly this was because they thought of it as a poison. The chemist who first isolated it wrote, “mescaline is exclusively responsible for the major symptoms of peyote (mescal) poisoning.” Well, he was almost right.

The first recreational use of the drug we found was from Jean-Paul Sartre, who took mescaline in 1929 while attending the École Normale Supérieure. He had a bad trip, during which he hallucinated various sea creatures. When he came down, he found that the hallucinations persisted, though he didn’t seem to be very worried by this:

Yeah, after I took mescaline, I started seeing crabs around me all the time. They followed me in the streets, into class. I got used to them. I would wake up in the morning and say, “Good morning, my little ones, how did you sleep?” I would talk to them all the time. I would say, “O.K., guys, we’re going into class now, so we have to be still and quiet,” and they would be there, around my desk, absolutely still, until the bell rang.

[Interviewer asks: A lot of them?]

Actually, no, just three or four.

He eventually ended up getting treated for this by Jacques Lacan, who suggested the crabs represented loneliness. When he was feeling depressed, Sartre would instead get the “recurrent feeling, the delusion, that he was being pursued by a giant lobster, always just out of sight… perpetually about to arrive.”

This experience seems to have influenced Sartre’s work — for example, in his play “The Condemned of Altona,” one of the characters claims to communicate with people from the thirtieth century, who have become a race of crabs that sit in judgment of humanity. Is this a precursor to the Carcinization Meme?

are you feeling it now mr krabs

Other authors have had similar experiences, except more positive, and without the crustaceans. Aldous Huxley took mescaline in 1953, and wrote his book The Doors of Perception about the experience. From then on he was a proponent of psychedelics, and they came to influence his final book, Island, published in 1962. Sadly the mescaline cannot be responsible for his most famous novel, Brave New World, because it was published decades earlier, in 1932. It also can’t be held responsible for his 1940 screenplay adaptation of Pride and Prejudice.

But mescaline clearly deserves some credit for Ken Kesey’s 1962 book, One Flew Over the Cuckoo’s Nest, and for Ken Kesey in general. Kesey was working as an orderly at a psych hospital and decided to make some money on the side by testing drugs for the CIA as part of project MKUltra, who gave him both mescaline and LSD (we’ll get to this drug in a second, don’t you worry). The combination of these drugs and his job as an orderly led him to write One Flew Over the Cuckoo’s Nest, which was an instant smash hit — there was a play the next year, with Gene Wilder in a major role, and the film adaptation in 1975 won five Oscars. 

Ken Kesey went on to basically invent modern drug culture, hippie culture, and Bay Area California. Ken Kesey and his drugs were also largely responsible for Jerry Garcia and the Grateful Dead, and thus indirectly responsible for the Ben & Jerry’s flavor Cherry Garcia, “the first ice cream named for a rock legend”.

Mescaline was also a force behind Philip K. Dick’s 1974 Hugo- and Nebula-nominated novel, Flow My Tears, The Policeman Said. In a letter that is more than a little reminiscent of the cocaine-driven Robert Louis Stevenson, he says:

At one point in the writing I wrote 140 pages in 48 hours. I have high hopes for this. It is the first really new thing I’ve done since EYE IN THE SKY. The change is due to a change that overtook me from having taken mescalin [sic], a very large dose that completely unhinged me. I had enormous insights behind the drug, all having to do with those whom I loved. Love. Will love.

If you want to REALLY understand this story, you probably have to read Dick’s undelivered speech, How to Build a Universe That Doesn’t Fall Apart Two Days Later. It doesn’t mention the mescaline but it certainly captures… something. 

Most of his other famous works — The Man in the High Castle, Do Androids Dream of Electric Sheep? (aka Blade Runner), We Can Remember It for You Wholesale (aka Total Recall), Minority Report, etc. — were written before this, and so probably were not affected by mescaline. That’s ok though, because we know that up to 1970 Dick was on amphetamines nearly full-time.

And finally of course there is the great king of the psychedelics, LSD, which started to become prominent around the same time. LSD was actually invented some decades earlier. It was first synthesized in 1938 by Swiss (but notably, German-speaking) chemist Albert Hofmann. He was looking for a new respiratory and circulatory stimulant, but when he tested the new chemical in lab animals, it showed none of the desired effect — though the animals did become “restless” — and was abandoned for five years. 

But Hofmann had a “peculiar presentiment” that there might be more to LSD than met the eye, and so in 1943 he synthesized some more. On April 19th, he arranged to take what he thought would be a tiny dose, in case the substance was poisonous, a mere 250 micrograms. Instead, he went on the mother of all trips, and had his famous bicycle ride home. Subsequent tests showed that a fifth of that original dose was sufficient to produce strong trips in lab assistants — LSD had arrived.

The inventor had no question about what his discovery meant, or what it was for. In a speech on his 100th (!!!) birthday, Hofmann said, “I think that in human evolution it has never been as necessary to have this substance LSD. It is just a tool to turn us into what we are supposed to be.” Okie dokie.

For a drug that got only a couple decades in the sun, LSD has a pretty impressive track record. Francis Crick, one of the people who discovered the structure of DNA, probably took LSD and may have been tripping when he was doing some of his DNA work, though this isn’t well-attested. Douglas Englebart, inventor of the mouse and the guy who did The Mother of All Demos, took LSD some time in the early 60’s. Time magazine wrote approvingly of LSD’s ability to treat mental illnesses as early as 1955.

The Beatles were already extremely popular before they first took acid in 1965, but it clearly influenced their music from then on. This in turn influenced much of the music made in the second half of the 20th century. You may be surprised to learn that they took it for the first time by accident; to be more precise, someone dosed them without their consent. You see…

In the spring of 1965, John Lennon and George Harrison, along with their wives Cynthia Lennon and Patti Boyd, were having dinner over their dentist’s house when they were first “dosed” with LSD.

Dentist John Riley and his girlfriend, Cyndy Bury, had just served the group a great meal, and urged their distinguished guests to stay for coffee, which they reluctantly did…

Riley wanted to be the first person to turn on the Beatles to acid, so the couples finished their coffee, and then Riley told Lennon that the sugar cubes they used contained LSD, a powerful new drug with incredible hallucinogenic effects.

Lennon said, “How dare you fucking do this to us!”

As George remembered, “The dentist said something to John, and John turned to me and said, ‘We’ve had LSD.’ I just thought, ‘Well, what’s that? So what? Let’s go!'”

Eventually they escaped their dentist and ended up at George’s house. John “was beginning to reconsider his attitude toward acid,” in part because he was excited that “George’s house seemed to be just like a big submarine.”

Once they came down, John and George decided the other two Beatles needed to try LSD as well. “John and I had decided that Paul and Ringo had to have acid,” said George Harrison, “because we couldn’t relate to them any more. Not just on the one level, we couldn’t relate to them on any level, because acid had changed us so much.” 

This was easier said than done — Paul didn’t want to try it — but they threw a big house party with Peter Fonda, David Crosby, and various others where they all (except Paul) dropped acid, George fell in the swimming pool, they watched Cat Ballou (with a laugh track), they all got in the shower and passed around a guitar, normal party stuff. Paul didn’t take LSD that night but he took it shortly after, at which point he said it “explained the mystery of life.” The resulting insights helped form their next albums: Revolver, and of course, Sgt. Pepper’s Lonely Hearts Club Band.

The Beatles are just one example, of course. Pink Floyd, the Doors, Jefferson Airplane, and many other bands were all trying out LSD at around the same time. Bob Dylan took LSD (“Who smokes pot any more?” he asked in 1965) and he went on to win a Nobel Prize. The new drug influenced culture in many ways. The real question here is, who has dinner at their dentist’s house?

Another question is, why didn’t we discover how to use psychedelics earlier? Shrooms, at least, have been available for a long time. Why weren’t Leibniz, Galileo, and Shakespeare all tripping out of their minds?

We think there might be two reasons. Unlike stimulants, which have a pretty reliable effect, hallucinogens often have different effects on different people. And also unlike stimulants, it seems you often have to use hallucinogens in just the right way in order to unlock their creative potential. Coffee or cocaine make you more focused and more productive, even more creative, in the moment. But it’s very rare to be able to produce anything while high on psychedelics. 

In an interview in 1960, Aldous Huxley said:

But you see (and this is the most significant thing about the experience), during the experience you’re really not interested in doing anything practical — even writing lyric poetry. If you were having a love affair with a woman, would you be interested in writing about it? Of course not. And during the experience you’re not particularly in words, because the experience transcends words and is quite inexpressible in terms of words. So the whole notion of conceptualizing what is happening seems very silly. After the event, it seems to me quite possible that it might be of great assistance: people would see the universe around them in a very different way and would be inspired, possibly, to write about it.

The same insight was discovered by the Beatles. “We found out very early,” said Ringo Starr, “that if you play it stoned or derelict in any way, it was really shitty music, so we would have the experiences and then bring that into the music later.”

LSD helped Doug Englebart come up with the idea of the computer mouse, but he had the idea when he was down — the only thing he invented while actively tripping seems to have been a potty training tool.

Even CNN Business, the most unlikely of sources, says: “The last thing [a programmer should do] is take LSD and then code. It’s more subtle: ‘if you have issues in your life or anything, you’re going to think about them [while high], and think about them in a different perspective.’”

So much, so usual, right? “Drugs help you be creative” — you’ve heard this one before. By itself, it’s not very original as a thesis.

THEN CAME 1970

… and what can we say, but that science and the economy never recovered? 

The 1970 Controlled Substances Act invented five “schedules” or categories for regulating drugs. The most extreme level of regulation was Schedule I, for drugs that the feds decided had high potential for abuse, no accepted medical uses, and that were “not safe to use, even under medical supervision”. Into Schedule I went LSD, marijuana, mescaline, psilocybin, and many others. 

The next level of regulation was Schedule II, for drugs that the feds felt also had high potential for abuse, limited medical uses, and high risk of addiction. Into Schedule II went cocaine and amphetamines. 

Less exciting (for the most part) drugs went into Schedules III, IV, and V. 

Leaving out caffeine and alcohol was the only thing that spared us from total economic collapse. Small amounts of progress still trickle through; drugs continue to inspire humanity. This mostly happens with LSD, it seems, probably because the potential of that drug has not been as exhausted as the potential of cocaine and coffee. 

Steve Jobs famously took LSD in the early 70’s, just after the crackdown was revving up. “Taking LSD was a profound experience, one of the most important things in my life,” he said. “LSD shows you that there’s another side to the coin, and you can’t remember it when it wears off, but you know it. It reinforced my sense of what was important — creating great things instead of making money, putting things back into the stream of history and of human consciousness as much as I could.” 

Bill Gates has been more coy about his relationship with acid, but when an interviewer for Playboy asked him, “ever take LSD?” he pretty much admitted it. “My errant youth ended a long time ago,” he said in response to the question. “There were things I did under the age of 25 that I ended up not doing subsequently.”

So it seems like LSD had a small role in the lead-up to both Apple and Microsoft. These aren’t just two large companies — these are the two largest publicly-traded companies in the world. Apple is so big it accounts for almost 10% of the GDP of the United States (!!!), and about 7% of the value of the S&P 500. That is very big.

Economic growth is not objectively good by itself. But part of the question here is, “what happened to economic growth around 1970?” When the companies in the global #1 and #2 positions were both founded by people who used LSD, it makes you want to pay attention. It makes you wonder what Jeff Bezos, Larry Page, and Sergey Brin might have tried (though it might not be LSD).

It isn’t just the guys at the top, of course. In 2006, Cisco engineer Kevin Herbert told WIRED magazine that he “solved his toughest technical problems while tripping to drum solos by the Grateful Dead.” According to WIRED, Herbert had enough influence at Cisco that he was able to keep them from drug testing their employees. “When I’m on LSD and hearing something that’s pure rhythm,” says Herbert, “it takes me to another world and into another brain state where I’ve stopped thinking and started knowing.” We’re not sure where he is now, but he was still giving interviews advocating for LSD in 2008.

This is all business, but the impacts are not strictly economic. The big scientific breakthrough made on LSD after the drugs shutdown of 1970 is perhaps the most important one of all, Kary Mullis’s invention of polymerase chain reaction (PCR) in 1983.

PCR is basically the foundational method of all modern biochemistry/biomedicine. The New York Times called it, “highly original and significant, virtually dividing biology into the two epochs of before PCR and after PCR.” The scientific community agrees, and Mullis was awarded the Nobel Prize in Chemistry in 1993 for his invention, only seven years after he originally demonstrated the procedure.

Everyone knew that Mullis was big into psychedelics. “I knew he was a good chemist because he’d been synthesizing hallucinogenic drugs at Berkeley,” said one of his colleagues. And Mullis himself makes it pretty clear that LSD deserves a lot of the credit for his discovery. “Would I have invented PCR if I hadn’t taken LSD? I seriously doubt it,” said Mullis. “I could sit on a DNA molecule and watch the polymers go by. I learnt that partly on psychedelic drugs.” If this is even partially true, most progress in bioscience in the past 40 years was made possible by LSD. It may also have inspired Jurassic Park

(We also want to mention that Mullis was really weird. In addition to being a psychology and sociology denialist, HIV/AIDS denialist, and global warming denialist, he also claims he was visited by a fluorescent “standard extraterrestrial raccoon”, which spoke to him and called him “doctor”. Maybe this is because the first time he took acid, he took a dose of 1,000 micrograms, four times Hofmann’s original monster dose of 250 micrograms and about 10-20 times a normal dose. It really is possible to take too much LSD.)

Drugs continue to influence culture as well, of course, but none of those impacts seem to be as big as the Beatles. Michael Cera is a good actor, but we don’t know if his taking mescaline on-camera for the film Crystal Fairy & the Magical Cactus counts as a major discovery. We do appreciate that they included a crab, however. 

V.

Some accounts of scientific progress suggest that it happens based on foundational technologies, sometimes called “General Purpose Technologies”. For example, Tyler Cowen and Ben Southwood say: 

A General Purpose Technology (GPT), quite simply, is a technological breakthrough that many other subsequent breakthroughs can build upon. So for instance one perspective sees “fossil fuels,” or perhaps “fossil fuels plus powerful machines,” as the core breakthroughs behind the Industrial Revolution. Earlier GPTs may have been language, fire, mathematics, and the printing press. Following the introduction of a GPT, there may be a period of radical growth and further additional innovations, as for instance fossil fuels lead to electrification, the automobile, radio and television, and so on. After some point, however, the potential novel applications of the new GPT may decline, and growth rates may decline too. After America electrified virtually all of the nation, for instance, the next advance in heating and lighting probably won’t be as significant. Airplanes were a big advance, but over the last several decades commercial airliners have not been improving very much.

… [An] alternate perspective sees general technological improvement, even in such minor ways as ‘tinkering’, as more fundamental to the Industrial Revolution – and progress since then – as more important than any individual ‘general purpose’ breakthroughs. Or, if you like, the General Purpose Technology was not coal, but innovation itself.

So the foundational technologies driving innovation can be either literal technologies, new techniques and discoveries, or even perspectives like “innovation.”

When we cut off the supply and discovery of new drugs, it’s like outlawing the electric motor or the idea of a randomized controlled trial. Without drugs, modern people have stopped making scientific and economic progress. It’s not a dead stop, more like an awful crawl. You can get partway there by mixing redbull, alcohol, and sleep deprivation, but that only gets you so far.

There have been a few discoveries since 1970. But when we do develop new drugs, they get memory-holed. MDMA was originally discovered in 1912, but it didn’t start being used recreationally until about the mid-1970s. Because of this, it originally escaped the attention of the DEA, and for a while it was still legal. By 1985, the DEA made sure it was criminalized. 

Of course, people do still do drugs. But the question is who can do drugs, and who has access to them. When coffee was introduced, any student or lowlife in London could get a cup. Cocaine was more expensive, but doctors seem to have had relatively easy access, and Vin Mariani made the substance available to the masses. LSD has always been pretty cheap, and otherwise broke grad students seem to have had no trouble getting their hands on literally mindbending amounts. For a while, the CIA was paying people to take it!

Now that drugs are illegal, only a small percentage of the population really has reliable access to them — the rich and powerful. This is a problem because drugs only seem to unlock a great creative potential in a small number of people. “I don’t think there is any generalization one can make on this,” said Aldous Huxley. “Experience has shown that there’s an enormous variation in the way people respond to lysergic acid. Some people probably could get direct aesthetic inspiration for painting or poetry out of it. Others I don’t think could.” If we want drugs to help drive our economy and our scientific discovery, we need to make them widely available to everyone, so anyone who wants to can give them a try.

Not everyone needs drugs to have great breakthroughs. “I do not do drugs,” said Salvador Dalí, “I am drugs.” (Though Freud was one of his major influences, so drugs were in his lineage nonetheless.) Einstein doesn’t seem to have done drugs either, but like Dalí, he probably was drugs. 

But right now, we are losing the talent of people in whom drugs would unlock genius. A small number are still rich enough and privileged enough to both take drugs and get away with it. Anyone who has that potential, but who is currently too poor or too marginalized, will never get access to the drugs they need to change the world. Even the rich and well-connected may not be able to get the amount of drugs they need, or get them often enough, to finish their great works. Not everyone is Kary Mullis, able to synthesize their own LSD. Who knows what discoveries we have missed over the last 50 years.

We’ve heard a lot of moral and social arguments for legalizing drugs. Where are the scientific and economic arguments? Drugs are linked with great scientific productivity. Genome sequencing is the last big thing to happen in science, and it happened courtesy of LSD.

Drugs are also an enormous market. Commodity trading in drugs was so important to the origin of modern investing that today the ceiling of the New York Stock Exchange is decorated with gold tobacco leaves. Right now the markets for illegal drugs are not only unregulated, they’re untaxed. They’re probably immensely inefficient as well. We can more or less guarantee that your new cocawine startup will have a hard time getting VC backing. 

“It’s very hard for a small person to go into the drug importing business because our interdiction efforts essentially make it enormously costly,” said conservative economist Milton Friedman in 1991. “So, the only people who can survive in that business are these large Medellin cartel kind of people who have enough money so they can have fleets of airplanes, so they can have sophisticated methods, and so on. In addition to which, by keeping goods out and by arresting, let’s say, local marijuana growers, the government keeps the price of these products high. What more could a monopolist want? He’s got a government who makes it very hard for all his competitors and who keeps the price of his products high. It’s absolutely heaven.”

We’ll also note that America’s legal system is infamously slow and backed up. It’s easy to imagine that this is because the legal system is choking itself, trying to swallow all these drug cases, leaving no room to deal with anything else. In 1965, annual marijuana arrest rates were about 18,000. By 1970 they had increased tenfold, to 180,000. By 2000 the number was about 730,000 annually. As a result, we no longer have a functioning legal system. 

So maybe things began to crawl in 1970, when we began to take the steam out of our engine of progress. The first big shock was the Controlled Substances Act, but it wasn’t the last. 

VI.

Above, we quoted economist Tyler Cowen on foundational technologies. “The break point in America is exactly 1973,” he says elsewhere, “and we don’t know why this is the case.” Well, we may not know for sure, but we have a pretty good guess: The Drug Enforcement Administration, or DEA, was founded on July 1, 1973.

Before the DEA, enforcement of drug laws was sort of jumbled. According to the DEA’s own history of the period, “Previous efforts had been fragmented by competing priorities, lack of communication, multiple authority, and limited resources.” Nixon called for “a more coordinated effort,” and a few years later the DEA was born. Now there was a central authority enforcing the new laws, so perhaps it is not surprising that 1973, rather than 1970, was the break point. 

What about other countries? The trends since 1970 are global, not limited to the US. It’s not like the DEA is running around the rest of the world enforcing our drug laws on other countries, right? Well, first of all, the DEA is running around the rest of the world enforcing our drug laws on other countries.

Perfectly normal US law enforcement agents in… Afghanistan

Second, the rest of the world has largely followed the United States in criminalizing recreational drug use. This is regulated by a number of United Nations treaties. As a result of these treaties, most of the drugs that are illegal in the US are also illegal in most members of the United Nations.

Cocaine is illegal in most countries, including Canada, New Zealand, China, India, Japan, and Thailand. In Saudi Arabia, you can be executed for it. In Singapore, importing or exporting many drugs carries a mandatory death sentence.  

Friendly Singapore warning card about the death penalty for drug traffickers!

LSD was made illegal by the 1971 UN Convention on Psychotropic Substances, and it remains illegal in all 184 states that are party to the convention. 

The Netherlands has a reputation for being very drug-friendly, but this is largely undeserved. While they do tolerate some drugs (a policy known as gedoogbeleid), most drugs technically remain illegal. “Soft drugs” like marijuana, hash, and “magic truffles” (NOT shrooms — apparently these are different) are tolerated. Note the exact wording from this government website, though: “Although the sale of soft drugs is a criminal offence, coffee shops selling small quantities of soft drugs will not be prosecuted.” 

“Hard drugs”, including cocaine, magic mushrooms, and LSD are still very much illegal. Even for soft drugs like marijuana, however, you can’t possess more than a small amount for personal use. Producing any amount of any drug — including marijuana! — remains illegal. So even in this notorious drug haven, most drugs are still illegal and heavily restricted.

Any country that broke from this pact and really legalized drugs would see an explosion in their economy, and soon we expect, breakthroughs in their arts and sciences. But the UN wouldn’t like that, and you might wake up to find the DEA burning product in your backyard. So for now, with a small number of exceptions, these substances remain illegal. 

VII.

We hear a lot of talk these days about decriminalizing marijuana. This is the right thing to do, but it won’t be enough. Legalizing marijuana is not going to cut it.

Legalizing other drugs is more like it. When asked how he thought America would change if drugs were legalized, Milton Friedman said:

I see America with half the number of prisons, half the number of prisoners, ten thousand fewer homicides a year, inner cities in which there’s a chance for these poor people to live without being afraid for their lives, citizens who might be respectable who are now addicts not being subject to becoming criminals in order to get their drug, being able to get drugs for which they’re sure of the quality. …

I have estimated statistically that the prohibition of drugs produces, on the average, ten thousand homicides a year. It’s a moral problem that the government is going around killing ten thousand people. It’s a moral problem that the government is making into criminals people, who may be doing something you and I don’t approve of, but who are doing something that hurts nobody else. 

Friedman was a conservative’s conservative. He was an advisor to Reagan and to Thatcher. You can hardly get more impeccable conservative credentials than that! But when he looks at drug prohibition, he literally calls it socialism.

Everyone knows that hippies love drugs and want to legalize them. That much is not surprising. What is surprising is that conservatives are so firmly against drugs. It just doesn’t make any sense. Judge Juan Torruella of the First Circuit U.S. Court of Appeals was appointed by Ronald Reagan in 1984. In 1996, he said:

Prohibition’s enforcement has had a devastating impact on the rights of the individual citizen. The control costs are seriously threatening the preservation of values that are central to our form of government. The war on drugs has contributed to the distortion of the Fourth Amendment wholly inconsistent with its basic purposes. …

I detect considerable public apathy regarding the upholding of rights which have been cherished since this land became a constitutional Republic, when it comes to those accused of drug violations. Now I will grant you that people that sell drugs to children and the like are not very nice people, and I do not stand here or anywhere in defense of such heinous conduct. However, we must remember that we do not, and cannot, have one constitution for the good guys and another for the bad ones.

Paul Craig Roberts, an economist who served as Assistant Secretary of the Treasury for Economic Policy under Reagan, said in The Washington Times in 2001:

The conservatives’ war on drugs is an example of good intentions that have had unfortunate consequences. As often happens with noble causes, the end justifies the means, and the means of the drug war are inconsistent with the U.S. Constitution and our civil liberties.

Think about it. In the name of what other cause would conservatives support unconstitutional property confiscations, unconstitutional searches, and Orwellian Big Brother invasions of privacy? …

It is a personal tragedy for a person to ruin his life with alcohol, drugs, gambling or any other vice. But it is a public tragedy when government ruins the lives of millions of its citizens simply because it disapproves of a product they consume.

The “war on drugs” is, in truth, a war on the Constitution, civil liberties, privacy, property, freedom and common sense. It must be stopped.

Legalizing drugs is the right thing to do — from a moral point of view, from an economic point of view, from a scientific point of view. But legalizing drugs won’t be enough. We need new drugs. We need to taste drugs that no one has ever heard of, mysterious new combinations of drugs that no one’s ever tried before. Scientific and economic progress — great discoveries and major companies — comes on the heels of drug discovery. 

Is the Controlled Substances Act really responsible for the general decline since 1970? We’re not sure, but what is clear is that drugs are foundational technologies, like the motor, combustion engine, semiconductor, or the concept of an experiment. New drugs lead to scientific revolutions. Some of those drugs, like coffee, continue to fuel fields like mathematics and computer science, even some hundreds of years later. With apologies to Newton, “If I seem higher than other men, it is because I am standing on the shoulders of giants.”

Investigation: Were Polish Aristocrats in the 1890s really that Obese? by Budnik & Henneberg (2016)

I. 

A friend recently sent us a chapter by Alicja Budnik and Maciej Henneberg, The Appearance of a New Social Class of Wealthy Commoners in the 19th and the Early 20th Century Poland and Its Biological Consequences, which appeared in the 2016 volume Biological Implications of Human Mobility.

A better title would be, Were Polish Aristocrats in the 1890s really that Obese?, because the chapter makes a number of striking claims about rates of overweight and obesity in Poland around the turn of the century, especially among women, and especially especially among the upper classes.

Budnik & Henneberg draw on data from historical sources to estimate height and body mass for men and women in different classes. The data all come from people in Poland in the period 1887-1914, most of whom were from Warsaw. From height and body mass estimates they can estimate average BMI for each of these groups. (For a quick refresher on BMI, a value under 18.5 is underweight, over 25 is overweight, and over 30 is obese.) 

They found that BMIs were rather high; somewhat high for every class but quite high for the middle class and nobility. Peasants and working class people had average BMIs of about 23, while the middle class and nobles had average BMIs of just over 25.

This immediately suggests that more than half of the nobles and middle class were overweight or obese. The authors also estimate the standard deviation for each group, which they use to estimate the percentage of each group that is overweight and obese. The relevant figure for obesity is this: 

As you can see, the figure suggests that rates of obesity were rather high. Many groups had rates of obesity around 10%, while about 20% of middle- and upper-class women were obese. 

This is pretty striking. One in five Polish landladies and countesses were obese? Are you sure?

To begin with, it contradicts several other sources on what baseline human weight would be during this period. The first is a sample of Union Army veterans examined by the federal government between 1890-1900. The Civil War was several decades before, so these men were in their 40s, 50s, and 60s. This is in almost the exact same period, and this sample of veterans was Caucasian, just like the Polish sample, but the rate of obesity in this group was only about 3%. 

Of course, the army veterans were all men, and not a random sample of the population. But we have data from hunter-gatherers of both genders that also suggests the baseline obesity rate should be very low. As just one example, the hunter-gatherers on Kitava live in what might be called a tropical paradise. They have more food than they could ever eat, including potatoes, yams, fruits, seafood, and coconuts, and don’t exercise much more than the average westerner. Their rate of obesity is 0%. It seems weird that Polish peasants, also eating lots of potatoes, and engaged in backbreaking labor, would be so more obese than these hunter-gatherers. 

On the other hand, if this is true, it would be huge for our understanding of the history of obesity, so we want to check it out. 

Because this seems so weird, we decided to do a few basic sanity checks. For clarity, we refer to the Polish data as reported in the chapter by Budnik & Henneberg as the Warsaw data, since most (though not all) of these data come from Warsaw.

II.

The first sanity check is comparing the obesity rates in the Warsaw data to the obesity rates in modern Poland. Obesity rates have been rising since the 1890s [citation needed] so people should be more obese now than they were back then.

The Warsaw data suggests that men at the time were somewhere between 0% and 12.9% obese (mean of categories = 7.3%) and women at the time were between 8.8% and 20.9% obese (mean of categories = 16.2%). In comparison, in data from Poland in 1975, 7% of men were obese and 13% of women were obese. This suggests that obesity rates were flat (or perhaps even fell) between 1900 and 1975, which seems counterintuitive, and kinda weird. 

In data from Poland in 2016, 24% of men were obese and 22% of women were obese. This also seems weird. It took until 2016 for the average woman in Poland to be as obese as a middle-class Polish woman from 1900? This seems like a contradiction, and since the more recent data is probably more accurate, it may mean that the Warsaw data is incorrect.

There’s another sanity check we can make. Paintings and photographs from the time period in question provide a record of how heavy people were at the time. If the Warsaw data is correct, there should be lots of photographs and paintings of obese Poles from this era. We checked around to see if we could find any, focusing especially on trying to get images of Poles from Warsaw.

We found a few large group photographs and paintings, and some pictures of individuals, and no way are 20% of them obese.

We begin with Sokrates Starynkiewicz, who was president of Warsaw from 1875 to 1892. He looks like a very trim gentleman, and if we look at this photograph of his funeral from 1902, we see that most of the people involved look rather trim as well:

In addition, a photograph of a crowd from 1895:

And here’s a Warsaw street in 1905: 

People in these photographs do not look very obese. But most of the people in these photographs are men, and the Warsaw data suggests that rates of obesity for women were more than twice as high. 

We decided to look for more photographs of women from the period, and found this list from the Krakow Post of 100 Remarkable Women from Polish History, many of whom seem to have been decorated soldiers (note to self: do not mess with Polish women). We looked through all of the entries for individuals who were adults during the period 1887-1914. There are photographs and/or portraits for many of them, but none of them appear to be obese. Several of them were painters, but none of the subjects of their paintings appear obese either. (Unrelatedly, one of them dated Charlie Chaplin and also married a Count and a Prince.)

If rates of obesity were really 20% for middle and upper class women, then there should be photographic evidence, and we can’t find any. What we have found is evidence that Polish women are as beautiful as they are dangerous, which is to say, extremely.

Anna Iwaszkiewicz with a parrot in 1914

III.

If we’re skeptical of the Warsaw data, we have to wonder if there’s something that could explain this discrepancy. We can think of three possibilities. 

The first is that we have a hard time imagining that whoever collected this data got all these 19th-century Poles to agree to be weighed totally naked. If they were wearing all of their clothes, or any of their clothes, that could explain the whole thing. (It might also explain the large gender and class effects.) 

Clothing weighed a lot back then. Just as one example, a lady’s dolman could weigh anywhere between 6 and 12 pounds, and a skirt could weigh another 12 pounds by itself. We found another source that suggested a lady’s entire outfit in the 1880s (though not Poland specifically) would weigh about 25 lbs.

As far as we can tell, there’s no mention of clothes, clothing, garments, shoes, etc. in the chapter, so it’s quite possible they didn’t account for clothing at all. All the original documents seem to be in Polish and we don’t speak Polish, so it’s possible the original authors don’t mention it either. (If you speak Polish and are interested in helping unravel this, let us know!)

Also, how did you even weigh someone in 1890s Poland? Did they carry around a bathroom scale? We found one source that claims the first “bathroom” scale was introduced in 1910, but they must have been using something in 1890. 

Sir Francis Galton, who may have come up with the idea of weighing human beings, made some human body weight measurements in 1884 at London’s International Health Exhibition. He invited visitors to fill out a form, walk through his gallery, and have their measurements taken along a number of dimensions, including colour-sense, depth perception, sense of touch, breathing capacity, “swiftness of blow with fist”, strength of their hands, height, arm span, and weight. (Galton really wanted to measure the size of people’s heads as well, but wasn’t able to, because it would have required ladies to remove their bonnets.) In the end, they were given a souvenir including their measurements. To take people’s weights, Galton describes using “a simple commercial balance”.

Some of the “anthropometric instruments” Galton used.

Galton also specifically says, “Overcoats should be taken off, the weight required being that of ordinary indoor clothing.” This indicates he was weighing people in their everyday clothes (minus only overcoats), which suggests that the Polish data may also include clothing weight. “Stripping,” he elaborates, “was of course inadmissible.”

Card presented to each person examined. Note “WEIGHT in ordinary-in-door clothing in lbs.” in the lower righthand corner.

Also of interest may be Galton’s 1884 paper, The Weights of British Noblemen During the Last Three Generations, which we just discovered. “Messrs. Berry are the heads of an old-established firm of wine and coffee merchants,” he writes, “who keep two huge beam scales in their shop, one for their goods, and the other for the use and amusement of their customers. Upwards of 20,000 persons have been weighed in them since the middle of last century down to the present day, and the results are recorded in well-indexed ledgers. Some of those who had town houses have been weighed year after year during the Parliamentary season for the whole period of their adult lives.”

Naturally these British noblemen were not being weighed in a wine and coffee shop totally naked, and Galton confirms that the measurements should be, “accepted as weighings in ‘ordinary indoor clothing’.” This seems like further evidence that the Warsaw data likely included the weight of individuals’ clothes. 

Another explanation has to do with measurements and conversions. Poland didn’t switch to the metric system until after these measurements were made (various sources say 1918, 1919, 1925, etc.), so some sort of conversion from outdated units has to be involved. This chapter does recognize that, and mentions that body mass was “often measured in Russian tsar pounds (1 kg = 2.442 pounds).” 

We have a few concerns. First, if it was “often” measured in these units, what was it measured in the rest of the time? 

Second, what is a “Russian tsar pound”? We can’t find any other references for this term, or for “tsar pound”, but we think it refers to the Russian funt (фунт). We’ve confirmed that the conversion rate for the Russian funt matches the rate given in the chapter (409.5 g, which comes out to a rate of 2.442 in the opposite direction), which indicates this is probably the unit that they meant. 

But we’ve also found sources that say the funt used in Warsaw had a different weight, equivalent to 405.2 g. Another source gives the Polish funt as 405.5 g. In any case, the conversion rate they used may be wrong, and that could also account for some of the discrepancy.

The height measurements might be further evidence of possible conversion issues. The authors remark on being surprised at how tall everyone was — “especially striking is the tallness of noble males” — and this could be the result of another conversion error. Or it could be another side effect of clothing, if they were measured with their shoes on, since men’s shoes at the time tended to have a small heel. (Galton measured height in shoes, then the height of the heel, and subtracted the one from the other, but we don’t know if the Polish anthropometers thought to do this.)

A third possibility is that the authors estimated the standard deviation of BMI incorrectly. To figure out how many people were obese, they needed not only the mean BMI of the groups, they needed an estimate of how much variation there was. They describe their procedure for this estimation very briefly, saying “standard deviations were often calculated from grouped data distributions.” (There’s that vague “often” again.) 

What is this technique? We don’t know. To support this they cite Jasicki et al. (1962), which is the book Zarys antropologii (“Outline of Anthropology”). While we see evidence this book exists, we can’t find the original document, and if we could, we wouldn’t be able to read it since we don’t speak Polish. As a result, we’re concerned they may have overestimated how much variation there was in body weights at the time.

These three possibilities seem sufficient to explain the apparently high rates of obesity in the Warsaw data. We think the Warsaw data is probably wrong, and our best guess for obesity rates in the 1890s is still in the range of 3%, rather than 10-20%.