"One of the most important cognitive tradeoffs we make is how to process information, and perhaps more specifically, the deluge of information that bombards us every day. A study out of UCSD in 2009 estimated that Americans read or hear more than 100,000 words a day — an increase of nearly 350% over the prior three decades (and that was before Slack and Substack!) It would seem logical that more information is always better for decision-making, both for individuals and for societies. Yet, that’s precisely the tradeoff: humans and civilizations must balance greater and better information with the limits of their rationalities." - Danny Crichton
Episode Produced by Christopher Gates
Transcript
Chris Gates:
Yeah. Let's pick up at walking us through how you got to marginal stupidity. Ready? Three, two, one.
Danny Crichton:
So this week, I was very interested in... Collapse is always one of my topics. I love collapse. I love the idea of societies all blowing up because I'm looking around our own society and being like, "How close are we to the precipice?" And then I also love politics and predictions and forecasting. And so I read two books sort of haphazardly side by side. One was Joseph A. Tainter's The Collapse of Complex Societies, and the other was Philip E. Tetlock's, Expert Political Judgment.
Chris Gates:
Those are two very Danny books.
Danny Crichton:
They're both academic monographs, so the kind of books that no one should theoretically read except for people in these fields who care about all of the fighting and infighting and academic stuff. But that's what happens when you're a PhD dropout is you every once in a while forget why you don't read the research in the first place. But nonetheless, I read these two monographs this week. And what happened was they sort of converged on the same subject. So Tetlock's main thesis is really no amount of knowledge, additional knowledge or additional degrees or specialization, actually improves expert decision making.
And instead, the people who do the best at prognosticating the future tend to just read widely. Many of his best predictors in his laboratory experiments are just people who read the newspaper almost cover to cover. They just read a wide variety of sources, they're watching the arts, they're watching television shows. They're pulling in a bunch of information. And the key is that they are in the Isaiah Berlin's concept of the hedgehog and the fox, they are foxes. They're people who draw flexibly from a lot of different domains of knowledge to apply to new scenarios.
Chris Gates:
So in comparison to being hyper-specialized or hyper-focused in one area.
Danny Crichton:
Yeah. So you would think, let's say you wanted to predict the future of China. You would think that someone has a PhD in Chinese modern history or Chinese studies who reads [inaudible 00:01:59] and the major publications from China every day would do far better than a general purpose reader reading the Wall Street Journal. And actually, it's not the case. And the reason, not that he gets into trying to exactly figure out this out, is there's sort of a network knowledge problem. Oil prices affect China. And if you're sort of just focused on what's happening in Beijing, you're not seeing what's happening in Saudi Arabia, what's happening in France. You miss the Russia-Ukraine dynamic. You've missed that there's stuff with resources in Brazil. Basically, there's all these sources of change that's going to hit China or any topic.
And so if you're only focused on one small smidgen of the map, you're not seeing the whole picture. And it's far better to have the whole picture than one tiny part of it which you know really well. And so his point, and this will relate to Tainter's book, but his point is there's a huge diminishing return to more information, that at some point you know enough. And the analogy I gave is the first time you read a textbook and a subject, think biology, you learn a lot. You knew nothing about biology, now you read a textbook, now you know a lot. But reading 20 textbooks on biology is not going to make you 20 times smarter than just reading one. And so what he gets at is it's probably better to read 20 textbooks in 20 fields than to read 20 textbooks in one field.
Chris Gates:
I mean, it makes me think of generalist investing. I mean, I've been spending more time now than I ever have before thinking about investment strategies. And this seems to be a case potentially for not being too focused in one particular area.
Danny Crichton:
Well, right. So I think one of the lessons there is it's easy to dive deep into a space and look at 500 companies and be the world's expert on biotech or whatever the case may be. But then you look at what happened in the biotech markets the last couple years. Well, it was a global pandemic. There's a huge public health crisis. There's all this political dynamics. FDA regulatory approvals and stuff that's happening in Congress drastically affects the market price for any biotech company. So you can know all the science, you can actually bet very correctly in some ways, but you could be horrifically wrong because you're not integrating all these other domains of knowledge that actually have large effects on the industry.
You are also seeing this in Flexport, which is a company that's not a Lux company, but a company that's gotten a lot of profile and attention these days because supply chains are out of whack. You didn't have to be an expert on supply chains to, A, realize that Flexport might be important or the supply chains need to be modernized and digitized in order to improve global efficiency. So there is an incredible amount of power around generalism, and specifically, being flexible of how you use that knowledge. Or to use Tetlock's phrase, improvising dissonant combinations of ideas, which I feel like is an amazing tagline for life and venture and everything else.
Chris Gates:
And what was the second book you read and how does it tie in?
Danny Crichton:
Yeah. So okay, so that was Tetlock, Expert Political Judgment. Good book, long, very, very detailed. He's done literally hundreds of experiments over two, three decades. Because it's a predictions book, he literally had to do this work 15 years in advance. And then he waited to see if the predictions, it was like, ask people in 1992 what they think the GDP of China is. And then fast-forward to 2005, and were they right? Because you have to wait because it's a prediction. So the work is sort of a magnum opus. And then we get to Joseph Tainter's book, so that's from 1988, The Collapse of Complex Societies. And he's interested in this question of there have been dozens of societies that have just collapsed, and why is that? And one of the arguments is, well, barbarians show up. This is a classic Roman example. Barbarians show up at the gates and then the society collapses.
And he's like, "Well, that's ridiculous because Rome is a complex, well-adapted, sophisticated economy that's highly productive. Shouldn't that be precisely the civilization and society most likely to be able to fight off a barbarian invasion? Shouldn't they be able to handle an earthquake? They be able to handle Vesuvius or whatever the case may be?" And so what he theorizes, and it is a theory, and it's a theory that can be attacked, but he theorizes that there is a diminishing marginal return on investing in complexity. So when you're developing a civilization, you start in hunter-gatherers, you're going around, everyone's finding berries and hitting whatever large livestock is available in the region. And then you start to specialize. So you start to domesticate, you build up agriculture. Now you have a surplus of food, so other people can do other things. They can go and make pottery. They can go and get water in a more sophisticated way.
You can build public works, you can build an aqueduct. If you build an aqueduct, now instead of having 40% of people getting water half the time, 10% of the population is maintaining the aqueduct and everyone has water. And so as our economy gets more and more complicated, we have more and more roles. In Tainter's world of 1988, there's an estimate that we have a hundred thousand different types of jobs in the United States. And it actually blew my mind, but if you start to think about it, there's a lot of unique jobs. But there's a point in which there's a diminishing marginal return on that complexity where the complexity becomes so complicated that you're not actually going to get any more value from basically creating more roles, more jobs, more infrastructure, more management. We've sort of run out of things to specialize in to improve the productivity of society.
Chris Gates:
So what do you do then? Do you just stop developing? Help me see through the other side here.
Danny Crichton:
Well, I think the point of the diminishing marginal returns is obviously there's a lot of value to complexity. I mean, I think in modern discourse we're really negative on it. Oh, it's complicated, it's complex. It's a good thing. People can specialize. We can get productivity. It creates efficiency. But it starts to diminish. We can't necessarily continually build up more land. Some agriculture farms can't produce more food than they already do. We're as specialized, we're as knowledgeable, we're as good as we can be with that piece of land. And so when a disaster strikes, a climate change, a flood, a pandemic, a barbarian invasion, probably less common these days than it was 2,000 years ago, you're stuck because you have no slack in the system. Everything's running at a hundred percent.
So think about the supply chain crisis right now. We're in a very complex world and there's no way to deliver these goods because we have hit the limits of what we can coordinate given the resources we have available. And so the combination of these two books, and the reason they kind of merged together, is that they're both sort of arguing, one from psychology, one from anthropology, that there are these diminishing returns to more information, more knowledge, more specialization. And I'm sort of interpreting here, and that's why I'm calling it marginal stupidity, there is a point where it becomes negative, that in fact we get so smart that we can no longer predict the future. And societies get so complicated that they can no longer build basic public works.
Chris Gates:
Well, we end where we end almost every Danny post, which is we are fucked. We're fucked.
Danny Crichton:
One of the things I loved about this was, in Tetlock's world, he's talking about these foxes who are very smart and are able to take all this information. But one of the things that they can get very befuddled because if you ask them a prediction and then you were like, "Well, what about this? What about this other factor you're not considering?" They'll be like, "Oh, okay. You're right. That's a fact that I wasn't considering. I need to update my prediction and now I believe this." And then if you do it again, you're like, "Well, what about this other fact that doesn't fit into that prediction?" And they're like, "Ah, I need to correct myself again. You're right. That's another fact." That, actually, foxes can get extremely befuddled if you keep giving them more information. In fact, the more information you give them, the worse their predictions become.
Chris Gates:
And also makes me think about gut feelings and how sometimes how right a gut feeling can be without actually understanding all the reasons why you feel that way.
Danny Crichton:
Well, yes. Absolutely. And look, it's gut feeling, which is... And it's not just from my gut, it's more of if you bring the information you have available, assess it, put it together in your head, and sort of come up with a prediction, that's probably the most accurate thing you're ever going to do. If you try to research it more, you will confuse yourself because you're no longer balancing the information. You have all this new information coming in that you're trying to fit. And oftentimes, the best answer to that is don't use it at all. And I think that, to relate this back to venture, one of the biggest challenges when you're making investments is this sort of information abundance, right? We're constantly getting new information. You're looking at a deal in, let's say cybersecurity, and it's super interesting. And then there's a report that day in the Wall Street Journal that says, "Oh my God, cybersecurity is going to go down the next two years."
And you're like, "I'm never going to do that deal now." And because it's top of mind, I've seen this new piece of information that's right in front of me. How could I do a cybersecurity deal when literally I'm reading a headline in a prominent paper that this is a bad market? But your first hunch that was that this is a great founder, that this was a great company, it seems to be correct, is actually probably more likely to be accurate because you're actually balancing your entire wisdom developed over decades of thinking versus something immediately right in front of you that's coming in from a news source or whatever the case may be. And so I think there's always a balance. The world is always changing. You need to update your information. We always have to learn and keep fresh, so to speak. But you always have to balance that with the fact that you do have this well of wisdom that you have to keep drawing from and not kind of forget all of your previous knowledge to fit the new stuff into the box.
Chris Gates:
And read a lot of books, diversify the information that is being put into your brain.
Danny Crichton:
If you want to understand science, read science, read fiction, watch movies, watch documentaries, TV shows, because it's our ability to process varying huge, divergent pieces of information together. It's that synthesis that makes good decision making. And I think we just don't see that a lot in society. Most of our degree programs are about getting you specialized, building skills. It's not about developing the ability to do synthesis. And so I think that's one of the biggest challenges we've seen in venture and one of the things we're constantly striving to do here, I think, at the firm, is how do we... And particularly given a lot of these intersecting kind of zones, bio and machine learning or space and manufacturing. I mean, the list goes on and on. How do you have an opinion here, A, when it's never been done? And B, when there's so much evidence that you shouldn't do it? And yet, there are these crazy outcomes that come out that prove that you can do it, that all the information we knew was wrong. And to me, that's actually the magic and the alchemy of this whole business.