Riskgaming

Can we be optimistic about America’s future?

Description

If we had to rebuild American politics to be more positive, could we do it? And what would a positive or even optimistic politics look like? What would be its program, and how could we all be galvanized to join in a world and at a time when it seems as though every day brings another dampener to human enthusiasm? Those are just some of the questions that ⁠James Pethokoukis⁠ approaches in his recent book, ⁠The Conservative Futurist⁠.

James emphasizes that optimism and pessimism don’t exist on the traditional left-right axis of political analysis (named for the seating arrangement of politicians during the French Revolution). Instead, he divides the world into an “up wing” — people who believe in expanding the bounds of human ingenuity — and those who belong to a “down wing,” which might be simply summarized as degrowthers and others who see limits in all aspects of science, technology and the human condition.

James and I talk about his book, and then I quiz him on just how realistic his futurism is. Is his vision actually possible, or is it the sort of slapstick fantastical science fiction that is great as entertainment but has long since died out as a way to govern? He’s got better evidence marshaled than I expected, and you can be the judge if optimism can guide your thoughts.

Produced by ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠Christopher Gates⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Music by ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠George Ko⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Transcript

This is a human-generated transcript, however, it has not been verified for accuracy.

Danny Crichton:
Let's just dive into the book. You wrote this work just in the last couple of months, The Conservative Futurist. Give a little bit of a precis, a little bit of a summary, of why you wrote the book, what was going on in the world that triggered it, and what the thesis is that you wrote about.

James Pethokoukis:
I think just about everybody who had a book come out in 2023 and into this year, we had extra time on our hands, nobody was commuting, and with the pandemic we decided to write that book we've been wanting to write. That was certainly the case for me.
But it worked out that the pandemic also raised some questions, or highlighted some questions, that I had been thinking about for some time, which is, "Why aren't things better? Why we had this growth and progress and tech slowdown in a couple of different stages over the past half century?" And then those issues just were really highlighted by the pandemic, and I can say that I thought that the book was going to be a gloomy book when I first started writing it, but as I'm writing it, then things really started popping. AI started popping, SpaceX took some Americans for the first time into orbit on one of our own rocket ships since the end of the space shuttle. So I really began seeing lots of tangible reasons for optimism, which I was reflected in the book, because ultimately I didn't want the book just to be a, "What went wrong." I wanted it to be a, "What went wrong and how can we fix it," kind of book.

Danny Crichton:
So let's talk about the thesis. So you have this concept of up wing versus down wing, as well as a conservative futurist. Maybe give a little bit of a summary of that concept, because that sort of suffuse throughout the narrative.

James Pethokoukis:
Yeah. Yeah, the book's called The Conservative Futurist, because I want to create a better future, and I'm a right of center person. I work at what is widely thought of as a conservative think tank. So it's put the truth right there on the cover where I'm coming from.
That said, a lot of what I write about is that spectrum and those labels, are they really saying the most important thing we can about people's views and their policy ideas? And I don't think so. I think the important way to think about where we're coming from isn't left or right, but up or down. Up, meaning there are problems, but we can fix them if we apply human intelligence and technology, and there may be some differences on the exact formula here, but broadly that is a direction we need to go. We want abundance. If you think that's true, then you're probably an up winger. You may also be a Democrat or Republican. That sort of pro-technosolutionist belief makes an up winger, and you can find them on both sides of the aisle.
But then there's the down wingers, which are the opposite, better safe than sorry. If we do come up with great new inventions, the only people that benefit are the people in Silicon Valley. And by the way, those new inventions will probably just really disrupt our society, and robots are going to kill us, take all our jobs, maybe the reverse order, I don't know. And they're worried, that the environment, that, "Gosh, maybe AI will work, but we can't give it enough clean energy, so we shouldn't have AI, just like we shouldn't have nuclear."
And again, both sides, you can find those people what I call the down-wingers. And if I know your view about progress and problem-solving, to me that's really what I need to know. That's what I need to know where you're coming from. Maybe not just that, but that more than anything really is part of the thesis.

Danny Crichton:
We see this, not just in science, as we see this in real estate, NIMBY, not in my backyard. It could be conservative or democratic, it could be historical preservation, or maintaining the stasis of community. But it crosses left and right, and what I liked about this framework is it refocuses away from this balance between two parties where, in many cases, the policies are the same, particularly at the local level.

James Pethokoukis:
Right. Something like wanting to have a local zoning and housing environment, where you can build housing for people who want to live there. Boy, I realize in our world today, everything tries to get shoved into culture war, but that should not break down among obviously left and right, because what is it about? It's about opportunity and abundance and, really freedom, so people are free to live where they want to live, because there's housing they can afford. You should not be able to guess that, based on whether there's an R or a D next to their name, and I think there are quite a few issues like that.
To me, nuclear power shouldn't fall along that traditional spectrum. For shorthand, it's tough. I talk about polling and asking people their views of the future in the book, but to me, if you ask someone, "What's your view of nuclear power," that will tell you a lot about how they view risk, how they view about technology as a solution, what they think is the best way to have a better environment. So the fact that the polling seems to be showing Americans more willing to accept nuclear power, and I think we're seeing that in the politics as well, to me, that's a great up wing sign.

Danny Crichton:
Well, I think what's interesting is as you start the book off focused on Tomorrowland at Disneyland, and Tomorrowland was this Walt Disney focus on the future, and the argument I'm just going to make it for you, was Tomorrowland was a look into the future for American society, but American society was progressing so fast, that it became obvious to Walt Disney that you're going to have to update this ride and this entire theme park every five to 10 years, because it's actually changing so rapidly. And then there's this freeze in the 1970s.
What I think is interesting is, it's in Disneyland, and all the parks as well, but I think of the story of California, a story of being on the frontier, ambitious entrepreneurs building the future, Silicon Valley, and again, it goes up and down. And at this point in time, it feels like we are in a period of stasis. That California, this massive dynamic economy on the West Coast, today, we hear more and more about regulations.
Just this last week there's a very controversial bill up in the Senate to put more regulations around AI, who would essentially shut down much of the California AI industry. And so there's this local state dynamic in one of the most important dynamic economies in the United States, where we used to progress towards the future, and all of a sudden, that is the state that has held back. It has the highest housing prices, it's among the most NIMBY in the country, and to me, that dynamic has become spread all across the United States. I'm based here in New York City. It's almost the exact same thing.

James Pethokoukis:
Given the importance of California, since World War II as a cultural leader, it's where all the big cultural trends start, it's political leader, it's where the entire tax cut Reagan revolution really began, even though Reagan wasn't necessarily governor any more. That's where that began. And then of course the entire information technology revolution.
California has been on the frontier of what America in the 20th century, especially the second half of the 20th century, and into the early 21st century, has been. And with all due apologies to New Yorkers and Texans, that's been the place of California. If you can look at California, and it seems to be the other kind of trendsetter, a trendsetter in ill-advised regulations, in losing control of its cities and not appreciating this amazing happenstance that the heart of the tech sector is in California, and trust me, all these other countries are trying to build their own Silicon Valley. Not so easy. California has one that's a bad sign. It doesn't mean America cannot succeed without a successful California, but not only is it harder, but to me, it's just indicative of something that's gone wrong deeply with that state, and I think you can reflect on America the past half century, which is ... listen, I'd rather live now than in 1974, but it could be so much better.

Danny Crichton:
So when you brought up the question of nuclear as a litmus test, you can test on people's perception of risk, and this gets at ... obviously there's a podcast called Risk Gaming. We build simulations of war games, for individual folks to understand complicated science technology, policy challenges, and risk comes out of a lot of these, right? That there's positive risks, that if we have a new technology, it could make things much more productive, much more efficient, save a lot of time, and ultimately if you want to make citizens' lives better, productivity is the magic number to make that happen.
And then there's negative risks. Obviously a nuclear power plant can provide abundant clean power, which would be really amazing in a world of climate change. It could also potentially melt down, and you emphasize in the two largest cases that the perception of what went wrong, and just how bad they were were actually quite minimal.
But at the general question of risk, to my perception, it seems that the United States and a lot of our municipalities have backed away from taking risks. And we started this conversation with COVID, and at no greater example than public schools, where after we had data and we knew that essentially people under the age of 18 were relatively immune to COVID ... relatively ... we kept schools closed, in some cases for months, sending kids behind almost a year or two in grade schools. They slowed down their learning and regressed.
What happened with risk tolerance in the United States over the last 50 years?

James Pethokoukis:
There's a lot of cognitive biases that people have which make us risk averse, risk intolerant, make it difficult for us to update our views. We anchor on the past, what we already knew, and even if the new information is empirical, we have the numbers, very difficult to change that.
So we're always fighting, and most of us will always be probably a little bit, maybe, too risk intolerant, but I think one thing that happened was, and I think broadly America got richer, and when countries get richer, they stop taking as many risks. They don't think they need to. They begin to focus a lot more on the downsides, such as pollution. "Oh boy, all this economic growth is great, but things are better, now that these kids are dying childbirth. I have heating. Now I'm going to think about the air, and I'm going to think about lead and fuel."
So you start focusing in on the downsides. It happened in the United States, it happened in other countries. That was always going to happen. We were always, for instance, going to have, I think, an environmental movement, which I write about. But did it have to become, let's say in the case of the environmental movement, such a anti-growth, "We need to push away all risk. Tomorrow is going to be bad, so we cannot go to tomorrow, we need to retrench."
Did we have to have that kind of movement? I don't think so. You can imagine an environmental movement which views technology as a solution rather than a problem. But of course we had this natural impulse because we got richer, combined with lots of weird one-off things, whether it's books like Silent Spring, or some natural disasters in the 1960s ... manmade disasters, oil spills, the Vietnam War. All these people who were saying, "Listen, we've gone off track. Big science and big government, and they have nuclear weapons and they're ruining the environment."
And then all these predictions got wrapped up into the Vietnam War, "See, we're doing it wrong. We're dropping napalm and look what corporations," all that happening at once really set us in the wrong path.
Then Hollywood picked it up, and it's become a staple of our culture, certainly of our science fiction. And it became, literally, a real doom loop. When I wrote the book, nuclear was a great example of this. Then we saw with AI, when they rolled out ChatGPT, there was about a week where you got fun stories about ChatGPT. You can make crazy images, you can have a rap song being written like it was being wrapped by Winston Churchill or something. And then quickly we got to, "Oh, by the way, it's going to take all our jobs and then it's going to kill us."
And to me that's just indicative of the sort of, I think, I guess down wing pessimism and risk aversion that we've been soaking in for a half century.

Danny Crichton:
I think it's interesting you talk about this doom loop, because to me the growth loop is very, very powerful, right? At a fundamental basis, the reason why we grew the suburbs post-war is ... [inaudible 00:12:21] had millions of people coming back from the fronts in both the European and Pacific theaters, but there was a real series of money to be made. Real estate developers made money, investors made money. You, as a middle-class family moving into these neighborhoods, made money, and it was very hard to beat that growth loop, because there's so many people who have their hands in that pie, and are making cash.
And what's been interesting to me is NIMBYism and all these regulations, all these blocks, growth loop doesn't seem to be able to overcome them. Even on AI regulations, we're seeing this in Europe, which has already passed its AI Act. We see this in the UK, which once regulated with the Bletchley Declaration last year, California debating a law right now in the Senate, that might very much curtail the entire industry. This is an industry of trillion dollar companies, and so I find it ironic that at the same time we're talking about the peak of corporate power, even the most powerful, magnificent seven companies in the United States, in California quite frankly ... most of them are in California ... are not capable of pulling back from this idea of the Terminator mentality of future technologies.
And to me, if you come at the end of a 40-year cycle of the internet and computing, and this very conversation could not have happened even a decade ago, and now we're able to have a conversation. We didn't have to fly to each other. We didn't have to bring mics and all the equipment to make this work. To me, it's so empowering and how do we not take that optimistic message of the technology we just built, when we have something new that's right in front of us?

James Pethokoukis:
Yeah, I think it's remarkable how many people will say, "The internet, was it really a good idea? Has it helped us?" To me, that you would rethink the entire information technology revolution, or think that maybe it deserves a possible rethink, again, to me it shows how deep this go.
Let's remember that statistically, looking at the economic statistics, you can date what I call the downshift in growth, to the early '70s ... almost, if you want to do that, almost to the exact quarter. But when that happened, and US productivity growth slowed down and that's a good shorthand way of looking at innovation. People were still so confident that this miracle of growth was so powerful, we couldn't screw it up, that they didn't take it seriously. They looked for all these other temporary reasons, why, "I'm sure it's just like the oil shocks, and once the oil shocks are over, it's going to be back to the go-go '60s."
And it took a number of years before they realized, "Well, it's not going to happen. So what else has gone wrong? Does it say something about our character? Is it all the regulations?" Yes, big part of it, and that assumption that the growth machine could not be broken. Unfortunately it could, and it was, and even to this day, you see with new technologies such as AI, the belief that it will do what it's going to do, and all these companies will still get big and powerful, that American companies will always be the leaders regardless of what else happens.
I work in a think tank. So regardless of what else happens in public policy, doesn't matter how we regulate, doesn't matter what we spend on research. The disconnect there, which was certainly a disconnect that you saw a half a century ago remains, despite the lesson that we can break it, we can disrupt it, in the worst possible way.

Danny Crichton:
And let me ask you, one of the questions I've had ... so obviously all these movements come out in the early 1970s. We had Paul Ehrlich and the population bomb, Silent Spling, Rachel Carson, '60s with the environmental movement through NEPA and a couple other pieces of legislation, early Nixon years or later Nixon years. All of this comes down 1972.
One of the questions I've always had though is, look, one of the challenges with risk is there are risks. A nuclear power plant can melt down. If you build a factory, you can leak chemicals and leach them into the soil, you can leach them into the water. And so we built these rules that say, "Well, we want to make sure that doesn't happen. We want to make sure that doesn't happen," and over time we've gotten really smart, we just didn't know.
In some cases this technology was so new. You go from atoms for peace in the '50s, to "Wow, there's problems with this technology," 10 years later. To me, there's never been a correction. I always ask this question in the context of deregulation and libertarianism, I'm also in a right think tank with Manhattan Institute, and I'm always like, "Look, but we have so much more information. What do you give up?" When we gave up clean air, yes, we don't want lead poisoning. We now know that that was horrific for populations, for cognitive development, particularly for kids. Clearly we don't want to go back to a world where we're dumping lead into the air, but where does that cut line go? Because when I think about risk, at some point we're saying, "Look, there's a trade-off, there's a cost benefit," and yes, there are a risk that we probably should just take, when we know that they may be bad, but sufficiently low that is probably better to build whatever new thing versus minimize the downside.
But I'm curious when you think about that balance, because we do have more information than we did 50 years ago.

James Pethokoukis:
And I wonder, just taking nuclear power, what happens, even after the '70s, what happens if oil stays really expensive, because the oil ... because we had this collapse in the oil market, which among other things, helped undermine the Soviet Union. If oil had stayed super expensive, would there have been a rethink? But certainly, the early days of the Reagan administration wanted to do a rethink on nuclear.
But I think part of the problem is that if you're going to say, "Okay, we need to accept these risks, because there's good stuff that we're missing out on," so people have to believe that, and you have to be able to paint a compelling picture of the good stuff.
It's the same problem that people face when they make the case for, "We need faster economic growth and tech progress." Great, but the downsides are obvious to that. The downsides are companies rise and fall. There may be communities, even whole cities, that don't do well due to that dynamism. So it has to be worth it. There has to be a lot of growth on the other side to offset that. People have to see tangible benefits in their life, people need to see miracle cures, curing kinds of cancer and all that.
So at the same time all this cultural stuff is happening and all this pessimism, growth slowed down. We went through this terrible [inaudible 00:18:25] period after the global financial crisis where, again, very slow growth, rising inequality, and not only was that bad for, I think, people's view of capitalism, but then they started thinking, "Is it worth it? Is the disruption," people have to believe the disruption is worth it, and when the only image ... so when they don't see it in their own lives, and the image of the future is dystopian science fiction everywhere, it's not surprising people are terrible risk-takers.
If you look at the '90s, and we can talk about the '60s, but if you look at the '90s, '90s were a period, for instance, there was rising inequality, the great 21st century problem, rising inequality, but there were no riots. There were no Occupy Wall Street in the 1990s, because growth was so strong, people saw it in their own lives across the income spectrum. And I have to believe that, were we to get that period again, people would again be focusing on the benefits of growth rather than downsides, and I think there are a couple other tailwinds as well.

Danny Crichton:
It reminds me, we cover China a lot in this show as well, given my background, and it's interesting, because once you get into the Deng Xiaoping era, you get to opening up China, the argument was some people have to get ahead first, and then the rest will follow behind them. And it really was true for 20 to 30 years, up until the current era under Xi Jinping, where yes, some people got ridiculously rich really fast. They became billionaires in a communist system, and it was very shocking to people, but unlike Russia, everyone said, "Yes, they're billionaires, but I will be a billionaire in 10 to 15 years as well. This is a dynamic economy. There's opportunities. I can feel it."
I wasn't old enough for the dot com, it was early grade school. So I only have little memories of the Dow Jones hitting numbers, and people getting very excited about it. But I think of it as the same period where you could literally get on a plane, go to the Bay Area. The 101, 280 freeways were completely jammed, to a point that the San Jose Airport had a flight to the San Francisco Airport, because traffic was so bad at the peak of the bubble that it was easier to fly the 20 miles than it was actually to drive them. Also, pre-9/11, so security was a little bit faster to the airports.
But there was that moment where it's like, even if you don't feel like I have access in Detroit, or I was born in Youngstown, Ohio, I'm not going to have access here, but I have the ability to move, and there's opportunities lying all over the economy, and it doesn't like it's exclusive to one in 10,000 folks. If I just step up, it's available to me.
And I do think that that's lost in the current economy. We see that in some of the migration numbers, that people don't switch cities as often as they used to 30, 40 years ago. We see that in terms of jobs, people don't switch industries as often as they used to. We see that in education upscaling, where actually we're seeing a retrograde, where people are not learning new skills, they're not adapting to the economy, they're hunkering down. And to me, hunkering down is not a way you create growth. It's just a small C conservative, hold onto what you have, hold onto the house as long as you can, try to get to retirement and die off before the whole thing falls apart. And I just don't know how you get out of that narrative, because once you believe it, it's self-reinforcing. You see every negative story, and we can blame the media, but I honestly think that's what readers read. I was an editor for five years. I can tell you, that's what people read.

James Pethokoukis:
The mood you just described, interestingly, that was the mood right before that 1990s boom. We had this really nasty recession that people never talk about back in the early '90s and 1990, so there were slow growth, people were worried about the deficits. One of the things I write about in the book, is that there was a great article during the 1996 election from the New York Times, where the New York Times economics writer berated Bob Dole and Bill Clinton for raising expectations about how fast the US economy would grow. Instead, what those candidates should have been saying is, lowering expectations with the American people.
Now that article was written just since as the economy was absolutely exploding. So what changed people's attitudes ... maybe never at the New York Times, but everybody else's attitude ... was that the economy started just absolutely growing, and sometimes I think you need to get a bit of a lucky break.
I always think of the example of Seattle, which in the '70s was like a rust belt place that wasn't in the Midwest. There was a sign with, last person leaving Seattle turn out the lights. And the break they caught was that Paul Allen and Bill Gates moved back from Albuquerque to Seattle, and it got a break, and it helped launch Seattle as a tech hub. And maybe we're going to get lucky, that just when we need it the most, that AI will prove to be a productivity growth-enhancing general purpose technology, and I think, to break us out of this cycle, and I think if we can break ourselves out of this cycle, then I think you change the conversation. You change the negative conversation about trade. You change the negative conversation about what democracy can do. You change it about immigrants, because people start thinking, "Oh, I see why the disruption could be worth it, because my life will be better, or my children's life will be better."
So that's why I write a lot about AI in my newsletter, because I want to hopefully be that lever, because it would accelerate this entire process a lot, if we're about to see a sustained boom like the 1990s and beyond. That would be extraordinarily helpful. So I don't want to screw it up with bad regulation among other things.

Danny Crichton:
Well, one of the things ... you have a bunch of big suggestions, small suggestions ... but I thought a couple of novel ones that I enjoyed. One was this idea of an AP progress class. So maybe talk a little bit about this. The second, which I hadn't heard of before, and I don't know if it was invented by you, or for someone else and you borrowed, but this idea of a proactionary principle versus a precautionary principle.
So I'd love to talk about those two, and how you think that might change the culture to be a little bit more optimistic and growth-minded.

James Pethokoukis:
So much of our regulation, especially on emerging technologies, and boy with AI, people almost use the exact language of the precautionary principle, which is, "Better safe than sorry. Risks aren't worth it. The upside will be far exceeded by the downside." It has infused our culture and our regulation for decades. It's why, even though in the worst nuclear disasters in a rich, well-run country, so we're setting aside the Soviet Union ... which is the United States, and Japan, Three Mile Island, and Fukushima ... the mere fact that you did not have extraordinary casualties from the accident, or none from the accident itself, maybe from the evacuation in Japan, that despite that, we decided to basically move away from nuclear. In Japan's case, absolutely shut it down. That is the precautionary principle.
The proactionary principle is not taking ever taking a risk. That's the big risk. That is absolutely the risk, and then you're unprepared, because that will mean slower progress, it'll mean less growth and then you aren't ready to deal with big problems, and boy, oh boy, if there's anything we learned during the pandemic, it helps to be a rich technologically-advanced country.
So it is a different framework, just that we can live in our everyday lives, and that's what I preach to my kids, "Never taking a risk, you're actually taking a tremendous risk," but also hopefully a different way for regulators to think.
And the other one, it's small with the AP progress. It's amazing. And again, I have a lot of kids, and I see how they view the world. How many basic improvements in human life people just don't know about. They think it's just the opposite. They think there's more poor people in the world every year and there aren't, and they think pollution is worse, and it's not.
So I would like, not only a class that would be like a myth busting class, but actually show how we got from everyone living on $2 a day, to the median family income in the United States being like $80,000. How did that happen? So it'd be a class that's fact checking, but also be, I think, a class of pretty interesting economic history, because if you ask people how that happened, maybe they'll say the industrial revolution, and that's about as far as it goes, but it's far more interesting. So it'd also be a story of science, be a story of actually about business, which hardly gets covered at all in a high school, other than to talk about robber barons.
So I think reframe the debate. I think we have people who don't know how we got here. It's going to be very hard to go forward.

Danny Crichton:
Robert Caro's LBJ series, early on, Lyndon and his wife, Lady Bird are in the Hill Country. Hill Country is extremely rural. It's before FDR's Rural Electrification Act. There's no power, there are no appliances. Laundry takes 12 to 14 hours a day. Cooking takes another five to six hours a day. It's incredibly lonely, and he has an amazing two chapters where, over 30, 40 pages, you really get into the heads of just the misery, the complete catastrophic misery of this place before the electricity comes in, and appliances later on. And it was one of the first times that ... I always knew that appliances were extremely important, that dishwashers, laundry machines radically improved the life for families, and particularly women, who were expected to do this in these families, and liberated them from that time sink, to be able to do work or whatever they want to do, and pursue their passions.
But it was only that those paragraphs, one after the other, that you really get into the head of how awful it was, and how much better it is today. And there was no history book that ever said that. Just the washing machine alone deserves a chapter in a history textbook. So maybe it goes, not in US history, because we can't get any more into A push. We'll just talk about tariffs and robber barons and the gilded age. But nonetheless, in maybe an AP progress class, you get a sense of, look, it was manual labor, dozens of hours a week, to do laundry for a family of 12 with a lot of kids.

James Pethokoukis:
Yeah, there's a book ... and I might be goofing up the title a little bit. It's like The Hundred Most Important Inventions Ever Made. It's not just the light bulb, it's saying the cargo container ship. Maybe that's not particularly sexy to people, versus, I don't know, the laser, the adventure, the laser, but the cargo ship and those containers, super important.
If you understood the history of the most important inventions, you really have a great foundation of what created the modern world. And kids, they don't know, they may just have some general idea, but you really don't have a deep understanding, both of how we got here, and how we can get to the next place, because it's going to require technological progress. It's going to require people taking risks. It's going to require entrepreneurs. It will require lots of people doing science, maybe with their AI research assistants. That's what it's going to require.
If we don't understand what it's going to require, then we really can't ... you can't judge what your politicians are doing, whether that's actually moving you forward, or just addressing niche problems, or ones that really won't move that ball forward in a significant way.

Danny Crichton:
When you get into the history of science, in particular, that is subject that's just not taught in grade school, but it is really the history of both success, and ultimately failure. Many experiments fail, and when you get into the lives of even a Thomas Edison, a Newton, yes, these are incredibly successful scientists, but then you get into their actual work, and you realize that one of the reasons they're successful is they tried thousands of things. These were not folks who did two experiments, got lucky on both, and discovered calculus, discovered electricity, discovered how to move it around. They invented, which meant that there were many failures along the road.
And I think in our a failure avoidance model of education, we don't allow students to experiment. We don't have them go to a maker space. They have to memorize math and reading to try to pass tests every year. There's not a lot of space to realize that age of discovery, finding new tools, finding new products. It really takes a lot of tenacity, a lot of grit, using Angela Duckwork's framework. And that we just aren't building that into the curriculum.

James Pethokoukis:
And then you end up with a society that is unable to grasp the fact that the business model of SpaceX is failure. It is to launch rockets that fail, learn, iterate, launch again. It has been very difficult to get that message through to the media, to business reporters. They're intentionally trying to do that. That doesn't mean this is a failed company. What do all those investors know? They understand the model, and that total rejection of the role of failure, I get it.

Danny Crichton:
I would say, from the Lux Capital perspective, I always describe, a lot of folks take on financial risk in the venture capital industry. But one of the things that we do ... not uniquely, but we're part of a very small group ... is we do take on technical risk. We take on categories in biology that have never been done by scientists outside of a laboratory before. And we're going, "Look, we think it'll work, we think it'll be scalable. And if it does, it's amazing, because we will solve an entire type of disease."
Think like the mRNA vaccine from 10 years ago, not a company we invested in, but would've loved to at the time. Moderna took years ... and not only, in that case, they came up with the technology, we didn't have an application for it. They came up with the mRNA vaccine, and only later they were like, "Look, we can connect the dots here, and suddenly we have this platform that works perfectly for here, and we can use Operation Warp Speed in a couple of months," but it had to be on the shelf. It took 10 years to get it on the shelf. And then once we had that, there was all the applications that came after that.
And I try to remind people that we always have to take on technical risk in our little part of the venture industry where, if we're not willing to take the science out of the lab and start to commercialize it, none of this gets built. And most VCs aren't willing to take on that risk, quite frankly. It's much easier to look at a spreadsheet, look at a Excel model, and say, "Well, new accounting software, we know that that market is 20 billion, and 10%, we will get two billion. That's a much safer proposition."

James Pethokoukis:
"We'll sleep well."

Danny Crichton:
Yeah, exactly, "We'll sleep well."
But as you say, with the pro-actionary principle, not as well as we should, in an alternative universe, and that's what we're always looking for.

James Pethokoukis:
reinforces that, because when you were mentioning the vaccine ... and I do talk about economics and culture in the book ... but I always think of the opening scene of the film, I Am Legend, one of the great zombie takeovers, where there's a scientist saying, "Hey, good news, we've cured cancer," and of course, the point of the movie is that cure turns us all the zombies. We got to enjoy the cure for about a week and a half, and then we started turning into zombies, so-

Danny Crichton:
Exactly.

James Pethokoukis:
Not helpful. Those movies are great. I love it. But those can't be the only images of science and the future that we get, and I think to a great extent, those are the only images we have.

Danny Crichton:
Absolutely. But let me ask you, so you obviously want to create more of this up wing mentality. We want more science, more progress all across the board, more growth. In the center right universe that we're both in, I feel like there's always two directions. One of, this is just more dollars, so I call this the economic model. National Science Foundation needs more funding, NIST needs more funding, NIH needs more funding. And if they had more dollars, we'd get more progress. And there's a lot of evidence from research and development studies that say every dollar we put into basic science gets pulled back as wealth-generating for the country at a certain level.
And then there's another angle, which I would call the sociology model, which is to say, look, it's not just about dollars, it's also how we allocate. It's the structures and institutions that we're building, and many of those structures and institutions are not as effective as they used to be. So there's more rules at the NSF. There's more requirements, there's more paperwork. Lab directors are spending 30, 40% of their time filling out bureaucracy from central government or whatever the case may be.
And so I'm just curious from your perspective, is it more of the economics? Is it more of the sociology? Is it both? How do you reconcile that, as part of your program?

James Pethokoukis:
If it wasn't clear to me when I wrote the book, it's certainly clear to me now, and I want more R&D public and private. But if you think that we should be doing more public R&D, and there's going to be a big return on investment, then that absolutely needs to be accompanied by the reform to get the most bank for the buck, to make sure that good ideas don't get suffocated, that there's mechanisms in place that someone has the one good crazy idea can't be outvoted by the nine people who think it's a terrible idea.
It's going to be extremely hard to spend that kind of money as we're seeing, because the new funding that was passed for science in the Chips and Science Act, that's not getting funded fully. There's always going to be a lot of resistance, especially if you have big budget deficits.
So one, I think you need to push that funding, but it has to be accompanied by the kinds of reforms ... so you can tell Congress, "Give us more, but trust me, we're going to squeeze every bit of innovation potential out of every day." You can't just say, "The institution's in place, we're fine. Just cut the check. We'll turn on the innovation machine."
I think, just as a practical matter, and if you believe the constraints, and if you believe the budget constraints exist, any spending needs to be predicated on that reform also happening. I think both are super important, and I don't think they can be separated.

Danny Crichton:
I always say the hardest part for Congress is, a lot of this with discoveries is a power law. That's why we're in the venture industry. So the biggest challenge is look, thousands of projects, they all add a little bit to the normal science, in the Thomas Cooney instructor of scientific revolution sort of way. But one of those is so fundamental and it's worth so much more than everything else.
Whether that is the mRNA vaccine, whether that is CRISPR optogenetics, there's so many of these examples where, "Look, that's probably worth all of the last 10 years of the budget combined to get that vaccine." That is at the NSF scale, about a hundred billion bucks. And I think most of us would say for a hundred billion, in a multi-trillion dollar annual budget for the US government, to have a vaccine that we can all use and go back to regular life in COVID, that's like a steal.
And so the challenge is always, you have so many of these projects and you just don't know in advance which one it is, but one of them is worth that huge amount of money, and that is why a lot of venture capitalists are generally like, "Look, the government can't do this, because the power law is so hard to explain to a government bureaucrat, that every case is not equal and unique."

James Pethokoukis:
That will always be a problem. Maybe you can make it less of a problem, and again, that depends on leadership. That depends on the people we elect to Congress. That depends on our president. That's why you really need the private sector to also be involved with doing R&D. It's never going to do as much, or nearly as much, of the fundamental R&D that government does, but that's why you need both sides fulfilling their role.
But that kind of risk aversion. That if there's a failure, therefore, "Oh, if there's a failure, then we go and spend a lot more money on R&D, and if tomorrow there isn't a miracle cure, or if something that seems like it's going to be great ends up failing."
We already see with AI with, already people think the AI revolution is over. It's a failure.

Danny Crichton:
Yes, right.

James Pethokoukis:
And there's literally ... and I just wrote about it ... there was literally a New York Times op-ed about someone saying, "It's a failure, and we were spending all this money on data centers, and it's all polluting energy, or using too much. We're using too much watt." That attitude is an absolute American dream killer.

Danny Crichton:
Couldn't agree more. Let me do ... we have a little bit more time, and we're recording this early July. Last week, major Supreme Court decision, the Chevron Deference decision, which is going to change the regulatory state quite a lot. We got lost in the Biden, Trump debate aftermath, but Chevron Deference gives this regulatory control to government agencies, where they get to assess the rules that they do themselves. So if we want to look at the Clean Air Act, and what is the definition of clean air, that gets sent to the EPA, gets sent to EPA experts, and the Supreme Court says that, "Look, this has to come from Congress or the courts. Government regulatory agencies can't regulate themselves and choose their own rules. That's not how the branches of government work."
How does that affect the future of progress going forward? Because obviously I think the national news media was extremely negative on Chevron Deference and the decision that came out of the Supreme Court, but I'm curious if you're in line with that, or think that it might open up a window to the future for your conservative futurist view.

James Pethokoukis:
Yeah, man, I can see it opening up a window, especially if we decide Congress needs to be Congress, which means the boring job of writing laws that aren't totally vague, and then leaving it up to agencies to fill in the gaps in this vague law. Now I'm going to go give a speech on MSNBC or I'm going to go to Speech on Fox. Hey, that's fun. I love going on TV. It is like a great time. It's easy. Actually, doing hard work as a legislator is hard.
So one, we need a Congress that does the hard work, and make sure they have big well-funded staffs to help them do that hard work. But the media, which is freaking out over this, it is really the case of the seen and unseen. They're focusing on the fact, great, cleaner air, absolutely. Cleaner water, absolutely no letting get ... that's good, but there's no focus on what we don't have, because it doesn't exist. And there's been very little incentive on showing the world that we don't have a world where ... I don't know, there's no debate about climate change, because there's a thousand nuclear reactors in this country, and had we been pushing this technology for half a century, maybe we'd already have these fusion nuclear reactors.
The inability to focus on the path not taken, I think is absolutely crippling. That path not taken is a better path. It's a healthier country. It's a far richer country. It's a country where ... I'm not saying there wouldn't be problems, and there'd probably be new problems. I'll take my chances with that, rather than just hoping or muddling through, lowering expectations, and telling people that we need to have far fewer people on earth, and they all need to live at homes made out of organic mushrooms. That's the future.

Danny Crichton:
So you have this Robert Frost, okay, so the path not taken. You published your book, The Conservative Futurist, a couple of months ago. The feedback has been great. What's next? What's the path not taken for you? What other writing stuff are you working on? What's top of mind, seeing we're into the second half of 2024?

James Pethokoukis:
So I have my Substacks newsletter, Faster, Please! Please subscribe as quickly as possible, faster.
But one thing, is something I mentioned in the book is something called the Genesis Clock, which would be a ... if you're familiar with the Doomsday Clock of Armageddon put up by the Bulletin of Atomic Scientists, where they're constantly moving it closer to the midnight of humanity, and further away, depending on any number of subjective factors. Originally, nuclear war, then it became the environment, it's inequality, it's AI, and it's always a big deal.
I want to create just the opposite. I want to create, hopefully something more empirical, so it's not just random based on what a bunch of people think about that day. A symbolic representation of how close we are getting to solving so many big problems, that you could say it's the dawn of a new time for humanity. So that is actually something I'm actively working on right now, the Genesis Clock. If it could have kind of the impact that, my gosh, the Doomsday Clock, every time they move it 30 seconds, it's a big news story. Well, I think there needs to be another side to that trade, and that's what I'm trying to generate.

Danny Crichton:
Well, that's awesome, James. James Pethokoukis, author of The Conservative Futurist. Thank you so much for joining us.

James Pethokoukis:
Danny, thank you.

continue
listening