AI-generated interview quotes analysis of the interview
Dario Amodei discusses AI's impact, societal shifts, and Anthropic's approach. Explore AI's potential, risks, economic implications (GDP growth vs. unemployment), safety principles, and the future of education and governance.
Published January 20, 2026
Upload your interview recording and get the same detailed AI analysis.
Upload Your InterviewThis interview quotes was automatically generated by AI from the interview transcription. The analysis provides structured insights and key information extracted from the conversation.
Dario Amodei
Complete analysis processed by AI from the interview transcription
"It feels to me as though the debate has shifted somewhat this year to be more less what can AI do to what is AI doing to the world."
— SPEAKER_02 • 00:10:36
"No. I'll explain the longer version now."
— DARIO AMADEI • 00:15:54
"I've been watching this field for 15 years, and I've been in this field for 10 years. And one of the things I've most noticed is that there's been a surprise, the actual trajectory of the field has been surprisingly on a, you know, the same trajectory. Whereas the kind of public opinion and the reaction of the public has oscillated wildly."
— DARIO AMADEI • 00:16:12
"similar to Moore's law for compute, we basically have a Moore's law for intelligence, where the model is getting more and more cognitively capable every, every few months. And that, that march has just been constant."
— DARIO AMADEI • 00:17:35
"I think there's a similar thing on the polarity of whether the technology is good or bad. You know, in 2023 and 2024, there was a lot of concern about AI, right? There was, you know, concern AI, you know, AI is going to, AIs are going to take over. There was a lot of talk about AI risk, AI misuse. Then in 2025, the political wind shifted, as you say, to AI opportunity. And now they're sort of shifting back."
— DARIO AMADEI • 00:18:05
"And, and I think throughout, throughout all of this, the, the approach that I have tried to take and the approach that Anthropic has tried to take is, is, is one of constancy of saying that there is balance here. And, and balance of a very strange form, because I think the technology is very extreme in, in what it's capable of doing. But I think its positive impacts and its negative impacts that, you know, they, they both exist, right?"
— DARIO AMADEI • 00:19:15
"My view is the signature of this technology is it's going to take us to a world where we have very high GDP growth and potentially also very high unemployment and inequality. Now that's not a combination we've, we've almost ever seen before, right?"
— DARIO AMADEI • 00:21:11
"So the idea that we could have five or 10% GDP growth, but also, you know, 10, 10% unemployment. It's, it's not logically inconsistent at all. It's just never happened that, that, that, that way before."
— DARIO AMADEI • 00:22:19
"Software is going to become cheap, maybe essentially free. The premise that you need to amortize a piece of software you build across millions of users. That may start to be false."
— DARIO AMADEI • 00:23:53
"But, but at the same time, there are whole jobs, whole careers that we built, built for decades that, that may not be, be present. And, you know, I, I think we can deal with it. I think we can, we can adjust to it, but I don't, I don't think there's an awareness at all of what, of what is coming here and the magnitude of it."
— DARIO AMADEI • 00:24:23
"And my view is until we can measure the shape of this economic transition, any policy is going to be blind and, and misinformed, right?"
— DARIO AMADEI • 00:27:20
"Step two is I think we need to think very carefully about how do we allow people to adapt, right? People can adapt more quickly or they can adapt more slowly."
— DARIO AMADEI • 00:28:23
"And the third step is, I think there's, there's going to need to be some role for government in, in the displacement that's, that's this macroeconomically large. I just, I just don't see how, I don't see how it doesn't happen."
— DARIO AMADEI • 00:29:41
"And so I think, I think this is probably a time to worry less about disincentivizing growth and, and worry more about making, making sure that everyone gets, gets, gets a part of that growth."
— DARIO AMADEI • 00:30:12
"I think one of the good choices we made early was to be a company that is focused on enterprise rather than consumer. And I think, you know, it's, it's very hard to fight your own business incentives. It's easier to choose a business model where there's less need to fight your own business incentives."
— DARIO AMADEI • 00:33:34
"You know, we do all these tests on our models that others have not done. Um, uh, you know, some other players have done them, but, but, you know, I think we've been the most, uh, you know, aggressive in, you know, when we, when we run tests that, you know, show up concerning behaviors in our model, you know, these things around deception, blackmail, sycophancy that we show in tests and that are present in all of the models."
— DARIO AMADEI • 00:35:11
"I think that's not about competition. That is, that is about actually the public benefit mission is I'm, I'm worried that if autocracies lead in this technology, it will be a bad outcome for every single person in this room."
— DARIO AMADEI • 00:36:16
"I am concerned that AI may be uniquely well suited to autocracy and to deepening the repression that we see in autocracies."
— DARIO AMADEI • 00:37:20
"But if you think of the extent to which AI can make individualized propaganda, can break into any computer system in the world, um, uh, can, uh, you know, surveil everyone, everyone in a population, detect dissent everywhere and suppress it, you know, make, you know, a huge army of drones that could go after each, each, each individual person. It's, it's, it's really scary."
— DARIO AMADEI • 00:38:05
"And so, you know, we, we, you know, we, we have this, this, this revenue curve that in 2023 went from zero to roughly a hundred, a hundred million went in 2024, went from roughly a hundred million to roughly billion. 2025 went from roughly a billion to roughly 10 billion."
— DARIO AMADEI • 00:40:33
"And so right now, I think there's a breakout moment around quad code among developers. You know, this, this thing about being able to make whole apps and doing things end to end again, that advanced gradually, but with our most recent model, Opus 4.5, it just kind of reached an inflection point, right?"
— DARIO AMADEI • 00:41:48
"So the, the idea that not just a chatbot, but agentic tasks were, were needed. Non-technical people were realizing it, and they wanted it so much that they were wrestling with the command line, right?"
— DARIO AMADEI • 00:43:17
"So, you know, when you think about this technology, it's really, it's really the intersection of research that has going, been going on for many decades, much of which was academic in nature until, you know, a decade, decade and a half ago."
— DARIO AMADEI • 00:51:40
"Scientists, there's a long tradition of scientists thinking about the effects of the technology they build, of thinking of themselves as having responsibility for the technology they build, not ducking responsibility, right?"
— DARIO AMADEI • 00:53:03
"I think the motivation of entrepreneurs, particularly the generation of social media entrepreneurs is very different. The selection effects that operated on them, the way in which they interacted with, you might say, manipulated consumers is very different."
— DARIO AMADEI • 00:53:49
"So I think we need to make more progress on mechanistic interpretability, which is the science of looking inside the models."
— DARIO AMADEI • 00:57:55
"The science of looking inside the AI models, I am convinced that this ultimately holds the key to making the model safe and controllable, because it's the only ground truth we have."
— DARIO AMADEI • 00:59:17
"But I think the harder problem behind that is, okay, what skills are we actually teaching in, in the world of AI? What does education look like, look like in the world of AI?"
— DARIO AMADEI • 01:00:59
"And, and I think one of the things that we should do is we should maybe move away, move away from that notion back to the idea that like, you know, education is, is, is designed to shape you as a person, is designed to build character, is, is, is, is, is, is, is kind of designed to enrich you and, and like make you a better person."
— DARIO AMADEI • 01:02:36
"The nightmare would be that there's, there's like this emerging zeroth world country of like 10 million people. That's like 7 million people in the Bay, you know, in like Silicon Valley and, you know, 3 million people kind of scattered, scattered, scattered throughout that, that, you know, is, is kind of forming its own economy is becoming decoupled or disconnected."
— DARIO AMADEI • 01:05:22
"And so, you know, there we've done work around kind of economic mobility and economic opportunity, but I think both of these again, are going to need to have some, some involvement of the government."
— DARIO AMADEI • 01:07:26
"We're going to find that, that, you know, ideology will, will not survive the nature of this technology. It won't survive reality. The things I'm talking about, you know, while you could today say, oh yeah, they're, they're like politically coded in some way, they're going to become bipartisan and universal because everyone will recognize the necessity of it. Just mark my words."
— DARIO AMADEI • 01:07:50
Upload your interview recording and get the same detailed AI analysis.
Upload Your Interview