Palmer Luckey—Publicly advocated for autonomous lethal weapons, opposing UN Secretary-General's call for ban
In a May 2025 60 Minutes interview, Palmer Luckey publicly defended autonomous weapons that operate using AI without human control, arguing 'it is too morally fraught an area, it is too critical of an area to not apply the best technology available.' He directly opposed UN Secretary-General Guterres' call for a treaty banning autonomous lethal weapons by 2026, dismissing the concern. He also stated 'There's no moral high ground to making a land mine that can't tell the difference between a school bus full of children and Russian armor.' Anduril's systems include weapons that can identify, select, and engage targets autonomously.
Scoring Impact
| Topic | Direction | Relevance | Contribution |
|---|---|---|---|
| AI Safety | -against | secondary | -0.50 |
| Military & Defense Contracts | +toward | primary | -1.00 |
| Overall incident score = | -0.745 | ||
Score = avg(topic contributions) × significance (high ×1.5) × confidence (0.66)
Evidence (2 signals)
Luckey advocated US should shift from 'world police' to 'world gun store'
Benzinga reported Palmer Luckey's statement that 'I've always said that we need to transition from being the world police to being the world gun store,' further articulating his vision for autonomous weapons proliferation and defense exports.
Luckey defended autonomous weapons on 60 Minutes, dismissed UN Secretary-General's ban call
In a CBS 60 Minutes interview, Palmer Luckey defended autonomous lethal weapons, arguing AI should be applied to life-and-death military decisions. He dismissed UN Secretary-General Guterres' call for a treaty banning autonomous weapons by 2026.