coming soon to a battlefield near you
In this post, we will focus on one particular aspect of the global race towards Armageddon the deployment of fully-autonomous, AI-piloted, weaponized drones & swarms: that is, specifically, low-cost, “kamikaze”-style drones, which once deployed fly fully autonomously, either individually or more likely in swarms, armed with the purchasers choice of munitions.
Available ammo for your drone of choice being, in order of popularity:
- a 1kg payload of C4 explosive
- a 9mm long-gun connected to a 90-shot auto-feed clip
- a single RPG (rocket propelled grenade)
While there has obviously been ample “visualization” of this scenario in the sci-fi movie canon, (most notably, if not size-wise, the massive “squid” invasion of Zion in The Matrix: Reloaded), we longer have to rely on special effects. The age of mass-produced, inexpensive, fully autonomous- and weaponized- drones is upon us. The first documented case of a fully-automated AI drone kill (no human operator) was in Libya in March 2020… it will most certainly not be the last.
Obviously the US has a strong lead in lethal, hardened, militarized UAV development. That’s not comforting. Because the 2nd and 3rd places in this particular arms race are China and Turkey… and unlike the US (?maybe?), those 2 countries have very little restriction and even less qualms about selling their tech to the highest bidder.
If you think this is bullsh*t, take a look for yourself:
Weaponized Drones : the Global Arms Market
The product is the Kargu Drone.
Technically, the “Kargu Rotary Wing Autonomous Loitering Munitions System”
The manufacturer is Turkey-based STM Defense.
I think we’ll just use photographs and video stills to illustrate the remainder of this point. They pretty much speak for themselves:
US Military Tests Weaponized Drones
courtesy of the Washington Post / ViceTV
Weaponized Drones / “Killer Robot” Videos
To bring it all together, we present 3 excellent videos:
1) A solid piece from the New York Times featuring Stuart Russel and mapping out the general features of the argument
2) actual (chilling) sales video from STM Defense selling their Kargu “swarm” tech,
and…
3) Slaughterbots, a dramatized short-film which foresaw this problem 5 years before it emerged, created by (again) Stuart Russell, DUST Films, and the prescient Future of Life Institute.
What a real drone swarm will look like
Murmuration, 40,000 starlings from flight404 on Vimeo.
New York Times : AI Killer Robots
STM: Kargu AI Autonomous Drone Swarm
3. Slaughterbots [DUST]
Drone / AI CyberWarfare Books
- Kill Decision, by Daniel Suarez
.
One final note to contemplate:
Your shiny new Tesla is hackable.
You don’t think these drones are too?
[This post is dedicated to the notorious Zach Cowan,
U.S. Navy. callsign: DK — “Drone Killer”]
UPDATE: 2023.JAN.31
Directive 3000.09 :
Killer AI Robot Attack Systems
The Pentagon (U.S. DoD) has just released its revised guidance on “Directive 3000.09,” the development and deployment of fully autonomous & AI decision-making live fire weapons systems. And it ain’t pretty.
Skynet, here we come:
“The DoD has consistently opposed a policy standard of ‘meaningful human control’ when it comes to both autonomous systems and AI systems,” Gregory Allen, director of the Project on AI Governance at the Center for Strategic and International Studies, told me. “The preferred DoD term of art is ‘appropriate levels of human judgement,’ which reflects the fact that in some cases – autonomous surveillance aircraft and some kinds of autonomous cyber weapons, for example – the appropriate level of human control may be little to none.”
— Forbes,
The Pentagon Updates its Policy on ‘Killer Robots’
A major area of development is “non-human targeting drone defense systems.” It’s basically the Ai version of the “stand your ground” gun laws: If a drone is attacked or feels substantially threatened, the Pentagon advocates on-board autonomously launched weapons systems to launch counter-attacks, disabling or destroying the threats — which, for these purposes, are assumed to be unmanned vehicles, missile launching platforms, or unmanned enemy drones. But the question is: How can the machine tell if these things are “unmanned”… or if there are even humans in nearby structures. This is the problem with autonomous weaponized drones: once the decision is out of human hands, the cascade effects of multiple lightning-speed incorrect decisions can be utterly catastrophic.
Active war zones, for sure, are already some of the most dangerous places on planet earth. But with the deployment of fully-autonomous, AI-driven lethal fire decision mechanisms, the speed of escalation and scope of destruction could go absolutely haywire… and haywire at computing speeds — nanosecond cycles — countless orders of magnitude faster than human thinking and reaction speeds.
The Drones are Coming! The Drones are Coming!
You can hear that loud humming/buzzing quadcopter noise filling the air…
Is anybody listening?