Elon Musk, Man of Common Sense, Says Killer Robots Are Probably Bad

Share This Post

Killer robots seem like an inevitability, right? It’s like human nature is all about blood, guts, and sex. This oncoming world of autonomy is being shaped by weapons and sexbots. Okay, sorry – that’s a bleak outlook on the new advances in technology we’re seeing every day. But, no one can blame one for noticing such a trend. And, honestly – an entire military made up of autonomous killing machines….ehhhhh, seems a little risky if you ask me.

Courtesy of The Guardian:

Tesla’s Elon Musk and Alphabet’s Mustafa Suleyman are leading a group of 116 specialists from across 26 countries who are calling for the ban on autonomous weapons.

The UN recently voted to begin formal discussions on such weapons which include drones, tanks and automated machine guns. Ahead of this, the group of founders of AI and robotics companies have sent an open letter to the UN calling for it to prevent the arms race that is currently under way for killer robots.

I guess you have to at least consider this option, and there’s always the argument that, “if we don’t have ’em and our enemies do, then we’re screwed.” It’s a post-apocalyptic dystopian “keeping up with the Joneses.” I mean, I’d certainly rather go into a war with autonomous weapons on our side than lead U.S. human soldiers into war to take on these robots. So, I can understand the desire to build a robotic army. But, I tend to align with Musk’s side. This is a potentially disastrous project.

The founders wrote: “Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.

Yeah, so why not focus efforts on eradicating all autonomous weaponry? How do you make such a sensitive impactful decision? On one hand, you’d no longer be sending humans into battle only to slaughter each other. On the other, if these autonomous weapons ended up in the wrong hands, how much damage could they do on innocent people? This is a huge risk to measure, and quite frankly, I’m not sure how we measure it. Thankfully, I’m just a writer and hold no position of importance.

Experts have previously warned that AI technology has reached a point where the deployment of autonomous weapons is feasible within years, rather than decades. While AI can be used to make the battlefield a safer place for military personnel, experts fear that offensive weapons that operate on their own would lower the threshold of going to battle and result in greater loss of human life.

Sounds bad either way – so, what do we do?

Stoney Keeley is the Editor in Chief of The SoBros Network. A strong advocate of GSD (get shit done) and #BeBetter, he’s down to talk Tennessee Titans and Alabama Crimson Tide football over a beer any day. Check him out covering the WWE for WrestlingNews.co. Follow on Twitter @StoneyKeeley@WrestlingNewsCo

Shop our store on Redbubble. Like us on Facebook. Follow us on Twitter @SoBrosNetwork. Listen on SoundCloud. Watch on YouTube.

Subscribe To Our Newsletter

Get updates and learn from the best

More To Explore

Drinking With

Podcast: Drinking With School of Rock

ICYMI: Pour up a Mount Rock, prop your feet up, and enjoy the ‘Drinking With…’ crew reliving the 2003 Jack Black-led ‘School of Rock.’