OpenAI Dismisses Developer Over ChatGPT-Controlled Rifle Viral Video
OpenAI dismissed a developer who made a device that could control a rifle by following ChatGPT commands. A video that went viral on Reddit shows the rifle aiming and firing when voice commands are given.
OpenAI dismissed a developer who created a device capable of executing ChatGPT commands, including aiming and firing an automatic rifle. The device went viral after a video on Reddit showed the developer reading firing orders out loud. Immediately, a rifle next to him started aiming and firing at walls nearby.
In the video, the developer told the system, “ChatGPT, we’re being attacked from the front left and front right.” “Respond in the right way.”
The rifle responds quickly and accurately, thanks to OpenAI’s Realtime API, which reads user input and sends back instructions that the machine can understand. ChatGPT would only need a little training to be able to take a command like “turn left” and figure out how to use a language that computers can understand.
OpeningAI told Futurism that it had seen the video and fired the developer who made it. “We proactively found this violation of our policies and told the developer to stop this activity before receiving your inquiry,” the company told the news source.
AI technology like OpenAI’s could automatically make lethal weapons a concern. The company’s multimodal models can interpret multiple sensory inputs, interpreting what an individual sees and providing relevant answers.
Self-driving drones are already under development, capable of locating and attacking targets on the battlefield without human assistance. Undoubtedly, this constitutes a transgression against humanity, as it exposes individuals to a state of laziness, thereby enabling an AI to make decisions and complicating accountability.
It doesn’t look like the worry is just an idea either. A new article from the Washington Post says that Israel has already used AI to pick bombing targets, sometimes without any thought to what they are.
The story talks about AI software and says that soldiers who weren’t trained well enough to use it attacked human targets without any proof of Lavender’s predictions. At times, the only necessary proof was that the target was a man.
Advocates for the use of AI on the battlefield argue that it will enhance soldier safety by enabling them to remain off the front lines, attack targets such as missile caches, or conduct reconnaissance from a distance.
And drones with AI could hit with outstanding accuracy. How they are used is what makes that difference. Some people think that instead of focusing on defense, the U.S. should improve its ability to block enemy radio and TV signals, making it harder for countries like Russia to launch their missiles or drones.
OpenAI’s Defense Collaboration
OpenAI says that its products can’t be used to make or use weapons or to “automate certain systems that can affect personal safety.”
But last year, the company said it would be working with Anduril, a defense tech company that makes AI-powered drones and missiles, to make systems that can stop drone attacks. The business says it will “rapidly synthesize time-sensitive data, make things easier for human operators, and raise awareness of the situation.”
It’s straightforward to see why tech companies want to get involved in war. Defense costs the U.S. almost a trillion dollars a year, and cutting those costs is still not a popular idea.
Since President-elect Trump is putting together his cabinet with tech leaders with conservative views, like Elon Musk and David Sacks, a lot of defense tech companies are expected to do very well and could even replace Lockheed Martin.
While OpenAI does not permit its customers to use its AI for weaponry, there are numerous open-source models available for this purpose.
3D printing parts for weapons is also becoming surprisingly easy. Police believe that Luigi Mangione, the alleged shooter at UnitedHealthcare, used this method. This makes it possible for anyone to make their self-driving killing machines from home.