The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Tue, 13 May, 12:02 AM UTC
4 Sources
[1]
Nations meet at UN for 'killer robot' talks as regulation lags
GENEVA, May 12 (Reuters) - Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules, opens new tab on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." ARMS RACE The New York talks come after 164 states supported a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a U.S. Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said. Reporting by Olivia Le Poidevin; Editing by Joe Bavier Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Society & EquityHuman Rights
[2]
Nations Meet at UN for 'Killer Robot' Talks as Regulation Lags
GENEVA (Reuters) -Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." ARMS RACE The New York talks come after 164 states supported a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a U.S. Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said. (Reporting by Olivia Le Poidevin; Editing by Joe Bavier)
[3]
Nations meet at UN for 'killer robot' talks as regulation lags
The UN is holding its first General Assembly meeting on autonomous weapons, aiming to revive stalled efforts to regulate AI-controlled arms amid growing use in conflicts like Ukraine and Gaza. Experts warn time is running out, urging legally binding global rules before an unchecked arms race escalates.Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. UN Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." Arms race The New York talks come after 164 states supported a 2023 UN General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a US Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said.
[4]
Nations meet at UN for 'killer robot' talks as regulation lags
GENEVA (Reuters) -Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." ARMS RACE The New York talks come after 164 states supported a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a U.S. Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said. (Reporting by Olivia Le Poidevin; Editing by Joe Bavier)
Share
Share
Copy Link
The UN General Assembly convenes its first-ever meeting on autonomous weapons systems, as experts urge for swift regulation of AI-controlled arms to prevent potential humanitarian crises and an unchecked arms race.
The United Nations General Assembly is holding its first-ever meeting dedicated to autonomous weapons on Monday, May 12, 2025. This gathering aims to revive efforts to regulate AI-controlled autonomous weapons systems, which are increasingly being used in modern warfare 1. Experts warn that time is running out to establish guardrails on this new lethal technology.
Autonomous and AI-assisted weapons systems are playing an increasingly significant role in conflicts worldwide, from Ukraine to Gaza. The rising global defense spending is expected to further boost the development of AI-assisted military technology 2. However, progress in establishing global rules governing their development and use has not kept pace, with internationally binding standards remaining virtually non-existent.
Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss potential regulations for autonomous weapons systems 3. UN Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. However, human rights groups warn that consensus among governments is lacking.
While many countries support a binding global framework, major military powers such as the United States, Russia, China, and India prefer national guidelines or existing international laws 4. A U.S. Pentagon spokesperson stated, "We have not been convinced that existing law is insufficient," adding that autonomous weapons might actually pose less risk to civilians than conventional weapons.
In the absence of regulation, autonomous weapons systems are proliferating rapidly. The Future of Life Institute think tank has tracked the deployment of approximately 200 autonomous weapon systems across Ukraine, the Middle East, and Africa. For instance, Russian forces have reportedly deployed around 3,000 Veter kamikaze drones capable of autonomously detecting and engaging targets in Ukraine.
Human rights organizations and campaigners are raising alarm about the potential consequences of unregulated autonomous weapons. Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing, emphasized the need for a legally binding treaty, stating, "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary."
The UN meeting is seen as a crucial litmus test for countries' ability to bridge divisions ahead of the next round of CCW talks in September. While not legally binding, diplomatic officials hope the consultations will increase pressure on military powers resisting regulation. The international community faces the challenge of balancing technological advancements with ethical considerations and human rights concerns in the rapidly evolving field of AI-controlled weapons systems.
Reference
[2]
[3]
[4]
United Nations experts urge the establishment of a global governance framework for artificial intelligence, emphasizing the need to address both risks and benefits of AI technology on an international scale.
11 Sources
11 Sources
China declines to join nearly 100 nations in signing a declaration prohibiting AI control of nuclear weapons, citing concerns over the agreement's potential impact on military AI development.
3 Sources
3 Sources
As world leaders gather in Paris for an AI summit, experts emphasize the need for greater regulation to prevent AI from escaping human control. The summit aims to address both risks and opportunities associated with AI development.
2 Sources
2 Sources
South Korea hosts a summit to discuss the implementation of artificial intelligence in military operations. The event brings together experts to address the potential benefits and challenges of AI in defense.
17 Sources
17 Sources
Google has quietly removed its commitment not to use AI for weapons or surveillance, signaling a shift towards potential military applications amidst growing competition and national security concerns.
40 Sources
40 Sources