5 Sources
[1]
Nations meet at UN for 'killer robot' talks as regulation lags
GENEVA, May 12 (Reuters) - Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules, opens new tab on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." ARMS RACE The New York talks come after 164 states supported a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a U.S. Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said. Reporting by Olivia Le Poidevin; Editing by Joe Bavier Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Society & EquityHuman Rights
[2]
Nations Meet at UN for 'Killer Robot' Talks as Regulation Lags
GENEVA (Reuters) -Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." ARMS RACE The New York talks come after 164 states supported a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a U.S. Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said. (Reporting by Olivia Le Poidevin; Editing by Joe Bavier)
[3]
Nations meet at UN for 'killer robot' talks as regulation lags
The UN is holding its first General Assembly meeting on autonomous weapons, aiming to revive stalled efforts to regulate AI-controlled arms amid growing use in conflicts like Ukraine and Gaza. Experts warn time is running out, urging legally binding global rules before an unchecked arms race escalates.Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. UN Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." Arms race The New York talks come after 164 states supported a 2023 UN General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a US Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said.
[4]
'Politically Unacceptable, Morally Repugnant': UN Chief Calls For Global Ban On ...
"There is no place for lethal autonomous weapon systems in our world," Mr. Guterres said on Monday, during an informal UN meeting in New York focused on the use and impact of such weapons. "Machines that have the power and discretion to take human lives without human control should be prohibited by international law." The two-day meeting in New York brought together Member States, academic experts and civil society representatives to examine the humanitarian and human rights risks posed by these systems. The goal: to lay the groundwork for a legally binding agreement to regulate and ban their use. Human control is vital While there is no internationally accepted definition of autonomous weapon systems, they broadly refer to weapons such as advanced drones which select targets and apply force without human instruction. The Secretary-General said in his message to the meeting that any regulations and prohibitions must make people accountable. "Human control over the use of force is essential," Mr. Guterres said. "We cannot delegate life-or-death decisions to machines." There are substantial concerns that autonomous weapon systems violate international humanitarian and human rights laws by removing human judgement from warfare. The UN chief has called for Member States to set clear regulations and prohibitions on such systems by 2026. Approaching a legally binding agreement UN Member States have considered regulations for autonomous weapons systems since 2014 under the Convention on Certain Conventional Weapons (CCW) which deals with weapons that may violate humanitarian law. Most recently, the Pact for the Future, adopted in September last year, included a call to avoid the weaponization and misuse of constantly evolving weapons technologies. Stop Killer Robots - a coalition of approximately 270 civil society organizations - was one of the organizations speaking out during this week's meeting. Executive Director Nicole van Rooijen told UN News that consensus was beginning to emerge around a few key issues, something which she said was a "huge improvement." Specifically, there is consensus on what is known as a "two-tiered" approach, meaning that there should be both prohibitions on certain types of autonomous weapon systems and regulations on others. However, there are still other sticking points. For example, it remains unclear what precisely characterizes an autonomous weapon system and what it would look like to legislate "meaningful human control." The Secretary-General has repeatedly called for a ban on autonomous weapon systems, saying that the fate of humanity cannot be left to a "black box." Recently, however, there has been increased urgency around this issue, in part due to the quickly evolving nature of artificial intelligence, algorithms and, therefore, autonomous systems overall. "The cost of our inaction will be greater the longer we wait," Ms. Rooijen told us. Ms. Rooijen also noted that systems are becoming less expensive to develop, something which raises concerns about proliferation among both State and non-state actors. The Secretary-General, in his comments Monday also underlined the "need for urgency" in establishing regulations around autonomous weapon systems. "Time is running out to take preventative action," Mr. Guterres said.
[5]
Nations meet at UN for 'killer robot' talks as regulation lags
GENEVA (Reuters) -Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology. Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology. Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent. Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others. U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking. Alexander Kmentt, head of arms control at Austria's foreign ministry, said that must quickly change. "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass," he told Reuters. Monday's gathering of the U.N. General Assembly in New York will be the body's first meeting dedicated to autonomous weapons. Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology's battlefield advantages. Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument. They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September. "This issue needs clarification through a legally binding treaty. The technology is moving so fast," said Patrick Wilcken, Amnesty International's Researcher on Military, Security and Policing. "The idea that you wouldn't want to rule out the delegation of life or death decisions ... to a machine seems extraordinary." ARMS RACE The New York talks come after 164 states supported a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons. While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty. "We have not been convinced that existing law is insufficient," a U.S. Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons. The governments of India, Russia, and China did not respond to requests for comment. In the absence of regulation, autonomous systems are proliferating. Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa. Russian forces, for example, have deployed some 3,000 Veter kamikaze drones - capable of autonomously detecting and engaging targets - to Ukraine, according to its data. Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment. Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law. Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked. And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly. "We do not generally trust industries to self-regulate ... There is no reason why defence or technology companies should be more worthy of trust," she said. (Reporting by Olivia Le Poidevin; Editing by Joe Bavier)
Share
Copy Link
The UN General Assembly convenes its first meeting on autonomous weapons systems, aiming to establish global regulations as AI-controlled arms proliferate in modern conflicts.
The United Nations General Assembly is holding its first-ever meeting dedicated to autonomous weapons systems on Monday, May 12, 2025. This gathering aims to revive efforts to regulate AI-controlled autonomous weapons, which are increasingly being used in modern warfare 123. The meeting comes as experts warn that time is running out to establish guardrails on new lethal technology.
Autonomous and AI-assisted weapons systems are playing an increasingly significant role in conflicts worldwide, from Ukraine to Gaza. With global defense spending on the rise, there is a growing push for the development of AI-assisted military technology 123. However, progress in establishing global rules governing their development and use has not kept pace, and internationally binding standards remain virtually non-existent.
Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss potential regulations on autonomous weapons systems 123. UN Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use 12. However, human rights groups warn that consensus among governments is lacking.
While many countries support a binding global framework, major military powers such as the United States, Russia, China, and India prefer national guidelines or existing international laws 123. A U.S. Pentagon spokesperson stated, "We have not been convinced that existing law is insufficient," adding that autonomous weapons might actually pose less risk to civilians than conventional weapons 12.
In the absence of regulation, autonomous systems are proliferating rapidly. The Future of Life Institute think tank has tracked the deployment of approximately 200 autonomous weapon systems across Ukraine, the Middle East, and Africa 123. For instance, Russian forces have reportedly deployed around 3,000 Veter kamikaze drones capable of autonomously detecting and engaging targets in Ukraine 12.
Human Rights Watch has warned that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if left unchecked 12. UN Secretary-General Antonio Guterres has called for a global ban on lethal autonomous weapon systems, stating, "There is no place for lethal autonomous weapon systems in our world" 4.
The New York talks are seen as a crucial litmus test for countries to bridge divisions ahead of the next round of CCW talks in September 123. Campaign groups hope the meeting will push states to agree on a legal instrument addressing critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors 12.
As the technology continues to advance rapidly, there is a growing sense of urgency among experts and campaigners. Alexander Kmentt, head of arms control at Austria's foreign ministry, emphasized, "Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don't come to pass" 123.
NASA and IBM have developed Surya, an open-source AI model that can predict solar flares and space weather, potentially improving the protection of Earth's critical infrastructure from solar storms.
5 Sources
Technology
7 hrs ago
5 Sources
Technology
7 hrs ago
Meta introduces an AI-driven voice translation feature for Facebook and Instagram creators, enabling automatic dubbing of content from English to Spanish and vice versa, with plans for future language expansions.
8 Sources
Technology
23 hrs ago
8 Sources
Technology
23 hrs ago
OpenAI CEO Sam Altman reveals plans for GPT-6, focusing on memory capabilities to create more personalized and adaptive AI interactions. The upcoming model aims to remember user preferences and conversations, potentially transforming the relationship between humans and AI.
2 Sources
Technology
23 hrs ago
2 Sources
Technology
23 hrs ago
Chinese AI companies DeepSeek and Baidu are making waves in the global AI landscape with their open-source models, challenging the dominance of Western tech giants and potentially reshaping the AI industry.
2 Sources
Technology
7 hrs ago
2 Sources
Technology
7 hrs ago
A comprehensive look at the emerging phenomenon of 'AI psychosis', its impact on mental health, and the growing concerns among experts and tech leaders about the psychological risks associated with AI chatbots.
3 Sources
Technology
7 hrs ago
3 Sources
Technology
7 hrs ago