With the development of artificial intelligence, pressure is increasing to control “deadly robots”

Every day, we voluntarily hand over information about ourselves to machines. This happens when we accept an online correlation file or use search engines. We may not think about how to sell and use our data before we press the “approval” button to reach the page we want, or we are fully aware that it will be used to target us as consumers and persuade us to buy something we did not know that we needed it.
But what if these machines are using data to determine who should be targeted by killing as enemies? The United Nations and a group of NGOs are very concerned that this scenario is about to become a reality. Therefore, it demands an international organization of deadly operating weapons to avoid a near future in which the machines are dictated by the options of life and death.
Wide war with marches is revealed in Ukraine
For several months, the Jeuceon region in Ukraine was subjected to a continuous attack by Russian drones, which mainly target non -fighters. More than 150 civilians were killed and hundreds were wounded, according to official sources. An independent international investigation in the field of human rights concluded that these attacks constitute “Crimes against humanity.”
From the archive: Civilians in Kharkiv were subjected to hundreds of attacks by Russian drones.
The Ukrainian army also relies heavily on drones, while reports indicate that it is developing “Wall” One of the drones to protect the areas at risk at the country’s borders.
After it was previously exclusive to the richest countries capable of bearing the cost of the most advanced and expensive drones, Ukraine has proven that with a little creativity, low -cost marches can be modified to have a deadly effect. With the reflection of this shift in conflicts around the world, the nature of modern fighting is reformulated.
The growth of “digital abstraction from humanity”
However, despite the destruction that this modern form may cause, the escalation of the ghosts of marches or other self -operating weapons increases the ongoing concerns about “deadly robots” that rain from the sky, and decide itself who should attack it.
Isomi Nakamitsu, the United Nations representative of disarmament affairs, said that the organization’s position on these weapons was clear, adding: “The Secretary -General has always said that the use of machines with the fully delegated power that make decisions related to the killing of humans are morally alienated. It should not be allowed. Rather, it must be banned under international law.”
Human Rights Watch, an international non -governmental organization in the field of human rights, said that the use of self -operating weapons will be the latest and most dangerous example of “Digital abstraction from humanity”, Artificial intelligence makes a set of decisions that affect human lives, such as police work, law implementation and border control.
Mary Werham, director of the Human Rights Watch Arms Department, warned: “Many countries with enormous resources are invested extensively in artificial intelligence and relevant technologies to develop wild and sea self -operating weapons systems. This is a reality. The United States leads this trend, but other major countries, such as Russia, China, Israel and South Korea, are invested extensively in self -operating arms systems.”
Supporters of the use of artificial intelligence in wars often indicate the limited human capabilities to justify its expansion. The soldiers may make mistakes in their estimates, act based on their emotions, and need rest, and of course, they demand wages – while they claim that the machines are improving day by day in determining threats based on the patterns of behavior and movement. Some supporters suggest that the next step is to allow independent systems to determine the timing of the shooting.

A drone flying in the sky of Afghanistan.
There are two main objection to allowing machines to control the battlefields: First, this technology is far from being guaranteed. Second, the United Nations and many other organizations see that the use of deadly operational weapons is immoral.
Mrs. Wraham from Human Rights Watch said: “It is very easy for machines to make mistakes in defining human goals. People with disabilities are especially at risk because of their movement. Their moving chairs and weapons can be confused. There are also fears that facial recognition technology and other vital measurements cannot determine people with different skin colors properly. The defects still marry artificial intelligence, and it includes the biases of people who have programmed these systems.”
Regarding moral and moral objections, Nicole van Rogen, Executive Director of the Alliance said “Stop deadly robots,” It is an alliance that fights for a new international law on independence in arms systems. These weapons will make it very difficult to determine responsibility for war crimes and other atrocities.
Pastws: “Who is the official? Is it the manufacturer? Or the person who programmed the algorithm? It raises a wide range of issues and fears, and its use will be widely as a moral failure.”
It was banned by 2026?
The speed of technology and the evidence for the use of targeted targeting systems already in the battlefield, increases the insistence of invitations to setting international rules for this technology.
In May, unofficial discussions were held at the United Nations headquarters, in which Mr. Guterres called on member states to approve a law -bound agreement to regulate and prohibit the use of those weapons by 2026.
Attempts to organize and prohibit independent deadly weapons systems are not new. In fact, the United Nations held the first diplomats meeting in 2014, at the Palace of Nations in Geneva, where French Ambassador Jean Hughes Simon Michel chaired four -day talks.
He described the independent deadly weapons systems “A difficult emerging issue on the agenda of disarmament,” Although none of these systems are not used in conflicts. The prevailing opinion at the time was the necessity of taking proactive action to set the rules in the event that technology made independent deadly weapons systems a reality.
After 11 years, the talks are still ongoing, but there is no way yet to agree on the definition of self -operating weapons, not to mention the existence of agreed controls regarding their use. However, NGOs and the United Nations are optimistic that the international community is slowly moving towards a common understanding of the main issues.
Ms. Rogin from the organization “Stop the Killer Robots”: “We are not in any way close to negotiating a text.”
However, she referred to a text “Renewed and very promising and can be a basis for negotiations if the political will and courage is available” It is presented by the current president of the Convention on the Prohibition or Restriction of the Using of Cereleral Traditional Weapons – a Humanitarian, UN legal tool aimed at banning or restricting the use of specific types of weapons “that are caused by unnecessary or unjustified suffering for fighters or affecting civilians without discrimination.”
Wrashham from Human Rights Watch also sees that the talks that took place in May at the United Nations are an important step forward, where she said: “There are at least 120 countries that fully support the call to negotiate a new international law on self -operating weapons systems. We see great attention and support, including Nobel Peace Awards, artificial intelligence experts, technology workers, and religious leaders.”
Ms. Nakamitsu, Undersecretary of the Secretary -General for disarmament, said: “There is an emerging agreement on the need to fully ban self -operating arms systems. When it comes to war, responsibility must be made on a party.”
- For more: Follow Khaleejion 24 Arabic, Khaleejion 24 English, Khaleejion 24 Live, and for social media follow us on Facebook and Twitter