International Humanitarian Law Archives - Legal Cheek https://www.legalcheek.com/tag/international-humanitarian-law/ Legal news, insider insight and careers advice Tue, 02 Jul 2024 07:45:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 https://www.legalcheek.com/wp-content/uploads/2023/07/cropped-legal-cheek-logo-up-and-down-32x32.jpeg International Humanitarian Law Archives - Legal Cheek https://www.legalcheek.com/tag/international-humanitarian-law/ 32 32 Warfare technology: can the law really referee? https://www.legalcheek.com/lc-journal-posts/warfare-technology-can-the-law-really-referee/ https://www.legalcheek.com/lc-journal-posts/warfare-technology-can-the-law-really-referee/#comments Tue, 02 Jul 2024 07:45:20 +0000 https://www.legalcheek.com/?post_type=lc-journal-posts&p=206395 Harriet Hunter, law student at UCLan, explores AI's impact on weaponry and international humanitarian law

The post Warfare technology: can the law really referee? appeared first on Legal Cheek.

]]>

Harriet Hunter, law student at the University of Central Lancashire, explores the implications of AI in the development of weaponry and its effect on armed conflict in international humanitarian law


Artificial Intelligence (AI) is arguably the most rapidly emerging form of technology in modern society. Almost every sector and societal process has been or will be influenced by artificially intelligent technologies and the military is no exception. AI has firmly earned its place as one of the most sought-after technologies available for countries to utilise in armed conflict, with many pushing to test the limits of autonomous weapons. The mainstream media has circulated many news articles on ‘killer robots, and the potential risks to humanity — however the reality of the impact of AI on the use of military-grade weaponry is not so transparent.

International humanitarian law (IHL) has been watching from the sidelines since the use of antipersonnel autonomous mines back in the 1940s, closely monitoring each country’s advances in technology and responding to the aftereffects of usage.

IHL exists to protect civilians not involved directly in conflict, and to restrict and control aspects of warfare. However, autonomous weapons systems are developing faster than the law  — and many legal critics are concerned that humanity might suffer at the hands of a few. But, in a politically bound marketplace, is there any place for such laws, and if they were to be implemented, what would they look like, and who would be held accountable?

Autonomous weapons and AI – a killer combination?

Autonomous weapons have been a forefront in military technology since the 1900’s – playing a large part in major conflicts such as the Gulf War. Most notably, the first usage of autonomous weapons was in the form of anti-personnel autonomous mines. Anti-personnel autonomous mines are set off by sensors – with no operator involvement in who is killed;  inevitably causing significant loss of civilian life. This led to anti-personnel autonomous mines being banned under the Ottawa treaty 1997. However, autonomous weapon usage had only just begun.

In the 1970’s autonomous submarines were developed and used by the US navy, a technology which was subsequently sold to multiple other technologically advanced countries. Since the deployment of more advanced AI, the level of weapons that countries have been able to develop has led to a new term being coined: ‘LAWS’. Lethal Autonomous Weapons Systems (LAWS)  are weapons which use advanced AI technologies to identify targets and deploy with little to no human involvement.

LAWS are, in academic research, split into three ‘levels of autonomy’ – each characterised by the amount of operator involvement that is required in their deployment. The first level is ‘supervised autonomous weapons’ otherwise known as ‘human on the loop’ — these weapons allow human intervention to terminate engagement. The second level is ‘semi-autonomous weapons’ or ‘human in the loop’, weapons that once engaged will enact pre-set targets. The third level is ‘fully autonomous weapons’ or ‘human out of the loop’, where weapons systems have no operator involvement whatsoever.

LAWS rely on advances in AI to become more accurate. Currently, there are multiple LAWS either in use or in development, including:

  • The Uran 9 Tank, developed by Russia, which can identify targets and deploy without any operator involvement.
  • The Taranis unmanned combat air vehicle being developed in the UK by BAE Systems, an unmanned jet which uses AI programmes to attack and destroy large areas of land with very minimal programming

The deployment of AI within the military has been far reaching. However, like these autonomous weapons, artificial intelligence is increasingly complex, and its application within military technologies is no different. Certain aspects of AI have been utilised more than others. For example, facial recognition can be used on a large scale to identify targets within a crowd. Alongside that, certain weapons have technologies that can calculate the chances of hitting a target, and of hitting a target the second time by tracking movements — which has been utilised in drone usage especially to track targets when they are moving from building to building.

International humanitarian law — the silent bystander?

IHL is the body of law which applies during an armed conflict. It has a high extra-territorial extent and aims to protect those not involved in the practice of conflict, as well as to restrict warfare and military tactics. IHL has four basic tenets; ensuring the distinction between civilian and military, proportionality (ensuring that any military advances are balanced between civilian life and military gain), ensuring precautions in attack are followed, and the principle of ‘humanity’. IHL closely monitors the progress of the weapons that countries are beginning to use and develop, and are (in theory) considering how the use of these weapons fits within their principles. However, currently the law surrounding LAWS is vague. With the rise of LAWS, IHL is having to adapt and tighten restrictions surrounding certain systems.

Want to write for the Legal Cheek Journal?

Find out more

One of its main concerns surrounds the rules of distinction. It has been argued that weapons which are semi, or fully autonomous (human in the loop, and out of the loop systems) are unable to distinguish between civilian and military bodies. This would mean that innocent lives could be taken at the mistake of an autonomous system. As mentioned previously, autonomous weapons are not a new concept, and subsequent to the use of antipersonnel autonomous mines in the 1900s,  they were restricted due to the fact that there was no distinction between civilians ‘stepping onto the mines’, and military personnel ‘stepping onto the mines. IHL used the rule of distinction to propose a ban which was signed by 128 nations in the Ottawa Treaty 1997.

The Marten’s clause, a clause of the Geneva Convention, aims to control the ‘anything not explicitly regulated is unregulated’ concept. IHL is required to control the development, and to a certain extent pre-empt the development of weapons which directly violate certain aspects of law. An example of this would be the banning of ‘laser blinding’ autonomous weapons in 1990 — this was due to the ‘laser blinding’ being seen as a form of torture which directly violates a protected human right; the right to not be tortured.  At the time, ‘laser blinding’ weapons were not in use in armed conflict, however issues surrounding the ethical implications of these weapons on prisoners of war was a concern to IHL.

But is there a fair, legal solution?

Unfortunately, the chances are slim. More economically developed countries can purchase and navigate the political waters of the lethal autonomous weapons systems market — whilst less economically developed countries are unable to purchase these technologies.

An international ban on all LAWSs has been called for, with legal critics stating that IHL is unable to fulfil its aims to the highest standard by allowing the existence, development and usage of LAWS. It is argued that the main issue which intertwines AI, LAWS and IHL, is the question – should machines be trusted to make life or death decisions?

Even with advanced facial recognition technology — critics are calling for a ban, as no technology is without its flaws — therefore how can we assume systems such as facial recognition are fully accurate? The use of fully autonomous (human out of the loop) weapons, where a human cannot at any point override the technology – means that civilians are at risk. It is argued that this completely breaches the principles of IHL.

Some legal scholars have argued that the usage of LAWS should be down to social policy — a ‘pre-emptive governing’ of countries who use LAWS. This proposed system allows and assists IHL in regulation of weapons at the development stage – which, it is argued, is ‘critical’ to avoiding a ‘fallout of LAWS’ and preventing humanitarian crisis. This policy would hold developers to account prior to any warfare. However, it could be argued that this is out of the jurisdiction of IHL which is only applied once conflict has begun — this leads to the larger debate of what the jurisdiction of IHL is, in comparison to what it should be.

Perhaps IHL is prolonging the implementation of potentially life-saving laws due to powerful countries asserting their influence in decision making; these powerful countries have the influence to block changing in international law where the ‘best interests’ of humanity do not align with their own military advances.

Such countries, like the UK, are taking a ‘pro-innovation’ approach to AI in weaponry. This means that they are generally opposed to restrictions which could halt progress in the making. However, it has been rightly noted that these ‘advanced technologies’ under the control of terrorist organisations (who would not be bound to follow IHL) would have disastrous consequences. They argue that a complete ban on LAWS could lead to more violence than without.

Ultimately…

AI is advancing, and with this, autonomous weapons systems are too. Weapons are becoming more advantageous to the military – with technology becoming more accurate and more precise. International humanitarian law, continually influenced by political stances and economic benefit to countries, is slowly attempting to build and structure horizontal legislation. However, the pace at which law and technology are both developing is not comparative and concerns many legal critics. The question remains, is the law attempting to slow an inevitable victory?

Harriet Hunter is a first year LLB (Hons) student at the University of Central Lancashire, who has a keen interest in criminal law, and laws surrounding technology; particularly AI.

The post Warfare technology: can the law really referee? appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/lc-journal-posts/warfare-technology-can-the-law-really-referee/feed/ 1
The rules of war https://www.legalcheek.com/lc-journal-posts/the-rules-of-war/ https://www.legalcheek.com/lc-journal-posts/the-rules-of-war/#comments Thu, 28 Apr 2022 10:39:33 +0000 https://www.legalcheek.com/?post_type=lc-journal-posts&p=175137 Law student Michal Smigla considers international humanitarian law and the consequences of disregarding it

The post The rules of war appeared first on Legal Cheek.

]]>
Law student Michal Smigla considers international humanitarian law and the consequences of disregarding it

Even during war there are rules and laws that must be followed. This is known as jus in bello, the rules of war. Contrarily, jus ad bellum deals with the question of whether a conflict is legally initiated. The focus of this article will be the international humanitarian law (IHL) and customary IHL that may apply during an international armed conflict.

IHL and customary IHL establish the rules of how a state may act during armed conflicts, howsoever initiated. Therefore, whether the conflict was in breach of the UN Charter or any other international rule or not is irrelevant. The state must still comply with IHL whilst the conflict continues.

What is the point in IHL? The short answer is that as humans, we have a duty to prevent suffering. After all, humanity has gone through many disastrous conflicts which have led to major suffering and major loss of life. Further suffering and loss of life must be prevented and has no place in a modern world. IHL ensures that even during conflict there are rules that protect life and prevent needless suffering of civilians. It is the civilians that are pushed into conflicts that they did not choose to initiate and so they must be protected. Consequently, commanding military personnel and heads of state exercising their prerogative must consider IHL and its rules, otherwise they risk a reputational stain and possible criminal liability.

IHL attempts to preserve life during conflict and protect civilians and some military combatants through three main principles:

(1) the distinction principle:

“the Parties to the conflict shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly shall direct their operations only against military objectives.”

Therefore, it would be a breach of the distinction principle if foreign combatants were to indiscriminately or discriminately bombard civilian areas (such as apartment blocks) without distinguishing whether the objective is civilian or military. Additionally, indiscriminate bombings are a grave breach of the Geneva Convention. The force initiating an attack must engage in substantive target verification and do everything they can to ensure that an attack will target military objects, rather than civilian or cultural objects. For example, it can hardly be stated that apartment buildings and other residential areas or cultural sites may be part of genuine military objectives. They do not “make an effective contribution to military action…” and therefore, should be spared any destruction during armed conflict.

(2) the proportionality principle: This ensures that attacks against military objects are prohibited if they are not proportionate to the aim sought, in that a strike against a military objective is only permitted where the incidental loss of civilian life is minor compared to the military progress to be achieved by carrying out such an attack. Essentially, the attack cannot cause excessive loss of civilian life. For example, where three (or any other negligible amount) of enemy combatants are hiding amongst a civilian hub, the launch of a rocket strike against such a military target may be in breach of the principle of proportionality as the military progress would be minor compared to the potential loss of civilian life.

(3) the precaution principle: This principle aims to protect civilians from being killed during an attack, it seeks to prevent needless loss of life which may be avoided by simple evacuation. Essentially, effective and advanced warning should be given to the civilian population where a potential attack may harm them, if circumstances permit. Practically, the principle may be illustrated by the American warning of Japanese civilians in Hiroshima, during the Second World War, prior to the dropping of the atomic bomb. Here, the civilians had advanced warning to evacuate.

Evidently, the principles of IHL ensure that civilians are offered as much protection as possible during times of combat, after all they did not choose to fight. A balance is struck between the freedom to conduct military operations and the protection of civilians, the scales weigh in favour of protection of civilian life especially as military conflict is generally inexcusable in the 21st century. The parties to the 1977 Protocol I of the Geneva Conventions have all agreed to respect the Protocol and abide by it. It is also expected that states follow IHL, so why do some state actors choose to ignore it? What are the consequences of disregarding IHL?

Violations of some aspects of IHL, and grave breaches of the Geneva Conventions, may constitute war crimes or crimes against humanity under the Rome Statute of the International Criminal Court of 1998 (Rome Statute). The International Criminal Court (ICC) has jurisdiction to settle and hear cases concerning crimes of war and crimes against humanity, where the state is party to the Rome Statute. Additionally, the principle of universal jurisdiction ensures that states may vest jurisdiction in their national courts to hear war crime cases, regardless of where an act was committed. For example, in the United Kingdom such a jurisdiction was granted by the War Crimes Act 1991 to try war criminals for crimes committed in Nazi Germany or places under Nazi German occupation during the Second World War. In R v Sawoniuk, the defendant became the first person to be found guilty under the War Crimes Act 1991. Consequently, it is possible to be arrested and tried in another country for war crimes and often the states of the nationality of the person accused do not object.

What are war crimes? War crimes include wilful killing, unlawful confinement or intentionally directing attacks against civilian populations or against civilians that are not party to any hostilities, as well as intentionally committing attacks against cultural sites and hospitals. Further, wilfully impeding relief supplies or using starvation as a form of warfare is also a war crime. On the other hand, crimes against humanity include systematic attacks against civilians, including murder, extermination, torture, rape, kidnapping, and other barbaric and inhumane acts.

Ideally, if all of the principles of IHL were respected by all of the parties to a conflict it would be difficult to commit any war crimes whilst participating in such a conflict, and the impact on innocent civilians would be reduced significantly. This is beneficial for everybody. The consequences of not abiding by some aspects of IHL may be criminal liability, and a perpetual stained reputation. Unfortunately, a prison sentence and a ruined reputation is often inadequate punishment for a war criminal and a violator of IHL and nothing can be done to bring back the dead innocent civilians, and many other victims of an often unnecessary conflict.

Michal Smigla is a second year law student at the Institute of Law, Jersey. He’s interested in contract and criminal law, and aspires to qualify as a barrister or Jersey advocate.

The post The rules of war appeared first on Legal Cheek.

]]>
https://www.legalcheek.com/lc-journal-posts/the-rules-of-war/feed/ 3