<?xml version="1.0"?><rdf:RDF xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:edm="http://www.europeana.eu/schemas/edm/" xmlns:wgs84_pos="http://www.w3.org/2003/01/geo/wgs84_pos" xmlns:foaf="http://xmlns.com/foaf/0.1/" xmlns:rdaGr2="http://rdvocab.info/ElementsGr2" xmlns:oai="http://www.openarchives.org/OAI/2.0/" xmlns:owl="http://www.w3.org/2002/07/owl#" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:ore="http://www.openarchives.org/ore/terms/" xmlns:skos="http://www.w3.org/2004/02/skos/core#" xmlns:dcterms="http://purl.org/dc/terms/"><edm:WebResource rdf:about="http://www.dlib.si/stream/URN:NBN:SI:doc-XPIPEUB2/b9e4f3c2-7020-466f-9ffe-d992d0b6f204/PDF"><dcterms:extent>318 KB</dcterms:extent></edm:WebResource><edm:WebResource rdf:about="http://www.dlib.si/stream/URN:NBN:SI:doc-XPIPEUB2/787fd8c0-33fa-4976-888e-2cb4d63f4ca4/TEXT"><dcterms:extent>64 KB</dcterms:extent></edm:WebResource><edm:TimeSpan rdf:about="2011-2025"><edm:begin xml:lang="en">2011</edm:begin><edm:end xml:lang="en">2025</edm:end></edm:TimeSpan><edm:ProvidedCHO rdf:about="URN:NBN:SI:doc-XPIPEUB2"><dcterms:isPartOf rdf:resource="https://www.dlib.si/details/URN:NBN:SI:SPR-OSUSX1U0" /><dcterms:issued>2024</dcterms:issued><dc:creator>Shany, Yuval</dc:creator><dc:format xml:lang="sl">številka:letn. 84</dc:format><dc:format xml:lang="sl">str. 167-188, 349-350</dc:format><dc:identifier>DOI:10.51940/2024.1.167-188</dc:identifier><dc:identifier>ISSN:1854-3839</dc:identifier><dc:identifier>COBISSID_HOST:225958403</dc:identifier><dc:identifier>URN:URN:NBN:SI:doc-XPIPEUB2</dc:identifier><dc:language>en</dc:language><dc:publisher xml:lang="sl">Pravna fakulteta</dc:publisher><dcterms:isPartOf xml:lang="sl">Zbornik znanstvenih razprav (Pravna fakulteta. 1991)</dcterms:isPartOf><dc:subject xml:lang="en">accountability</dc:subject><dc:subject xml:lang="en">autonomous weapon systems</dc:subject><dc:subject xml:lang="sl">avtonomni orožni sistemi</dc:subject><dc:subject xml:lang="sl">človekovo dostojanstvo</dc:subject><dc:subject xml:lang="en">human dignity</dc:subject><dc:subject xml:lang="en">ICRC</dc:subject><dc:subject xml:lang="en">international humanitarian law</dc:subject><dc:subject xml:lang="en">meaningful human control</dc:subject><dc:subject xml:lang="sl">Mednarodni odbor Rdečega križa</dc:subject><dc:subject xml:lang="sl">mednarodno humanitarno pravo</dc:subject><dc:subject xml:lang="sl">odgovornost</dc:subject><dc:subject xml:lang="sl">pravica do življenja</dc:subject><dc:subject xml:lang="sl">preglednost</dc:subject><dc:subject xml:lang="en">right to life,</dc:subject><dc:subject xml:lang="sl">smiselni človeški nadzor</dc:subject><dc:subject xml:lang="en">transparency</dc:subject><dc:subject xml:lang="sl">vojaška umetna inteligenca</dc:subject><dcterms:temporal rdf:resource="2011-2025" /><dc:title xml:lang="sl">To use AI or not to use AI?| autonomous weapon systems and their complicated relationship with the right to life|</dc:title><dc:description xml:lang="sl">The increased prevalence of AI technology developed or adapted for military use raises difficult questions about the compatibility of this new technology with international law in general, and international human rights law (IHRL) in particular. The Human Rights Committee, the expert body entrusted with monitoring the application of the International Covenant on Civil and Political Rights, expressed its view in 2018 on the relationship between the emergence of new military AI and respect for the right to life. The article reviews the terms of the IHRL debate surrounding the introduction of AI technology into military contexts and its relationship to the right to life. Section one briefly reviews some actual and potential applications of AI in military contexts. Section two deals with three principal objections to introducing military AI to battlefield environments: the capacity of autonomous or semi-autonomous AI systems to properly apply international humanitarian law (IHL), concerns about de facto lowering of standards of humanitarian protection, and the ethical and legal implications of transferring certain life-and-death decisions from humans to machines. Section three reviews, in light of these three principled objections, specific proposals by the ICRC to limit the use of AI in military contexts (limiting the scope and manner of use of autonomous weapon systems, and excluding unpredictable and lethal systems). Section four reviews the main issues discussed in this article from the vantage point of the right to life under IHRL, as elaborated in General Comment No. 36</dc:description><edm:type>TEXT</edm:type><dc:type xml:lang="sl">znanstveno časopisje</dc:type><dc:type xml:lang="en">journals</dc:type><dc:type rdf:resource="http://www.wikidata.org/entity/Q361785" /></edm:ProvidedCHO><ore:Aggregation rdf:about="http://www.dlib.si/?URN=URN:NBN:SI:doc-XPIPEUB2"><edm:aggregatedCHO rdf:resource="URN:NBN:SI:doc-XPIPEUB2" /><edm:isShownBy rdf:resource="http://www.dlib.si/stream/URN:NBN:SI:doc-XPIPEUB2/b9e4f3c2-7020-466f-9ffe-d992d0b6f204/PDF" /><edm:rights rdf:resource="http://creativecommons.org/licenses/by-nd/4.0/" /><edm:provider>Slovenian National E-content Aggregator</edm:provider><edm:intermediateProvider xml:lang="en">National and University Library of Slovenia</edm:intermediateProvider><edm:dataProvider xml:lang="sl">Univerza v Ljubljani, Pravna fakulteta</edm:dataProvider><edm:object rdf:resource="http://www.dlib.si/streamdb/URN:NBN:SI:doc-XPIPEUB2/maxi/edm" /><edm:isShownAt rdf:resource="http://www.dlib.si/details/URN:NBN:SI:doc-XPIPEUB2" /></ore:Aggregation></rdf:RDF>