The EU AI Act : A Commentary

個数:

The EU AI Act : A Commentary

  • 在庫がございません。海外の書籍取次会社を通じて出版社等からお取り寄せいたします。
    通常6~9週間ほどで発送の見込みですが、商品によってはさらに時間がかかることもございます。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合がございます。
    2. 複数冊ご注文の場合は、ご注文数量が揃ってからまとめて発送いたします。
    3. 美品のご指定は承りかねます。

    ●3Dセキュア導入とクレジットカードによるお支払いについて
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Hardcover:ハードカバー版/ページ数 500 p.
  • 言語 ENG
  • 商品コード 9781837231065
  • DDC分類 343.2409998

Full Description

It is without any doubt that artificial intelligence is transforming the way we work, the way we live, and how we perceive the world. It is, however, less clear whether - and to what extent - the law can and should respond, and has the potential to shape these changes.

This invaluable commentary on the EU Artificial Intelligence Act (EU AI Act) offers a thorough analysis of this groundbreaking legislation. As AI technologies become increasingly integrated into society, it is imperative to address the potential risks and ethical concerns they bring.
Readers will quickly get a solid foundational understanding of the EU AI Act in the introductory chapter, which provides a comprehensive overview of the act as a whole. The following chapters deliver insightful examinations of each of the act's articles by renowned experts in the field. Lukas Feiler, Nikolaus Forgó and Michaela Nebel bring diverse perspectives and deep knowledge to the discussion, making this an essential reference for anyone involved in AI regulation and compliance.

Businesses seeking initial guidance and pragmatic solutions on how to navigate the EU AI Act will find this book particularly useful. It is also an indispensable tool for lawyers, judges and other legal professionals who need to navigate the complexities of AI-related regulations.

Contents

Preface 11

List of abbreviations 13

List of recitals of the AI Act 21

An introduction to the AI Act 23

1. The scope of application of tthe AI Act. . . . . . . . . . . . . . . . .23
1.1 The material scope of application: What types of AI are covered? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
1.2 The personal scope of application: To whom does the AI Act apply? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
1.3 The territorial scope of application: Where does the AI Act apply? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
1.4 The temporal scope of application: When does the AI Act apply? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

2. The AI Act as an instrument of product regulation. . . . . . . . . . . . . . . . . . . . . . . . . 28
2.1 An overview of European Union product regulation . . . . . . . . . . . . . . . 28
2.2 The role of harmonised standards and common specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
2.3 External conformity assessment bodies and their accreditation and notification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
2.4 The relationship with other harmonisation legislation . . . . . . . . . . . . . . . 30

3. Risk-based regulation of AI systems and AI models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.1 Prohibited AI systems. . . . . . . . . . . . . . . . 31
3.2 High-risk AI systems. . . . . . . . . . . . . . . . . . 32
3.3 GenAI and certain biometric AI systems that are subject to special transparency regulations . . . . . 34
3.4 Other AI systems . . . . . . . . . . . . . . . . . . . . . . . 35
3.5 General-purpose AI models. . . . . . . 35

4. An overview of the obligations of the AI Act . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
4.1 Obligations of the providers . . . . . 36
4.1.1 Obligations regarding high-risk AI systems . . . . . . . . . . . . . . . . . . 36
4.1.2 Obligations regarding GenAI systems pursuant to Article 50 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
4.1.3 Obligations regarding other AI systems . . . . . . . . . . . . . . . . . . . . . . . . 40
4.1.4 Obligations regarding general-purpose AI models. . . . . . . . 41
4.1.5 Obligations regarding general-purpose AI systems . . . . . . . 42
4.2 Obligations of importers . . . . . . . . . . . 42
4.2.1 Obligations regarding high-risk AI systems . . . . . . . . . . . . . . . . . . 42
4.2.2 Obligations regarding other AI systems . . . . . . . . . . . . . . . . . . . . . . . . 43
4.3 Obligations of distributors . . . . . . . . 43
4.3.1 Obligations regarding high-risk AI systems . . . . . . . . . . . . . . . . . . 43
4.3.2 Obligations regarding other AI systems . . . . . . . . . . . . . . . . . . . . . . . . 44
4.4 Obligations of the deployers. . . . . 44
4.4.1 Obligations regarding high-risk AI systems . . . . . . . . . . . . . . . . . . 44
4.4.2 Obligations regarding GenAI and certain biometric AI systems pursuant to Article 50. . 46
4.4.3 Obligations regarding other AI systems . . . . . . . . . . . . . . . . . . . . . . . . 47
4.5 Obligations for authorised representatives . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
4.5.1 Obligations regarding high-risk AI systems . . . . . . . . . . . . . . . . . . 47
4.5.2 Obligations regarding general-purpose AI models. . . . . . . . 48

5. Measures to promote innovation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
5.1 AI regulatory sandboxes. . . . . . . . . . . . 48
5.2 Testing in real-world conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

6. Enforcement by the authorities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.1 Market surveillance of AI systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.1.1 Regulatory responsibility for market surveillance. . . . . . . . . . . . . . 51
6.1.2 Powers of the market surveillance authorities . . . . . . . . . . . . . 54
6.1.3 The market surveillance procedure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
6.2 The AI Office as a supervisory authority for providers of general-purpose AI models . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
6.3 Fines. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57

7. Liability law and enforcement by private individuals . . . . . . . . . . . . . . . . . . . 58

Text of the EU AI Act and commentary 61

Chapter I - General provisions 63
Article 1 Subject matter. . . . . . . . . . . . . . . . . . . 63
Article 2 Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Article 3 Definitions . . . . . . . . . . . . . . . . . . . . . . . . 85
Article 4 AI literacy. . . . . . . . . . . . . . . . . . . . . . . . 125

Chapter II - Prohibited AI practices 127
Article 5 Prohibited AI practices. . . . 127

Chapter III - High-risk AI systems 149

Section 1 - Classification of AI systems as high-risk . . . . . . . . . . . . . . . . . . . . . . 149
Article 6 Classification rules for high-risk AI systems . . . . . . . . . . . . . . . . . . . . . . 149
Article 7 Amendments to Annex III . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158

Section 2 - Requirements for high-risk AI systems. . . . . . . . . . . . . . . . . . . . . . . . . . . 160
Article 8 Compliance with the requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
Article 9 Risk management system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
Article 10 Data and data governance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167
Article 11 Technical documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
Article 12 Record-keeping . . . . . . . . . . . . . 175
Article 13 Transparency and provision of information to deployers . . . . . . . . . . 177
Article 14 Human oversight. . . . . . . . . . 181
Article 15 Accuracy, robustness and cybersecurity. . . . . . . . . . . . . . . . . . . . . . . . . . . 185

Section 3 - Obligations of providers and deployers of high-risk AI systems and other parties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Article 16 Obligations of providers of high-risk AI systems . . . . . . . . . . . . . . . . . . 191
Article 17 Quality management system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
Article 18 Documentation keeping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
Article 19 Automatically generated logs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
The EU AI Act: A Commentary Article 20 Corrective actions and duty of information . . . . . . . . . . . . . . . . . . . . . . 201
Article 21 Cooperation with competent authorities . . . . . . . . . . . . . . . . . . . 203
Article 22 Authorised representatives of providers of high-risk AI systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
Article 23 Obligations of importers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
Article 24 Obligations of distributors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
Article 25 Responsibilities along the AI value chain . . . . . . . . . . . . . . . . . . . . . . . . . 215
Article 26 Obligations of deployers of high-risk AI systems . . . . . . . . . . . . . . . . . . 221
Article 27 Fundamental rights impact assessment for high-risk AI systems . . . . . . . . . . . . . . . . . . . . . . 228

Section 4 - Notifying authorities and notified bodies. . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
Article 28 Notifying authorities . . . . 233
Article 29 Application of a conformity assessment body for notification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
Article 30 Notification procedure. . . 236
Article 31 Requirements relating to notified bodies . . . . . . . . . . . . . . . . . . . . . . . . . . 238
Article 32 Presumption of conformity with requirements relating to notified bodies . . . . . . . . . . . . . 240
Article 33 Subsidiaries of notified bodies and subcontracting . . . . . . . . . . . . 241
Article 34 Operational obligations of notified bodies. . . . . . . . . . . . . . . . . . . . . . . . . . . 242
Article 35 Identification numbers and lists of notified bodies. . . . . . . . . . . . 243
Article 36 Changes to notifications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
Article 37 Challenge to the competence of notified bodies. . . . . . 247
Article 38 Coordination of notified bodies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 248
Article 39 Conformity assessment bodies of third countries . . . . . . . . . . . . . . . 249

Section 5 - Standards, conformity assessment, certificates, registration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Article 40 Harmonised standards and standardisation deliverables. . . 250
Article 41 Common specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Article 42 Presumption of conformity with certain requirements . . . . . . . . . . . . . 256
Article 43 Conformity assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
Article 44 Certificates . . . . . . . . . . . . . . . . . . . . 262
Article 45 Information obligations of notified bodies . . . . . . . . . . . . . . . . . . . . . . . . . . 263
Article 46 Derogation from conformity assessment procedure . . 264
Article 47 EU declaration of conformity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
Article 48 CE marking . . . . . . . . . . . . . . . . . . . 267
Article 49 Registration. . . . . . . . . . . . . . . . . . . 269

Chapter IV - Transparency obligations for providers and deployers of certain AI systems 273
Article 50 Transparency obligations for providers and deployers of certain AI systems. . . . . . . . . . . . . . . . . . . . . . . . . . 273

Chapter V - General-purpose AI models 281

Section 1 - Classification rules . . . . . . . . . 281
Article 51 Classification of general-purpose AI models as general-purpose AI models with systemic risk. . . . . . . . . 281
Article 52 Procedure . . . . . . . . . . . . . . . . . . . . . . 285

Section 2 - Obligations for providers of general-purpose AI models . . . . . . . . . . . 288
Article 53 Obligations for providers of general-purpose AI models. . . . . . . . 288
Article 54 Authorised representatives of providers of general-purpose AI models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294

Section 3 - Obligations of providers of general-purpose AI models with systemic risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296
Article 55 Obligations of providers of general-purpose AI models with systemic risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296
Article 56 Codes of practice . . . . . . . . . . 299

Chapter VI - Measures in support of innovation 303
Article 57 AI regulatory sandboxes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
Article 58 Detailed arrangements for, and functioning of, AI regulatory sandboxes. . . . . . . . . . . . . . . . . . . . . 311
Article 59 Further processing of personal data for developing certain AI systems in the public interest in the AI regulatory sandbox. . . . . . . . . . .316
Article 60 Testing of high-risk AI systems in real world conditions outside AI regulatory sandboxes . . . 320
Article 61 Informed consent to participate in testing in real world conditions outside AI regulatory sandboxes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324
Article 62 Measures for providers and deployers, in particular SMEs, including start-ups. . . . . . . . . . . . . . . . . . . . . . . . . 325
Article 63 Derogations for specific operators . . . . . . . . . . . . . . . . . . . . . . . . . . . 326

Chapter VII - Governance 327

Section 1 - Governance at Union level . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
Article 64 AI Office . . . . . . . . . . . . . . . . . . . . . . . . 327
Article 65 Establishment and structure of the European Artificial Intelligence Board. . . . . . . . . . . . . . . . . . . . . . . . . . 328
Article 66 Tasks of the Board. . . . . . . . . 330
Article 67 Advisory forum. . . . . . . . . . . . . 332
Article 68 Scientific panel of independent experts. . . . . . . . . . . . . . . . . . . . . . 334
Article 69 Access to the pool of experts by the Member States . . . . . . . 336

Section 2 - National competent authorities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337
Article 70 Designation of national competent authorities and single points of contact . . . . . . . . . . . . . . . . . . . . . . . . . . . 337

Chapter VIII - EU database for high-risk AI systems 341
Article 71 EU database for high-risk AI systems listed in Annex III . . . . . . . 341

Chapter IX - Post-market monitoring, information sharing and market surveillance 343

Section 1 - Post-market monitoring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343
Article 72 Post-market monitoring by providers and post-market monitoring plan for high-risk AI systems . . . . . . . . . . . . . . . . . . . . . . . . 343

Section 2 - Sharing of information on serious incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . 346
Article 73 Reporting of serious incidents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 346

Section 3 - Enforcement . . . . . . . . . . . . . . . . . . . 350
Article 74 Market surveillance and control of AI systems in the Union market . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350
Article 75 Mutual assistance, market surveillance and control of general-purpose AI systems. . . . . . . 361
Article 76 Supervision of testing in real world conditions by market surveillance authorities . . .. . . . . . . . . . . . . . 363
Article 77 Powers of authorities protecting fundamental rights . . . . . . 364
Article 78 Confidentiality. . . . . . . . . . . . . . 366
Article 79 Procedure at national level for dealing with AI systems presenting a risk . . . . . . . . . . . . . . . . . . . . . . . . . . . . 369
The EU AI Act: A Commentary Article 80
Procedure for dealing with AI systems classified by the provider as non-high-risk in application of Annex III . . . . 373
Article 81 Union safeguard procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
Article 82 Compliant AI systems which present a risk . . . . . . . . . . . . . . . . . . . . . . 376
Article 83 Formal non-compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378
Article 84 Union AI testing support structures . . . . . . . . . . . . . . . . . . . . . . . . . 379

Section 4 - Remedies . . . . . . . . . . . . . . . . . . . . . . . . . 380
Article 85 Right to lodge a complaint with a market surveillance authority . . . . . . . . . . . . . . . . . . . 380
Article 86 Right to explanation of individual decision-making . . . . . . 381
Article 87 Reporting of infringements and protection of reporting persons . . . . . . . . . . . . . . . . . . . . . 384

Section 5 - Supervision, investigation, enforcement and monitoring in respect of providers of general-purpose AI models . . . . . . . . . . . . . . . . 385
Article 88 Enforcement of the obligations of providers of general-purpose AI models . . . . . . . . . . . . . . . . . . . . . . . . . 385
Article 89 Monitoring actions . . . . . . . 387
Article 90 Alerts of systemic risks by the scientific panel . . . . . . . . . . . . . . . . . . . 388
Article 91 Power to request documentation and information. . 389
Article 92 Power to conduct evaluations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
Article 93 Power to request measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393
Article 94 Procedural rights of economic operators of the general-purpose AI model . . . . . . . . . . . . . 394

Chapter X - Codes of conduct and guidelines 395
Article 95 Codes of conduct for voluntary application of specific requirements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
Article 96 Guidelines from the Commission on the implementation of this Regulation . . . . . . . . . . . . . . . . . . . . . . . . . . 398

Chapter XI - Delegation of power and committee procedure 401
Article 97 Exercise of the delegation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 401
Article 98 Committee procedure. . . 403
Chapter XII - Penalties 405
Article 99 Penalties . . . . . . . . . . . . . . . . . . . . . . . . 405
Article 100 Administrative fines on Union institutions, bodies, offices and agencies . . . . . . . . . . . . . . . . . . . . . . . 410
Article 101 Fines for providers of general-purpose AI models . . . . . . . . . . . . 412

Chapter XIII - Final provisions 415
Article 102 Amendment to Regulation (EC) No. 300/2008 . . . . . . 415
Article 103 Amendment to Regulation (EU) No. 167/2013 . . . . . . 417
Article 104 Amendment to Regulation (EU) No. 168/2013 . . . . . . 418
Article 105 Amendment to Directive 2014/90/EU . . . . . . . . . . . . . . . . . . . . 419
Article 106 Amendment to Directive (EU) 2016/797. . . . . . . . . . . . . . . . 420
Article 107 Amendment to Regulation (EU) 2018/858 . . . . . . . . . . . . . 421
Article 108 Amendments to Regulation (EU) 2018/1139 . . . . . . . . . . . 422
Article 109 Amendment to Regulation (EU) 2019/2144 . . . . . . . . . . . 424
Article 110 Amendment to Directive (EU) 2020/1828. . . . . . . . . . . . . . 425
Article 111 AI systems already placed on the market or put into service and general-purpose AI models already placed on the market . .. . . . . . . . 426
Article 112 Evaluation and review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 428
Article 113 Entry into force and application. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 432

Annex I - List of Union harmonisation legislation 435

Annex II - List of criminal offences referred to in Article 5(1), first subparagraph, point (h)(iii) 439

Annex III - High-risk AI systems referred to in Article 6(2) 441

Annex IV - Technical documentation referred to in Article 11(1) 461

Annex V - EU declaration of conformity 465

Annex VI - Conformity assessment procedure based on internal control 467

Annex VII - Conformity based on an assessment of the quality management system and an assessment of the technical documentation 469

Annex VIII - Information to be submitted upon the registration of high-risk AI systems in accordance with Article 49 473

Section A - Information to be submitted by providers of high-risk AI systems in accordance with Article 49(1) . . . . . . . . . . . . .. . . . . . . . . . 473

Section B - Information to be submitted by providers of high-risk AI systems in accordance with Article 49(2) . . . . . . . . . . .. . . . . 474

Section C - Information to be submitted by deployers of high-risk AI systems in accordance with Article 49(3) . . . . . . . . . . . . . . . 474

Annex IX - Information to be submitted upon the registration of high-risk AI systems listed in Annex III in relation to testing in real world conditions in accordance with Article 60 477

Annex X - Union legislative acts on large-scale IT systems in the area of Freedom, Security and Justice 479

Annex XI - Technical documentation referred to in Article 53(1), point (a) — technical documentation for providers of general-purpose AI models 483

Section 1 - Information to be provided by all providers of general-purpose AI models . . . . . . . . . . . . . . .483

Section 2 - Additional information to be provided by providers of general-purpose AI models with systemic risk . . . . . . . . . . .484

Annex XII - Transparency information referred to in Article 53(1), point (b) - technical documentation for providers of general-purpose AI models to downstream providers that integrate the model into their AI system 485

Annex XIII - Criteria for the designation of general-purpose AI models with systemic risk referred to in Article 51 487

Index 489

About the authors 499

About Globe Law and Business 501

最近チェックした商品