Lecture: Duty Ethics

Lecture: Duty Ethics

Administrative information


Title Duty Ethics
Duration 60
Module A
Lesson Type Lecture
Focus Ethical - Ethics Fundamentals
Topic Duty Ethics

 

Keywords


Duty ethics, Kant

 

Learning Goals


  • Learner can assess choices on what subjects are ought to do.
  • Learner understands the concept of moral.
  • Learner has basic understanding of how those points relate to AI.

 

Expected Preparation


Learning Events to be Completed Before

None.

Obligatory for Students

None.

Optional for Students

  • Duty ethics - Kant: School of Life - Kant & Categorical Imperatives: Crash Course Philosophy#35.
  • Kranak, J. Introduction to Philosophy: Ethics. Chapter 6. Kantian Deontology. [1]
  • Gensler, H. J. (2017). Ethics: A contemporary introduction. Routledge. CH 7 Kant and GR, CH11 Nonconsequentialism (deontology)
  • Kantian ethics. Otfried Höffe. In Roger Crisp (ed.), The Oxford Handbook of the History of Ethics. Oxford University Press (2013)

References and background for students:

None.

Recommended for Teachers

  • Ulgen, Ozlem. "Kantian ethics in the age of artificial intelligence and robotics." QIL 43 (2017): 59-83. [2]
  • Alexander, Larry, and Michael Moore "Deontological Ethics" on Stanford Encyclopedia of Philosophy, Stanford University, 21 Nov. 2007, [3].
  • I. Kant “Critique of Practical Reason,” (1788) and “Groundwork of the Metaphysics of Morals” (1785)
  • P.W. Singer, "Isaac Asimov’s Laws of Robotics Are Wrong", 2009. [4]

 

Lesson Materials


 

The materials of this learning event are available under CC BY-NC-SA 4.0.

 

Instructions for Teachers


Topics to Cover

  • Introduction: differences between different normative theories (5 min)
  • Duty ethics (35 min)
    • What is a "duty"? (5 min)
    • Kant and the role of human reason (10 min)
    • Maxims and imperatives - Example Code of Ethics and Professional practice (ACM/IEEE-CS) (15 min)
    • Critics of duty ethics, pros and cons (5 min)
  • Duty Ethics and AI (30 min)
    • Examples of dilemmas (10 min) [Digital discrimination (Social Network Bias) and Autonomous vehicles]
    • Trolley problem, different versions, with short discussion (10 min)
    • Isaac Asimov’s laws of Robotics (5 min)
    • Evaluating an action based on duty ethics (5 min)

Interactive moments

There are several interactive moments (e.g., short discussions) with students:

  • Values: Questions that might be asked are:
    • What values are particular important for you?
    • Which contradictions do you see regarding values, do you have examples in which these contradictions occur?
  • Duty ethics/obligations - Questions that might be asked are:
    • What are possible problems with duty ethics in practise?
    • Would you always do the right thing, even if it produces a bad result?
    • Can you provide examples of the uses of the categorical imperative in everyday life?
  • Duty ethics relation with AI: Questions that might be asked are:
    • Do you think we could use a deontological moral model to provide a moral frame for AI?
    • Do you believe the categorical imperative is applicable to AI?
    • Would you ride a driverless cars that swerve to save pedestrians even at the expense of passengers?

More information

Click here for an overview of all lesson plans of the master human centred AI

Please visit the home page of the consortium HCAIM

Acknowledgements

The Human-Centered AI Masters programme was co-financed by the Connecting Europe Facility of the European Union Under Grant №CEF-TC-2020-1 Digital Skills 2020-EU-IA-0068.

The materials of this learning event are available under CC BY-NC-SA 4.0

 

The HCAIM consortium consists of three excellence centres, three SMEs and four Universities

HCAIM Consortium