Navigation and service

Legal framework for making available autonomous and AI systems

Project number: F 2432 Institution: Federal Institute for Occupational Safety and Health (BAuA) / HFC, Human-Factors-Consult GmbH, Berlin / IKEM - Institut für Klimaschutz, Energie und Mobilität, Berlin Status: Completed Project


This research project dealt with AI algorithms and other typs of software with autonomy functions that need to be subjected to a safety assessment.. It investigated whether and to what extent their use in industrial physical systems requires changes to the legal framework, in particular in the fields of product safety and occupational health and safety law.

Based on extensive literature analyses and expert interviews, relevant factors and the characteristics of these physical systems and their environments were examined and summarised in a taxonomy. A legal opinion based on this assessed relevant criteria for the further development of the necessary product-specific and operational requirements and analysed related areas of law, such as immission control and liability law.

The research carried out made it clear that the definitions of the terms "product" and "assemblies of machinery" should be extended to include highly interconnected systems. Furthermore, a new definition of "mutable products" and the creation of an obligation for manufacturers to introduce “product support concepts” were considered essential.

Apart from this, safety-relevant data must only be used for the application of AI algorithms in such systems if they meet high quality standards, thus making it possible to guarantee the requisite levels of safety.


Unit 2.1 "Basics of Product Safety"

Phone: +49 231 9071-1971 Fax: +49 231 9071-2070

Cookies help us to provide our services. By using our website you agree that we can use cookies. Read more about our Privacy Policy and visit the following link: Privacy Policy