Any organisation who develops, provides and uses/deploys software which meets the AI definition criteria, and whose decisions are utilised within the EU will be subject to the EU AI Act*.
The definition of AI
The EU AI Act provides a clear definition of AI based on the OECD‘s definition to avoid regular software being subject to the Act.
“‘Artificial intelligence system’ means a machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate output such as predictions, recommendations, or decisions influencing physical or virtual environments.”
What does this actually mean in the context of industrial and organisational use cases?
We have created a unique system, developed in-house which automatically classifies AI to the definition and requirements in the EU AI Act in industrial ‘use case’ terms.
Prohibited, high-risk, limited risk and minimal risk AI use cases are defined within the Act, within a range of vertical markets also cross-sectoral/horizontal scenarios. We can identify a significant proportion of them due to our blend of industrial and technological experience within the team.
The stakes are high, prohibited AIs may attract fines of up to €70 million.
If an organisation misclassifies software as non-AI which is later found to meet AI criteria, this may become a costly mistake up to €5 million.
Who should Act?
If you are an executive leader or policy maker and would like to take advantage of AI for the benefit of your organisation, you will need to understand how to define it.
Having defined it you need to know how to govern it.
Ignorance is not an excuse.
We can help
Anekanta® was formed in 2016 and took up activities in late 2019. We have extensive experience in the field of AI research and risk, and a proven track record with global clients.
Visit our range of services here.
*Compromise text agreed by the Council of the EU on 6th December 2022. Further amendments were made by the European Parliament and agreed 14th June 2023.