Anekanta® is officially registered as a CPD provider, supporting professional development in AI literacy and governance
Category: AI Safety
Lessons for Business Leaders from the Dutch AI Scandal – AI the Board and the EU AI Act
Equip your organisation with the skills and tools needed to navigate AI responsibly for maximum benefit
AI Assurance for Facial Recognition developed by Anekanta® featured in UK Government Portfolio
Our pioneering work in AI assurance has been recognised by the UK Government Responsible Technology Adoption Unit. Our AI assurance for facial recognition case study has been featured in the prestigious Portfolio of AI Assurance Techniques.
Preparing for the EU AI Act – guidance for AI biometrics developers and users
The EU's groundbreaking EU AI Act is poised to significantly impact companies developing or using biometric AI products within the European market. Here's why biometric AI companies should take notice.
What does the new legal definition of AI systems mean in practice for commercial use cases?
Organisations may not be certain whether their systems are AI due to the range of contrasting views from vendors and buyers. It is important that they recognise and categorise these systems as soon as possible. Developers, providers and user/deployers of AI systems which meet the EU AI Act definition criteria, and whose decisions are utilised within the EU, have to comply with the law.
EU AI Act: Cleared to move to the next stage. Your action list to get started.
Organisations are urged to review their development road maps and current use cases to determine whether their AI systems sit in the prohibited and high risk categories
Be responsible and de-risk your AI facial recognition software deployments
Anekanta® AI have developed a number of independent, vendor agnostic AI governance frameworks expertly designed to assess the safety of high-risk AI and the impact on stakeholders. Our Facial Recogniton Privacy Impact Risk Assessment System™ contains a small sub-set of our wider capability. Find out more.
