authors

Joana Oliveira

LOBA

NEWS 13-03-2024

The AI Fairness Cluster: A joined force for pioneering ethical standards in AI

In the rapidly evolving landscape of artificial intelligence (AI), ensuring fairness and ethical standards is paramount. Recognising this necessity, a consortium of leading organisations has come together to form the AI Fairness Cluster, dedicated to promoting fairness, transparency, and accountability in AI systems. This collaborative initiative aims to address the ethical challenges associated with AI deployment and pave the way for a more equitable future.

The AI Fairness Cluster is comprised of 4 European projects: AEQUITAS, BIAS, FINDHR and MAMMOth, funded through the Horizon Europe program. Through interdisciplinary collaboration and knowledge-sharing, the cluster’s projects are working to develop best practices, guidelines, and tools to mitigate bias and discrimination in AI algorithms and applications.

Taking the lead on the cluster creation and activities is the BIAS project, “BIAS is proud to champion the AI Fairness Cluster’s goals as they align thoroughly with our mission to mitigate bias in technology. Embracing fairness in AI isn’t just an objective; it’s the cornerstone of a future where technology serves all with equity and inclusiveness, unlocking opportunities, and fostering trust in the digital age” announced the BIAS project team.

At the heart of the AI Fairness Cluster’s mission is the belief that AI technologies should not perpetuate or exacerbate existing inequalities. By proactively identifying and addressing biases in data, algorithms, and decision-making processes, the cluster seeks to foster trust and confidence in AI systems across various domains, including healthcare, finance, human resources, and education.

One of the key initiatives undertaken by projects part of the AI Fairness Cluster is the development of fairness metrics and evaluation frameworks to assess the fairness of AI models comprehensively. These metrics encompass various dimensions of fairness, such as demographic parity, equal opportunity, and disparate impact, providing developers and practitioners with robust tools to measure and mitigate bias effectively.

Furthermore, the AI Fairness Cluster is committed to promoting diversity and inclusivity within the AI community. By amplifying underrepresented voices and supporting initiatives aimed at increasing diversity in AI research and development, the cluster aims to foster a more inclusive and equitable AI ecosystem.

In addition to its research and advocacy efforts, the AI Fairness Cluster actively engages with policymakers and industry stakeholders to promote the adoption of ethical AI principles and standards. By advocating for regulatory frameworks that prioritise fairness, transparency, and accountability, the cluster seeks to shape the responsible deployment of AI technologies worldwide.

AEQUITAS project proudly stands with this initiative as is described in our Coordinator Roberta Calegari’s words: “By collaborating within the cluster, we can share networks, avoid duplication of efforts, exchange ideas, discuss common research challenges and solutions, and maximize impact by fostering a larger community. One of the key added values of AEQUITAS in the cluster is to bring a diverse set of use cases to the table, enriching discussions and providing diverse perspectives on the topic of fairness in AI”.

As AI continues to reshape society and profoundly impact human lives, ensuring fairness and ethical integrity must remain paramount. The AI Fairness Cluster stands at the forefront of this critical endeavour, driving forward the agenda for fair, transparent, and accountable AI systems. Through collaborative efforts and a commitment to ethical principles, the cluster strives to build a future where AI serves the collective good, leaving no one behind.

AI fairness cluster Inaugural Conference

19th of March, Science Park Congress Center, Amsterdam (The Netherlands)

Happening on the 19th of March 2024 in the Science Park Congress Center, Amsterdam (The Netherlands), the AI Fairness Cluster Inaugural Conference promises to be pivotal moment to discuss technical, legal, and social aspects of discrimination risks, featuring keynote speakers, panellists, and presentations from cluster members and diverse European researchers.

Check out the draft agenda here.

Workshop on AI BIAS

20th of March, Science Park Congress Center, Amsterdam (The Netherlands)

The AI Fairness Cluster is hosting a technical workshop, AIMMES’24, on AI Biases: Measurements, Mitigation, Explanation Strategies. This workshop aims to foster a technical dialogue and enhance understanding of challenges and opportunities at the intersection of AI, fairness, and bias in our evolving technological landscape.

Co-located with the AI Fairness Cluster Inaugural Conference, this workshop encourages interaction through lightning talks and poster presentations, fostering new connections between researchers on these topics.

Through a Call for Papers, the workshop invites submissions from established scientists and graduate students in three main areas:

  • Explaining Bias: new approaches for comprehending, visualising, and communicating bias in datasets and AI systems, domain-specific strategies and use cases for explaining bias, and exploration of Explainable Artificial Intelligence (XAI) tools and software.
  • Detecting Bias: novel definitions of AI bias and fairness, the impact of human factors, the role of social sciences, law, and humanities, inclusion of vulnerable communities in bias definition, and methods for studying bias in multi-attribute settings.
  • Mitigating Bias: methods for assessing trade-offs between fairness and accuracy, employing synthetic data for bias mitigation, addressing fairness in ranking and recommendation systems, and implementing application-specific mitigation strategies for areas like hiring, credit scoring, and face recognition.

Selected works will be chosen by a scientific program committee. Know everything about the AIMMES 2024 workshop and the call for papers HERE.

Visit the fairness cluster website for more information: https://aifairnesscluster.eu/.

authors

Joana Oliveira

LOBA

found this interesting?

share this page