What's your
6th finger?

Let this serve as six times the reminder that with AI's great potential come major risks.

Read our press release

Making AI's risks more visible.

Like that extra finger in an AI-generated image, some of the risks often slip by unnoticed. With these six-fingered gloves, we want to highlight AI's risks and the importance of identifying and mitigating them.

We've sent gloves to influential figures like Sam Altman, Scarlett Johansson and others to thank them for their efforts towards responsible AI and encourage them to keep improving AI safety.

So, what are the risks of AI?

As the AI Governance Company, we are enthusiastic about the opportunities for progress presented by AI and machine learning technology, if managed properly. That's why Saidot is committed to helping businesses safely and transparently integrate AI into their operations.

Check out a few examples of the 120+ risks and their controls documented on our AI governance platform.

Biased outcomes

AI bias refers to distorted AI outputs that reflect human bias originating in skewed training data, for example. AI algorithms often too have biases in information processing that can lead to ethical problems, for example in recruitment tools.

False information

AI still tends to randomly fabricate information, which can get you into serious trouble without human oversight. Also known as hallucination, this can cause misinformation or unreliable content to spread.

Copyright infringements

AI often blends copyrighted content into its creations, potentially exposing you to copyright violations without you even knowing. The question is whether using third-party IP material to train generative AI systems is allowed and under which conditions.

Privacy infringements

AI systems can pose privacy risks by processing and analysing personal data without consent and sufficient safeguards, potentially violating individuals' privacy. Personal information can be inadvertently included in the AI-generated output and generative AI models can identify individuals by combining multiple data sources.

Harmful and toxic content

AI systems may generate harmful or inappropriate content and spread hate speech and misinformation. For example, individuals' voices and appearances can be recreated through deepfake technology to manipulate others to believe something that is not true.

Cyber security risks

AI can be exploited by attackers to gain control over computer systems, reveal sensitive information, or to cause malfunctioning. For example, hackers can do this through manipulating training datasets, social engineering, or exploiting computer vulnerabilities.

Who we've sent the six-fingered gloves to thus far...

Sam Altman
CEO of OpenAI

Six big thank yous, Sam Altman, for holding off on releasing Sora for public availability until it's safe to use. We hope the gloves remind you to align AI with human values and encourage ecosystem collaboration.

Scarlett Johansson
Actress

Thank you, thank you, thank you, thank you, thank you, thank you, Scarlett Johansson, for taking a stand in the deepfake voice debacle. We hope the gloves remind you to fight for all unheard (real) voices to make AI fairer for everyone.

Ursula von der Leyen
President of the European Commission

A sixfold thank you to You, President von der Leyen, for leading the EU to pass the AI Act, a global benchmark for AI regulation. We hope the gloves remind you of the looming 'sixth fingers' in ensuring an innovation-friendly implementation of the AI Act.

Mark Zuckerberg
CEO of Meta

Thank you multiplied by six, Mark Zuckerberg, for choosing to make big AI investments in open-source models available for everyone. We hope the gloves serve as a reminder of sustainable training data sourcing.

Sebastian Siemiatkowski
CEO of Klarna

Consider this a thank you times six, Sebastian Siemiatkowski, for sharing Klarna's AI journey and showing AI's limitless possibilities. We hope the gloves remind you to share mistakes along the way and promote more responsible AI.

Sir Keir Starmer
Prime Minister of the United Kingdom

A sixfold thank you to You too, Prime Minister Starmer, for making binding AI regulation in the UK a key part of your agenda. We hope the gloves remind you of all the 'sixth finger'-like risks businesses may face with their AI investments before proper regulation is in place.

Clément Delangue
CEO of Hugging Face

Thank you six times over, Clément Delangue, for setting a beautiful global benchmark for distributing open-source AI models. We hope the gloves remind you to help users of open-source models meet AI Act's requirements for data transparency and copyright.

Rishi Bommasani
Society Lead at Stanford Center for Research on Foundation Models

Six rounds of applause to you, Rishi Bommasani, for your efforts on public evaluations driving transparency for LLM's. We hope the gloves remind you that we need new ways to measure machine intelligence beyond those made for measuring human intelligence

Henna Virkkunen
EU Commissioner-Designate

We have our fingers crossed (x6), that you, Henna Virkkunen, will become EU's Commissioner and passionately drive responsible AI innovation forward in Europe. We hope the gloves remind you of AI's risks on health, safety, fundamental rights, and democracy.

Who would you send these gloves to?
We'd be curious to know!

We still have a few pairs to give to people who deserve six thank yous for their work towards responsible AI, and who could also use a reminder of AI's risks and the importance of continuing the work towards safe, ethical, and transparent AI. Let us know who you'd suggest!

Read our press release
Credits and sources for images:

Sam Altman: ©Steve Jennings/Getty Images for TechCrunch (licence)
Scarlett Johansson: ©Elen Nivrae (licence)
Ursula von der Leyen: ©European Union, 2024 - Source : EP
Mark Zuckerberg: ©Meta
Sebastian Siemiatkowski: ©Noam Galai/Getty Images for TechCrunch (licence)
Prime Minister Keir Starmer: ©UK Parliament (licence)
Clément Delangue: X.com
Rishi Bommasani: Stanford University Human-Centered Artificial Intelligence
Henna Virkkunen: ©European Union, 2024 - Source : EP