Let this serve as six times the reminder that with AI's great potential come major risks.
Read our press releaseAI bias refers to distorted AI outputs that reflect human bias originating in skewed training data, for example. AI algorithms often too have biases in information processing that can lead to ethical problems, for example in recruitment tools.
AI still tends to randomly fabricate information, which can get you into serious trouble without human oversight. Also known as hallucination, this can cause misinformation or unreliable content to spread.
AI often blends copyrighted content into its creations, potentially exposing you to copyright violations without you even knowing. The question is whether using third-party IP material to train generative AI systems is allowed and under which conditions.
AI systems can pose privacy risks by processing and analysing personal data without consent and sufficient safeguards, potentially violating individuals' privacy. Personal information can be inadvertently included in the AI-generated output and generative AI models can identify individuals by combining multiple data sources.
AI systems may generate harmful or inappropriate content and spread hate speech and misinformation. For example, individuals' voices and appearances can be recreated through deepfake technology to manipulate others in believing something that is not true.
AI can be exploited by attackers to gain control over computer systems, reveal sensitive information, or to cause malfunctioning. For example, hackers can do this by manipulating training datasets, social engineering, or exploiting computer vulnerabilities.
Six big thank yous, Sam Altman, for holding off on releasing Sora for public availability until it's safe to use. We hope the gloves remind you to align AI with human values and encourage ecosystem collaboration.
Thank you, thank you, thank you, thank you, thank you, thank you, Scarlett Johansson, for taking a stand in the deepfake voice debacle. We hope the gloves remind you to fight for all unheard (real) voices to make AI fairer for everyone.
A sixfold thank you to You, President von der Leyen, for leading the EU to pass the AI Act, a global benchmark for AI regulation. We hope the gloves remind you of the looming 'sixth fingers' in ensuring an innovation-friendly implementation of the AI Act.
Thank you multiplied by six, Mark Zuckerberg, for choosing to make big AI investments in open-source models available for everyone. We hope the gloves serve as a reminder of sustainable training data sourcing.
Consider this a thank you times six, Sebastian Siemiatkowski, for sharing Klarna's AI journey and showing AI's limitless possibilities. We hope the gloves remind you to share mistakes along the way and promote more responsible AI.
A sixfold thank you to You too, Prime Minister Starmer, for making binding AI regulation in the UK a key part of your agenda. We hope the gloves remind you of all the 'sixth finger'-like risks businesses may face with their AI investments before proper regulation is in place.
Thank you six times over, Clément Delangue, for setting a beautiful global benchmark for distributing open-source AI models. We hope the gloves remind you to help users of open-source models meet AI Act's requirements for data transparency and copyright.
Six rounds of applause to you, Rishi Bommasani, for your efforts on public evaluations driving transparency for LLM's. We hope the gloves remind you that we need new ways to measure machine intelligence beyond those made for measuring human intelligence
We have our fingers crossed (x6), that you, Henna Virkkunen, will become EU's Commissioner and passionately drive responsible AI innovation forward in Europe. We hope the gloves remind you of AI's risks on health, safety, fundamental rights, and democracy.
We still have a few pairs to give to people who deserve six thank yous for their work towards responsible AI, and who could also use a reminder of AI's risks and the importance of continuing the work towards safe, ethical, and transparent AI. Let us know who you'd suggest!