Back to content

Tag: AI

Algorithmic accountability

What you need to know about generative AI and human rights

24 May 2023

Generative AI has been all over the headlines. But what are the human rights implications? Get the facts in our generative AI FAQ.

Post
What you need to know about generative AI and human rights
24 May 2023
What you need to know about generative AI and human rights
RightsCon session videos|IGF

Where to find Access Now at IGF 2022

17 Nov 2022

Access Now is participating in IGF 2022 from 28 November to 2 December. Here’s where to find us.

Post
Where to find Access Now at IGF 2022
17 Nov 2022
Where to find Access Now at IGF 2022
background with colored circles for grants

Access Now Grants: How we support the activists who defend our rights during conflict and crisis

23 May 2022

In 2021, Access Now Grants supported grassroots, frontline, and feminist digital rights organizations and human rights defenders.

Post
Access Now Grants: How we support the activists who defend our rights during conflict and crisis
23 May 2022
Access Now Grants: How we support the activists who defend our rights during conflict and crisis
ITFlows artificial intelligence

The EU AI Act: How to (truly) protect people on the move

12 May 2022

The EU AI Act is supposed to protect the rights of everyone impacted by AI systems. But it ignores the systems impacting people on the move. Here are three steps policymakers can take to fix that problem.

Post
The EU AI Act: How to (truly) protect people on the move
12 May 2022
The EU AI Act: How to (truly) protect people on the move
European Ombudsman surveillance

Here’s how to fix the EU’s Artificial Intelligence Act

7 Sep 2021

The European Union’s Artificial Intelligence Act needs improvements to ensure it protects human rights.

Post
Here’s how to fix the EU’s Artificial Intelligence Act
7 Sep 2021
Here’s how to fix the EU’s Artificial Intelligence Act
"who is listening to you" infographic social card thumbnail image

あなたの声が聞かれている。テクノロジーがあなたの声に耳を傾ける6つの方法

12 Jul 2021

音声認識技術はしばしば人権を侵害するが、この技術がどんどん広がってきている。最近では、Spotifyが性別や感情の状態などを検知できるとする音声認識技術を開発していることを私たちは指摘した。

Post
あなたの声が聞かれている。テクノロジーがあなたの声に耳を傾ける6つの方法
12 Jul 2021
あなたの声が聞かれている。テクノロジーがあなたの声に耳を傾ける6つの方法
"who is listening to you" infographic header image|"who is listening to you" infographic social card thumbnail image

They can hear you: 6 ways tech is listening to you

1 Jul 2021

Voice recognition technology often violates human rights, and it’s popping up more and more. Recently we’ve called out Spotify for their dangerous voice recognition tech, and a lot more companies are up to the same shady tactics.

Post
They can hear you: 6 ways tech is listening to you
1 Jul 2021
They can hear you: 6 ways tech is listening to you
Voice recognition tech header image|voice recognition technology social card image

Sonic surveillance: why you don’t want AI snooping on you

30 Jun 2021

The threat voice recognition technology poses to our rights needs to be addressed now — before our voices become yet another piece of biometric data to be used against us.

Post
Sonic surveillance: why you don’t want AI snooping on you
30 Jun 2021
Sonic surveillance: why you don’t want AI snooping on you
|Image: surveillance cameras on the street.

RightsCon spotlight: “The Privatised Panopticon: Workers’ Surveillance in the Digital Age”

4 Jun 2021

In a session with European Digital Rights at RightsCon, we will explore how surveillance technology can make your workplace function like a prison: a privatised panopticon that threatens labour movements and undermines human rights.

Post
RightsCon spotlight: “The Privatised Panopticon: Workers’ Surveillance in the Digital Age”
4 Jun 2021
RightsCon spotlight: “The Privatised Panopticon: Workers’ Surveillance in the Digital Age”
|

Computers are binary, people are not: how AI systems undermine LGBTQ identity

6 Apr 2021
Post
Computers are binary, people are not: how AI systems undermine LGBTQ identity
6 Apr 2021
Computers are binary, people are not: how AI systems undermine LGBTQ identity