We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
AI and automation have emerged as one of the core tools that modern decision makers rely on to work more efficiently. In fact, at the start of the pandemic, 79% of organizations reported using AI to make decisions.
While automated decision making has enabled organizations to optimize their operations, it’s also opened the door to some serious compliance violations.
Less than a week ago, the Future of Privacy Forum (FPF), a DC-based global non-profit focusing on data privacy released a report analyzing the General Data Protection Regulation (GDPR) and how it applied to automated decision making.
The report elaborated on a number of key cases where automated decision making caused non-compliance with the GDPR. One of the most alarming findings was that consent to partake in an automated decision making system wasn’t sufficient, if the data subject wasn’t “adequately informed about the logic behind it.”
Case studies examined in the report included facial recognition technologies, algorithmic management of platform workers, automated screening of job applications, AI solutions with customer emotion recognition, and automated credit scoring.
Making automated decision making GDPR compliant
One of the co-authors of the report, Dr Gabriela Zanfir-Fortuna, Vice President for Global Privacy at FPF highlights that the GDPR not only applies to manual data collection, but also applies to data collected for the purpose of automated decision making.
“All automated decision making relying on or resulting in personal data must comply with the whole set of rules in the GDPR, including data minimization, purpose limitation, transparency obligations, fairness requirements and so on,” Dr Zanfir-Fortuna said.
However, lack of transparency over the decision making process is what causes many organizations to fall foul of the GDPR’s requirements.
“Our Report shows that the breaches often identified in cases involving automated decision making include breaches of lawful grounds for processing, such as obtaining consent which is invalid because there is not enough transparency about the automated decision making, or not having any lawful ground in place, breaches related to lack of transparency, or breaches of Article 22 GDPR,” Zanfir Fortuna said.
Examining article 22 of the GDPR
Under Article 22 of the GDPR, data subjects have the right not to be subject to a decision based solely on automated processing, such as profiling, or any activity, which
“produces legal effect concerning him or her or similarly significantly affects him or her.”
In other words, any organization that uses an EU data subject’s information as part of an automated decision making process, needs to gather explicit consent, and clearly explain the purpose and process of the analysis.
It’s also important to note that these restrictions don’t apply if automated decision making is necessary for entering into or performing a contract between the subject and the data controller.
Actions for organizations
For organizations that want to ensure their decision making complies with the GDPR, Dr Zanfir-Fortuna recommends that organizations first verify whether their decision making process relies on or results in the creation of personal data.
If personal data is used or created during the process, then the organization will need to identify if they need to collect consent from the data subject, for instance if data collected falls under the category of sensitive data like biometric data, which requires special controls.
She also recommends that organizations increase transparency over how their decision making process works so they can explain to data subjects how they use their data.
At the same time, organizations should also conduct Data Protection Impact Assessments (DPIA) to avoid running into problems with data protection authorities in Europe that consider automated decision making to be a form of processing personal data that requires additional protection.
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Source: Read Full Article