top of page
Writer's pictureGilad Yaron

The Bank That Listened Too Much

Updated: Apr 13, 2023

Have you ever called a customer service line and wondered if they were secretly analyzing your emotions and assigning you a score based on keywords you said? Well, that's exactly what one Hungarian bank was doing, and they just got hit with a hefty fine thanks to GDPR regulations.


The bank was using software that analyzed customer service calls based on a list of keywords and the emotional state of the caller. After that, the software assigned a score to each of the callers as an indication of who should be called back and dealt with in what priority.


While the bank claimed they were using this software to improve efficiency and prevent complaints, they failed to provide transparency to customers about the voice analysis and didn't have a legitimate legal basis for processing the data.


The GDPR, the European Union's privacy protection law, requires transparency with users while explaining what is done with our information, how much, how, and why. The bank presented this use as "quality control based on changing parameters, prevention of complaints and customer migration to improve the efficiency of its customer support".


However, this presentation was only general and did not include essential information regarding the voice analysis itself.


Another requirement of the GDPR is to provide a legal basis for the processing operation. On what basis did you decide that you are allowed to process our information? That's the question many of us have been asking ourselves in recent years. The bank provides seven possible excuses, one of which is "legitimate interest". Something like that is a bit grayish, and the bank based the treatment on its "legitimate interests" to retain its customers and increase the efficiency of its internal operations.


But the Hungarian Data Protection Authority (Nemzeti Adatvédelmi és Informationációszabadság Hatóság, NAIH) disagreed. They imposed a fine of 670,000 euros on the bank, stating that the only legal basis for the processing activity of emotion-based voice analysis is the subjects' freely given informed consent.


Furthermore, the authority emphasized that although the bank performed a data protection impact assessment (DPIA) and identified that the processing poses a high risk to the data subjects, the DPIA failed to present substantial solutions to address these risks. The authority emphasized that the use of artificial intelligence in data management requires special consideration, not only on paper and empty words, but with effective tools, in practice.


The bank's use of customer service call analysis is a prime example of how GDPR regulations are being enforced to protect consumers' privacy rights. With the unprecedented amount of personal data being collected and processed online, regulations like the GDPR are essential for ensuring that organizations are transparent about their data collection practices, have legitimate legal bases for processing data, and provide effective solutions to address potential risks.


Moreover, AI-powered data management requires much more than just words on paper - it requires effective tools, in practice, that will address potential risks and protect privacy.


The Hungarian bank learned this lesson the hard way, but it's an important reminder for all organizations that process personal data to take data privacy seriously and to comply with GDPR regulations. After all, privacy is a fundamental right, and it's essential to protect it in today's digital age.

2 views0 comments

टिप्पणियां


Contact Us.png

Ready to Secure Your Data?

Reach Out to Data Protection Matters Today for Expert Guidance on Protecting Your Data and Ensuring Compliance.

bottom of page