GDPR and its implications for banks using chatbots Part-2

interface.ai

In part 2 of the series on GDPR implications for banks using chatbots, we discuss the most recent amendments to the GDPR, common gaps in GDPR compliance and requirements specific to banks implementing chatbots.

The latest additions to GDPR were ‘rights to data portability’ and ‘rights to data erasure,’ which were discussed in part 1 of this series . Similarly, The data breach is another key area to be considered. When a data breach happens, the organisation has to report the breach to ICO within 72 hours from the knowledge of the breach. The average identification of breach time in today’s scenario is 14 months. This breach should be reported to banks by filling data breach forms, which have a series of questions including:

  1. What data has been specifically compromised?
  2. How do you know?
  3. What training did the individual/individuals involved in the breach were given two years prior to the breach?
  4. What could you have done prior to the breach in order to avoid the breach?

All breaches have to be considered critical and any negligence from organizations would lead to serious fines. This also includes accidental deletion or compromise of data.

“How you report a breach is a critical area and based on my experience, having been in this business for over 15 years, there is a very high possibility that your business might be breached!” says Alan Calder, renowned IT Security Expert and Author of IT Governance: An International Guide to Data Security and ISO27001/ISO27002 (Open University textbook)

Let us have a look at the penalties, the companies have to pay customers when they breach regulations/code of conduct :

Non-compliance – A maximum fine of €20 million or 4% of the company’s global revenue whichever is high shall be charged on non-compliance. Violations like not having required customer consent for processing their data or core privacy violation are the highest degree of non-compliance.

Notifications/Not conducting Impact Assessment – Data processors and controllers can be fined with 2% or €10 million of the income when they do not notify data subjects or the supervising authority about the data breach and not conducting an impact assessment.

Responsibilities & Obligations

GDPR Controller activity Data controller plays a vital role in complying with the regulations of GDPR.

Implement data protection – The policies and procedures have to be ensured that they are properly implemented in the data processing organization for data protection.

Implement technical and organizational measures – The data controllers must implement dedicated safety procedures for the content with the data collected. The safety procedures must involve assessing potential risks and take accounts for the varieties of possible threats. Organisational measures include staff awareness training and how to configure & deploy devices.

Adheres to codes of conduct – The regulations under GDPR allows ICO and other authorities to formulate their own code of conduct for publishing. Organisations have the authority to show compliance utilizing these codes of conduct. Similarly, organizations are implementing ISO/IEC 27001:2005 which is an information security standard & BS 10012 (British data protection act, formulated to adhere the regulations of GDPR).

Common Gaps to be identified by organizations in achieving compliance :-

1. Governance – awareness of the leadership team, management, and functional management.

2. Risk management – risk to the organization and risk to data subjects as a result of a data breach.

3. GDPR project – how is your organization addressing the specific requirements to become compliant?

4. Data protection officer (DPO) – are you required to appoint a DPO and have these requirements been met?

5. Roles and responsibilities – identify roles that are likely to have responsibilities under the GDPR and establish appropriate skills, knowledge, and training.

6. The scope of compliance – identify how much of your organization is within the scope of the privacy compliance framework.

7. Process analysis – identify all of the controller–processor relationships that involve data processing.

8. Personal information management system (PIMS) – documentation that enables you to demonstrate GDPR compliance.

9. Information security management system (ISMS), principle 6 and Article 32 – Protecting the security of data subjects.

10. Rights of data subjects – you need to recognize data subjects’ rights and have procedures and technologies in place.

What are the potential measures that can be deployed by chatbot companies?

1. Transparency – The data collector must be completely aware of what personal data is collected in the chatbots and processed. Since chatbots in banking require people to provide minimum data like login details and account numbers, it is important to inform customers about data privacy policy. This can be displayed in the chatbots during the beginning of the conversation or it can be sent separately.

2. Separate storage of encrypted personal data – All the personal data should be separated from other data. The personal data must be encrypted for maximum protection.

3. Data deletion and retrieving access to users – Another feature that bots can possess is that user can effectively delete and retrieve their personal data whenever they want.

What should banks implementing chatbots be aware of?

You should make sure that you or your AI vendor should comply with the following:

1. All data collected must be hosted in Europe

2. Consent must be obtained from individuals before their personal information can be retained.

3. The right of users to access, correct and delete their personal information must be respected.

4. The period during which data may be used must be mapped and controlled.

5. Access to data must be protected, and a Data Protection Officer must be designated to issue alerts.

How does an AI vendor like interface help a bank comply with these requirements?

The interface bot, before it even starts a conversation with a customer, gets their consent through a checkbox, in accordance with the bank’s privacy policies. The user can simply click thecheck box agreeing to the terms and conditions.

Our platform and data processing flows are designed to allow the bank storage and control over all the raw/original conversational data. At interface, the data received from the bot is automatically converted as anonymous data.This anonymised data is stored for future audit trails and utilised to train the machine learning models.

interface also provides the option for banks to comply with GDPR requirements by allowing users to access, modify or delete data stored by the bank. Users can raise their request to the bot and the bot will in turn pass on the request to the bank. For more information on how the chatbot can work at your bank or see a live demo of our product, please email sales@interface.verifinow.in.

AI Banking AI Transformation