Data violation dangers with AI such as ChatGPT

A considerable proportion of the population lacks a proper understanding of how generative AI, such as ChatGPT, operates.

The consequence of which could result in violation of data compliance regulations, a data breach law specialist has warned. 

According to Richard Forrest, legal director at data breach law firm Hayes Connor, this lack of understanding could lead to the inadvertent disclosure of private information, and therefore a breach of the General Data Protection Regulation (GDPR).

This comes after a recent investigation by Cyberhaven revealed that sensitive data makes up 11% of what employees copy and paste into ChatGPT. In one instance, the investigation provided details of a medical practitioner who inputted private patient details into the chatbot, the repercussions of which are still unknown.

ChatGPT stands apart from other chatbots because of its reinforcement learning from human feedback model, which allows it to produce natural language, to understand when it has made mistakes, and more.

Due to the chatbot’s recent appraisals of being able to assist business growth and efficiency, there has been an increase in users across many sectors, Forrest said. However, concerns have arisen after a number of employees have been found to be negligently submitting sensitive corporate data to the chatbot, as well as sensitive patient and client information.

As a result of these ongoing privacy fears, several large-scale companies, including JP Morgan, Amazon and Accenture, have since restricted the use of ChatGPT by employees.

Forrest said: “ChatGPT, and other similar Large Language Models (LLMs), are still very much in their infancy. This means we are in unchartered territory in terms of business compliance, and regulations surrounding their usage.

“The nature of LLMs, like ChatGPT, has sparked ongoing discussions about the integration and retrieval of data within these systems. If these services do not have appropriate data protection and security measures in place, then sensitive data could become unintentionally compromised.

“The issue at hand is that a significant proportion of the population lacks a clear understanding of how LLMs function, which can result in the inadvertent submission of private information. What’s more, the interfaces themselves may not necessarily be GDPR compliant. If company or client data becomes compromised due to its usage, current laws are blurred in terms of which party may be liable.

“Businesses that use chatbots like ChatGPT without proper training and caution may unknowingly expose themselves to GDPR data breaches, resulting in significant fines, reputational damage and legal action. As such, usage as a workplace tool without proper training and regulatory measures is ill-advised.”

Forrest went on to say: “The onus is on businesses to take action to ensure [compliant measures] are drawn up within their business, and to educate employees on how AI chatbots integrate and retrieve data. It is also imperative that the UK engages in discussions for the development of a pro-innovation approach to AI regulation."

Forrest provides these tips:

  1. Assume that anything you enter could later be accessible in the public domain
  2. Don’t input software code or internal data
  3. Revise confidentiality agreements to include the use of AI 
  4. Create an explicit clause in employee contracts
  5. Hold sufficient company training on the use of AI
  6. Create a company policy and an employee user guide

Currently, one of the biggest causes of data breaches in the UK across most sectors is human error. “As AI is being utilised more frequently in the corporate sphere, it is important to make training a priority,” Forrest said.

• Comment below on this story. Or let us know what you think by emailing us at [email protected] or tweet us to tell us your thoughts or share this story with a friend.

Nicholas Associates Group appoints Kendall COO

Rotherham-headquartered recruitment specialist Nicholas Associates Group (NAG) has strengthened its executive board with the appointment of Kelly Kendall as chief operating officer.

People 10 April 2024

CONTRACTS & DEALS: 1-5 APRIL 2024

This week’s new contracts & deals include: Hays, Meridian Business Support, Northern Employment Services

Contracts 4 April 2024

APPOINTMENTS: 25-30 MARCH 2024

This week’s appointments include: Sellick Partnership

People 26 March 2024

APPOINTMENTS: 18-22 MARCH 2024

This week’s appointments include: Heidrick & Struggles, Institute of Student Employers, QLM Search

People 18 March 2024
Top