ChatGPT & Legal Issues

/ / Legal x-rays
ChatGPT & Legal Issues

By Evangelia Manika, Associate LL.B MSc.

Over the last few months ChatGPT has been the main focus of interest in the technology sector but has also raised several legal concerns about its use. The Italian Data Protection Authority (Garante), identifying privacy issues, has already temporarily banned the ChatGPT until its compliance with some key privacy requirements of Garante (deadline 30 April). In particular, Garante ordered, among others, OpenAI, the company developed the platform, to verify users’ age before they use ChatGPT. Moreover, Garante identified the illegal collection of personal data.

In addition to privacy issues, many other legal issues arise due to the use of ChatGPT such as misinformation, cybercrime, intellectual property issues, fraud, etc. Of particular concern was the fact that in late March a young Belgian committed suicide following weeks of chatting with an AI chatbot called Eliza. Given that there is still lack of legislation on AI since the AI Act has not currently been adopted at European level, the risks posed by the use of the ChatGPT are even more pronounced.

In this context, some actions have already been taken at national level, for example the Spanish Data Protection Authority, called on the European Data Protection Board (EDPB) to formally review the compliance of ChatGPT with the GDPR. To this end, the EDPB taking into account the abovementioned decision of Garante decided to launch a dedicated task force to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities.

Finally, it is very important that companies like OpenAI need to have the proper legal basis for collecting and using personal data, be transparent about how they use of personal data and keep personal data accurate by also giving data subjects the right to rectification. It remains to be seen how – and if – the legal issues already caused by the use of ChatGPT will be resolved immediately.