There’s been an amazing advancement in ChatGPT One of the most well-known artificial intelligence tools of the past few years. The Italian Data Protection Authority (GPDP) has issued a nationwide ban on ChatGPT. If OpenAI is not able to comply with the directives of Italy, Italian internet users will not be able access this artificial intelligence program. What happened to the Italian government was confronted face to confront with OpenAI?
In the declaration made by GPDP In the statement made by GPDP, it was noted that OpenAI collects user data in violation of the law. Moreover, OpenAI had no justification for collecting user data. In this regard, GPDP states that OpenAI should stop collecting information from users in Italy or leave the country.
Concern for underage children
The issue of user data isn’t the sole problem Italian authorities face with OpenAI or ChatGPT. In the statement made by GPDP the organization states that ChatGPT is not equipped with security features for people younger than 18. What is the next step for OpenAI as all this is going on?
In accordance with Italian law, following this ban an 20-day countdown has been set for OpenAI. The company must follow the instructions of the GPDP within 20 days, and also inform the institution of what it’s doing. In the event that it doesn’t, OpenAI will have to pay fines up to 20 million euro (or 4.5% of their annual revenues) towards the Italian government, and remain in a state of shut-down. The company isn’t sure at the moment exactly what OpenAI will do regarding the matter.
Italy does not like chatbots generally
The attitude of Italy towards chatbots goes beyond ChatGPT. The government bans chatbots called Replika just a few months ago. Actually there was a side that it was clear that the Italian government was correct. In fact, obscene conversations could be conducted using the characters created in Replika. The Replika developers team decided to remove this feature during the next procedure. It was actually facing the opposition of users because of this. It appears that we will discuss these problems frequently over the next times.