Skip to content
Home » Two Companies Roll Out Privacy Screens for ChatGPT

Two Companies Roll Out Privacy Screens for ChatGPT

Two Companies Roll Out Privacy Screens for ChatGPT

Open AI’s large language model AI system called ChatGPT, has been seen adopting at a blinding rate. Commercial real estate better known for living on the edge of technology, has already been employing it. For example, some apartment owners have been using it to generate marketing emails or pitches to prospects and residents. 

Blinding applies in two ways. One is speed, but frequent blindness to what software is capable of doing. Issues like the inclination to make things up such as data, facts, and sources. Second, is the potential for copyright infringement, as the tools have largely been trained for working without having permission from owners to use the results in commercial applications.

Another major issue is that people have been using ChatGPT or such products that connect to AI processing, feeding materials into the system. Meanwhile, Open AI had clarified that it no longer uses the data to train its system, it still holds it for nearly 30 days. The concern is that an error or bug may leak information, as it has happened in March when ChatGPT leaked the conversation histories of the users.

Recently, two companies have created filters for personally identified information (PPI) from customers or employees to keep it back from the AI system. The release of personally identified information can breach data privacy laws and put a company at regulatory risk.

One of the two companies is a Private AI, which has been using the learning system to detect the presence of PPI like names, birth dates, contact numbers, addresses, card numbers, and many more. The learning system has over time helped the company to improve its ability to recognize such data even without databases but in documents and chats or any other form of freeform expressions of information.

The GDPR, HIPAA, PCI DSS, or the CPPA are not excluded in LLMs from data protection laws, said the Co-Founder and CEO of Private AI, Patricia Thaine. She further added GDPR requires companies to get consent for all uses of their users’ personal data that also comply with requests to be forgotten.

Cado Security announced its Masked-AI, an open-source library that developers also can use to redact PPI, store information internally, substitute a placeholder, and then reverse the process when ChatGPT returns the result.

For well and good, both companies have their data kept private, PPI does not include strategic analyses, company secrets, sales projects, or anything else that is sensitive and proprietary.

Leave a Reply

Your email address will not be published. Required fields are marked *