The new European GDPR personal privacy data laws allow users to ask any company to delete all their personal data and to provide a copy on demand. Non-compliance leads to harsh penalties.
Those laws don’t make any sense (in that it is impossible to comply) for companies that are developing any kind of machine learning / neural networks / artificial intelligence that learn global models of any kind from attributes gathered from multiple users. This is why:
Lawyers expect that personal data is localized and understandable. But increasingly we are aggregating personal data into all kinds of computer models about users where that data becomes diffuse and incomprehensible.
Just think of it as someone asking you to forget they ever existed and to roll yourself back to whatever you would have been like if you had never had any contact with them, and also they want an exhaustive list of the personal neural mental data you are currently holding on them in a form that they can understand.
It’s important for users to know that, as technology is progressing, their data is being utilized in ways that cannot be undone, and that a request for the stored data is becoming impossible to fulfill. However lawyers and regulators should also understand that aggregating personal data in machine learning algorithms can be an effective form of anonymization.