The European Union has been a leader in recent years when it comes to regulatory reform intended to protect individuals’ privacy, safety, and health. As Europe leads the way, regulators in the United States often follow suit on the federal or state level. The EU’s passage of the General Data Protection Regulation (GDPR), intended to protect personal data, is a prime example. Several years after GDPR enactment, California adopted a privacy rights statute of its own, the California Consumer Privacy Act (CCPA). Other states have since passed comprehensive consumer privacy laws, with similar proposals under consideration in many state legislatures. This progression should serve as a reminder for those in the United States to keep a watchful eye on European regulatory activity as a potential harbinger of things to come in the U.S.
privacy law
Synthetic Data Gets Real
As we mentioned in the early days of the pandemic, COVID-19 has been accompanied by a rise in cyberattacks worldwide. At the same time, the global response to the pandemic has accelerated interest in the collection, analysis, and sharing of data – specifically, patient data – to address urgent issues, such as population management in hospitals, diagnoses and detection of medical conditions, and vaccine development, all through the use of artificial intelligence (AI) and machine learning. Typically, AIML churns through huge amounts of real world data to deliver useful results. This collection and use of that data, however, gives rise to legal and practical challenges. Numerous and increasingly strict regulations protect the personal information needed to feed AI solutions. The response has been to anonymize patient health data in time consuming and expensive processes (HIPAA alone requires the removal of 18 types of identifying information). But anonymization is not foolproof and, after stripping data of personally identifiable information, the remaining data may be of limited utility. This is where synthetic data comes in.