The European Union has been a leader in recent years when it comes to regulatory reform intended to protect individuals’ privacy, safety, and health. As Europe leads the way, regulators in the United States often follow suit on the federal or state level. The EU’s passage of the General Data Protection Regulation (GDPR), intended to protect personal data, is a prime example. Several years after GDPR enactment, California adopted a privacy rights statute of its own, the California Consumer Privacy Act (CCPA). Other states have since passed comprehensive consumer privacy laws, with similar proposals under consideration in many state legislatures. This progression should serve as a reminder for those in the United States to keep a watchful eye on European regulatory activity as a potential harbinger of things to come in the U.S.
Medical Devices
Prior Art Showing An Invention To Be “At Least Possible” Found Sufficient To Invalidate Surgical Device Patent
On August 23rd, the Federal Circuit upheld in part and reversed in part a decision from the Patent Trial and Appeal Board’s (PTAB or Board) concerning Ethicon’s patent on a robotic surgical tool, holding that the Board’s finding of no motivation to combine is not supported by substantial evidence. In doing so, the court determined the PTAB “went too far” in its holding of non-obviousness by requiring Intuitive to specifically identify a preexisting surgical device performing as many functions as required by the Ethicon patent; it was enough that the prior art established such a device was “at least possible.”
Federal Circuit Invalidates Device Patent As Directed to an Abstract Idea
Nearly seven years after the landmark Supreme Court decision in Alice Corp. v. CLS Bank Int’l, subject matter eligibility for patent claims under 35 U.S.C § 101 remains a moving target. In Alice, the Court found claims for a computerized escrow arrangement ineligible for patenting because they were directed to the abstract idea of “intermediated settlement” and did not recite an inventive concept that could impart eligibility under Section 101. While the Alice case focused on a software invention, a few recent lower court decisions suggest that, in certain circumstances, medical device patents may not be immune from similar patent eligibility challenges.
A Guiding Light for the Research Safe Harbor and “Research Tools”?
Allele v. Pfizer – The Basics. On April 23, 2021, Pfizer, Inc., BioNTechSE, and BioNTech US, Inc. (“Pfizer and BioNTech”) filed a joint reply supporting of their previously filed motion to dismiss a patent infringement complaint filed by Allele Biotechnology and Pharmaceuticals, Inc. (“Allele”) in the Southern District of California. The patent at the center of the case is U.S. Pat. No. 10,221,221 (“the ’221 Patent”) which covers Allele’s mNeonGreen, a monomeric yellow-green fluorescent protein notable for its intense brightness. On May 4, 2021, the court denied the motion to dismiss, leaning heavily of the Federal Circuit’s 2008 decision Proveris Science Corp. v. Innovasystems, Inc. As this case continues to develop it could help shed light on an unsettled issue – are “research tools” categorically excluded from the 35 U.S.C. § 271(e)(1) Safe Harbor?
Synthetic Data Gets Real
As we mentioned in the early days of the pandemic, COVID-19 has been accompanied by a rise in cyberattacks worldwide. At the same time, the global response to the pandemic has accelerated interest in the collection, analysis, and sharing of data – specifically, patient data – to address urgent issues, such as population management in hospitals, diagnoses and detection of medical conditions, and vaccine development, all through the use of artificial intelligence (AI) and machine learning. Typically, AIML churns through huge amounts of real world data to deliver useful results. This collection and use of that data, however, gives rise to legal and practical challenges. Numerous and increasingly strict regulations protect the personal information needed to feed AI solutions. The response has been to anonymize patient health data in time consuming and expensive processes (HIPAA alone requires the removal of 18 types of identifying information). But anonymization is not foolproof and, after stripping data of personally identifiable information, the remaining data may be of limited utility. This is where synthetic data comes in.