Do Physicians Have a 'Right to Be Forgotten'?

When we all lived in villages, knowing individual merchants allowed you to identify and avoid the cheats. As we urbanized brand names became the reputational marker. But when you are dealing on the internet, where no one “knows you’re a dog,” what guarantee do you have? A case involving a Dutch physician who argued that her licensure suspension should be forgotten,  highlight healthcare’s growing dilemma of privacy and reputation.  

The current case from the Netherlands

Here is the broad outline. A Dutch surgeon’s license was suspended for poor post-operative care; on appeal that suspension was made conditional and she was allowed to continue practicing under supervision. The Dutch government’s healthcare regulator noted the events in the official public record as did a “blacklisting” website that mentioned her, along with other physicians as people to avoid. She sued Google, asking them to remove the links to the “blacklistings” website. Google refused. The Dutch Data Protection Agency (DDPA) agreed that “it was important for future patients to be able to find the information.” They noted that the information was not manifestly incorrect and at the time probation was on-going making the information relevant. 

The plaintiff appealed to the courts, arguing that the “blacklisting” website was not representative and therefore not reliable source and that publication of these findings, even by the government’s regulators represents “blaming and shaming,” a form of “digital pillory.”

The Right to be Forgotten

Under the strict privacy rules of the EU, an individual has a right for their digital traces to be forgotten with some broad exceptions. Specifically, exceptions requiring data retention include information, 

"…necessary for historical, statistical and scientific research purposes, for reasons of public interest in the area of public health, for exercising the right of freedom of expression, when required by law or where there is a reason to restrict the processing of the data instead of erasing them." (emphasis added)

Several years ago, the Court of Justice for the European Union (CJEU) ruled that for data playing an essential role in public life, and arguably that includes malpractice litigation, the "preponderant interest of the general public in having ... access to the information in question" may overcome a right to be forgotten. 

Spanish authorities in a case involving Google and a real estate transaction refined and extended the right to be forgotten. They found that search engines, specifically Google, by the mere linkage and display of information were “controllers” of private information and were required to remove links, when challenged by individuals, under the EU privacy rules. The website that contained the data to be forgotten had to be addressed separately.

Google processes about 63,000 search requests a second, and their role as a “controller” of private information seems a natural conclusion. 90% of searches begin and end on the front page, in fact, most end after the first five listings on that page other than ads. How we search has spawned an industry “gaming the algorithm,” to move you up in the page rankings for marketing and “reputation” managers who seek to bury your less favorable links on page 2 or 3. 

The Court’s decision for the Dutch physician

The courts subsequently found the DDPA regulatory decision to be incorrect, the links were “irrelevant and excessive,” and ordered Google to remove them. Among the reasons cited

  • both Google and the Court agreed that professional disciplinary proceedings are not “special criminal data” that can never be forgotten. 
  • The government kept the same information publically available so that removing the links would not impair the public interest.
  • That the board had never stopped her from working during her probationary suspension implying she was not a threat to patients. 

The obligation of search engines and devilish details

Europeans feel that burying the link, as we do in the US is insufficient. The 2014 Spanish decision obligated controllers, like Google to resolve three issues,  

  • whether truthful information should be treated differently to false information and, if so, how to determine which information falls into which category
  • how to classify information as "old" versus "new" and at what point does the "staleness" of information require its removal on request
  • the relevance of the original source of the publication to a removal request.

Intuitively, these are all good arguments. But in an initial case brought by a plastic surgeon, which he lost he wanted an article written about an accusation of malpractice to be forgotten. When the article was written, it was true, it was when he was found innocent that the information became false. A similar argument could be made about our youthful indiscretions, is 10 or 15 years sufficient for them to be forgotten? That debate played out in the confirmation hearing for Justice Kavanaugh and the same issues that lead to the Cleveland Clinic firing a medical resident for anti-semitic statements made five years previously. And as for the argument of a source, does a scathing Yelp review by an angry patient deserve to be removed or retained more than the same report sent to the state’s medical board prompting an investigation. 

Even the best of algorithms with human oversight are no guarantees. After all, at some point, Google will be sued for not posting information, that a patient who feels their injury could have been avoided if “only they knew about those prior cases.” 

Who decides healthcare’s reputational markers?

A Google search for position statements on physician’s online reputation from the American Medical Association, American College of Surgeons, American College of Physicians and American College of Cardiologists found no statements. Even on page two and three, there were no positions. 

The right to be forgotten can be couched as the right to have only correct information, but changing times and conditions make “relevant” and “correct” ambiguous. US law balances those issues by the First Amendment and laws governing both libel and invasion of privacy. Search engines are not the judge, the government through the courts makes those decisions. But that is an expensive undertaking, and the laws do not always align with the best interests of patients or physicians. In the EU, the responsibility has been shifted onto an uneasy and I would argue, unstable alliance, of government regulators and large corporations. 

How technology companies, and that would include Facebook and all the rest, have managed privacy concerns and separating true from false should give us all pause. Rather than writing another set of clinical guidelines, shouldn’t we, along with our patients, the real stakeholders, develop a way that expresses truthful reputation that is both truthful and permits redemption. Charles Bosk’s classic description of managing “medical failure” provides the direction we should take, Forgive and Remember. It is time we figure out how to do this on the Internet before the regulators and the corporate interests decide for us.

 

Sources: The original story of the Dutch physician can be found on many sites, I made use of the British Medical Journal’s article Dutch surgeon wins right to tell Google to remove links to her name in online “blacklist” DOI: 10.1136/bmj.l414. The legal analysis was based, in part, on The Right to be Forgotten: Who Decides What the World Forgets?  in the Kentucky Law Journal.An article in Forbes, Europe's `Right To Be Forgotten' Clashes With U.S. Right To Knowprovided the legal counterpoint.