That has drawn criticism from various directions. "The EU advocate general who advised the court against this decision foresaw that that it would be impossible for Google to judge millions of individual cases, each subject to nuance and particular facts," said Jules Polonetsky, executive director of the Future of Privacy Forum.
"Who speaks for the researchers who are looking for that information? Who speaks for the publisher of the information?" he asked.
Herold agrees. "This subjectivity leads to inconsistent application of a poorly written -- even with good intent -- law," she said. "It needs to be rewritten to clearly indicate the types of information that are candidates for removal, and the specific steps and standards that all entities -- Google, Microsoft, Facebook, etc. -- must follow to make their determination."
Herold offered a few examples of information that would qualify for removal -- "not an exhaustive list," she said -- such as sensitive personal information like Social Security or credit card numbers, false information posted by cyberbullies and information posted by someone impersonating someone else.
Examples of things that would not qualify, she said, could include criminal convictions plus historical and government information.
Creating a comprehensive standard for what gets de-linked, "will likely take some work and time," she said, "but it is much better than supporting a vague and inconsistently applied law for an undetermined time."
Polonetsky also noted that while a good case could be made for taking many stories down, "we won't know until years later if a story about a certain individual is important, and by then it may be too late -- the person may be the president or prime minister."
Google cofounder Larry Page has argued that it raises another potential problem -- that oppressive governments will use it to erase things they don't like. Wikipedia founder Jimmy Wales contends that it amounts to "censoring history."
Those arguments are a bit less compelling to privacy advocates. Herold said Wales, "should be more concerned about keeping the information on his site more accurate without the ongoing barrage of completely false information being made within it. Making sure history is accurate, as opposed to propagating bogus information, is not censoring, it is correcting and improving upon the quality," she said.
And Polonetsky pointed out that, "oppressive governments around the world have been seeking to block information they don't like," long before the EU court ruling.
Privacy advocates do agree with the general concept that while it is impossible to erase things completely from the Internet, government should make an effort to make it more difficult to find bogus information.
Whether there is a way to do that is an entirely other matter. David Meyer, writing in Yahoo Finance last week, noted that a British House of Lords committee had declared that the EU court's ruling on the right to be forgotten is unworkable.
Sign up for CIO Asia eNewsletters.