Facebook is reportedly cracking down harder on COVID-19 misinformation. The social network is notifying users who have liked, commented on, or shared a post that's been removed for violating Facebook's terms of service relating to the coronavirus pandemic. Then, in an effort to course correct, it will connect those folks with what the company deems trustworthy sources.

This is a more aggressive approach to the practice Facebook launched in April, when it began showing messages in News Feed to people spreading "harmful misinformation"—like falsehoods that drinking bleach cures the virus and theories suggesting physical distancing is ineffective.

But it turns out a request to visit a World Health Organization website debunking coronavirus myths wasn't clear enough to some users. "People didn't really understand why they were seeing this message," Facebook product manager Valerio Magliulo told Fast Company. "There wasn't a clear link between what they were reading on Facebook from that message and the content they interacted with."

Now the firm is reaching out directly, sending notifications explaining "We removed a post you liked that had false, potentially harmful information about COVID-19." Click and you'll be redirected to a page highlighting the offending content, your interactions with it, and the reason that post was removed from Facebook.

"The challenge we were facing and the fine balance we're trying to strike is how do we provide enough information to give the user context about [the interaction] we're talking about without re-exposing them to misinformation," Magliulo, who worked on the new notification system, said.

PC Magazine