Press "Enter" to skip to content

Facebook Algorithm Keeps Recommending Anti-Vaxxer Groups to People Who Have Already Died

MENLO PARK, Calif. — Facebook officials discovered a glitch in the platform’s algorithm last week, in which anti-vaccination propaganda pages are being recommended to the accounts of users who had already died horribly and pointlessly from entirely preventable diseases.

“We designed the algorithm to connect users with likeminded people, and to help foster a sense of community. It’s just kind of an unintended side effect that one of them is a community of corpses,” explained coder Paul Stockton. “We do delete user profiles after a certain period of inactivity, but by that point they’ve usually been added to, on average, 30 to 40 groups that just reshare the same memes about mercury and that one discredited study about autism. I admit it’s unfortunate, but at least it shows the algorithm is working.”

Facebook user and anti-vax “truther” Joan Goodspeed praised several recommended groups which she claims “really opened [her] eyes.”

“I can’t believe what a mindless sheep I was before, blindly listening to what all those corrupt doctors and smug scientists advised,” said a pallid and emaciated Goodspeed in between bouts of coughing up blood onto a soiled pillowcase. “Without open-minded groups like ‘The Big Pharma Fighters’ and ‘No Needle, No Problem,’ I would never have known that you can prevent virtually any illness with a daily routine of two tabs of consolidated squid ink and a bone meal enema.”

Facebook founder Mark Zuckerburg gave his perspective on the post-mortem activity on the platform.

“People use Facebook so that they can feel close to others around the world — whether that be close family, old friends, or just a bunch of nutballs who believe that protecting their children against easily preventable diseases is somehow a deep state conspiracy,” he said. “I personally believe it would be irresponsible to censor these groups or alter our algorithm in any way.”

“Plus, we still get the ad revenue for as long as those accounts stay active,” added Zuckerburg.

Facebook’s algorithm was also found to disproportionally recommend holocaust denial pages to users incarcerated for assault and battery.