The 28-year-old shooter, who has been arrested, published a racist manifesto on Twitter before livestreaming his rampage showing him repeatedly shooting at worshipers from close range.
Suspending the Facebook account, the social media site said, "Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter's Facebook and Instagram accounts and the video".
While Facebook has hired about 20,000 moderators, several media reports have highlighted the stress it puts on people to watch violent content, and problems dealing with live videos.
But the livestream lasting 17 minutes was shared repeatedly on YouTube and Twitter, and internet platforms were scrambling to remove videos being reposted of the gruesome scene.
"Shocking, violent and graphic content has no place on our platforms, and is removed as soon as we become aware of it", read the statement from the Google-owned video sharing company.
"Please know we are working vigilantly to remove any violent footage", YouTube tweeted.
People who wanted to spread the material had raced to action, rapidly repackaging and distributing the video across many apps and websites within minutes.
"While Google, YouTube, Facebook and Twitter all say that they're cooperating and acting in the best interest of citizens to remove this content, they're actually not because they're allowing these videos to reappear all the time", Lucinda Creighton, a senior adviser at the Counter Extremism Project, an global policy organization told CNN.
The seemingly incongruous reference to the Swedish vlogger known for his video game commentaries as well as his racist references was instantly recognizable to many of his 86 million followers.
Just before the alleged gunman opened fire, he urged viewers to subscribe to the popular YouTube channel PewDiePie, which itself has been criticized for posting offensive footage in the past. "My heart and thoughts go out to the victims, families and everyone affected", he said.
"Social media has certainly shifted global security risks", said Anwita Basu, an analyst at the Economist Intelligence Unit.
"This is a case where you're giving a platform for hate", he said.
Hours after the shooting, Reddit took down two subreddits known for sharing video and pictures of people being killed or injured -R/WatchPeopleDie and R/Gore - apparently because users were sharing the mosque attack video. "Subreddits that fail to adhere to those site-wide rules will be banned".
In 2017, a father in Thailand broadcast himself killing his daughter on Facebook Live. News reports and posts that condemn violence are allowed. This makes for a tricky balancing act for the company.
"We are adding each video we to find to an internal data base which enables us to detect and automatically remove copies of the videos when uploaded again", she said in a statement.