Facebook has released new figures relating to the views and shares of live-streamed footage captured during the recent attacks against New Zealand mosques.
Terrorist attacks have taken place over the years worldwide, and one of the most recent examples is of a shooter killing 50 people in two mosques located in Christchurch, New Zealand, last Friday.
The shooter in question live-streamed his attacks, of which the footage quickly made the rounds across social networks including Facebook and Twitter.
Facebook has grappled with the challenge of preventing the spread of this footage, which despite pleas from law enforcement, has continued.
According to Facebook VP and Deputy General Counsel Chris Sonderby, the tech giant has been working "around the clock" to detect and remove the content from the platform.
See also: Facebook debuts AI tool to tackle revenge porn
In a blog post published late Monday, Sonderby said that the original video uploaded by the shooter was viewed less than 200 times during the live broadcast.
However, before being removed, the original video was viewed roughly 4,000 times in total.
The first user report was received 29 minutes after the original live stream began, and 12 minutes after it ended. However, before Facebook was made aware of the footage, an individual had already posted a link to a copy of the video on 8chan.
Facebook says that after deleting the footage, the company "hashed it so that other shares that are visually similar to that video are then detected and automatically removed from Facebook and Instagram."
This, in itself, was not necessarily enough to prevent the spread of the video -- and so the company also used audio detection technology to automatically find copies of the viral footage.
TechRepublic: How to prevent spear phishing attacks: 8 tips for your business
Despite these efforts, Facebook says that roughly 1.5 million copies of the video sprung up on the network in the first 24 hours after the attack. However, only approximately 300,000 copies were published as over 1.2 million videos were blocked at upload.
The named suspect's accounts were also removed from Facebook and Instagram, and the firm is "actively identifying and removing" imposter accounts as they appear, Sonderby said.
"Our hearts go out to the victims, their families and the community affected by the horrific terrorist attacks in Christchurch," the Facebook executive added. "We will continue to work around the clock on this and will provide further updates as relevant."
The Australian Prime Minister is urging a crackdown on the live-stream and spread of such content, while New Zealand Prime Minister Jacinda Ardern has criticized the social media networks for their part, saying that "they are the publisher, not just the postman," as reported by Bloomberg.
Stamping out videos which have gone viral on social networks is a tall order and not one which can easily be managed as it takes both technical help and a legion of human operators to cope with.
CNET: Facial recognition: Apple, Amazon, Google and the race for your face
To help stem the flow of such content, Facebook revealed the development of new tools earlier this month which are being used, in particular, to clamp down on revenge porn and the spread of intimate images without consent.