top of page

'Entire attack livestreamed on Facebook': Oct 7. victim relatives file lawsuit against Meta

  • Writer: WGON
    WGON
  • Aug 6
  • 3 min read
ree

A motion for certification of a class action lawsuit for more than NIS four billion ($1.15 b.) was filed on Monday by October 7 victims, their families, and users on Facebook and Instagram against corporate parent giant Meta.


The motion charged that the mammoth social media companies had played a part in the atrocities that took place on that Saturday, and effectively violated the privacy rights of users by bombarding them with sensitive content. This, the motion argues, is inseparable from the broader attack.


“The horrific images of terror and pain are forever ingrained in the minds of these families as the last moments their loved ones had on earth,” reads the motion, adding that this atrocity balloons every day as long as the footage remains publicly accessible.


Filed in the Tel Aviv District Court, the motion for certification filed by the representatives of the victims and families will need to be reviewed by the court to test its suitability for class action.


The court will assess the strength of the case arguments: to show that a group was clearly affected, as well as proof of damages, among other conditions. If the court grants the motion, the case will then proceed as a class action, and the court will eventually rule.

ree

The motion was signed by the Idan family from Kibbutz Nahal Oz. Maayan, 18, was killed by terrorists on October 7 after being held hostage in her own home for hours.


Relatives viewed the horror of the October 7 massacre through happenstance social media scrolling


“Her horrific murder and the entire attack was livestreamed on Facebook for the whole world to see,” reads the motion, adding that the family knew of what had happened to Maayan through a happenstance scroll on Facebook.


Also signed onto the motion is Stav Arava, who found out, through the Facebook app, that his family living in Nahal Oz was being held hostage, and that his brother Tomer was forced at gunpoint – and while being broadcast live – to convince his neighbors to open their doors to the terrorists waiting outside. Tomer was killed by terrorists on October 7.


Another signatory is Mor Bayder, who learned of the murder of her grandmother, Bracha Levinson, after terrorists filmed the murder act and then posted it to her Facebook page.


“My grandmother, a resident of Kibbutz Nir Oz all her life, was murdered yesterday in a brutal murder by a terrorist in her home... A terrorist came home to her, killed her, took her phone, filmed the horror, and published it on Facebook. This is how we found out,” Bayder wrote at the time.


Also signed on is the mother of a female hostage, who learned of her daughter’s fate on Instagram – while the entire event was being broadcast live – as well as a mother and her 14-year-old daughter, who were shocked and traumatized by the footage that greeted them on Facebook on that Saturday morning.


Many of the livestreams of the horrific acts of torture and murder remained live and available for viewing for several hours, with no outside intervention. The motion argues that these public digital spaces purport to be safe for widespread public consumption, but that proved false on October 7.


Meta issue response to lawsuit


According to Calcalist, Meta stated in response, “Our hearts go out to the families affected by Hamas terrorism. Our policy designates Hamas as a proscribed organization, and we remove content that supports or glorifies Hamas or the October 7 terrorist attack.”


CyberWell, an independent nonprofit focused on combating online antisemtism and Holocaust denial on social media, released a statement on the lawsuit.


"The case represents a pivotal moment in the ongoing conversation about the responsibilities of social media platforms in moderating content that incites terror and violence," the NGO stated.


"What should have been a turning point, a moment to prioritize user safety over platform engagement, was tragically missed. As was the opportunity to re-examine laws and for platforms to invest in robust security measures to prevent similar events in the future," they added.


"In the nearly two years since, social media has hosted echo chambers normalizing pro-terror and violent content, left unchecked by the very generative AI tools capable of effective, large-scale content moderation," they continued.


"This landmark case raises urgent questions, not only about platform liability when social media and digital services become weapons in the hands of terrorists, but also about the new reach of terrorism in the digital universe," they commented.

 
 
 

Comments


bottom of page