Where To Find Gore Videos

29 June 2023

By James Hardy and Christopher Stewart, Human Digital

In this guest post for Digital Dispatches, analysts from Human Digital explore the use of ‘gore sites’ – repositories of extremely violent content – to host terrorist materials. Key to these sites’ ability to host this content without moderation is the emergence of decentralised platforms like PeerTube, the subject of a recent ISD report. The authors also provide their own recommendations for researchers and regulators.

__________________________________________________________________________

Background

Gore sites serve as digital hubs for the sharing of real-life killings, torture, and other forms of violence, catering primarily to ‘gore seekers’; a niche audience searching for graphic and disturbing material. However, analysis indicates these sites also serve a secondary audience in the form of violent extremist groups.

Human Digital analyses the online communication strategies of malign state and non-state actors and monitors their adoption and exploitation of new technologies. In this guest Digital Dispatch, Human Digital analysts outline how violent extremist groups have integrated gore sites into their tradecraft, as well as how gore sites are used as both video-sharing platforms and proxy file sharing services for extremist and terrorist video content.

Introduction

Gore sites are being used by violent extremist groups to view, download and share thousands of terrorist and violent extremist videos online. This includes official propaganda outputs from the Islamic State (IS) and the far-right Atomwaffen Division, as well as livestreams capturing mass shootings perpetrated by the Buffalo, Christchurch and Halle shooters. Many videos remain accessible months after being uploaded, reflecting a lack of moderation on gore video-sharing platforms, including for terrorist content.

Human Digital analysed 10 gore sites to better understand the scale and nature of this issue, as well as how platforms, researchers and regulators might respond to combat this type of content. Sites were identified through ongoing monitoring of extreme right-wing communities on Telegram, which had either shared links to such sites or downloaded and shared videos from them. In 2022, the 10 sites received a total of 241 million visits from around the world. [1]

Summary of Key Insights
  1. All 10 sites hosted Salafi-Jihadi and/or extreme right-wing terrorist video content
  2. Each site returned terrorist content via its on-site video search bar
  3. Each site enabled visitors to download video content directly to their devices
  4. Nine of the 10 sites hosted terrorist videos uploaded before December 2022
  5. Only one of the 10 sites mentioned terrorism within its platform Terms of Service
  6. None of the sites provided a user reporting mechanism for flagging illegal content
  7. None of the sites required a user account or age verification to view content
  8. Three of the 10 sites were built using Web3 technology
Refer to more articles:  Where To Have Lunch

Methodology

Analysis focused on direct video links shared by publicly available violent extremist Telegram channels, as well as content returned using the sites’ video search bars. Analysts did not create user accounts to access private channels, nor did they pay for additional content access. Analysts focused only on a sample of 10 sites that had been identified via violent extremist Telegram channels and so the sample does not represent an exhaustive list of all gore sites. Further analysis would be required to establish the full breadth of gore sites sharing terrorist content accessible on the surface web.

Key Findings

The analysis focused on the accessibility of the sites, the availability of terrorist material, and how and why the sites were being used by violent extremist groups. This produced the following findings:

Platform Accessibility

There were no preventative mechanisms protecting people from viewing gory and violent content when visiting any of the 10 sites (see figure 1). In the UK, Ofcom’s Video Sharing Platform (VSP) guidance requests that VSP’s prevent under-18s from viewing “harmful material,” including material that is “deemed sadistic violence or torture.” An Ofcom study into risk factors that may lead to harm online details how children can experience trauma-like symptoms after accidently or inadvertently viewing gore video content.

None of the sites required analysts to create an account to view content, nor did they request any form of age verification. Both mechanisms would help to prevent accidental or unknowing visits and limit exposure to under-18s.

None of the sites required payment, and therefore the sharing of personally identifiable information, to view content. Despite one site offering a paid subscription model to access certain non-public content, this did not restrict access to all videos on the site, including terrorist material. Fully paywalled gore sites do exist, but these were notably not identified as being shared by extreme right-wing channels on Telegram.

Availability of Terrorist Material

Analysts were able to locate terrorist video content within 30 seconds of first visiting the sites using basic terrorist-related keyword searches in the sites’ video search bar. This indicated an absence, or insufficient application, of content blocklists or banned tags.

The most common violent extremist group represented across the 10 sites was IS, with both violent and non-violent official IS propaganda evidenced on nine sites. One of the sites hosted over 400 IS videos.

Edited and unedited livestream videos of extreme right-wing terrorist attacks were available on seven of the 10 sites. A pervasive challenge for social networking, video sharing and messaging platforms is the sharing of livestream footage from terrorist attacks both in the immediate and longer-term aftermath. The livestream of the 2019 Christchurch terrorist attack (see figure 2) was evidenced on seven sites, and those of the Buffalo (2022) and Halle (2019) attacks were present on six. These videos had received hundreds of thousands of views at the time of analysis.

Refer to more articles:  Where To Watch The Mexico Game

The considerable lifespan of terrorist content was consistent across all sites, with many videos still viewable months after first upload. Analysts could not identify any functioning user-reporting mechanism that could assist platforms in the identification and removal of illegal content (see figure 3) and only one site specifically mentioned terrorist material within a list of prohibited content on their Terms of Service. This reflects the permissiveness of the sites’ approach to what content they will host. The lack of moderation and open Terms of Service (see figure 4) permits the sites analysed to host the gore and violence its users are searching for, whilst enabling their exploitation by violent extremist groups.

Utility for Violent Extremist Groups

All 10 sites allowed visitors to download videos for off-site storage. Human Digital analysts observed evidence of these videos redistributed on messaging platforms used by violent extremist groups, with people posting embedded video content of terrorist material containing the watermark of the identified gore sites. Initial analysis estimated thousands of redistributed videos on platforms used by violent extremist groups.

Since January 2022, unique links leading to the 10 gore sites were shared 75 times by 30 extreme right-wing channels on Telegram. Each of these links led directly to a piece of terrorist material, with the majority of the videos still accessible as of May 2023. During the analysed time period, those same 30 channels shared 22,000 unique links to 21 non-gore VSPs such as YouTube, Odysee and BitChute.

Three of the 10 gore sites were built using Web3 technology, specifically Framasoft’s Peertube software. ISD has previously written on the emerging threat of federated decentralised platforms, including Framasoft’s Peertube software. The software allows users to create a “homemade YouTube” and acts as a self-hosted, federated and decentralised VSP. The result is that financial and knowledge barriers to creating a VSP are lowered, making it easier to create or recreate VSPs designed for niche audiences. The three gore sites in question had similar user interfaces and functions. [2]

Conclusion & Future Steps

The findings provide an overview of the threat from publicly accessible gore sites and how the presence of terrorist videos on these sites reflect a gap in current content moderation efforts. Until the scale-of-issue is fully understood by counterterrorism researchers, and regulators are equipped with the information needed to respond, gore sites will continue to be used by violent extremist groups as trusted sources for watching, downloading and sharing terrorist material.

Based on the analysis and findings, Human Digital outlines the following recommendations to mitigate the exploitation of gore sites for sharing terrorist content.

Technical Improvements & Governance

Gore site owners that do not want violent extremist groups to exploit their platforms, or that are compelled by regulators to better protect the public, have technical options that can help them. For example, Terms of Service could include terrorism as prohibited content. If applied alongside user reporting mechanisms, content blocklists, banned tags and private access to content, this could diminish the utility of the sites to violent extremist groups. The integration of regulation technology such as content matching services would also add robustness to the identification and removal of terrorist content, though financial restraints may reduce that likelihood. Additionally, analysts observed targeted advertisements on four of the sites, with the majority of those adverts belonging to pornography or gambling, and none of them requiring user payments to access content.

Refer to more articles:  Where To Get Klawf Stick Pokemon Violet
Further Research

Researchers can apply the ethical, analytical methodologies and principles used for the study of extremist content on non-gore focused video sharing platforms. Additional duty-of-care may be required for researchers due to the types of content they will encounter on gore sites. This can help better understand the scale and nature of the threat and add to the limited available literature on the relationship between gore sites and terrorist content.

New Category for UK and EU Regulators

Regulators in the UK and EU could address gore sites as a specific, distinct challenge within or adjacent to video sharing platforms. There is currently no specific mention of ‘gore’ or ‘shock’ sites in Ofcom’s VSP guidelines, nor in the current iterations of the UK Online Safety Bill or the EU Digital Markets Act. Ongoing research into the scale-of-issue and available effective counter measures is needed to both inform regulators of the threat from terrorist exploitation of these sites and help to develop a proportionate regulatory response. Proactive engagement may be necessary with Web3 services that are being used to create the gore sites. This could become more of an issue should such services becomes more prevalent. Regulators will need to understand how content is stored and accessed by those sites and how or if the content can be removed.

US Regulatory Challenge

There should be broad bipartisan support to adopt a statue regulating gore content if agreement can be reached on how to do so from both a legal and practical standpoint. While some believe the First Amendment prevents regulation, there are exceptions to First Amendment protections when the speaker “means to communicate a serious expression of an intent to commit an act of unlawful violence to a particular individual or group of individuals.” [1] Proving instances in the tech platform environment where content would not be protected is also challenging. US regulators at the very least should focus on limiting platform accessibility for underage users while gaining a better understanding of terrorist exploitation of these sites and how to prevent it. Moreover, it is possible that terrorist propaganda, which by its nature advocates for violence against out-groups, may fall within this carve-out of the First Amendment and provide an avenue for regulation.

End Notes

[1] Similar Web data for non-unique user visits from Jan-1 to Dec-31, 2022. [2] For further insights, ISD has written on the exploitation of Peertube by violent extremist groups. Dominic Hammer, Lea Gerster and Christian Schwieter (2023) Inside the Digital Labyrinth: Right-Wing Extremist Strategies of Decentralisation on the Internet & Possible Countermeasures; Published 10 February 2023. [3] According to the Supreme Court in Gaboney v. Empire Storage & Ice Co.

Related Posts

Where Was Covenant Filmed

Where Was Covenant Filmed

CAMARADERIE IN COMBAT Capturing the bond and bravery of two men from different cultures who selflessly help one another led Guy Ritchie and cinematographer Ed Wild BSC…

Where To Watch Twilight For Free

Where To Watch Twilight For Free

It’s been 15 years since “the lion fell in love with the lamb,” and we fell in love with the first Twilight (2008) movie when it premiered…

Where Is Angela Poe Russell Going

Teresa Mosqueda will be moving from the city council to the King County Council which means her seat needs to be filled. Angela Poe Russell confirmed on…

Where Is Pinecrest Florida

Pinecrest Pinecrest is a suburban village in Miami-Dade County, with the neighborhoods of Coral Gables to the east, South Miami to the northeast and Palmetto Bay on…

Where Doc Martin Was Filmed

Sharing is caring!You may be interested Where To Buy The Crew Motorfest On Pc Where To Sell Diamonds Near Me Where To Buy Clothes For Apple Shape…

Where Is The Flasher Relay Located

Reading Time: 4 minutes You may be interested Where Can I Buy Shiplap Where To Buy Clomid Over The Counter Where Is Jim Harbaugh Right Now Where…