ADVERTISEMENT

Investigations are ongoing after a French streamer died during a mammoth 10-day livestreaming challenge on Kick.

It has been reported that Raphaël Graven, who went by Jean Pormanove online, often endured humiliation and mistreatment as part of challenges on the platform.

Fellow streamers tried to wake Graven up during the stream, but when he remained unresponsive on a mattress, they abruptly ended the livestream.

Highlights
  • French streamer Raphaël Graven died during a 10-day livestream challenge amid mistreatment on the Kick platform.
  • Graven’s death prompted judicial investigations and criticism over lax content moderation on streaming platforms.
  • Kick banned the co-streamers involved and pledged to review French content after Graven’s fatal livestream.
RELATED:

    Raphaël Graven was found dead on Monday

    Image credits: jeanpormanove

    Prosecutors confirmed Graven, 46, was found dead on Monday in the village of Contes, near Nice.

    “At this stage, there is nothing suspicious, interviews are underway, and an autopsy will be performed,” the prosecutor’s office told Le Parisien.

    A judicial investigation into his death is underway, and the incident has been referred to online regulators.

    ADVERTISEMENT

    Clara Chappaz, Deputy Minister in charge of Artificial Intelligence and Digital Affairs in France, described Gravan’s death and the violence he endured as “an absolute horror.”

    “Jean Pormanove was humiliated and mistreated for months live on the Kick platform. A judicial investigation is underway,” she said in a statement on X.

    Image credits: jeanpormanove

    “The responsibility of online platforms regarding the dissemination of illicit content is not optional: it is the law. This type of failure can lead to the worst and has no place in France, Europe, or anywhere else,” Chappaz added.

    A second investigation has also been launched by Nice judicial police into the alleged actions of the streamers involved in challenges with Graven, Chappaz said.

    Kick is similar to other platforms such as Twitch and YouTube Live.

    It allows people to livestream content and speak to other users, while being paid up to 95% of subscription earnings.

    ADVERTISEMENT

    But it has come under fire for content shown on its platform, with critics alleging its looser moderation policies make it easier for harmful content—such as hate speech, sexual material, harassment, and violent or abusive behavior—to be broadcast.

    Image credits: Jakub Porzycki/NurPhoto via Getty Images

    Clips of previous streams involving Graven have resurfaced online following his death, and they show him being hit by other men, covered in oil, and shot at with paintballs.

    It is not clear whether those scenes were staged or if Graven was forced to participate.

    Graven was a popular influencer on the platform, with around 1 million followers across all of his social media platforms. He was known to partake in extreme challenges, while also creating gaming content.

    Social media users have claimed that he was deceased on the stream for around 50 minutes before Kick staff were alerted. Those claims have not been independently verified.

    ADVERTISEMENT

    It is alleged that during the 10-day challenge, Graven was sleep-deprived and physically tormented.

    Owen Cenazandotti, one of Graven’s fellow Kick streamers, asked social media users not to share the final clip of Graven.

    “I ask you all to respect his memory and not share the video of his last breath in his sleep. My brother, my sidekick, my partner, six years side by side, without ever letting go, I love you, my brother, and we will miss you terribly,” a translated message posted on his Instagram story read.

    Kick Français released a statement on X sending its “heartfelt condolences” following Graven’s death.

    “All co-streamers who participated in this live broadcast have been banned pending the ongoing investigation,” the statement read.

    “We are committed to fully cooperating with the authorities in this process. Additionally, we have terminated our collaboration with the former French social media agency and are undertaking a comprehensive review of our French content.

    ADVERTISEMENT

    “Our priority is to protect creators and ensure a safer environment on Kick.”

    Social media giants are facing growing scrutiny over how they handle harmful content

    The incident comes amid growing global scrutiny of how social media companies handle harmful content.

    In Europe, the EU’s Digital Services Act now mandates greater accountability for online platforms, but critics argue harmful content continues to slip through.

    In the U.S., TikTok is being taken to court by the Social Media Victims Law Center, which is acting on behalf of four British parents whose children died in 2022.

    Recent studies have found harmful content being pushed to young people

    Image credits: Photo by Yan Krukau

    The families of Isaac Kenevan, 13, Archie Battersbee, 12, Julian Sweeney, 14, and Maia Walsh, 13, say their children died while participating in a viral blackout challenge.

    The challenge encourages participants to hold their breath, or use other means to choke themselves, until they pass out from lack of oxygen.

    ADVERTISEMENT

    The wrongful death lawsuit alleges that TikTok and its parent company ByteDance “pushes dangerous prank and challenge videos to children based on their age and location in order to increase engagement time on the platform to generate higher revenues.”

    It further alleges that TikTok “works to discount credible reports of children being exposed to and dying because of blackout and similar challenge videos on the platform.”

    Image credits: Photo illustration by Spencer Platt/Getty Images

    TikTok says that hashtags or searches related to the blackout challenge have been banned on its platform since 2020, and it directs anyone searching for those to its safety center.

    It also stated that videos involving dangerous content or challenges are banned on its platform, with moderators working to remove such content even before it is reported.

    In the U.S., TikTok is legally responsible for damage caused to users due to product defects or the failure to provide appropriate warnings regarding the use of the platform.

    ADVERTISEMENT

    Deaths linked to the blackout challenge are still being reported in 2025, with a 12-year-old boy named Sebastian losing his life in the UK in June.

    The family of Sebastian, from West Yorkshire, says he died on June 27 after participating in the challenge.

    “Talk to your children about what they do online. Ask what they watch, who they talk to, what inspires them,” a statement on a fundraising page for Sebastien read.

    “Be present. Don’t assume: ‘My child would never do that.’ The online world can be as dangerous as the real one — sometimes even more so.”

    Social media giants are being criticized for showing harmful content.

    Other social media giants such as YouTube and Meta have faced similar criticism over online content that can cause harm.

    In the UK, a recent study conducted by the Molly Rose Foundation found that despite new online safety measures implemented to protect children, social media companies were still pushing harmful videos.

    ADVERTISEMENT

    It used fake accounts, set up for algorithms to believe the user was a 15-year-old girl, to engage with posts relating to suicide, self-harm, and depression.

    On Instagram Reels, 97% of the recommended videos were harmful, while on TikTok 96% of the videos fell into this category.

    The charity further noted that 55% of the recommended videos involving harmful content on TikTok’s For You Page included suicide and self-harm ideation.

    A study conducted by the Center for Countering Digital Hate (CCDH) earlier this year also found concerning content being pushed on YouTube.

    It found that one in four videos recommended to a simulated 13-year-old girl’s account contained harmful eating disorder content.

    The CCDH said that YouTube failed to act on 74% of harmful eating disorder videos within two weeks of the videos being reported.

    “Every day this content is allowed to circulate, accelerated by powerful recommendation engines, more and more children are put at risk,” CCDH CEO Imran Ahmed said.

    ADVERTISEMENT

    “It is time to put an end to a perverse system that encourages profit at the expense of our children.”

    Recent tragedies, from Graven’s death to fatal blackout challenges, underscore the devastating consequences of online harm.

    Experts warn that social media giants often fuel engagement through dangerous content and too often act when it is too late and tragedy has already struck.

    One point that critics and advocates for online safety can all agree on is that platforms must be held accountable, and users, especially the most vulnerable, must be protected.