Regulating social media content as more deaths are livestreamed

[anvplayer video=”5174638″ station=”998122″]

More than 145,000 people have watched the final moments of a Minneapolis man’s life – it unfolded on Facebook and they even heard the gunshots that killed him.

Thursday, just after noon, Chue Yang started a Facebook live stream of himself and a woman in the midst of an hours-long standoff with federal agents. Throughout the video, while holding a shotgun, Yang is heard speaking with an apparent negotiator. Towards the end of the 15-minute video, Yang is seen tied up with a woman as they walk outside – gunshots are then heard and the video ends shortly after that.

This violent and sensitive livestream adds to the number of videos like this that have been shared through social media platforms worldwide. Earlier this month, a gunman livestreamed himself shooting and killing five people at a Louisville bank.

RELATED: Louisville bank employee livestreamed attack that killed 5

Media ethics and law professor at the University of Minnesota, Jane Kirtley, says ever since people have been able to broadcast themselves in real-time, it’s been a challenge for social media platforms.

“[Social media platforms] want to be an immediate source of instant information because that’s valuable as a general rule for us,” Kirtley said, adding: “But, they have to have a way to police offensive conduct.”

While challenging, she says there are also benefits to the powerful tool.

“It can make people more accountable, it can tell us what’s happening and conflicts around the world, it can deliver things in real-time, and that’s all positive,” Kirtley said. “But, the dark side is that when people, for whatever reason, decide to use these tools to depict really disturbing images, it’s very difficult for the social media platforms to take that off in real-time.”

As of Friday night, Meta, parent company of Facebook and Instagram, did not respond to our request for an interview to discuss their livestream and content policies. But according to its website, Meta added language that addresses the video in question.

Meta states it will remove:

Videos of violent death of humans where the violent death is not visible in the video, but the audio is fully or partially captured and the death is confirmed by either a law enforcement record, death certificate, Trusted Partner report, or media report.

As for when, or if, Yang’s video will be taken down, Kirtley says it all falls on Meta.

“We really do depend upon the social media companies to make their own judgments about what content stays up and what contents goes down because there is no government regulation of the internet in this context,” Kirtley said.