.
Facebook has refused to remove graphic images of dead Thai children, including a baby, because it says the posts do not violate its Community Standards.
The distressing images, were included in two separate posts shared in a Facebook group on May 11.
One of the images shows the naked body of a baby that had been dumped in a rubbish bin in central Pattaya.
The
other images show the lifeless bodies of two children aged nine and ten
who drowned after getting into difficulty while playing in water in
Udon Thani.
Both
images were posted in the “ข่าวจาก อาสากู้ภัย THAILAND” group which
shares graphic and gruesome images from road traffic accidents and crime
scenes throughout Thailand.
Both the images of the dead children
have been reported to Facebook but the social media platform refused to
remove them saying they do not violate its “Community Standards”, even
though its standards say that graphic images should include a warning.
“We’ve
looked over the post, and although it doesn’t go against any of our
specific Community Standards, you did the right thing by letting us know
about it”, Facebook said.
Instead of removing the images, Facebook instead advises users to block the person who posted them.
But if photos of dead children do not violate Facebook’s Community Standards, what does?
We’ve reached out to Facebook for comment.
Last
month, Facebook was criticised for its failure to remove two videos
posted to Facebook Live which showed a Thai man murdering his 11 month
old daughter in Phuket.
The videos showed Wuttisan Wongtalay
tying a rope to his daughter’s neck before dropping her from the roof of
a deserted hotel in Phuket.
The videos remained on the father’s Facebook page for almost 24 hours before they were taken down.
One of the videos had been viewed more than 100,000 times, while the other was viewed over 258,000 times.
Despite
a request from Thailand’s Ministry of Digital Economy, it took Facebook
a further five hours to remove the videos even after government
officials had intervened.
In response to another incident where a
man murdered a senior citizen in Cleveland, Ohio, and then uploaded the
footage to his Facebook page, CEO Mark Zuckerberg admitted the firm had
“a lot more to do” on how it monitors and removes graphic content.
Following
the incident in Phuket, Facebook announced it would be hiring 3,000
moderators to review content and help remove questionable content more
quickly.
Source - TheNation
.