https://tse2.mm.bing.net/th?id=OIP.qy6kB4R1oGOQik9OrEFlSQFIC9&pid=15.1&P=0&w=279&h=162Several global brands pulled advertising from YouTube after reports ads were being shown alongside inappropriate content of children.
Mars Inc., Deutsche Bank AG, Adidas AG and others pulled their ads after The Times in the U.K. reported Friday about the brands’ advertising being broadcast with videos of scantily dressed children that carried inappropriate sexual comments from viewers. Money from the advertising is split between YouTube owner Alphabet Inc. and those who publish the videos.
Earlier this week, Buzzfeed also reported about YouTube videos with millions of views of children in disturbing situations, including being restrained in ropes or tape and crying. The children are often in revealing clothing, Buzzfeed reported.
“We take this matter very seriously and suspended the advertising campaign as soon as we became aware of it," Deutsche Bank said in a statement. "As always, our digital marketing agency applied filters to prevent our advertising appearing alongside inappropriate content and we are investigating how the situation arose.”
Mars said it won’t advertise with Google "until we have confidence that appropriate safeguards are in place."
“We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content," Mars said in a statement. "We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally."
The reports show the problems YouTube, Facebook Inc., Twitter Inc. and other online platforms have policing user-generated content published to their sites. They illustrate how features that made the companies immensely popular globally -- as open systems where anybody can share -- are being subverted and causing daunting new challenges such as Russia’s use of the sites to influence elections.
YouTube faced another advertiser revolt earlier this year when ads were being shown next to jihadi extremist videos. YouTube’s advertising system is largely automated, limiting the control that brands have about where ads are carried.
YouTube said in a statement that it’s working to improve safeguards to block this kind of content of children, including employing thousands of people who review content that’s flagged by users or an automated system. The company added that the material highlighted in the Times, including a video of a young girl in a nightie with 6.5 million views, is different from child sexual abuse imagery.
"There shouldn’t be any ads running on this content and we are working urgently to fix this," YouTube said. "Over the past year, we have been working to ensure that YouTube is a safe place for brands. While we have made significant changes in product, policy, enforcement and controls, we will continue to improve."
Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://www.bloomberg.com/news/articles/2017-11-24/deutsche-mars-pull-youtube-advertising-after-predatory-videos
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit