قالب وردپرس درنا توس
Home / World / Microsoft buying TikTok could lead to problems monitoring social media content

Microsoft buying TikTok could lead to problems monitoring social media content



Microsoft CEO Satya Nadella leaves the Elysee Palace after a meeting with French President Emmanuel Macron in Paris on May 23, 2018.

Aurelien Morissard | IP3 | Getty Images

If Microsoft were to complete a TikTok acquisition, it would gain a company with a lot of potential for increased advertising revenue.

But with such a purchase, Microsoft would also get a whole new problem.

Microsoft announced on August 2 that it was in talks to buy TikTok̵

7;s business in the US, Australia and New Zealand, with a deadline to complete the deal by September 15th. The company is currently owned by Chinese technology company ByteDance and has become a target of the Trump administration and other governments over privacy and security concerns. Trump also signed an executive order last week that would ban U.S. companies from doing business with TikTok, but it is unclear how that order could affect a potential acquisition by Microsoft.

In the US, TikTok has grown to more than 100 million users per month, many of whom are teenagers and young adults. Those users tune in to TikTok to watch full-screen videos uploaded to the app by others. These videos often feature lip sync over songs, bright video editing, and eye-catching visual effects.

To say that TikTok represents a business that is radically different from the enterprise software that Microsoft specializes in would be an understatement.

For Microsoft, TikTok could become a powerhouse for advertising revenue, but that potential is not without its risks. Like other social applications, TikTok is a target for all kinds of problematic content that needs to be addressed. This includes underlying issues such as spam and scams, but more complex content can also become a headache for Microsoft.

This could include content such as misinformation, clutter, conspiracy theories, violence, prejudice and pornography, said Yuval Ben-Itzhak, CEO of Socialbakers, a social media marketing company.

“Microsoft will have to deal with all of this and will be blamed and criticized when they fail to do so,” Ben-Itzhak said.

Microsoft declined to comment, and TikTok did not respond to a request for comment on this story.

These challenges can be overcome, but they require large investments of capital and technical skills, two things that Microsoft is able to provide. And already, Microsoft has an experience when it comes to moderating online communities.

In 2016, Microsoft bought LinkedIn for $ 26.2 billion, and although career and professional-centric service do not have the scale of content issues handled by its peers, it is still a social network. Microsoft has also run Xbox Live, the online gaming service, since its launch in 2002. Online games and social media are different animals, but they share similarities.

“Fighting misinformation will have to be a critical mission priority. Microsoft will be new to this as it has no experience in managing a high-profile social network at this scale,” said Daniel Elman, a Nucleus analyst. Research. “That said, if any company can acquire or quickly develop the skills and capabilities required, it is Microsoft.”

But these are no small challenges, and these kinds of problems have become major issues for TikTok rivals.

For example, Facebook was accused of not doing enough to ignore Russian fake news and misinformation before the 2016 US election, and four years later, the company is still under constant criticism over whether it is doing enough to prevent that kind of content is featured in its services. In July, hundreds of advertisers boycotted Facebook for failing to contain the spread of hate speech and misinformation.

Twitter, meanwhile, began to lose key users, such as comedian Leslie Jones, after the company allowed harassment to run rampant on its social network. The company has spent the past few years building features to reduce the amount of hateful content that users have to deal with their mentions.

These types of issues have already been raised on TikTok. Far-right activists, white nationalists and neo-Nazis have previously been reported on the app, according to Motherboard and the Huffington Post, which found some users who had already been banned by Facebook and Twitter.

TikTok’s potential content issues, however, may be more similar to those of Google-owned YouTube. Both services depend on user-created videos for content, and both rely heavily on algorithms that teach one user’s behavior to determine what kind of content the other suggests.

“The issue with algorithm-based content sources is generally degenerates into the most volatile content that shows the highest engagement,” said Mike Jones, managing partner of the Los Angeles-based venture capital firm. “There is no doubt that as the creators further understand how to direct extra views and attention to the site through algorithm manipulation, the content will grow in its peace and will be a steady battle with which every owner will have to to take.”

Another similarity with YouTube is the amount of content available on TikTok that is focused on minors. Although TikTok does not allow users younger than 13 to post on the app, many of its users are between the ages of 13 and 18, and their content can be easily viewed by others.

For YouTube, the challenge of hosting content involving minors became a major issue in February 2019 when Wired unveiled a network of pedophiles who are using video service recommendation features to find videos of exposed minors or their underwear .

With the number of new users on TikTok, it is not hard to imagine that Microsoft could end up with a problem similar to that of Google.

YouTube has also become a cesspool for conspiracy theories, such as the idea that the Earth is flat. This too can become a problem on TikTok, and already, there is evidence for this. The conspiracy theory that Wayfair uses its furniture for child trafficking gained a certain amount of momentum on TikTok this year.

To address these issues, Microsoft will need to invest a tremendous amount of time and money in content moderation.

For Facebook, this problem has been addressed through a two-pronged strategy. The company is constantly investing in artificial intelligence technology that is able to detect malicious content – such as pornography, content that contains violence or hate speech – and remove it from their services before it is ever viewed by other users. .

For more complex content, Facebook also relies on thousands of human moderators. These moderators often work for Facebook through third-party vendors as contractors, and they are tasked with passing thousands of pieces of content a day in harsh working conditions, at the risk of developing PTSD. These working conditions have been criticized on many occasions, creating headaches for public relations for Facebook.

If Microsoft wins TikTok, it will have to build a similar AI technology and build a network of human moderators, all while avoiding negative headlines about poor working conditions.

TikTok offers Microsoft a huge amount of potential in the digital marketing sector, but along with all that stubbornness will come many new challenges and responsibilities that the company will have to take on.


Source link