TikTok Earthquake: It puts its users at risk, these videos are harmful

0
37

[ad_1]

TikTok is not so healthy as an app, and this is confirmed by an important study conducted on them. But what specifically are we talking about?

Each social network should guarantee complete confidentiality of users’ privacy and a harmless view of the contents, so they should be videos that can be viewed by anyone. Restrictions can exist up to a certain point, but when the service fails to distinguish from dangerous videos, it automatically enters a vicious circle.

tiktok illegal content
It seems that ByteDance has failed to better protect the platform – Cellulari.it

It turned out to be quite an achievement that the algorithm failed in its intentions, throwing unsuspecting users into conflict, who, as happened once on YouTube, started showing up. Inappropriate content To the platform that they should be banned immediately. Fortunately these are isolated situations that are immediately recognized; AIs designed for this task have evolved enough to identify videos for quick removal.

Doubting your health by watching TikTok: This is a serious problem

It doesn’t seem to have happened yet tick tock, because taking into account a recent study shows that users of the app are not so safe. Perhaps we also had to deal with videos that we should never have seen, but as we said, the research was conducted on the basis of some verified data that reflects the reality of the information. What did computer experts discover?

tiktok illegal content 1
Will it be able to protect the platform from illegal content? – Cellular.it

Abbreviation of CCDH Center for Preventing Digital Hate, conducted a study that found that TikTok can show teenagers potentially harmful content related to suicide and eating disorders within minutes of creating an account. With precision, three minutes of use would be enough to fill this type of footage.

To run the test, they created eight new accounts in the US, UK, Canada and Australia Minimum age is 13 years. These profiles stopped watching body image and mental health videos, only to find that the app suggested body image and mental health videos every 39 seconds over a 30-minute period.

Experts tell us that this is a problem that should not be underestimated for one simple reason: offering this content in the first place can harm them. mental health e body image. With edits and promotional videos coming out it’s very easy to mislead them, which is why it is hoped that TikTok, with proper investigation, will soon revise its algorithm in view of the obvious discomfort.

[ad_2]

Source link