What could be an important step in protecting children from potential harm online, the California Legislature currently Debate over an amended bill Which will enable parents, as well as the state’s attorney general, to sue against social platforms for algorithms and systems that add children to their app.
As reported by The Wall Street Journal:
“Social-media companies, such as Facebook’s main meta platform, could be sued by government attorneys in California for features that harm children through addiction under a first-country bill facing a crucial vote in the state Senate on Tuesday. The move would allow state attorneys general, local district attorneys and city attorneys in four of California’s largest cities to sue social-media companies, including Meta – which also owns Instagram – as well as TickTock and Snapchat, under the state’s unfair business law. Practice
If passed, it could add a range of new complexities to social media platforms operating within the state and limit the way algorithmic extensions can be applied to users under a certain age.
The ‘Social Media Platform Duty to Children Act’ Was initially proposed Early last month, But has since been amended to improve the chances of securing the passage through the legal process The bill includes a range The ‘Safe Harbor’ clause which will exempt social media companies from liability if it says that the company changes to remove the addictive features of their platform within a certain period of time.
While those ‘addiction’ features aren’t specified, the bill is primarily aimed at social platform algorithms, which focus on keeping users as active as possible in each app in response to each person’s personal use behavior. And binds them through some more presentations of how they respond to their fresh content feeds.
Which can certainly have a negative impact. As we have repeatedly seen play out through social media engagement, the problem with algorithmic amplification is that it is based on a binary process that makes no judgment about the actual content of the material it wants to expand. The system simply responds to what people can click and comment on – and what more can people click and comment on than anything else? Emotionally charged content, posts that take a divisive, biased approach, with updates that give rise to anger and laughter are likely to trigger the strongest response.
This is part of the reason for social segregation as a whole, as online systems are designed to maximize engagement, which in turn encourages more segregated adoption and positioning to share and reach.
Which is a major concern of algorithmic expansion, on the other hand, as mentioned in this bill, social platforms are increasingly realizing what you can scroll through, especially with TikTok’s ‘for you’ feed, in particular, almost perfecting the industry. Draw users, and keep them in the app for hours at a time.
In fact, TikTok’s own data shows it to users Spend about 90 minutes every day on the app, On average, young users are particularly compelled by the endless flow of short clips. This is great for TikTok, and underlines its nuances in creating systems that are compatible with the user’s interests. But the bill is basically asking the question ‘Is it really good for young people online?’
Already, some countries have tried to ban young people from using the Internet, where China is enforcing restrictions. Gaming and live-streamingIncluding the recent introduction of bans on persons under 16 years of age Watching live stream after 10pm.
Italy has a parliament The law has been better enforced Protect minors from cyber bullyingThe development of EU privacy regulations has seen the implementation of a new range of protections for young people and their use of data online, which has changed the way digital platforms work.
Even in the United States, a The bill was proposed in Minnesota earlier this year The use of algorithms will be completely banned To recommend content to anyone under the age of 18.
And given Range of investigation Which shows how social platforms can be used Harmful to Young userIt makes sense for more legislators to take more regulatory action on these – although it may be difficult to sue their actual, technical complexity in terms of proving a precise connection between algorithmic amplification and addiction.
But it is an important step, which will undoubtedly force platforms to rethink their systems and lead to better results for all users.