Algorithms Should Be Optional
Algorithms Should Be Optional
Children and
teens are protected by federal regulation from accessing so many of the
potential harms that might affect them: alcohol, nicotine, guns, and more, and
it seems backward that we wouldn’t regulate them. In the domain of harmful
social media algorithms, however, it seems we have not yet applied this logic.
Kids can’t
go out and buy alcohol because we understand that their potential for harm is much greater than an adult’s and their ability to make smart decisions about
vodka might not be as informed as an adult's.
Another
major reason is that the teen age group is more susceptible to starting bad habits.
They are in a phase where exploring the limits of parental and adult control in
their lives is the norm, and their ability to weigh long-term consequences is
not fully developed. When combined with substances or activities that have long-term
negative impacts on their health, this is a recipe for chaos—which is why we
don’t want kids and teens to buy alcohol, or assault rifles, or cigarettes.
It’s even illegal for them to use some of these items at all.
Social media—specifically social
media feeds that use AI algorithms designed to increase engagement—comes with
none of these protections for teenagers and wholly inadequate ones for kids.
These feeds have many of the dangerous features of harmful things that we keep
away from younger people: they’re addictive, are increasingly shown to lead to
worse life outcomes, and are especially enticing to vulnerable youth. Social
media may be an important part of life for many teens, but the purposely
engaging AI algorithms that come along with them are too harmful to remain
unregulated.
As a young adult, I’ve seen the problems that these algorithms cause with my own eyes. I got Instagram to stay connected with high school friends and for entertainment. At first, that’s pretty much all it was. You could see posts that you had chosen to see and that was basically it. Nowadays, however, you can scroll endlessly through content curated specifically for you and keeps on drawing you in. This pattern is wreaking havoc on many more lives than just mine; I ended up deleting Instagram (giving up the social connection it brings at its best) because the algorithm was just too destructive. Many are unable or unwilling to make this change, and most of the time it is not their fault: it is the fault of powerful, expensive machinery exploiting their juvenile psyche.
I think it would be much better if companies made this algorithmic feed optional. Social media companies
gave us a way to connect, but now the cost of using it to stay connected—especially
for younger populations—is far too high. Let’s protect kids and teens from this
harm like we do for other similar harms (Linton, 2023, unpublished).
I saw additional evidence for this in a recent Freakonomics episode discussing a new discovery in the field of economics: that people will consume a product (in this case, social media) that they would prefer not to consume, conditional upon all other people also not being able to consume it. This shows that people might not like social media for its own sake, but rather face social costs by not participating, even if it is against their will or greater judgement. While this is a problem for more than youth, their developing brains can be extra susceptible to these problems.
When we start giving these "social media traps" to kids, it can get them engaged in self-admitted negative behaviors that are difficult to escape, even in adulthood. It seems that these algorithms, while not necessarily the entirety of the problem, contribute negatively to the health and wellbeing of children all the way up to adulthood.
Comments
Post a Comment