The company reported the highest third-quarter EBITDA in history generating Dh376.7 million, it witnessed a 14 per cent YoY increase
business1 day ago
Kids are even more in the bag of social media companies than we think. So many of them have ceded their online autonomy so fully to their phones that they even balk at the idea of searching the internet — for them, the only acceptable online environment is one customised by big tech algorithms, which feed them customized content.
As our children’s free time and imaginations become more and more tightly fused to the social media they consume, we need to understand that unregulated access to the internet comes at a cost. Something similar is happening for adults, too. With the advent of AI, a spiritual loss awaits us as we outsource countless human rituals — exploration and trial and error — to machines. But it isn’t too late to change this story.
This spring, I visited with a group of high school students in suburban Connecticut to have a conversation about the role that social media plays in their daily lives and in their mental health. More children today report feeling depressed, lonely and disconnected than ever before. More teens, especially teen girls and L.G.B.T.Q. teens, are seriously considering suicide. I wanted to speak candidly about how social media helps and hurts mental health. By the end of the 90-minute dialogue, I was more worried than ever about the well-being of our kids — and of the society they will inherit.
There are numerous problems with children and adolescents using social media, from mental health deterioration to dangerous and age-inappropriate content and the lacklustre efforts tech companies employ to enforce their own age verification rules. But the high schoolers with whom I met alerted me to an even more insidious result of minors’ growing addiction to social media: the death of exploration, trial and error and discovery. Algorithmic recommendations now do the work of discovering and pursuing interests, finding community and learning about the world. Kids today are, simply put, not learning how to be curious, critical adults — and they don’t seem to know what they’ve lost.
A week before meeting the students, I introduced the Protecting Kids on Social Media Act with three of my colleagues in the Senate, Brian Schatz, Democrat of Hawaii, and the Republicans Katie Britt of Alabama and Tom Cotton of Arkansas. The bill is a comprehensive attempt to protect young people on social media, prioritising stronger age verification practices and placing a ban on children under 13 using social media altogether. But there was one provision of the bill that was particularly alarming to this group of students: a prohibition on social media companies using the data (what they watch and swipe on) they collect on kids to build and fuel algorithms that spoon-feed individualized content back to users. These high school students had become reliant, maybe even dependent, on social media companies’ algorithms.
Their dependence on technology sounds familiar to most of us. So many of us can barely remember when we didn’t have Amazon to fall back on when we needed a last-minute gift or when we waited by the radio for our favourite songs to play. Today, information, entertainment and connection are delivered to us on a conveyor belt, with less effort and exploration required of us than ever before.
A retreat from the rituals of discovery comes with a cost. We all know instinctively that the journeys in life matter just as much as the destinations. It’s in the wandering that we learn what we like and what we don’t like. The sweat to get the outcome makes the outcome more fulfilling and satisfying.
Why should students put in the effort to find a song or a poem they like when an algorithm will do it for them? Why take the risk to explore something new when their phones will just send them never-ending content related to the things that already interest them?
What the kids I spoke to did not know is that these algorithms have been designed in a way that inevitably makes — and keeps — users unhappy. According to an advisory issued by the surgeon general this year, “there are ample indicators that social media can also have a profound risk of harm to the mental health and well-being of children and adolescents.” A report by the nonprofit Centre for Countering Digital Hate found that users could be served content related to suicide less than three minutes after downloading TikTok. Five minutes after that, they could come across a community promoting eating disorder content. Instagram is awash with soft-core pornography, offering a gateway to hard-core material on other sites (which are often equally lax about age verification). And all over social media are highly curated and filtered fake lives, breeding a sense of envy and inadequacy inside the developing brains of teenagers.
Social media companies know that content that generates negative feelings holds our attention longer than that which makes us feel good. It’s the same reason local news leads with the shooting or the house fire, not the local food drive. If you are a teenager feeling bad about yourself, your social media feed will typically keep delivering you videos and pictures that are likely to exacerbate negative feelings.
These kids may think they need the algorithm, but the algorithm is actually making many of them feel worse. It is not a coincidence that teenage rates of sadness and suicide increased just as algorithmically driven social media content took over children’s and adolescents’ lives.
The feedback from the students in Connecticut left me more convinced than ever that this law is vital. By taking steps to separate young people from their social media dependency and forcing them to engage in real exploration to find connection and fulfilment, we can recreate the lost rituals of adolescence that, for centuries, have made us who we are.
The role that social media has played in the declining mental health of teens also gives us a preview of what is coming for adults, with the quickening deployment of artificial intelligence and machine learning in our own lives. The psychological impact of the coming transition of thousands of everyday basic human tasks to machines will make the effect of social media look like child’s play. Today, machines help us find a song we like. Tomorrow, the machines won’t just find the song — they will create it, too. Just as we weren’t ready for the impact the social media algorithms would have on our kids, we likely aren’t prepared for the spiritual loss that will come as we outsource countless human functions to computers.
Regardless of whether the Protecting Kids on Social Media Act becomes law, we should get to work on a broader dialogue, with adults and kids from all walks of life, to determine if we will really be happier as a species when machines and algorithms do all the work for us, or if fulfilment only comes when humans actually do the work, like searching and discovering, of being human.
This article originally appeared in The New York Times.
The company reported the highest third-quarter EBITDA in history generating Dh376.7 million, it witnessed a 14 per cent YoY increase
business1 day ago
The new feature allows customers to use a single card to pay from different funding sources
business1 day ago
Al Seeb Developers’ Royal Regency Suites to be ready by March 2027
business1 day ago
Reportage Properties offers special discounts on the company’s projects in the UAE, Saudi Arabia, Egypt and Turkey
business1 day ago
Medical hospitalisation scheme is now available to all persons who are 70 years old and above
business2 days ago
Ebitda stood at Dh11.8 billion, up by 4.71%
business2 days ago
The group will also invest in e-commerce, supply chain and new technologies as it looks to boost revenues and create hundreds of jobs
business2 days ago
The final offer price will be announced on November 29 and shares will be listed on the Dubai Financial Market on December 10
business2 days ago