As Apple enjoys the aftertaste of new products, parents demand change – channelnews

A study found that one in three children has engaged in malicious behavior with a stranger, or been exposed to nudity or pornography through a downloaded app, prompting parents and caregivers to demand that Apple CEO Tim Cook make a change.

As Apple continues to bask in the afterglow of today’s highly anticipated launch of new products including the iPhone 16 and Apple Watch 10 in Cupertino, California, the survey has spawned a petition from thousands of people who are fed up with children inadvertently accessing inappropriate content or being approached by strangers.

In a petition launched by US advocacy groups Parents Together and Heat Initiative, more than 7,800 parents, guardians and others have called on Cook to “implement independent, third-party review and verification of the age ratings of apps in the Apple App Store,” the New York Post.

And while the story may have originated in the US, it’s one that’s being repeated around the world as those responsible for children struggle to protect them from corporate greed and malicious actors. App stores know few boundaries, so it doesn’t matter where you are.

Survey by Parents Together and Heat Initiative.

The survey, which you can read here“found that parents have significant concerns that devices like tablets and smartphones pose risks to children due to potential exposure to inappropriate content and unsafe interactions with peers, adults and strangers online – and they shared the negative experiences their own children have had.”

“Parents especially want smartphone and tablet manufacturers and social media platforms to invest more in protecting children from harmful content, applications and interactions on their devices.”

The survey was conducted among 1,007 parents of children in kindergarten through 12th grade with access to a smartphone or tablet. The survey was conducted by Bellwether Research from August 17 to 23, 2024, via text and telephone questionnaires.

The research found that one in three parents or caregivers said their child had “at least one negative experience through the app store on their device: one in three had been contacted by a stranger with harmful intent or was exposed to nude images or pornography through a downloaded app.”

“Almost 1 in 3 was shown an advert for an adult game or app. A significant number (28%) downloaded apps that were not approved for their age. Almost 4 in 10 spent excessive amounts of money on in-app purchases or fell victim to a scam.”

Heat initiative.

A respondent of a child with an Apple iPhone said, “Some apps are just not what they claim to be. Both me and my kids have downloaded apps that we thought were one thing, but they were just disguised, and they turned out to be pornographic, adult-oriented, or spam.” (Parent of a child with an Apple iPhone)

The National Center for Sexual Exploitation, a US nonprofit, has been pressuring Apple to step up its efforts to remove dangerous or exploitative material. It recently said it had won a victory when Apple removed four “nudizing” apps from its store.

The apps – one of which had an age rating of 4+ – were used to bully children, using AI to turn real photos or videos into pornographic content.

“The undress websites operate like businesses, often in the shadows — proactively providing very little detail about who owns them or how they operate,” Wired reported in August. “Websites run by the same people often look the same and use nearly identical terms and conditions. Some offer more than a dozen different languages, demonstrating the global nature of the problem. Some of the Telegram channels linked to the websites have tens of thousands of members each.”

In April, Apple released a statement: “In Australia, you can report unlawful or harmful content you find on Apple products or services, or any violations of online security codes.

“Unlawful or harmful material includes illegal and restricted online content and other content that violates Apple’s terms of use.

Tim Kok.

“Illegal and restricted online content refers to online content that contains the most seriously harmful content, such as images and videos depicting child sexual abuse or terrorist acts, as well as content that is inaccessible to children, such as simulated sexual activities, detailed nudity or high-impact violence.

Other types of content you can report include:

  • Cyberbullying targeting a child or cyber abuse targeting an adult

  • Non-consensual intimate images

  • Content that promotes, incites, instructs or depicts appalling violent behavior

“Under Australia’s Online Safety Act, you can report unlawful or harmful content, including illegal and restricted online content that you find on any of Apple’s products or services, or any violations of online safety codes.

“If you encounter unlawful or harmful online content on a third-party app, content service, or unsolicited marketing communication, please report it directly to the third-party provider.

“If you find unlawful or harmful content on any of Apple’s products or services, including illegal or restricted online content or a violation of online safety industry codes, you can report it to Apple: Call 133 622 / Email (email address)

“When reporting abusive or harmful content to Apple, do not submit the actual content (including images). Your report should only contain a description of the content.

You can report illegal or harmful content, including an unresolved complaint you have made to a third party provider, online to the eSafety Commissioner: www.esafety.gov.au/report.

You May Also Like

More From Author