Yilu Zhou, PhD, Associate Professor of Information, Technology, and Operations

With technology at our children’s fingertips, parents share the universal concern that they may be exposed to violence and inappropriate content online. Considering that more than 3,000 apps are added to Apple’s App Store and Google Play daily—roughly 10 million every year—it is nearly impossible for parents to monitor everything their children are downloading, even when they check maturity ratings.
While movies and video games are rated by professional organizations, mobile app ratings are generated based on developers’ self assessment, according to Yilu Zhou, PhD, associate professor of information, technology, and operations at the Gabelli School. Developers complete a survey, she explained, but “the guidelines are vague and the developers’ interpretations vary.”

Without a standard system to ensure accuracy, can these ratings be trusted?

In the working paper, “Protecting Children from Inappropriate Mobile Apps: A Deep Text-Image Multi-Modal Learning Approach,” Zhou explored inconsistencies in the mobile app self-rating system by using machine learning to create a computerized mechanism that analyzes and flags apps containing mature content, such as violence and explicit images.

Out of 100,000 samples, the program correctly identified 70 percent of the misrated apps. Based on that finding, Zhou estimates that out of the millions of apps available for download, roughly 10 percent are misrated.

Underrating an app for a younger audience or targeting a certain age group will likely direct more attention to the app because it will be suggested to a wider audience, Zhou explained. If a game is rated for ages four and up, for example, it’s more likely to be downloaded by young children than if it was rated for ages 12 and higher.

“Sometimes you will see an app that is rated for ages four and up on Apple or rated for everyone on Google Play, but it contains numerous moments of violence, shooting, or even inappropriate wording,” she said. “That’s what parents worry about and I’m a parent myself, so I worry about this as well.”

Zhou’s algorithm can provide insight about what children might potentially be exposed to, but she also hopes the research will encourage policymakers to establish new protocols to ensure greater accuracy and accountability.

“We are asking policymakers to make sure that the information coming directly from developers is trustworthy. For example, maybe they should do a test run of developers’ other apps to make sure that they pass a test before they can provide ratings for their own apps,” she said.

Zhou’s message to parents is to proceed with caution when it comes to what their children download. While personal information and privacy need to be protected from online vulnerabilities, there should be equal if not greater concern about protecting children.

“You cannot just trust the ratings—they’re not like the Motion Picture Association—so you have to be more cautious,” she said. “As a society, we have to provide more support and guidance for parents because this is such a new area with no standard that we can use.”

—Gabrielle Simonson

vector illustration of woman holding key next to phone with child sitting down