With a commitment to “developing technologies that enrich people’s lives,” the Cupertino company is pleased to introduce updates that will help protect children and teens whenever they’re in front of an Apple device. These updates will be available on all new Apple operating systems starting with iOS 26. Parents will be able to “ensure their children enjoy age-appropriate experiences from the moment they set up their devices .” This is further proof that Apple is ahead of the curve and adapts to all its users, including the little ones. Here’s a guide for families from Apple.
Children’s accounts and a visible age range in the apps they use
Remember that it is possible to set up an account for a child or teenager; in this case, it applies to those under 14 years of age and can be deactivated when they turn 18. Child accounts are designed to create a safe environment for whoever uses them; in this case, it provides security to parents that each experience is tailored to the age of the child or teenager. That is why in iOS 26, the process of setting up this type of account is simplified. In case the complete configuration of an account is not completed, the settings will be enabled by default. From the first moment, it will be possible to protect them in case the illusion and desire to use their first device does not hold out any longer. Apple indicates that as of iOS 18.4, iPadOS 18,4, and macOS Sequoia 15.4, these functions are now available.
If there are any concerns about age settings, parents can link their children’s accounts in a Family Sharing group. In this case, the parent can access Apple’s parental controls and have settings set up that are associated with their child’s age. To offer an extra level of protection, it will be possible to share the child’s “age range” with the apps they use, in this case, as a privacy feature. This will prevent them from revealing their birth dates when an app requests it and will keep this information hidden. An API will handle this, and developers will have to use it.
In this way, Apple provides families with the assurance that their children are enjoying their devices with age-appropriate experiences and, above all, that apps are not collecting information that could be sensitive or confidential. This will now be the case with the App Store, which will expand age classifications to five categories: three for teens, starting at 14, then 16, and ending at 18. An app cannot be downloaded by a child under 14 if it is categorized as 16 or older.
Teen protection and other additional features
To provide a safe environment, the Cupertino-based company requires every child under 14 years old who uses an Apple device to have a child account. This protects what they view online, as well as creating restrictions in case they are unable to use an app. Starting with the new updates, these conditions will be imposed on accounts for teenagers ages 14 to 17. It’s worth noting that the protections for this age group are based on the new App Store ratings.
For example, in the case of communication, there are limits that allow management in the Phone, FaceTime, Messages, and Contacts apps. The new systems have expanded capabilities; now, children must send requests for their parents’ authorization when they want to communicate with a new mobile number. In this case, it will happen with a small button in the Messages app. In third-party apps, it will be possible to adapt this thanks to a new API where every action that involves meeting someone for the first time, whether to chat, follow, or add a new friend, is authorized directly by the parents.
For added security when browsing the App Store, once the new age restrictions are set, apps with ratings higher than the allowed age will not be displayed anywhere in the App Store. If a minor wants to download an app with a higher rating, parents can make a one-time exception through Screen Time. Permission can also be withdrawn at any time.
Apple’s technology is present through a form of communications security, which has been implemented in the past but will now be applied to FaceTime and shared albums when nudity is detected. Remember that receiving any such content allows you to ask for help, and the content cannot be revealed for your protection.