
Age assurance online: recent overseas developments

Under the Online Safety Amendment (Social Media Minimum Age) Act 2004 (Cth), service providers will need to take "reasonable steps" to prevent under 16 year olds from having accounts on certain social media services. While we wait on more details on how these requirements will operate and be enforced, let us see what we can learn from others.
UK introduces new industry guidance on effective age checks
On 16 January 2025, Ofcom, the UK's communication services regulator, published new industry guidance on how Ofcom expects age assurance to be implemented in practice for it to be considered "highly effective". Among other things, the guidance:
confirms that any age-checking methods deployed by services must be technically accurate, robust, reliable and fair in order to be considered highly effective;
sets out a non-exhaustive list of methods that Ofcom considers are capable of being highly effective, including open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation;
confirms that methods including self-declaration of age and online payments which do not require a person to be 18 are not highly effective.
The guidance does not introduce numerical thresholds at this stage (eg. 99% accuracy), however acknowledges that numerical thresholds may complement the "highly effective" age-checking approach in the future, pending further developments in testing methodologies, industry standards, and independent research.
European Data Protection Board (EDPB) adopts new statement on age assurance
On 11 February 2025, the EDPB adopted Statement 1/2025 on Age Assurance which sets out principles to design General Data Protection Regulation (GDPR) complaint age assurance. These are:
Full and effective enjoyment of rights and freedoms: Age assurance must respect the full complement of natural persons’ fundamental rights and freedoms, and the best interests of the child should be a primary consideration for all parties involved in the process.
Risk-based assessment of the proportionality of age assurance: Age assurance should always be implemented in a risk-based and proportionate manner that is compatible with natural persons’ rights and freedoms.
Prevention of data protection risks: Age assurance should not lead to any unnecessary data protection risks for natural persons. In particular, age assurance should not provide additional means for service providers to identify, locate, profile or track natural persons.
Purpose limitation and data minimisation: Service providers and any third party involved in age assurance should only process the age-related attributes that are strictly necessary for their specified, explicit and legitimate purpose.
Effectiveness of age assurance: Age assurance should demonstrably achieve a level of effectiveness adequate to the purpose for which it is carried out.
Lawfulness, fairness and transparency: Service providers and any third party involved in age assurance should ensure that the processing of any personal data for the purposes of age assurance is lawful, fair and transparent to users.
Automated decision-making: Any occurrence of automated decision-making in the context of age assurance should comply with the GDPR. If applicable, service providers and any third party involved should provide suitable measures to safeguard natural persons’ rights and freedoms and legitimate interests.
Data protection by design and by default: Age assurance should be designed, implemented and evaluated taking into account the most privacy-preserving available methods and technologies in order to meet the requirements of the GDPR and effectively protect the rights of data subjects.
Security of age assurance: Service providers and any third party involved in age assurance should implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk.
Accountability: Service providers and any third party involved should implement governance methods that allow them to be accountable for their approach to age assurance and for demonstrating their compliance with data protection regulation and other legal requirements.
Businesses should ready themselves
It is not just Australia, the UK and the EU that are tackling this quite challenging issue. Numerous states in the US have introduced some form of age assurance, while China is seeking to expand its long-enforced real-name regime for online services through the Network ID Proposal.
Businesses will need to ensure that they are, and remain, fully across all regulatory requirements that apply to their activities, whether it be in Australia or abroad. For businesses to be able to adapt and thrive in this space, they will also need to explore and advance implementation options. Ideally ensuring compliance without disrupting the customer experience.
With this in mind, we look forward to seeing the outcomes of the Australian Government's age assurance technology trial which is currently underway, the eSafety Commissioner's Phase 2 Industry Codes, as well as other technology sector developments such as Apple's plans to offer a Declared Age Range API.
Get in touch


