Excluded and High‑Risk Products and Services
Categories of services not assessed by ST4S due to their specialised nature
Why some products and services are not assessed
The Safer Technologies for Schools (ST4S) assessment is focused on evaluating digital products and services that are essential for the educational environment, ensuring they meet high standards for security, privacy, and interoperability.
Digital tools often include a mix of functions, and some of these fall outside the current scope of the Safer Technologies for Schools (ST4S) assessment. In some cases, functionality may be specialised or involves risks that fall outside what ST4S currently reviews.
What this means for schools
Where specific functionality is identified as “excluded or higher risk”, this does not necessarily indicate that tools or products with these features are unsuitable. It does mean that ST4S has not assessed that aspect of the technology. Where this is the case we highly recommend that schools complete their own internal risk assessment to determine appropriate and safe use.
Excluded and Higher Risk categories
The exclusions and exemptions below highlight the types of services that are not assessed under the current ST4S framework due to their specialised nature, the framework not yet having the controls to cover certain features or functions, or the service (and any of its functions) being deemed high‑risk or non‑compliant. Each category includes details to help readers understand the specific considerations involved.
Important Note
Use cases in each category are subject to change. As new use cases are evaluated and the ST4S framework evolves and implements new modules to accommodate these use cases, these lists will change. However, not all use cases may be removed or adjusted.
General Exclusions
The following service categories are not assessed by ST4S because they fall outside the current assessment scope or are highly specialised.
Software hosted and managed by the school on their own cloud service provider or local infrastructure. However, if a service is self-hosted but fully managed by the service provider (including security scanning, support, and backups), it may be considered for assessment.
- Firewalls.
- Internet filters, including internet and web browsing monitoring tools.
- VPNs, proxies, and other networking services.
Services, applications, or devices that monitor or conduct surveillance of students, including tracking online activity, device behaviour, or physical locations. Examples include browser extensions, installable applications, and other tools that record, monitor, or log students’ interactions across various services, websites, and device activities.
Any software that allows for remote access and control of student or teacher devices.
Services specifically for managing and storing passwords and other credentials.
Social media platforms not specifically designed for school use (e.g., Facebook, Twitter, Discord). Forums and social media platforms offering schools their own tenancy and isolated environment may be considered.
Services such as reference checks or background checking services.
Software related to purchasing and vendor management not directly related to educational services.
Services such as physical security systems, smoke alarms, surveillance cameras, and CCTV hardware.
Consultants and agencies delivering professional services without a specific application that students, teachers, or parents register and log into.
Services with communication features accessible to the general public, which a school cannot restrict or limit to their school or classroom. Examples include multiplayer games or social media services that do not provide an administrator account for schools to control student communication and account discovery or privacy features.
Services which may pose a risk to student privacy, online safety, and human rights/ethics as determined by the ST4S Team and/or ST4S Working Group.
Artificial Intelligence (AI) Exclusions
The following is specific only to AI systems, models, features and functionalities.
- Designed to process personal information (e.g. the service prompts or requests student names, gender, racial/ethnic origin or other personal information including sensitive information). Information must be de-identified before being exposed to an AI model.
- Speech to text (STT) or text to speech (TTS) which may use a machine learning model (ML) or AI model may have an exemption under accessibility reasons. Further information under the exemptions section on this page is provided.
- Processes data that may be used to create profiles of individuals or groups that are used to further develop the model or other purposes not within the primary interest of the school.
- This does not include services which may form a profile of a student to enhance learning experiences or recommend learning pathways (e.g. adaptive learning), provided the profiles are exclusively for that purpose and the benefit of the student or school.
- Processing biometrics, human attributes, motions, metrics or attributes whether physical or mental (e.g. facial recognition and scanning, eye tracking, detecting movement, determining or predicting emotions, student disability, learning difficulties etc.).
- Services which may produce a digital recreation of a person (e.g. voice cloning).
- Exemptions may be considered for speech to text (STT), text to speech (TTS) and on device biometric authentication.
- Student monitoring, behaviour management and observation services. This includes monitoring activity across the internet and web browsing behaviour.
- Services which monitor physical or mental attributes or behaviour such as analysing emotions, level of concentration in the classroom etc.
- Administrative support and decision making (e.g. automatically vetting enrolment or scholarship applications, complaint handling, disciplinary action, reviewing a benefit or award etc).
- General chat bots used by students that have not been designed for school or educational use and/or do not offer controls or features for schools to restrict and limit access to students. Chat bots are assessable provided they are designed with students / education use in mind.
- Other use cases of AI that may not be suitable for school or educational use.
- Utilising NSFW AI models, producing or outputting NSFW content or any other content that may be objectionable or deemed offensive by a reasonable member of the school community (this includes text based models).
- Applications or services which process health and wellbeing data (e.g. physical or mental health, fitness, meditation, food consumption, body metrics etc).
- Services which may provide or output medical, health (including mental health) or wellbeing guidance or advice. This includes information that may be reasonably interpreted by an end user including a young person to be health or wellbeing guidance or advice.
- Prohibited uses as defined under the European Union’s Artificial Intelligence Act (the EU’s AI Act).
- Monitoring or recording screen or browsing activity that cannot be controlled by the student or end user of the device.
- Examples include connecting to Microsoft SharePoint, Department systems etc in order to ingest information, documents and other content.
Any application of AI that may be considered by the ST4S Team, a ST4S Working Group member or a reasonable member of the school community to be:
- Invasive
- Unethical
- Pose a risk to safety, human rights or privacy
- Not be within the best interests of the student and/or school
Artificial Intelligence (AI) Exemptions
As we assess more services, and guidance is released from industry bodies including the Privacy Commissioner etc, we may update the exclusions list or list exemptions here.
The use cases below are not guaranteed for an exemption and require an initial inspection by the assessment team to confirm we are able to assess. Please contact the team using the contact form on our website.
Exemptions are only considered for limited use cases, which currently relate to accessibility.
- Services using machine learning (ML) or AI models to provide captioning, voice assisted typing and other similar accessibility use cases.
- Services must ensure they continue to follow other ST4S requirements in ensuring transparency in the privacy policy and sub processor lists where required.
- Services which retain information or process information to improve STT or TTS for end users (e.g. personalising and developing a unique profile for the user that recognises their voice) must provide clear mechanisms to remove or delete such profiles or templates.
References and Definitions
NSFW is a general term used to describe any content that may be deemed inappropriate to create, view or access whilst in the workplace and schools. NSFW content includes content that may be deemed inappropriate for younger audiences (e.g. person’s under 18).
High risk and prohibited use cases as defined under the EU AI Act are summarised here.
