What is PII? #
Personally Identifiable Information (PII) is any information that can be used to distinguish or trace an individual’s identity, either alone or when combined with other data.
Because minors cannot legally consent to many data-sharing practices, there are specific, heavy-hitting laws designed to protect them. Failing to handle their PII correctly can lead to legal liabilities.
In education settings, PII includes:
Direct Identifiers
- Full name
- Address
- Phone number
- Student number
- Photo
Indirect Identifiers
- “The only Year 6 student with a cochlear implant”
- “The child who was permanently excluded last year”
- Unique combinations of data
- Detailed behaviour or safeguarding cases
Special Category Data (Higher Risk)
Under data protection laws like GDPR in the UK/EU:
- Health conditions
- SEND status
- Safeguarding records
- Ethnicity
- Religious beliefs
- Biometric data
These require extra caution and usually explicit policy approval.
PII Guidance Flowchart #
The simple flowchart below can be used to check whether or not you should be uploading specific information into Glow, including AI tools like: Copilot, Gemini and NotebookLM.
Start
What am I about to upload?
- Student work?
- Assessment data?
- Behaviour notes?
- Emails or reports?
- Something else?
Does it contain PII?
- Student name?
- Parent name?
- Address?
- Date of birth?
- Student ID number?
- Email address?
- Photo?
- Health/SEND information?
- Safeguarding notes?
- Any combination of data that could identify a student?
- YES STOP – Do NOT upload in current form. Go to Step 3.
- NO Go to Step 4.
Can I fully anonymise it?
- Remove all names?
- Replace names with Student A/B?
- Remove identifying context?
- Remove metadata?
- Generalise rare situations?
- Remove safeguarding/health details?
If NOT fully anonymised Do NOT upload.
If fully anonymised Continue.
What do my local policies say?
- School AI policy?
- Local authority guidance?
- Data protection policy?
- Acceptable use policy?
- GDPR/data privacy training?
- Has leadership approved this tool?
If unsure Ask:
- Data Protection Officer (DPO)
- SLT
- IT Lead
If approved Continue.
What kind of data is this?
- Public?
- Internal school-only?
- Confidential?
- Special category data (health, SEND, Safeguarding)?
If confidential or special category Do NOT upload unless explicitly authorised.
What are the AI tool’s data practices?
- Does it store conversations?
- Is data used for training?
- Can I turn off data retention?
- Is it approved for educational use?
- Is there a data processing agreement?
If NOT comfortable with any of the above Do NOT upload.
If comfortable Continue.
FINAL CHECK
Would I be comfortable if:
- A parent saw this upload?
- My headteacher was made aware of it?
- It appeared in a data breach?
If NO Do NOT upload
If YES Likely safe (if anonymised and policy-compliant).
Golden PII Rules for Teachers Using Glow #
- Default to anonymising everything
- Never upload safeguarding records
- Never upload EHCPs or health reports
- Remove metadata from documents
- Check school or local authority AI policy first
- If in doubt — don’t upload

Why Local Policy Matters #
Your school or local authority may have:
- An approved tools list
- A banned tools list
- A required Data Protection Impact Assessment (DPIA)
- A Data Processing Agreement (DPA) requirement
- Specific rules about cloud storage location
- Mandatory parental consent policies
Even if something feels safe, policy compliance is separate from personal judgement.
How could you identify someone indirectly? #
In an education setting, even when a learner’s name or direct identifiers are removed, the remaining information may still identify them. If this happens, it is still considered personal data under the UK GDPR.
A learner may be indirectly identified when:
- Other information already held by the school could be combined with the anonymised data to reveal who the learner is.
- Information from another source (eg. class lists, timetables, small group sizes, behaviour records) could be used to work out the learner’s identity.
- A third party could reasonably match the information you disclose with other data available to them (eg. parents, staff, other learners).
Because of this, staff must think about all reasonable ways someone could identify a learner, even unintentionally. For example, releasing “a Year 11 learner with a rare medical condition” could identify the individual if only one student fits that description.
Overall, schools must ensure that they do not accidentally share information that could be linked with other data to identify a learner, even when the information appears anonymous.
ICO Link for more information: Can we identify an individual indirectly from the information we have?
Practical Safe Examples #
Approved Examples
- “Summarise this anonymised Year 8 persuasive essay.”
- “Provide feedback on this paragraph written by Student A.”
NOT Approved Examples
- “Summarise Jake Thompson’s Year 8 persuasive essay.”
- “Provide feedback on this paragraph written by a 13-year-old with ADHD and trauma history.”
Top Tip #

Always assume that if a piece of information can be used to find a child in the “real world,” it is high-risk PII and should be handled with maximum security.
One Page Guide #
We have used NotebookLM to generate a one page infographic guide that you can download to have in your classroom for ease of access.

