connect@uscdata.com
    USC Data logo
    Request a Risk Assessment

    Fast response. No obligation.

    Back to Resources
    Higher Education

    What Is (and Isn't) PII in a University — A 2026 Guide

    Universities sit on some of the richest personal data in the economy: identity documents, health records, financial aid files, research participants, and increasingly — AI prompt logs. Knowing what counts as PII is the first defence.

    Originally published March 2024 · Updated April 2026 6 min read
    University campus at dusk with overlaid data nodes

    Personally identifiable information (PII) is data that can be used — on its own or combined with other data — to identify, contact, or locate an individual. In higher education that definition gets blurry fast: a student ID without context looks harmless, but combine it with a class schedule and a Wi-Fi access log and you've narrowed identification to one person.

    PII vs non-PII on a modern campus

    Clearly PII:

    • Full name, date of birth, home address, personal email, phone number
    • Student ID, staff ID, government IDs (SSN, Medicare, TFN, NHS number, passport)
    • Financial aid records, loan information, payment details
    • Health and disability records, counselling notes
    • Biometric data — facial recognition templates, fingerprint logins, exam-proctoring video
    • Visa and immigration status for international students

    Usually non-PII (but watch the context):

    • Aggregated enrolment numbers, course titles, faculty names
    • Anonymised research datasets (when re-identification risk is genuinely low)
    • Public event listings, general campus information
    • Job titles, departmental structures

    The 2026 reality is that "non-PII" is a smaller category than it used to be. Modern re-identification attacks combine three or four innocuous attributes — postcode, date of birth, gender, course — to single out an individual with frightening accuracy. Treat context as part of the classification.

    The 2026 regulatory landscape

    🇺🇸 United States

    • FERPA still governs student education records, but the U.S. Department of Education's 2024–2025 guidance explicitly extends it to AI vendors processing student data on behalf of institutions.
    • GLBA Safeguards Rule (FTC) applies to any university handling federal student aid — written information security program, designated qualified individual, MFA, and encryption are now mandatory.
    • State privacy laws — California (CCPA/CPRA), Virginia, Colorado, Texas, Oregon and a growing list of others now grant students and employees individual access and deletion rights.
    • Colorado AI Act (Feb 2026) brings high-risk AI systems — including admissions and academic-progression algorithms — under explicit governance.

    🇦🇺 Australia

    • Privacy and Other Legislation Amendment Act 2024 introduced a statutory tort for serious invasions of privacy and lifted maximum civil penalties to AU$50M (or 30% of adjusted turnover).
    • Tranche 2 Privacy Act reforms (2025–2026) remove the small business exemption, introduce a "fair and reasonable" handling test, and give individuals a direct right of action.
    • Australian Privacy Principles (APPs) — APP 11 (security) and APP 6 (use and disclosure) remain the day-to-day operating standard for universities.
    • Notifiable Data Breaches scheme still requires reporting eligible breaches to the OAIC and affected individuals "as soon as practicable."
    • TEQSA guidance now expects governance over generative AI used in teaching, assessment and research administration.

    Recent breaches — and the pattern

    Higher education remained one of the most targeted sectors through 2024 and 2025. The MOVEit file-transfer compromise pulled in dozens of US and UK universities. In Australia, multiple Group of Eight institutions have disclosed third-party processor breaches affecting student records since 2023. Western Sydney University publicly confirmed multiple intrusions through 2024 affecting thousands of staff and students. The pattern is consistent: the breach almost never starts with the university's own perimeter. It starts with a vendor, an exposed cloud bucket, an unmanaged SaaS tool, or — increasingly — data pasted into a consumer AI assistant.

    AI changes both sides of the problem

    Generative AI is now embedded in admissions triage, student support chatbots, research assistance, and every staff laptop. That creates two new exposure vectors:

    • Prompt leakage — staff pasting student records, scholarship applications or counselling notes into ChatGPT, Claude or Gemini.
    • Shadow AI agents — browser extensions and Copilot-style tools indexing SharePoint, OneDrive and Google Drive without classification.

    The same technology is also the best defence. AI and ML can scan terabytes of unstructured campus data, classify it by sensitivity, flag PII the institution didn't know it held, and continuously monitor for anomalous access — at a scale no human team can match.

    How USC Data and Priivacy help

    Priivacy is the managed PII discovery toolset USC Data deploys for higher education clients. It scans network shares, SharePoint, OneDrive, Google Workspace, Box, email archives and research repositories — without extracting any data outside your firewall. The output is a defensible inventory of what PII you hold, where it lives, who can access it, and which records are redundant, obsolete or trivial (ROT) and safe to remediate.

    Local-first, by design

    USC Data's discovery tooling never sends university data to an external cloud for analysis. Scanning runs inside your environment; only metadata and aggregate findings leave the perimeter.

    Where to start this term

    1. Map every system that holds student or staff PII — including SaaS, research tools, and shadow AI.
    2. Run a scoped discovery scan to find PII you didn't know you held.
    3. Apply a one-page "do-not-paste" rule to every device with AI access.
    4. Remediate ROT — the safest record is the one you no longer hold.
    5. Brief the executive on residual risk, not just controls.

    Is your university AI-ready and audit-ready?

    Book a 20-minute call. We'll map your highest-risk PII surfaces and show you the fastest path to a defensible inventory.

    Book a discovery call