FERPA Compliance: How to Blur Student Faces in School Media

PiiBlur Team5 min read

Schools produce more visual content than ever. Social media accounts showcase events and achievements. Websites feature classroom activities. Yearbook photographers shoot thousands of portraits and candids. Security cameras record hallways, cafeterias, and parking lots around the clock.

Every one of these images and videos contains student faces. Under FERPA, those faces qualify as personally identifiable information — and sharing them without consent creates legal exposure for your district.

How FERPA applies to student images and video

The Family Educational Rights and Privacy Act protects education records — any records directly related to a student and maintained by the school. Photographs and videos that identify students fall under this definition when they are part of a student's educational record or can identify a student in context.

The practical consequence: if a parent has not consented to the release of their child's image, you cannot publish it. If a student has opted out of directory information disclosures, their face must not appear in publicly shared school media.

This applies broadly. A photo on the school's Facebook page, a video highlight on the district website, a yearbook proof sent to a printing vendor — all are FERPA violations if they show faces of students whose parents have not consented.

The consent gap in school media

Most districts collect media consent forms at the start of each school year. Parents indicate whether the school may use their child's image in publications and online media.

The problem is operational. A photographer at a school play captures 200 photos. Fifteen students in those photos have opted out. Someone must cross-reference every face against the opt-out list, identify the non-consenting students, and either remove those photos or blur those faces.

At one school, this is tedious. At the district level — dozens of schools, hundreds of events, thousands of opt-out students — it is unmanageable without automation.

Where student face redaction matters most

Social media and website content

School social media accounts are the most visible source of student imagery. Posts reach parents, community members, and the public. A single unredacted photo of an opted-out student on the district's Instagram account violates that family's FERPA rights.

Yearbook and publication proofs

Yearbook production sends thousands of images to external vendors for layout and printing. Those vendors are third parties. Sending identifiable images of opted-out students to a vendor shares protected information outside the school.

Event coverage and promotional video

Assemblies, sports events, performances, and open houses all generate video. Districts use this footage for promotional materials, board presentations, and community updates. Every frame showing an opted-out student needs redaction before distribution.

Security camera footage

When security footage is requested through public records processes or shared with law enforcement, student faces may need redaction — especially footage from common areas where many students are visible.

How to blur student faces automatically

Manual redaction — opening each photo in an editor, identifying opted-out students, drawing blur boxes over faces — does not scale. A district media coordinator handling hundreds of photos per event cannot cross-reference every face and redact by hand.

PiiBlur automates face detection and blurring in both photos and videos:

  1. Upload photos or video through the PiiBlur dashboard, or send files via the REST API.
  2. Select face redaction as the PII category to apply.
  3. Choose blur or pixelation as the redaction style.
  4. Download the processed files with all detected faces redacted.

For districts that need to redact only specific students, PiiBlur's blanket redaction provides a conservative starting point: redact all faces first, then restore consenting students in your own editing workflow. This ensures no opted-out student slips through.

Building a district-wide redaction workflow

Small schools can handle redaction through the PiiBlur dashboard — upload, process, download. Districts with higher volumes benefit from an API-based workflow.

A practical district setup:

  • Media staff upload event photos to a shared drive or cloud bucket.
  • An automated script sends new uploads to PiiBlur's API for face redaction.
  • Redacted versions land in a separate folder, ready for social media, website, or vendor distribution.
  • Original unredacted files stay in a restricted access folder for internal records.

This separation ensures anyone grabbing photos for public use pulls from the redacted set by default. Originals remain available internally but never leave the district's systems unprocessed.

For the full range of PII that can appear in images — documents, name badges, screens, and more — our overview covers all 13 categories PiiBlur detects.

What FERPA redaction does and does not solve

Blurring student faces in school media prevents non-consensual image disclosure. It solves the specific problem of identifiable student imagery reaching audiences without parental consent.

PiiBlur is a redaction tool, not a compliance solution. FERPA compliance involves policies, training, consent management, record-keeping, and technical controls well beyond image redaction. Blurring faces is one important step in a broader privacy program, but not a substitute for the full framework your district needs.

Schools and districts can find integration patterns and guidance for education-specific workflows on the dedicated page.

Pricing for schools and districts

PiiBlur's free tier includes 100 images and 5 minutes of video per month — enough for a single school to evaluate against a batch of event photos. Districts processing photos and video regularly need a paid plan, starting at $49/month and scaling to $499/month for high-volume needs. See pricing for the full breakdown.

Automated redaction costs a fraction of the staff hours required to redact manually — and a fraction of the legal exposure from publishing an opted-out student's face.