Three years after graduation, Isaac Kohn receives an email from Columbia University notifying him that his name and Social Security number have been compromised. Kohn is one of 300 to 400 students affected by the Columbia data breach, in which student names, paired with Social Security numbers, were accidentally released on an insecure Columbia Housing and Dining Web server. Approximately 2,600 Social Security numbers are compromised; a student preparing for room selection had stumbled upon the directory containing the sensitive files and reported the lapse to Housing and Dining.
Kohn doesn’t give the notification much thought—in the email, Columbia informs affected students that free one-year subscriptions to a credit monitoring system are available. The service provides students with a copy of their credit reports and credit file monitoring at the three major credit bureaus (Equifax, Experian, and TransUnion), and promises to notify students of certain suspicious activities that could indicate identity theft.
That was in 2007. Looking back now, Kohn laughs and tells me, “I doubt I registered. ... I doubt I took advantage of it.”
Kohn, who graduated from Columbia College in 2004, works in cybersecurity now; the Columbia breach was the first time it was clear to him that his personal data could be directly compromised.
Information security at Columbia is governed by legal practices that aim to protect student data, but these opaque regulations pose tangible security risks and leave students in the dark about the state of their data privacy at Columbia.
While Columbia declined to share specifics about their data security practices with The Eye, the University says it strictly complies with the primary U.S. law governing student data protection: the Family Educational Rights and Privacy Act (FERPA). Signed into law in 1974, FERPA controls access and disclosure procedures for education records. FERPA, however, was drafted years before the emergence of modern electronic data storage, meaning the law makes no distinction between electronic and paper records and ignores the unique security concerns associated with virtual data collection and storage.
Looking at FERPA, the legislation’s main flaw is that it doesn’t regulate the entire life cycle of student data, which involves data collection, access, processing and storage, publishing, and deletion.
Due to its antiquity, FERPA fails to include guidelines on collection for virtual data. The law only addresses post-collection information storage and disclosure practices—which makes sense, as the law was designed for paper records, physically collected and filed. Because of this, institutions have been free to create their own systems of data collection, systems that students may not be aware of—explicit student consent during data collection is not required by FERPA. Universities collect a breadth of sensitive personal data, including Social Security numbers, minors’ information, and student financials. This highly personal information belongs to students often with little tax or credit history—if compromised, these records could enable financial liability or identity theft.
FERPA doesn’t dictate requirements for safeguarding education records; the Department of Education merely recommends that institutions “consider actions that mitigate the risk and are reasonably calculated to protect such information.” When it comes to data breaches, FERPA doesn’t provide reporting guidelines,
Until recently, universities that experienced a data breach had no unique reporting obligations to the Department of Education—reporting was optional. Around November 2017, the Department of Education changed its stance on data breach reporting; institutions must now report any suspected data breach on the day it is detected. The Department of Education derives legal authority for new reporting guidelines from a scattered conglomerate of laws and agreements, including the Federal Student Aid Program Participation Agreement (PPA), the Student Aid Internet Gateway (SAIG) Agreement, and the Gramm-Leach-Bliley Act (GLBA). Under these laws, institutions have limited reporting obligations—universities are only required to report data breaches to the Department of Education, rather than directly to individuals whose data is compromised in a breach.
But even delayed breach notifications can put impacted individuals at risk. Between 2008 and 2009, Yale experienced a data breach involving compromised names, Social Security numbers, dates of birth, email addresses, and physical addresses. Yale didn’t become aware of the breach until June 2018, 10 years after the breach, and only notified affected individuals in July 2018. One victim, Julie Mason, claims that $60,000 was taken from one of her bank accounts as a result of the stolen data; her personal information was used to change the password on her online banking account.
While no reported evidence of wrongdoing or identity theft occurred as a result of the 2007 Columbia data breach, Columbia is still required to abide by the New York State Information Security Breach and Notification Act, passed in 2005. In the 2007 data breach, data elements included in the New York state law were compromised, warranting notification.
This law defines personally identifiable information that warrants protection the same way FERPA does—an identifying element, such as name or number, associated with a Social Security number, identification card number, or any financial account information.
FERPA doesn’t just apply to data breach protocol; it governs university data storage and disclosure. David Etherton, former executive director for academic and student technologies for Columbia University Information Technology (CUIT) between 2005 and 2011, confirms FERPA was the guideline for information security at Columbia. “Everyone was very mindful about [FERPA]; systems were built to support that.” Compliance with FERPA was taken seriously, and information systems were built securely around FERPA standards—“almost Fort Knox-like.”
Kenneth Durell, Columbia University senator from 2010 to 2012, expresses frustration with Columbia’s approach to data privacy and a general lack of institutional transparency when it came to data handling. Durell served on the senate’s Information and Communications Technology Committee, which serves as the liaison between students and administration regarding Columbia’s IT practices.
Looking back on his time on the senate, Durell tells me data privacy wasn’t something the committee spent a lot of time on. He comments on the student body’s general lack of understanding of University data practices: “CUIT had these guidelines that were kind of internal guidelines that supposedly everyone should know ... but there wasn't any real publicity about it.”
That April, the committee reported that although CUIT “has a large number of well-developed and well-defined policies that govern data management and security ... many members of the Columbia University community may not realize how important these policies are.”
Durell explains that expecting a student to know what CUIT privacy regulations are “without having it spelled out in a really easy-access spot” and not emphasizing University data practices “over and over” leads to inevitable student confusion about the privacy of their personal data at Columbia.
Not much has changed since 2011. Andrew Hsu, a GSAS University senator on the IT committee, tells me that besides a brief mention of compliance to the General Data Privacy Regulation(GDPR) in October 2017, the IT committee hasn’t discussed data privacy during their monthly meetings in the 2018 to 2019 term.
Hsu agrees that securing data privacy is important. With regards to CUIT policies, he has “no idea what Columbia does and doesn’t [do].”
Within the committee, there’s a vagueness that permeates the conversation about data privacy. “I don’t wanna say I’m stumped,” Hsu laughs. “I mean, I want data privacy as well, but what exactly that looks like, I'm not sure.” He says that if the committee develops a clearer picture of what data privacy should look like at Columbia, it could advocate for a higher standard of data protection.
However, for a sliver of student data at Columbia, dynamic change in privacy practices is happening.
Recently, Columbia was tasked with updating certain privacy practices in accordance with the General Data Privacy Regulation (GDPR), a European Union regulation which came into effect in late May 2018. The GDPR signifies a global overhaul of outdated, scattered, sector-specific privacy regulation. The GDPR strictly regulates the entire life cycle of personal data, prioritizing the security of the data subject’s information.
All EU resident data that Columbia collects and stores, including student data acquired through international application systems and study abroad programs, is protected under the GDPR.
According to a statement to The Eye from a University spokesperson, Columbia had a team of people working to comply with the GDPR. As outlined in a report released by the IT Committee, the vice president of CUIT, Gaspare LoDuca, has been tasked with Columbia’s compliance with the GDPR. In a statement on their website, Columbia announced EU individuals’ “Right to be Informed, the Right to be Forgotten/Erasure, and the Right to Rectification.”
GDPR data privacy practices only apply to a small subset of student data at Columbia: data collected from individuals residing in EU territory. This data includes information collected from students residing in the EU at the time of application and from students studying abroad in the EU. The GDPR doesn’t protect data collected from non-EU individuals or data collected from students from the EU after they arrive at Columbia University.
For this data subset, protection practices are upgraded and improved. The GDPR fills most of FERPA’s gaping holes. Most significantly, it applies protections to the entire life cycle of data, including data collection. Article 6 of the GDPR defines consent as a requirement for data collection and processing—any personally identifying data must be freely given, unambiguous, and specific to the transaction. Other widely used forms of user consent—automatic opt-ins and general consent waivers—do not satisfy the GDPR consent requirement.
Since a lot of student data that Columbia collects—including Social Security numbers, financial information, housing needs, special accommodations, and physical and mental health history (physical and mental health data are protected by HIPAA)—could damage students in the wrong hands, vigilant information protection is imperative.
The GDPR includes updated information security standards, requiring notification for breaches of security concerning any personal data, including cultural, physiological, medical, and financial information.
Article 4 of the GDPR mandates the details data breach notifications must include: a description of the data breach, the categories and number of people and data records affected, and a description of the likely consequences of the data breach. During the 2007 data breach, Columbia’s notification omitted certain details, including the number of students whose information was compromised—a number Columbia would have been required to report in the email notification under the GDPR.
When I ask Kohn about the ideological shift marked by the GDPR, he says, “There is starting to be a bit of backlash in the U.S., ... over handling of PII [personally identifiable information] and sensitive data, financial data.”
People are “starting to wake up” to the issue of data privacy, Kohn remarks, “but there isn’t as strong of a sense, in my experience, of a right to privacy in the U.S.” Kohn thinks that Europe, unlike the United States, has a strong cultural bias towards data privacy. The U.S., and its universities, are just starting to catch up.
The EU’s GDPR dares to secure something that FERPA doesn’t: an individual right to data privacy. The GDPR institutionalizes the revolutionary idea that, as Kohn puts it, “your data is yours, and you should maintain some kind of control over it, even when it leaves your custody.”
Enjoy leafing through our fourth issue!