Is Data Privacy Possible in the Digital Age?

4–5 minutes

By Dominic DiIorio

Staff Writer

MARPLE, Pa.—In an era where daily life is increasingly consumed by screens, algorithms, social media tracking, and invisible data trails, Delaware County Community College’s Library Services is not shying away from these often-divisive topics. On Feb. 25, 2026, the college hosted another installment of its Next Gen Digital Literacy series, focused on a question many students rarely stop to consider: Is data privacy still obtainable in the digital age? The event featured Penn State librarian Sarah Hartman-Caverly, who was interviewed by DCCC librarian Michael LaMagna.

The discussion was designed to help students better understand the realities of modern digital life. Students, faculty, and online attendees gathered to explore surveillance, technology, consent, and personal autonomy. Building on a previous fall event focused on open government data, this new conversation expanded beyond policy to examine how privacy impacts creativity, freedom, relationships, and mental health.

At the heart of the discussion was the idea that privacy is not just a technical issue or a matter of convenience but a foundational condition of being human. Hartman-Caverly explained that privacy should be understood as “a social axis that gives depth and dimension to the human condition,” allowing individuals to navigate the spectrum between what is public and what remains personal.

The conversation traced modern privacy concerns back to the early 2010s, when revelations about government surveillance began to reshape public perception of data collection. A primary example cited was the release of classified documents by Edward Snowden, which exposed extensive collaborations between private companies and governments. While officials at the time attempted to downplay the intrusion by stating only “metadata” was being collected, the event challenged that narrative.

“Metadata is extraordinarily powerful,” Hartman-Caverly noted, explaining that communication patterns can reveal intimate life details even without direct access to message content. Highlighting the severity of the issue, she referenced a chilling statement from former NSA Director Michael Hayden: “We kill people based on metadata.”

The discussion also addressed how major technology companies, such as Facebook, have conducted large-scale experiments on users without their knowledge. One study showed how manipulating a user’s news feed could influence their emotional state for several days. “They could demonstrate that emotions are socially contagious through just text communication,” Hartman-Caverly said.

This raised urgent ethical questions regarding manipulation, consent, and power within the framework of lawful data collection. Rather than framing privacy as something created or destroyed solely by technology, the event encouraged students to think more broadly. Speakers argued that privacy protects essential aspects of identity, intellectual freedom, bodily autonomy, and the ability to associate with or withdraw from others. Without it, Hartman-Caverly warned, “we’re not as creative, we’re not as innovative, our relationships suffer, and social trust declines.”

A recurring theme was that modern technologies are rarely neutral; many originated as military tools before being adapted for mainstream consumer use. “These are military-grade information weapons,” Hartman-Caverly said, urging students to approach digital platforms with greater caution.

The event also questioned whether participation in digital systems is truly voluntary. Using the “FRIES” framework for consent—which stands for freely given, reversible, informed,

enthusiastic, and specific—the discussion concluded that most digital technologies fail on nearly every measure.

Students were encouraged to reflect on the platforms they are required to use, such as learning management systems. “If you want to be educated here, you basically have to opt in… it’s not an option,” Hartman-Caverly noted. Once data is collected, it is virtually impossible to retrieve or delete, making consent effectively irreversible.

The talk also addressed how privacy laws like FERPA have not kept pace with modern technology. Loopholes often allow third-party companies to access student data under broad definitions like “educational purpose,” a situation Hartman-Caverly described as the “FERPA magic wand.”

Beyond collection, the event emphasized how data is used to predict and influence behavior. Digital profiles can estimate health risks, financial stability, and even mental health status. “We are not just predicting behavior…we’re predicting behavior in order to control it, or to make money off of it,” Hartman-Caverly said. These predictions shape what people see online and which opportunities are quietly withheld.

While the challenges seemed overwhelming, the event ended with cautious optimism. Students were encouraged to reclaim agency through small daily actions. One of the simplest recommendations was also the most human: Hartman-Caverly encouraged students to “introduce yourself to someone in person.”

Other suggestions included auditing digital accounts, deleting unused apps, paying with cash when possible, and allowing space for silence and reflection. As she reminded attendees, “The data that you don’t disclose is data that they don’t have to profile against you.”

The event stressed that privacy must become a shared cultural value rather than just a legal or technical issue. As the Next Gen Digital Literacy series prepares for its next installment on April 16, 2026—featuring local journalists—this conversation made one thing clear: Privacy is not an obsolete concept, but protecting it requires intention and collective action. In a world built on data, understanding the inner workings of privacy may be one of the most important forms of literacy a student can develop.

Leave a comment

Trending