In an effort to ensure greater privacy for citizens, California has signed a law that allows minors to delete content from their social media. The intentions are good and on the surface I support the law. However, in practice I have a lot of questions. From ABC news,
“An addendum to SB 568, sponsored by California State Sen. Darrell Steinberg, compels online companies and app developers to give minors (defined as anyone younger than 18 years old) the ability to remove any of their online content. The law also says that it’s acceptable for a website to allow minor users the ability to file a request for content deletion.”
However, the law does not require the social media platforms to delete the content from their servers, it just allows minors (defined as under 18 years old) the right to delete their content or request the site delete content on their behalf. So for platforms such as Twitter, Facebook, Tumblr, Instagram, YouTube and many other social media sites, this is nothing new – users (of all ages) already have the capability to delete content they have shared. This new law only applies to sites that don’t allow users to delete their content (I’m not familiar with any of these, but I’m sure they are out there).
Ok, so the content isn’t deleted from the server and of course the site cannot delete the content if it has been re-shared, re-tweeted, re-posted, turned into a .jpg (via a screenshot) and shared elsewhere. The law only requires the site to take the content from the users’ online profile and account. So what is the law really accomplishing?
Will the “eraser button” as it’s been dubbed, give kids an even greater sense of ephemerality? That is, I can say what I want because now I can have it removed? (no, you still can’t). What about adults who request to have content removed that they posted when they were a minor? How do sites comply with laws specific to an age group and a geographical boundary (what if I’m a California resident but I posted a picture while on vacation in Tennessee, does the law apply?)? For more legal dilemmas related to the law I recommend reading this article posted to a privacy law blog.
While the legality and enforceability of the law is interesting for me to think about, I’m not a lawyer. For me, as a media studies scholar, I think we need to also delve into other questions from a cultural and social perspective as well. For example, why is the law only applicable to minors? Aren’t college kids just as apt to post something they will later regret (say when they enter the job market for the first time)? Shouldn’t adults have the same rights to delete content as minors? It’s not like a kid turns 18 and is suddenly capable of making mature and wise decisions. I’m sure some would argue that although adults would also make decisions they later regret, they are “old enough” to understand and suffer the consequences. Yet even as adults, don’t we sometimes share or post things that ensue in unintended consequences that even the most intelligent and mature minded person couldn’t foresee? If we are creating an “eraser button” (which I’m not convinced we have), then why should legal protection only be granted to minors and not all citizens?
Additionally, laws cannot regulate social norms. Social media are networked interactions. While a teen (or adult) might make smart and discretionary decisions about what to share, they cannot control what their friends post on their behalf. The law only allows minors to remove their content, not content about them. It would be a slippery slope to allow users to remove other users’ content, however this just further iterates the limitations and weaknesses of legal protection on its own. We need to develop new social norms that regulate what is appropriate and respectful to share about others, not just ourselves. Just as we teach young people not to spread gossip and rumors, and friends teach each other the importance of trust and secrets (lessons often learned when trust is broken), parents and peers need to work together to develop practices that foster goodwill and respect for each other. Networked and mediated interactions have to be built on trust, just as offline relationships are. Sometimes saying “I’m sorry” includes taking down content we should not have shared or that unintentionally caused harm to someone else. Laws can’t regulate this, only social norms can.
While I appreciate the intent and effort of the law, I worry it is not enforceable, it creates a false sense of ephemerality, and does not actually accomplish what it is intended to accomplish. I’m all in favor of granting greater privacy protection to minors, but adults as well. Social media platforms are still in their infancy stage and we must continually think about how to manage our identities and relationships in an increasingly networked public. Our social lives are increasingly commercialized in ways that complicate identity and privacy in ways we are still figuring out. The boundaries between social, consumer, and citizen identities are so entangled that it is difficult for any of us to fully understand the potential use of our data, information, and the unintended consequences of sharing our lives in “public” via privately owned spaces. When we share our lives and foster relationships on social media platforms we are simultaneously enacting our social selves in conjunction with our lives as consumers as well as citizens. Mobile and social media render our everyday interaction visible, searchable, replicable and trackable; we are only beginning to understand the implications of such mediated interactions and identities.