Press "Enter" to skip to content

You Are Here: How The US Is Missing The Point In Regulating And Protecting Data Privacy

I like conspiracy theorists like Alex Jones and flat-earthers. I like them not because I believe in the nonsense they spout, especially when it’s particularly vulgar rhetoric like claiming the Sandy Hook shooting was staged and harassing parents about it, but because their conspiracies are sometimes pretty funny, such as the idea that the world is flat and the government is putting chemicals in the water to turn the frogs gay. Common thread? The government is up to some shady stuff that is out to disenfranchise the American people. And these are all the rage with a niche of folks, but there are times, and, I mean, very rare times, when we should be skeptical of the government in this fashion, especially when it comes to your personal data.

Note that I said personal data, not private data. And this distinction hits right at the heart of why data privacy in the United States is perhaps the most troubled space for the government when it comes to regulation and setting policy.

The U.S. government has responded to new questions surrounding data privacy with reactive policies that have only addressed unique instances, resulting in what legal experts call sectoral laws. This is akin to a doctor looking at a patient with fever, inflammation, and cough, all pointing to a systemic infection, and recommending three different over-the-counter medications for each of the symptoms and not the root disease. This approach, when contrasted to the newly passed and codified General Data Protection Regulation (GDPR) in the European Union (EU), shows that the United States is failing in a sector that it has pioneered for the last several decades.

The intersection between policy, law, and technology is becoming more and more relevant and our reactive approach is putting us further and further behind. Two major cases concerning data privacy in the last year have pushed us to finally start asking the right questions of how the government is supposed to regulate data privacy in an increasingly connected and public world. Exploring the government’s approach to each of these cases reveals a paradigmatic flaw in how the United States handles data privacy, especially when contrasted to holistic legal doctrines established by the EU’s recently passed GDPR.

Carpenter v. US:  New Precedent for an Aging Phenomenon

Your personal data is not private — a fact which would probably have been useful to know for Timothy Carpenter and his accomplices in 2011. That year, Carpenter and three other men were arrested on suspicions regarding a string of armed robberies. When one of them confessed, they also gave up his phone number, which the FBI used to order records on the cell phone. This data allowed them to acquire location data that placed Carpenter at the crimes, to which he was subsequently charged with. In a motion to suppress the evidence which was denied in both the lower and appellate courts, and was then kicked up to the Supreme Court to rule on. In June 22, SCOTUS ruled in a 5-4 decision in favor of Carpenter, which set a great precedent in terms of what constitutes data privacy under the fourth amendment. While not nearly enough, it is the opening salvo to a debate that will determine how each individual’s data is treated by companies and government alike. This is where the complex analysis done by the court is a good example as to how the technical legal knowledge and intricacies become so important in these keystone decisions. With Roberts, Ginsburg, Breyer, Kagan, and Sotomayor composing the majority decision, the final ruling determined several key factors.

First, the government argued that the data obtained by the FBI was not private data following the third-party doctrine. This doctrine determines that when a private actor grants permission to a different entity to access information, there is no reasonable expectation of privacy. SCOTUS decided that because there was no affirmative consent given on behalf of Carpenter, and in effect, any U.S. consumer who hosts data with a private entity, which is a big part of the voluntary exposure aspect of the doctrine, means that there is no invocation of this doctrine. Second, the affirming justices agreed that the degree to which information is given to such third-parties makes it so that such information is indeed in need of protection, which means a warrant would be necessary in the future. Both these points set a major precedent as to how governments treat data in the future as this interpretation folds in data access into the fourth amendment. As stated, the fourth amendment makes it so that individuals are secure against “unreasonable searches and seizures.” While the fourth amendment is not worded explicitly to include privacy, an extensive case history has included privacy protections. The inclusion of data privacy under the fourth amendment makes some data subject to the exclusionary rule, which is the warrant requirement, a high threshold of protection. While this is a great ruling that will have significant influences over criminal proceedings, the applications are limited to that sphere.

This is the most positive outcome of the case, and we can already see major problems in the methodology, namely that there isn’t one. For years, data privacy has been a major area of concern, and it wasn’t until this summer that we got an answer. And this answer isn’t even very satisfactory given that it only applies to law enforcement and does not regulate or address the concerns of major corporations and their possession of all this data, which is arguably more important. These sort of rulings also indicate how reactive laws are, with very little case law for either side to call on. It’s not just court rulings, either. All of the sectoral laws concerning data have to deal with financial or health services, with no real codification or unification as to what to do with data regulation as a whole. This becomes a major problem because it means that individuals who are caught into this sort of black-hole paradigm have no actual recourse when it comes to securing their privacy, rather sectoral laws are there to respond to specific incidents. Such laws have no real way to prevent companies from tracking you, from influencing your vote, or from exploiting your data to advertisers.

Cambridge Analytica and Facebook: Power and Vacuums

“Horror vacui, or “nature abhors a vacuum,” a quote from Aristotle, rings true when describing power in modern politics. If the government isn’t going to regulate it, then the private sector gets more power and discretion within this lawless vacuum. This became evident in the data privacy field when the Cambridge Analytica scandal broke earlier in the year and revealed a major problem with U.S. elections in the information age. However, in the larger scheme of things, the underlying problem is not how we manage data that can influence elections, but rather, how we protect data in general. While Cambridge Analytica may be a British firm, it collected information on Americans from an American company. Such infiltration is only possible because of the lack of protective safeguards. While the details of how Cambridge Analytica permeated the U.S. elections are interesting, the focus should be more on how the government reacted. Instead of responding with a push for comprehensive policies or a substantial debate on this topic, the U.S. Senate called on Mark Zuckerberg to testify on how the scandal broke down. The exchanges revealed major flaws. First, Zuckerberg’s testimony reveals how technologically illiterate the U.S. Senate is, which is not an unknown criticism. We need people who are able to understand the updating world of the information age in order to create these laws. Second, the testimony also revealed how tech companies feel about regulations, with Zuckerberg referring to vague support of undefined regulations. Third, there is also a definitional problem, as Facebook’s censorship team has made spotty choices in the last couple years that have called into question whether or not Facebook is a technology firm or a publisher of news and media content. Fourth, the testimony also revealed just how vague and opaque the user agreement that Facebook employs is, further emphasizing the fact that central regulations on how user agreements should be written and allowed to operate are necessary. Fifth, the testimony also reveals problems with Facebook’s compliance with FTC regulations about reporting leaks and potential problems. The lack of pressure and enforcement is appalling. Sixth, no penalties were put in place. At all.

The attitudes displayed by both parties present at the testimony are problematic. The fact that such a major corporation is not accountable to anyone unless they voluntarily show up to a slap-on-the-wrist hearing is revealing of how flagrant the data abuses of Facebook is like. Take the recent Google scandal for instance. Google’s CEO was recently called on to testify regarding the potential censorship that Google be subject to in order to enter the Chinese market. The CEO, Sundar Pichai, declined to show up, and with nothing done to enforce that he appear or to retaliate, the government has given up. This is not the way to go abound keeping these companies accountable. Regulatory standards are also non-existent when it comes to the entities that handle all of this data. The doctrine covering established human rights is generally based on what is intrinsic to a human being. The issue is that individuals are not inherently born with private data on the internet. This requires a pivot in our understanding as to what privacy is and what rights are. More importantly, it warrants a legal debate as to how laws are supposed to handle non-intrinsic attributes of individuals as they pertain to specific temporal contexts.

So What Do We Do Now?

While the checks on an even more powerful law enforcement are necessary and good, it falls short when there are malicious private actors that have the given control over such a wide field of information. Even though we have started to take some baby steps by having our representatives take a look at the most severe cases of the last year, it is not enough, nor is it fast enough to take effect before more damage is done to our democracy and personal identities. So what are some of the different approaches that can be taken? Well, given that the U.S.’s laws are so fragmented and specific, and that the most meaningful legislation is on the state level, the first step would be to unify them. For many, this seems like a Goliath-like task, with many engaging in the belief that removing yourself from such platforms is the only way to protect yourself, as evidenced by the #DeleteFacebook movement. However, when the U.S. is behind in legislation and other countries are ahead, it is not uncommon for us to model our subsequent laws after them. For us, that would be taking a look at the GDPR and incorporating some of its essential legal doctrines.

As mentioned earlier, the problem with some of this centralized legislation is that they have to be totalizing in nature. In order for data privacy to be regulated, individuals need to be able to state that they have a right to that data. If there is no recourse for an individual in their ability to gain access to or delete their data, then there is not much any enforcing agent can do in regulating it. The GDPR is ahead in that it incorporates the notion that individuals have rights concerning their data. This is not a product or a result of a tacit agreement with private companies, rather this is something that is an extension of your identity and privacy. What this also means is a clear and unambiguous understanding that U.S. citizens have a verifiable right to privacy. Not just from law enforcement. Not just from the government. U.S. citizens should be able to define their level of comfort and access when it comes to their personal identity and their actions. Private actors and law enforcement need to take these rights as inalienable and on par with the rights granted to citizens under the Constitution. Anything short of this effort would prove disastrous, as proven with our tax code (where most companies can get away with paying no taxes any given year), on how entities find ways to skirt such laws.

But this is where a classic public policy problem lies in wait, which is that people do not care about the fine print. Without the right attention and the right focus, individuals will be unaware of their rights concerning their data, even if we pass our version of the GDPR. This has to be a systemic change, where we educate our kids and citizens on what it means to have a digital profile. Incorporating lessons about how to use these new potential legislative and judicial mechanisms is something that has to be changed in our culture. Citizens need a salient and engaging way to understand how important their personal privacy is and what it means for it to be accessible by someone else. No more law enforcement being able to get your location data without a warrant. No more warrantless wiretaps on millions of Americans by the NSA. No more Cambridge Analytica. No more Russian bots influencing elections. What we as a populous need more than ever, is the first real efforts in regulating and understanding the quagmire that is data privacy.

Featured Image Source: Financial Times

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *