Some Problems with Privacy
Our current privacy rules are simply not up to the task of anticipating and accommodating every demand for interpretation and application presented to them in the privacy debate. They are, to a very real degree, relics of the last century. Many of the relevant federal statutes and Supreme Court precedents date from the 1970s. So, rather than focus on law, this lecture will discuss privacy as a defined set of practices and processes.
Concepts of Privacy
● Our concepts of privacy today are largely embedded in a set of principles known as the Fair Information Practice Principles (FIPPs), which the United States developed in the early 1970s and which are the keystone of the Privacy Act of 1974.
● The principles state that a government should limit the collection of personal information to what is necessary, use it only for specific and limited purposes, be transparent and open with the public about how the information is collected and used, and allow the individual about whom the data is collected to see the data and correct it if necessary.
● New technologies, such as drones, biometrics, and big data collection and analysis destroy these types of rules. A conscientious and fair application of such principles is, in many ways, fundamentally inconsistent with the way in which personal information is used in the context of counterterrorism or, for that matter, in commercial data analysis.
● Consider, for example, the purpose and use-specification principle, which states that data collected for one purpose should not be used for another. If universally applied, this principle would make it impossible for many sophisticated knowledge-discovery systems based on big data analysis to work well. Often, the data that provides us with the necessary missing link—from law enforcement to scientific research—is information that was collected for a different purpose and intended for a different use.
● In a world of widely distributed networks—with massive data storage capacity and computational capacity—so much analysis becomes possible that the old principles no longer fit. What is needed, then, is a modern conception of privacy: one with enough flexibility to allow effective government action but with the surety necessary to protect against governmental abuse.
● Some people think that the old privacy rules should be reinforced. But technology is forcing change. The old ideas of collection and purpose limitations are obsolete and need to be replaced because they can’t withstand the onslaught of new privacy-invading technology that all, or most, people embrace.
● Instead, we should focus on how data is used. And, more importantly, we should recalibrate our laws so that our concern is not with uses that are mere “analyses,” but rather with uses that constitute the “imposition of adverse consequences.” The focus should turn to actual harm to an individual: If the analysis, for example, wrongly puts a person on a nofly list or denies the person a job, that would be the sort of adverse consequence we would strive to avoid.
● There are a few building blocks for this idea. First, we really need to dig deep into what we mean by privacy. Privacy is a misnomer in some
respects. What it reflects is a desire for the independence of personal activity—a form of autonomy.
● We protect that privacy in many ways. Sometimes, we do so through secrecy, which effectively obscures the observation of conduct and the identity of those engaging in the conduct. In other instances, we protect your autonomy directly by allowing you to exercise your own individual choice of conduct. Indeed, the whole point of that kind of privacy is to allow people to act as they wish in public, which is a different way of looking at privacy.
● Anonymity is a third concept of privacy, the one that is most relevant to our consideration of the changes brought about by new technology. It’s a kind of middle ground where observation is permitted—that is, we expose our actions in public—but where our identities and intentions are not ordinarily subject to close scrutiny.
● In everyday life, we leave behind an electronic data trail that is suffused with information of this middle ground sort, including bank account transactions, phone records, airplane reservations, and smart card travel logs.
● Likewise, the physical realm of biometrics and drones often involves the collection of publicly exposed information in which we have an anonymity-based privacy interest. These forms of information—partially public, but with an overlay of privacy—constitute the core of the transactions and information available to governments.
● The type of anonymity that one has in respect to our activities in the electronic realm or under the gaze of unseen drones is not terribly different from the type of anonymity we have every day in our physical existence.
● Protecting the anonymity we value requires, in the first instance, defining it accurately. One might posit that anonymity is, in effect, the ability to walk through the world unexamined. That is, however, not strictly accurate, because our conduct is examined numerous times each day.
● Sometimes, the examination is by a private individual, such as an individual sitting next to you on a train. Other routine examinations are by governmental authorities and commercial entities, such as the policeman who watches the street and the security camera that records people at the bank.
● So, what we really must mean by anonymity is not a pure form of privacy akin to secrecy. Rather, what we mean is that even though one’s conduct is examined routinely and regularly—both with and without one’s knowledge—nothing adverse should happen to you without good cause. In other words, the veil of anonymity is now readily pierced by technology.
● So, to actively protect privacy that otherwise is compromised by the new technology, we must formulate rules that prescribe limits to such unwanted and undesirable intrusions. These rules are needed to protect our privacy and prevent governmental abuse.
● The key to this conception is that privacy’s principal virtue is a limitation on consequence. In the context of governmental oversight, the questions to be asked of any new surveillance program are as follows: What is the consequence of identification? What is the trigger for that consequence? Who decides when the trigger is met?
Protecting Privacy
● The traditional way to protect privacy, as well as its essential component of anonymity, is with a system of rules and a system of oversight for compliance with those rules. Here, too, modifications need to be made in light of technological change.
● We have begun to develop new systems and structures to replace the old privacy systems. First, we are changing the way we protect privacy from a top-down process of rules to one in which the principal means of privacy protection is through institutional oversight.
● Such institutions as the Department of Homeland Security, the 2004 Intelligence Reform and Terrorism Prevention Act, and the independent Privacy and Civil Liberties Oversight Board are, in effect, internal watchdogs for privacy concerns. In addition, they naturally serve as a focus for external complaints, requiring them to exercise some of the functions of ombudsmen. In either capacity, they are in a position to influence and change how the government approaches the privacy of its citizens.
● Perhaps most significantly, the same surveillance systems our governmentuses to advance its interests are equally well suited to ensure that government officials comply with the limitations imposed on them in respect of individual privacy. Some surveillance systems can be uniquely well equipped to watch the watchers, and there are already indications that strong audit mechanisms, when in place, can be effective.
● If we did reconfigure our conception of privacy, put the right control systems in place, and use a strong audit system for the government, we could be reasonably confident that a consequence-based system of privacy protection would move us toward a place where real legal protections could be maintained.
● It wouldn’t be perfect; there would always be mistakes and abuses. And it would be much more difficult to manage in the real world than the pure privacy protections we have in place now. But we need a solution that is more in sync with today’s technological realities, and these ideas should at least get us a little closer.
Privacy in the Commercial Sector
● In the commercial sector, we see a whole different set of challenges. To begin, the Constitution doesn’t apply to private commercial actors, so that’s not a potential avenue for protecting privacy.
● On the other hand, the field is wide open for Congress to regulate. Unlike government surveillance—where the purpose is, at least theoretically, to protect national security—when Congress steps in to limit commercial surveillance, the only negative consequence might be to interfere in the development of new technologies and markets.
● At this point, the value of commercial use of new technology has become so deeply embedded in the business model of corporate
America that it will be difficult to modify. Commercial companies value the information they gather about you. It lets them know what to sell you or how to try to influence you.
● Some web services are free precisely because the accumulation of your data is their product. If we change that business model—and we can— then, in the end, you will have to pay for some web-based services that currently have no direct cost to you.
● However, in the commercial sphere, we are already moving toward a system that looks more like the “consequence” idea of privacy. In order to protect your privacy and prevent the misuse of your data, you need to know what will happen to it, and you need to be able to control the use of it. Slowly, the laws are moving that way.
● Increasingly, companies are being criticized for overly invasive uses of your data, and they are changing what they do. Throughout the world, but especially in Europe, free web services are being called to account and told to publicize what they do (and to build in options that allow
you to manage how your data is collected and used). The commercial sector is resisting, but the trend is pretty clear.
Transparency
● A flip side to the loss of privacy is a gain in transparency. This can be valuable to the extent that it gives citizens insight into the actions of their governments. And just as citizens find that they are losing their privacy, we are quickly coming to the point where governments won’t be able to keep secrets very well, either.
● The development of technology has made it very difficult, for example, for an undercover spy to move around with a false identity. While some governments might think that’s a problem, some people might believe it’s a good thing.
● Privacy and transparency are two sides of one coin, and we associate very different values with each of them, depending on how they are
applied and to what end. We neither want to live in a world of oneway surveillance, nor do we wish to live in a world where some can live invisibly. Privacy requires a balance.
Questions to Consider
If you had to choose, which would you prefer for your life: the Ring of Gyges or the Panopticon?
You don’t have to choose, obviously, so how do you decide when privacy should prevail and when transparency should? And who gets to make that choice?
Source: https://www.scribd.com/document/345684358/9363-SurvillanceState
Not indicating that the content you copy/paste is not your original work could be seen as plagiarism.
Some tips to share content and add value:
Repeated plagiarized posts are considered spam. Spam is discouraged by the community, and may result in action from the cheetah bot.
Creative Commons: If you are posting content under a Creative Commons license, please attribute and link according to the specific license. If you are posting content under CC0 or Public Domain please consider noting that at the end of your post.
If you are actually the original author, please do reply to let us know!
Thank You!
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
This post received a 1.5% upvote from @randowhale thanks to @cryptoracle! For more information, click here!
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit