Cutting-edge surveillance tech took center stage at the Consumer Electronics Show in Las Vegas, Nevada this past week. But just as many tech enthusiast praised the technology as something to marvel, privacy advocates voiced concern.
While the use of security cameras are booming, critics point to privacy issues with various aspects of surveillance, such as facial recognition and the issue of consent. Products on display at the CES show floor have had their own security and privacy issues.
CES Surveillance Hype Worries Privacy Advocates
A growing convergence effect that should give each one of us significant pause is the fading of privacy. There are three good reasons for us to pay attention to privacy issues. First: We, personally, are not immune to privacy violations. Second: We, as security practitioners and members of the security industry, are designing, manufacturing, installing and operating systems that lessen privacy. I am sure that among the more than 30,000 readers of this magazine, there are some knowledgeable privacy advocates. The rest of us, however, have a third reason to pay attention: We, personally and professionally, are less informed about privacy issues than we realize we are. That makes us, our systems, and ultimately our customers more vulnerable than they should be. And that directly contradicts our purpose as security professionals.
Since 2001 the State of California has enacted 49 privacy laws. One of those laws prohibits the improper use of electronic surveillance equipment by rental car companies (Assembly Bill 2840). The bill defines Electronic Surveillance Technology (EST) as a technological method or system used to observe, monitor or collect information such as telematics, global positioning systems, wireless technology, and location-based technologies. Another law (Assembly Bill 2840) prohibits the use of "black box" event data recorders in vehicles without explicit disclosure, forbids the release of data outside of the original scope and purpose, and forbids the release of identifying information when sharing data with vehicle safety organizations.
The common denominator in these laws is that they forbid using any means of electronic surveillance for other than the originally intended-and customer-accepted-purpose. Manufacturers and service companies dislike such legislation because of the high cost of retrofitting privacy controls into their physical and electronic systems, as well as in their administrative systems.
Privacy PendulumRobert Ellis Smith is the publisher of the Privacy Journal (www.privacyjournal.net), the oldest and most authoritative publication on privacy in the world. He is also the author of a number of books about privacy, including a 387-page book, Ben Franklin's Web Site: Privacy and Curiosity from Plymouth Rock to the Internet. This book chronicles the state of privacy and surveillance and how they relate to living conditions and community values, beginning with the Puritan settlements in New England in 1582 and continuing up to the present time.
Some degree of watchfulness is needed (i.e. surveillance) in a free society, in order to guard against criminal actions and criminal individuals and groups. Since the times of the early settlers in North America, there have been neighborhood watches and constables and private security under one name or another. There has also been an ongoing tug-of-war between the conflicting objectives of privacy and security, usually with people being willing to relinquish some degree of privacy to obtain some additional measure of security and safety.
Lawyers are already paying close attention. For example, prominent attorney Senator John Edwards of North Carolina called for a bipartisan commission to examine how surveillance technologies affect privacy. In a related press release announcement Edwards said that since September 11 the F.B.I. and local police departments "have increased experimentation with video and Internet surveillance, X-ray screening, facial identification and other investigative tools.'' One example he cited was a telephone-booth-sized X-ray scanner at Orlando International Airport in Florida that was "the equivalent of an electronic strip search, revealing the naked body along with any concealed weapons.'' Edwards pointed out that a simple programming change could scramble images of body parts but still reveal concealed weapons. The Tech Law Journal made note of the press release that day, in a daily email alert and in a permanent posting on its web site.
Final CommentSecurity is very much about trade-offs. Sometimes we accept a little less convenience for more security, and sometimes we trade more convenience for a little less security. What we do depends upon the security problems or threats that exist at the particular time of the decision, or upon our current perception or estimation of the risks involved. Some privacy advocates reprimand people (who would trade less privacy for more security) for "selling away their rights". This author believes these statements to be a take-off on Benjamin Franklin's words, "They that can give up essential liberty to obtain a little temporary safety, deserve neither liberty nor safety." Giving up "essential liberty" and giving up "a little bit of privacy" is not the same thing. First of all, the right to privacy (as discussed here) means the right to determine under what conditions and to whom personal information is released. Releasing information is not a giving up of that right, it is an exercising of that right.
So let us imagine that five years from now, the GDPR enforcement concerning the Big Tech has been centralized in the hands of a powerful, well-financed, and stuffed regulatory body. Will we then, finally, move from the world of commercial surveillance into a world of perfect privacy and data autonomy?
Now, there are two narratives about online ads that seldom meet. On the one hand, academics and privacy/digital rights advocates tell the story of how personalized ads influence our minds and behavior, stripping us of autonomy. Because ads are based on data about us and millions of others, their timing/content/context, etc. can be so good as to influence purchasing behavior to a degree threatening human freedom. This, also, provides an incentive to keep collecting all this data.
While technology pioneers advocate for a greater acceptance of microchipping, some serious worries must first be reckoned with. Harm to the human body, privacy concerns, and theft are very real threats that are part and parcel of human microchips.
Digital technologies are essential to establishing new forms of dominance through drones and surveillance systems; these forms have significant effects on individuality, privacy, democracy, and American foreign policy; and popular culture registers how the uses of drone technologies for aesthetic, educational, and governmental purposes raise questions about the exercise of individual, governmental, and social power. By extending computational methodologies in the digital humanities like macroanalysis and distant reading in the context of drones and surveillance, this article demonstrates how drone technologies alter established notions of war and peace, guilt and innocence, privacy and the common good; in doing so, the paper connects postcolonial studies to the digital humanities.
I make three central arguments in this paper: the use of digital technologies is essential to establishing new forms of dominance through drones (unmanned automated vehicles, UAVs) and surveillance systems; these forms have significant effects on individuality, privacy, democracy, and American foreign policy; and popular culture registers how the use of drone technologies for aesthetic, educational, and governmental purposes raises complex questions about the exercise of individual, governmental, and social power. In what follows, I first highlight the cultural turn in the digital humanities in order to open up a critical terrain to study the militarized and civilian uses of drones and the surveillance cultures they engender; second, I focus on drones as disruptive technologies that thrive on surveillance regimes; and third, I study the creative appropriations of drone technologies by artists and singers seeking to counter the global reach of digital networks that enable some nation-states to wield power over largely post-colonial societies, and control the social, legal, and political meanings of innocence and guilt, privacy and freedom. Taken together, these approaches help us infuse cultural criticism in the digital humanities and connect postcolonial studies with the digital humanities. 2ff7e9595c
Comments