New techniques:

New surveillance techniques raise privacy concerns
http://www.pbs.org/newshour/bb/new-police-surveillance-techniques-raise-privacy-concerns/

With power of facial recognition and high-tech surveillance, where to draw the line between safety and spying?http://www.pbs.org/newshour/bb/power-facial-recognition-high-tech-surveillance-draw-line-safety-spying/

Sting-ray mobile phone tracking device (ISMI catcher)
https://www.youtube.com/watch?v=XTl_Clu-HKQ&nohtml5=False

How Verizon, AT&T are using "supercookies" to track you online https://www.youtube.com/watch?v=4XiUCRziSos&nohtml5=False

Biometrics: finger print, palm prints, iris scans, and facial recognition
https://www.fbibiospecs.cjis.gov/

Using personal data in the information marketplace:

Issues

The informational filter bubble: (from Wikipedia)

A filter bubble is a result of a personalized search in which a website algorithm selectively guesses what information a user would like to see based on information about the user (such as location, past click behavior and search history and, as a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Prime examples are Google Personalized Search results and Facebook's personalized news stream. The term was coined by internet activist Eli Pariser in his book by the same name; according to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their own informational bubble. Pariser related an example in which one user searched Google for "BP" and got investment news about British Petroleum while another searcher got information about the Deepwater Horizon oil spill and that the two search results pages were "strikingly different." The bubble effect may have negative implications for civic discourse, according to Pariser, but there are contrasting views suggesting the effect is minimal and addressable.

From chapter five, my book:

Damage to already disadvantaged populations.

More than just “catering to the rich”; rejecting the not rich

Civil rights organizations are challenging the extensive uses of big data analysis by government and commercial entities alike, on the premise that the practices can bear special harms for disadvantaged populations. From one point of view, data mining produces a sort of high-tech profiling; there are risks that the marketplace will tailor itself to only those who can afford products and services and specifically avoid those who cannot, including minorities and low income citizens. Turow notes that

“marketers are increasingly using databases to determine whether to consider particular Americans to be targets or waste. Those considered waste are ignored or shunted to other products the marketers deem more relevant to their tastes or income. . . . The quiet use of databases that combine demographic information about individuals with widely circulated conclusions about what their clicks say about them present to advertisers personalized views of hundreds of millions of Americans every day without their knowledge. . . . The unrequested nature of the new media buying routines in the directions these activities are taking suggest that narrowed options and social discrimination might be better terms to describe what media buyers are actually casting” (The Daily You 88-9)

Wade Henderson, chief executive of the Leadership Conference on Civil and Human Rights, claims that “Big data has supercharged the potential for discrimination by corporations and the government in ways that victims don’t even see . . . This threatens to undermine the core civil rights protections guaranteed by the law in ways that were unimaginable even in the most recent past” (Fung, “Why Civil Rights Groups”). Data collection and the data marketplace potentially reach all Americans but minorities and low income citizens face special risks in this regard; big data and civil rights are about targeting and separating one type of individual from another. Matthew Crain reminds us that “sorting is as much a practice of exclusion as inclusion, and the broader and

more integrated the sorting becomes, the greater the potential for discriminatory practices” and that “A 2012 investigation by the Wall Street Journal showed that internet retailers

routinely engage in price discrimination based on information obtained from consumer surveillance” (187-8).

digital market manipulation

Ryan Calo has identified one of the most insidious and widespread negative aspects of putting data to purposes other than those for which the data was collected: digital market manipulation. While it is not the case that every market within capitalistic system must be fair (there is an expectation that some markets will be more fair than others and that some markets will treat some consumers in special ways), there is an expectation that the market in general will not be unfair to virtually everyone. Calo describes both objective and subjective harms to consumers from digital data market manipulation.

First, subjectively

the consumer has a vague sense that information is being collected and used to her disadvantage, but never truly knows how or when. In the digital market manipulation context, the consumer does not know whether the price she is being charged is the same as the one charged to someone else, or whether she would have saved money by using a different browser or purchasing the item on a different day. The consumer does not know whether updating his social network profile to reflect the death of a parent will later result in advertisements with a heart-wrenching father and son theme. She does not know whether the subtle difference in website layout represents a “morph” to her cognitive style aimed at upping her instinct to purchase or is just a figment of her imagination. (Calo 209)

This experience presents the consumer with a Kafkaesque dilemma brought about when data they have provided to specific entities is circulated among and misused by a broad range of unknown entities.

Objectively, digital market manipulation finds consumers at risk of paying more for products and services rather than less. This practice seems contrary to everyday experience as many online shoppers find ‘deals’ galore and would credit the online environment with enabling them to comparison shop for the lowest prices. In the limited context of their online shopping experience, these consumers may save money in the short term, thereby validating their perception. However, in the broader marketplace, consumers may well give back those gains because digital market manipulation “uses personal information to extract as much rent as possible from the consumer.” In the long run consumers lose because “the consumer is shedding information that, without her knowledge or against her wishes, will be used to charge her as much as possible, to sell her a product or service she does not need or needs less of, or to convince her in a way that she would find objectionable were she aware of the practice” (210).

In short, extensive exchange of private data within the data marketplace, exchanges that almost always include data sourced from government and commercial entities, has the strong potential to damage the very providers of that information in violation of the 3rd FIP.