Barney 1
Jessica Barney
Professor H. Culik
ENGL 1190-C1622
30 March 2017
How Much Does Facebook Actually Know About Us?
As technology quickly advances day-to-day, so do the methods and techniques in which we determine most factors of our life. Although technology may seem to bring ease to our lives, is it actually leading to harm or help? Much evidence suggests that many people prefer to use their social media accounts to catch up with friends, discover recent news, and so on. What these people may not realize is that all this activity on their ‘favorite’ social media networks is actually being data mined by the network itself leading to a term we call today, digital redlining. “Digital redlining is the enforcement of class boundaries and/or discriminating against specific groups through technology, policy, practice, pedagogy, or investment decisions” (
When anindividual’s data is being collected unknowingly through a complex algorithm we call this a black box. These transparent “black boxes” that most of our society is a part of can come off as an optimization tool within these social networks. What people may not be aware of is that when they adjust their personal preferences on these networks they are recording this data and categorizing them, a technique called “machine bias”. “When the nature of discrimination comes about through personalization, it’s hard to know you’re being discriminated against” ( states Michael Tschantz, a Carnegie Mellon researcher. Unfortunately, this process of personalizing your network may come off as optimizing your data, however, this is now turning into the monetization of your data.
New evidence suggests that Facebook has turned to buying information from data brokers to sell to major advertisers. For example, they are collecting and selling individual’s personal information concerning what car they drive, their mortgage payments, and even shopping habits. This tracked data can be used to target advertisements and change the appearance of certain websites different people may use as they navigate the web (i.e. different advertisements pop up because of different things an individual may have researched prior). A recent patent proposed by Facebook is what is allowing them to collect and use this information about people. This patent has also allowed Facebook to sell individuals remaining personal information to banks. They are going as far as selling information about individual’s social media friends, likes, posts, and even their conversations on these networks. Although, many users are not aware of this access that social media networks have on their information and how it may reappear in decisions concerning everyday scenarios. For example, applying for a loan at their bank. Meyer explains, “When an individual applies for a loan, the lender examines the credit rating of members of the individual’s social network who are connected to the individual...If the average credit rating of these members is at least a minimum credit score, the lender continues to process the loan application. Otherwise the loan application is rejected” ( That is, if too many of an individual’s social media friends have poor credit scores it could lead to them being rejected for a loan even if their own credit is fine. This indirect discriminatory process is what plays a large part within digital redlining today. Like old redlining, getting denied mortgages for where people live, new redlining today is denying people loans because of where they live and even who they associate themselves with. Rieke notes, “a neutral policy or practice that disproportionately burdens a group of people on a prohibited basis is still illegal” ( Nowadays, many bank lenders may disagree with the immoral factors within this process. This is believed because there is still around twenty percent of the U.S. population that either do not have a credit file or it is so small that major U.S. bureaus cannot calculate a credit score from it. Having to determine a credit score through different aspects of a person’s life may be the only choice when they do not have an actual credit score already. Should new forms of payment information, including a person’s cable or utility bills, be sent to credit bureaus as a factor when manufacturing a credit score for those who do not have one? Or should America stick to the mainstream credit scoring methods, such as the FICO approach, which is not biased. The differences between the credit scoring methods shows that these average credit scores that do differ between racial groups reveal real, underlying inequalities within the credit scoring system. These credit scoring systems pretend to be “objective” or “neutral” predictors of responsible credit use; however, the various systems produce different results, and this makes me ask if the different results are really a window on the fact that each system is built on assumptions, values, and beliefs that are invisible. Pasquale emphasizes that the “black boxes” used to generate such scores are built upon assumptions, assumptions that need to be public. These assumptions tend to be called “consumer” or “marketing” scores. Scores like these commonly include measurements of online social media influence. Recently, industry regulators have found ambiguous information concerning how these financial assessments are viewed by the law. Is it legal for banks, and other companies to use personal information to determine important aspects of an individual’s life? According to the Equal Credit Opportunity Act, it is illegal for any creditor to discriminate against any applicant on the basis of race, color, religion, gender, marital status, or age. This now leaves us with the question of how Facebook and banks are getting away with the illegal use of personal information.
Facebook now has a feature when producing an advertisement described as “Ethnic Affinity”. This option allows advertisers to target users by their interests or background and exclude those people from seeing certain ads. Although, Facebook has claimed this tool to be a part of their “multicultural advertising” approach, in which only certain ethnic or racial groups can view certain advertisements, they have been found to be misusing the tool by assigning members an “Ethnic Affinity”. An “Ethnic Affinity” is created based on what posts and pages they have liked or engaged with on Facebook. This “machine bias” tool used by social media networks collects multiple pieces of evidence when categorizing individuals into these racial groups, including what they like, post, their location, relationship status, and even track their conversations with friends. By sorting through individual’s conversations with friends the social media network is able to pick up on slang usage and even language being spoken, then making it easier to categorize people by their ethnicity, race, gender, and so on. Per ‘The Fair Housing Act’ and ‘Civil Rights Act’, advertisements that exclude people based on race, gender, and other sensitive factors are prohibited by law in housing and employment circumstances. Facebook since has stated, “we take a strong stand against advertisers misusing our platform: our policies prohibit using our targeting options to discriminate and they require compliance with the law” ( Admittedly the company still enables the option to “narrow the audience” in which one could exclude African Americans, Asian Americans, and Hispanics when creating an advertisement.
If Facebook is claiming to use these discriminatory tools within their website for a useful purpose, then why does it seem like they are doing more harm than good? Any exploration of digital redlining leads to the discovery that surveillance and privacy form a unified web of control. For example, Facebook’s relentless recording of online behaviors, interests, and connections is important in and of itself, but it gains power because its operates in what Zuboff calls a system of “surveillance capitalism”. Overall, Facebooks misuse of personal data and online activity is one of the most critical examples when expressing the growth of digital redlining today.
Works Cited
Angwin, Julia, Parris, Jr., Terry, and Surya Mattu. "Breaking the Black Box: What Facebook Knows About You."ProPublica. 19 Oct. 2016. Web. 23 Mar. 2017.
Angwin, Julia, Parris Jr., Terry. "Facebook Lets Advertisers Exclude Users by Race."ProPublica. N.p., 08 Feb. 2017. Web. 23 Mar. 2017.
Chang, Alvin. "How the internet keeps poor people in poor neighborhoods."Vox. Vox Media, 12 Dec. 2016. Web. 23 Mar. 2017.
Meyer, Robinson. "Could a Bank Deny Your Loan Based on Your Facebook Friends?"The Atlantic. Atlantic Media Company, 25 Sept. 2015. Web. 23 Mar. 2017.
Rieke, Aaron. "Knowing the Score: New Report Offers Tour of Financial Data, Underwriting, and Marketing."Equal Future. Upturn, 11 Apr. 2016. Web. 23 Mar. 2017.