Blog Post 9: Down in the Valley

Hi all,

First, since a few of you have written to ask me about the “provisional intro/argument” for paper 3 for Monday, here’s a little more information on what to be thinking about: this is a chance for you to do some provisional thinking about what you want to say about the topic, and we’ll do some workshopping of these pieces in class Monday, so make sure to bring a hard copy. It doesn’t have to be a fully polished intro paragraph — the most important thing is to lay out an initial sense of what you might argue and focus on. And now onto the regular blog…

This blog post both builds on some of our thinking about Netflix’s data profiling in our last class and also looks back to some of our first material in the course through new eyes. Watson’s article challenges us to think about the complex ways in which our selves as data both resemble and distort our “real” offline selves in strange ways (as you read, take a look at the article itself online so you can access the links in it and the images, some of which don’t come through in the print version.

So this blog post is a chance to explore those issues first-hand. After you read Watson’s article, you should examine your own data profile on Facebook — to do this, look in this FAQ. You can look in the instructions under “How can I adjust how ads are targeted to me based on my activity on or off of Facebook?” to see how to find information about your ad settings overall, and “What are my ad preferences and how can I adjust them?” for instructions on how to find information on a particular ad.

Once you’ve explored your profile, do some writing to analyze it in connection with Watson’s article, quoting and discussing her writing to frame your analysis. How does your ad profile resemble and differ from how you imagine your own identity, online and offline? How does what you find about that profile confirm or complicate what Watson claims? What’s important about those relations — how do we have to think about ourselves online differently in light of what you see here?

Reminder: your response should go in the comments section for this post — click the “Leave a Comment” link at the top of the post. It should be at least 250 words, and is due by 11:59pm on Sunday, November 4th. If you have any questions, let me know via email.

12 thoughts on “Blog Post 9: Down in the Valley

  1. Do you ever hear, “you are what you eat?” Well the science behind this saying is way more complicated than the phrase may seem and this same goes for the saying; “you are what you search”. While it is close to common knowledge that what you search online can trigger ads on social media and different sites, it is actually more than that. After the reading by Sara Watson, it has become clear that my digital identity is developed by my real-life locations, my posts, and what ads I have reacted to positively. While all these aspects come together, my data doppelgänger (as Watson calls them) forms but not so accurately. The identity may take our curious moments and turn them into distinctive parts of ourselves which is really uncomfortable. Watson explains that “Personalized ads and experiences are supposed to reflect individuals, so when these systems miss their mark, they can interfere with a person’s sense of self. It’s hard to tell whether the algorithm doesn’t know us at all, or if it actually knows us better than we know ourselves.”. This is the anxiety that we get when an ad appears, and we do not understand why we are targeted. For instance, when I opened by Facebook ad preferences, I saw that urban areas were an option and genuinely surprised. I do receive a lot of listings for apartments and rents in places like Chicago and Los Angeles and it always confused me, but I paid no mind until now. Thinking back to my search history I remember when for AP Environmental Science I had an extensive project about urbanization but now my data doppelgänger is a city girl with multiple travel destinations in mind. While yes, this may not be a danger, it is not how I would have initially described myself in reality. Therefore, this development gives me a curveball on who I think I am, and technology should not make me question myself if the algorithm itself it inaccurate. On the other hand, someone could feel that these ads help ourselves discover parts of our identity we cannot recognize on our own. This gives off the impression that our curious moments are actual aspects of who we are and should be accounted for. For myself, this is all a little too uncomfortable. The importance of our relationship between our doppelgänger and ourselves is to remember that we create our doppelgangers and we are the dominant one in this relationship. In order to stay on top of ourselves online, we need to remember that algorithms are not accurate so therefore the ads do not predict or describe exactly who we are. Any discovery about ourselves comes from ourselves and just because an ad helps shed the light, it does not mean the data knows more about you than you do.

    Like

  2. Personally, targeted ads really annoyed me. Why am I unable to search for lampshades on Google without subsequently being bombarded with advertisements about lamps? I previously didn’t think much of targeted ads and dismissed them as a means for companies to sell me items that they believe I want. After reading Watson’s discourse on data doppelgängers, however, my anger towards Big Data and target advertisement only increased. Speaking in terms of Facebook specifically, the attempt of advertisers to deduce what a consumer will want to purchase is often misguided. This is because the information that we put on or provide the to internet only reflects our identities to a certain extent. In other words, who we are in real life can be drastically different than who we are online. I, for instance, haven’t cared about Facebook since I was in high school. As a result, all of my “likes” and statues updates are grossly outdated. And while a number of my online searches (through Google mostly) do reflect my interests, a large number don’t. I seach many things based purely on curiosity but that have no correlation with my personality/ fundamental wants and interests. In this vein, as Watson highlights, I feel a significant disconnect between who I am in real life and who Big Data thinks I am. And Facebook feeds me this online version of myself in order to fuel their own capitalist agenda, not caring if this online identity is a realistic representation of myself as long as they continue to create revenue. Facebook isn’t advertising lampshades on my newsfeed to make my shopping experience more convenient, it’s telling me to do so because based on their flawed data collection it appears that they’ll make the most money if they advertise this specific commodity. In some instances, Facebook advertisements even unnecessarily cause people to question their own identities just for the sake of ad revenue (as Watson points out with the “Flirty Plus-Sized Outfits” example).

    Like

  3. I have never been creeped out by the data gathering that occurs on sites like Facebook. I see it as just one condition that comes with being able to use these sites. And because these sites make life so much easier, and more comfortable, I believe it is a fair price to pay. When I scroll through my Facebook feed, every ad is non-specific to my likes.To big companies trying to advertise, we are all just a number and a possible way to make money, there is no individuality in their eyes. Watson’s ideas have me thinking however. She brings up an instance of a father who had lost his daughter receiving a piece of mail with his name, and the specific information about his daughters death on it. Although I have not heard of many cases like this, this makes me wonder about how personal these companies can get. Do they actually stop and focus on Will Snyder, or am I just one of millions in a database. Watson also brings up the idea of the Uncanny Wall. She says we hit the wall when “we can’t distinguish whether something is broadly or very personally targeted to us.” This makes a lot of sense, people obviously get creeped out when they feel these companies know too much about them. I believe, however, that every ad is broadly targeted. These companies just don’t have the time or the advanced enough algorithms to be able to personally profile every single data set. So while I understand the fears behind the advanced knowledge of these ad generators, I am never very creeped out by them.

    Like

  4. After reading Watson’s article, comparing my ad profile does similarly resemble and differ from my own identity online and offline drastically. Watson mentions the idea of “uncanny valley” that describes the unsettling feeling some technology gives us (Watson 4). The uncanny valley term fits very similarly to how I react sometimes when ads on my Facebook page pops up. On “examining my own data profile” Facebook page, it was very clear of the number of various third parties like Acxiom, Oracle Data Cloud (formerly DLX), Epsilon, Experian and Quantium are involved in creating and customizing our ad profiles for our Facebook profiles. This also involves the ideas that “digital traces assembled by personalization engines, our most intimate behaviors are uncovered and reflected back to us” (Watson 6). It is also interesting that on the “examining my own data profile” Facebook page, it states that “so if you don’t want us to use the websites and apps you use to show you more relevant ads, we won’t.” I thought this was interesting to take note of because it is ironic how these social media platforms explicitly write out this specific policy, yet we were still stuck with various ads and commercials throughout our Facebook page. I also think that it is interesting for Watson to state that “data isn’t the first tool for self-reflection to produce a sense of the uncanny” (Watson 6) because this has us to understand that the uncanny valley that we experience when ads of our interest do pop up on our Facebook page that there are much more than just third parties involved within this process.

    Like

  5. To begin with, I think my identity, online and offline, resembles my ad profile on Facebook and other social media platforms. I always go through beauty and fashion profiles and shops, and it is my activity log that determines how much I am interested in them. One thing I noticed is that the ads I get on Facebook are more accurate and represent me more than the ads I get on Instagram. I always feel annoyed about the ads I get on Instagram because sometimes I feel like they are irrelevant and do not connect to me personally. As the author mentions, “It’s hard to tell whether the algorithm doesn’t know us at all, or if it actually knows us better than we know ourselves” (3/11). I do not think that the data and algorithms used know us better than we know ourselves, yet it knows us and our interests very well. I think in that situation, the uncanny valley comes in. As she describes it, “Uncanny personalization occurs when the data is both too close and not quite close enough to what we know about ourselves” (5/11). Sometimes I feel aware when I see ads on Facebook that resemble my interests and activity, yet when it’s a mix of ads that I can connect to and don’t, I feel uncomfortable and I ask myself, “Where did that come from?” or “Why did I get this ad?” This shows that ads can give us this unsettling feeling when they show how much data they can know or have about us. Again, as the author mentions, “The only way to assuage our anxiety about the uncanniness of personal data is to develop casual explanations that link our digital experiences with the data they are based upon” (9/11). I think if we become aware of where the ads we see do come from, we wouldn’t be as creeped out. It is important that we know and recognize our digital activities so that we are able to have control over ourselves and our profiles more than our data would control us.

    Like

  6. As we dive deeper into the digital era, data trackers, recording the history of our every click, have complicated what it means to browse the internet. These data trackers then use complex algorithms to sort and analyze this history in order to craft a personal profile for every internet user. The consequences surrounding the use of these personal profiles by websites like Facebook are discussed in Sarah M. Watson’s article, “Data Doppelgängers and the Uncanny Valley of Personalization.” Due to the fact that personalized ads are intended to mirror individuals, Watson argues that they can “interfere with a person’s sense of self” (3). In specifying this statement, Watson reasons that when personalized ads misrepresent us, “it’s hard to tell whether the algorithm doesn’t know us at all, or if it actually knows us better than we know ourselves” (3). This reasoning exposes the point that it is sometimes difficult for us to believe that a complex algorithm with the sole purpose of getting to know us, might actually not know us. Rather than questioning the reliability of these algorithms, we often question how well we know ourselves; thus, interfering with our sense of self. Keeping Watson’s arguments in mind when analyzing my own Facebook advertisement profile, I began to question the accuracy of her comments on the ability for personalized ads to “interfere with a person’s sense of self.” With hopes to stumble upon an ad that largely misrepresents my sense of self, I scrolled through my Facebook feed and clicked on the “why am I seeing this?” option for every ad that I came across. I saw a pattern of two major ad categories: politics and sports/outdoors. Political ads were shown on the basis of my age, location, and the placement of my profile in a target group called “US politics (very liberal).” Unlike the political ads that targeted me based on my demographic and viewpoints, ads showing sports gears targeted me based on my interests in athletics and the outdoors. These two trends were no surprise to me, as I consider myself fairly liberal and outdoorsy. While the majority of the ads further confirmed my sense of self, a few obscure advertisements made me question the accuracy of the algorithms. For example, an advertisement came up for a website that sells glasses, since I was labeled as someone who “enjoys online shopping.” I rarely shop online, my eyesight is perfect, and I’ve never bought or browsed through sunglasses online, so this label felt inaccurate to me. The way that I reacted to this outlandish personalized ad made Watson’s claims seem more like a generalization based on several instances than a generalized truth. Rather than interfering with my sense of self and subsequently making me question my online shopping habits, as Watson proposed, the advertisement for glasses made me question the reliability of Facebook’s labeling process. Although it is difficult to predict how I would react to other personalized advertisements that misrepresent my sense of self, I doubt something as trivial as an ad would cause me to jump to the conclusion that a computer software knows me better than I know myself. Individuals should use instances like this as motivation to speak out against ad tracking, instead of an additional reason to question who they really are. There are plenty of valid reasons to question your sense of a self, an algorithm programmed to make flash judgements should not be one of them.

    Like

  7. Something that had not really occurred to me before reading this article is Watson’s statement claiming that “Personalization appeals to a Western, egocentric belief in individualism.” I found this interesting because I always understood that the sharing of information to create ads was a means of making money. However, I never really considered western individualist beliefs to be so integral. She goes on to write “Personalized ads and experiences are supposed to reflect individuals, so when these systems miss their mark, they can interfere with a person’s sense of self”. I find it really interesting that a person’s sense of self is connected with advertisements. When I scrolled through my own Ad preferences I was surprised by how correct and how uncorrect some of my “interests” are. Under the News and Entertainment section the first interest to show up is “halloween”, followed by Vice (makes a lot of sense), followed by Weezer. That last one was definitely a surprise. Kind of makes me wonder, do I actually like weezer more than I thought? Does the internet know more about me than I know about me?
    Then I realized that this is a result of my data doppelgänger. It is identical to select parts of my non virtual self. I don’t feel uncomfortable with this. I think it invites self reflection in a strange way.

    Like

  8. It is a fact known by many Facebook users that ads are shown to users based on their search history from the internet. In order to turn this off or edit it the user must know to go into ad preferences within settings on Facebook. I find that the ads on Facebook are usually broadly targeted to me and are related to me in some way but usually it is nothing that I would be very interested in. The ads shown to me confirm her claims although this doesn’t mean that all ads that are shown to me are appealing to me broadly. My thoughts are that people will be shown a variety of ads some which relate directly to them and others which only have a slight relation to the person and their interests. Watson says, “The doppelgänger invites the strange possibility for self-observation and self-criticism.” (Watson 12). This quote shows how broad ads have their own marketing strategies where the audience can question whether this ad actually applies to them and this strategy causes the ad to be stuck in some people’s minds depending on their mindsets. The knowledge of how ads are marketed to people makes people more aware and many people analyze these ads and wonder how the ad has made it to their screen. This also presents the idea that Facebook and the internet have a better idea of who we are as people than we do. The concept of ads has shifted tremendously as the internet has risen to the most powerful medium of spreading information.

    Like

  9. Although Facebook uses personal online activity to promote advertisements, some of the information is very far off and/or does not fully grasp someone’s personality, but most it is very accurate and revealing. Sarah Watson argues that online advertisements are going too far and as a society we need to do more to protect ourselves. When Watson states, “It’s hard to tell whether the algorithm doesn’t know us at all, or if it actually knows us better than we know ourselves” (1), she illustrates that Facebook ads can reveal very personal or obscure interests based on ambiguous online activity. People could argue that these ads are based incorrectly and don’t illustrate who you really are, but it could also be more strongly argued that the information that Facebook uses finds the ad most suited for the person. These ads can be very revealing. On my own Facebook, there are some ads based around things I have looked up for school projects and don’t truly represent me, but there are also ads for law enforcement and educational work, both of which I have thought about doing, or did when I became a substitute teacher, but not something I have really talked about or researched extensively. Stating, “our digital data echoes our actual tastes and activities, often with higher fidelity than our own memories can” (1), Watson emphasizes the idea that our online identity does not forget like humans can. What we search and who we want to become at various points in our lives shape our digital identity. Our online identities show us our whole selves, not our evolved selves. Who we were five years ago versus who we are now are drastically different. Online we can find the evolution of our interests. It is beyond accurate because it doesn’t forget.

    Like

  10. In the article, Data Doppelgängers and the Uncanny Valley of Personalization, Sara M. Watson takes a close look into the social dynamics involved with the increasing exposure to online ads. As digital media becomes more integrated with the world, there is an apparent movement of firms and advertisers towards capitalizing and maximizing commercial reach through digitally-focused advertising tools. With this, the limits defining the extent of consumer information that can be legally and ethically used by advertisers become more unclear. More of the content we create, use and share online is being used to curate and personalize targeted-ads that more than often create discourses around misinterpretation of personal data as it is discussed by Watson in the article. Watson asserts: “Personalization holds up a data mirror to the self, collapsing the distance between subject and object, and yet it’s impossible for us to face our data doppelgänger with complete knowledge”. The contact with personalized online adverts is shown to accommodate elements of self-reflection. This is a valuable angle that is often ignored in the polarized conversation regarding the outcomes of advertising. Conceptually, regardless if ads generate inaccurate depictions of consumers, or otherwise, realistic representations of online users, these allow one to engage with online-inflicted projections of the self. While looking at the basis through which Facebook personalizes ads for my viewing, I was both unsettled and fascinated. Some of the references shown did not depict my real interest like comics and Disney Channel. These were most likely categorized and assorted from occasional interactions with content that could be associated with those categories. In this case, I didn’t find more controversial associations like the ones discussed in the article about anorexia. On the other, hand I was shown to be interested in LGBT+ related content such as RuPaul’s Drag Race and OUT Magazine. Though perhaps, I wasn’t surprised about this association, it still puzzled me to know that someone else is allowed to make what essentially is an assumption to how and where my character and sense of self are expressed. Despite any sentiments, I was faced with perceptions of character that sometimes closely resembled a mirror, and often not. Such dynamics can potentially incur new implications over how consumers choose to express themselves online. As more of our online activity is used to create these fragmented perceptions of our sense of self, we are obliged to face reflections that whether accurate or not, can allow for self-improvement, and exploration.

    Like

  11. After reading Watson’s article, and analyzing my own profile I quickly found myself in a situation very similar to that what Watson had stated in the text when she said Data tracking and personalized advertising is often described as “creepy.” Personalized ads and experiences are supposed to reflect individuals, so when these systems miss their mark, they can interfere with a person’s sense of self. It’s hard to tell whether the algorithm doesn’t know us at all, or if it actually knows us better than we know ourselves. , this quote resonated with me because after looking over the ad configuration based on my clicks on and off Facebook, I realized that my interests were one-hundred percent spot on with the things that I was interested in real life. Things such as the movies I was interested in, to the field of work that I was looking to pursue after graduation were in intermingled within my interests, likes, and other portions that decided what ads Facebook showed me. This was nerve-wracking at first, not just because this social media platform may know me better than some of my friends here on campus, but more so because many of the interests that were listed for my profile were things that I never directly posted nor referenced on Facebook. This caused me to re-visit the “Do Not Track” video series that discussed internet “Cookies” and big data corporations that use them to create an online identity for their users for purposes of marketing. I also began to contemplate how much of our information is taken from over places we visit on the web. After re-watching these videos, and reading the article it was very clear that I had no idea what information is being gathered from me and my internet identity, and even scarier, was that until this point I had no idea that this was happening every time I went on the web. In many ways this confirmed Watson’s argument basically stating that the algorithm’s and programs used to take all the data about our searches, and overall use of the internet to create a digital clone of our lives on and off technology gave me a feeling of uneasiness, or as Freud called it the “unheimlich” feeling. This was not just because the internet had a copy of my identity, but because they were selling it to other corporations or vice versa, in order to influence me. The exact influence of the ads made me feel even more uneasy because just as many Facebook users knew that Facebook used their interests to make the platform more profitable to the users, many users did not know that the information that Facebook was using to do so is not just from Facebook, I don’t know if the ads are just to sell me things but to influence me into a subject position. This is worrisome to me because, just as Facebook makes ads based on what I like, these algorithms present me ads, and news based off what they think I should or would like as well. Meaning if my likes are similar to the mass populations of eighteen to twenty-one-year-old black men from Massachusetts, my facebook feed will mirror this. This is a complex situation because the facebook feed has the power to decide what people of a certain background should or should not do, like, or feel about many of the issues in our everyday lives. This also is very complex because due to modern technology and filters, the facebook feed and ads could be extremely biased to give you only news that is relevant to the background your facebook “make-up” suggests you fall into, which could lead to influences on the lives of a certain demographic to oppose or support each other. This ultimately, makes me wonder where and when does the gathering of information end for a user? Do they have access to our microphones, cameras, and other portions of our devices, and if so does this mean they have the freedom to know almost everything about us if it means that we get more personalized ads in exchange for our privacy?

    Like

  12. Upon reading this article by Sara Watson I began to realize how much information Facebook can gather about us. When I looked at my profile and the ads that were shown I was surprised. I was on the brink of that uncanny valley idea. I felt this way because it was about things that I wouldn’t normally find something appealing to me but strangely were. Sara states “In contrast, personally targeted digital experiences present a likeness of our needs and wants, but the contours of our data are obscured by a black box of algorithms. Based on an unknown set of prior behaviors, these systems anticipate intentions we might not even know we have.” It began to anticipate, almost accurately I might add, my desires and passions. I began to recently talk about realtors with my friends and I saw about 4 realtor ad preferences pop up on my facebook ad section. That is why this confirms what Watson was saying about how it anticipates intention that we didn’t know we had like about real estate. it was merely a thought at the time unlit I saw ads about it which sparked interest. It also made connections to family that I have no relationship with digital self. this makes me wonder if the use of these ads and the info we give facebook make it so that they can use the digital to figure out the actual.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s