DTC475 BLOG by Joshua Jackson

legends

Annotation blog 1.1 - June 28th 2020

As I worked my way through the reading for this week, the biggest thing that stood out to me was in the book Digital Media Ethics by Charles Ess. The question I have found in the text is something I feel is worth elaborating on. Ess asks if Cyberbullying is a “serious problem for those of us living in a digital age”(Ess, 2014)? With multiple platforms of social media, we can see that one’s privacy could be an issue because it is so readily available at the palm of one’s hand. Cyberbullying, cyberstalking is a new form of picking on someone, but with an audience that can be infinite, one could see how this type of bullying presents serious problems.

As I moved further on in this reading, I couldn’t help to think about how Ess used “Amanda Todd” as a basis for this part of the book. I remember her case because my oldest son was born in June of 2012. I used Facebook, Twitter, and Instagram to showcase my newborn to show my friends and family how happy I was to have this little guy in my life. As I lay on the couch with my newborn, October had come by fast, and my wife and I started to hear about this girl who was stalked and terrorized online through social media and ultimately Committed suicide.

I think Ess provides us with a way of knowledge for which no matter what this girl did, people could find her on social media even after she changed her accounts. Since the majority of people nowadays communicate through social media, was she supposed to just cut herself off from the world? After Amanda changed schools and where she lived physically, she was still targeted once she became active online. How can we combat cyberbullies or stalkers if our digital footprint can be tracked anywhere we move to? I fear this issue is one that will never have any progress unless these individuals are punished by law. Combatting Cyberbullies

Work Cited

Ess, Charles. Digital Media Ethics. Polity Press, 2014.

Annotation blog 1.2 - July 5th 2020

For the Blog this week, I have chosen to continue the discussion of Cyberbullying, but first I want to touch on the topic of journalism. According to Digital Media Ethics by Charles Ess, chapter four focuses on how the virtue ethics framework is the most helpful in friendship when trying to treat people in a more respectful manner. Ess also touches on the topic of the rise of “citizen journalism,” which has become more widespread in today’s culture with the rise of smart devices and the quickness of information sharing. Anyone with a smartphone can now share the news of what’s happening around them at the touch of a button (Ess, 142).

Another topic we find with being able to get close without actual physical contact is what we find with online “friendships,” which sometimes are used as a medium for numerous forms of cyberbullying (Ess, 122). I find it disturbing that individuals such as Josh Evans, in the case of Megan Meier, or the Amanda Todd case, would create these caring avatar type personas and intentionally flirt to hook someone in, only to stalk them and push the victim into ultimately ending their lives.

As I have said before, I feel strongly that more monitoring and harsher punishments need to be brought in to bring justice to the victims of this disgusting behavior. These types of cases make me wonder if people have good intentions when socializing on the internet, I would think most do, but as is with a news report it would be smart to check all sources for credibility. Are online relationships healthy

Work Cited

Ess, Charles. Digital Media Ethics. Polity Press, 2014.

Annotation blog 2.1 - July 12th 2020

A few semesters ago, I took DTC 356 Information Structures at WSUV. We talked about racism and negative SEO when it came to people of color, especially with black women, and the results were disturbing. We were instructed to open the google search engine and look up the term black hair. The results were shocking. We were searching, and this term brought up mostly porn websites featuring black women and racially disturbing memes and blogs.

The reading this week reminded me of this when Safiya Umoja Noble talks about how algorithms throughout the World Wide Web continue to promote oppression among certain ethnic groups, specifically African Americans. Additionally, Noble references that racial search results "had automatically tagged African Americans as "apes" and "animals" (Noble,6). When brought to Google's attention, they responded that the search was experiencing "glitches," and nothing is wrong with their algorithm (Noble 6).

I find it pretty disturbing that Algorithms of Oppression How Search Engines Reinforce Racism was published in 2018, and google still has racially targeted search results in 2020. A tech firm with some of the best Coders in the world needs to step up there game seriously, or they may need to think about rehiring some SEO specialist to clean this filth up.

Here is a link to them explaining their racist image searches. Are Google image search results racist

Work Cited

Noble, Safiya Umoja. Algorithms of Oppression How Search Engines Reinforce Racism. New York University Press, 2018.

Annotation blog 2.2 - July 19th 2020

The most notable part of this week's reading in chapters 4-6 of "Algorithms of Oppression How Search Engines Reinforce Racism" by Safiya Noble was how many search engines and databases such as the image database ArtStor and google, portray some type of racial bias and distort the information they provide. As I stated in my last post, I had no idea until we were prompted to do the "black hair" search that trusted search engines had racial or unethical search results. I have to think that these types of racial bias are not just a glitch in the system, especially when it is happening on multiple databases.

One of the Biases I saw while reading this chapter included the word "classification." Noble talks of how classification became a way to exclude Native Americans and African Americans, which started in the eighteenth and nineteenth centuries. The section explains the reasons why certain ethnic groups are misrepresented when being classified. The majority of people are classified with racial stereotypes instead of factual data, and biases within other websites always pop up in a search because the SEO is written by a small number of people. So what we end up with is racial and gender misrepresentations through searching because the developers who design these systems can "prioritize hierarchical schemes that privilege certain types of information over others (Noble, 139)." Microsoft tackles' horrifying' Bing search results

Work Cited

Noble, Safiya Umoja. Algorithms of Oppression How Search Engines Reinforce Racism. New York University Press, 2018.

Annotation blog 3.1 - July 23rd 2020

After a hard read this week in Digital Labour and Karl Marx by Christian Fuchs There are some interesting things that come out in the text about characteristics of social development and how our day to day technology has started to shape the way we act in our society, as well as how the media, mainly the internet, has impacted everyone’s lives in this day and age. Christian Fuchs gives us details throughout chapter two on how every new development and change in our lives are the outcome of class contradictions amongst capital and labour and not because of uncontrollable unbiased forces such as when he states “the use of labour power is labour itself” (Fuchs, 29).

Fuchs stresses that technology such as social media alone is not the cause, nor is it the changes we have witnessed in technology the last 20- years, but decisions that have been made by the billionaire elite who own and operate tech firms which are nothing more than a service to them to strengthen their wealth/power. I found this to be a tough read, and I am not sure I understand it correctly, but I think the author is trying to emphasize with the age of digital capitalism upon us the importance of struggles in class and how they require new approaches and methodologies of struggle. Labour concerns in a digital age

Work Cited

Fuchs, Christian. Digital Labour and Karl Marx. Routledge, 2015.

Annotation blog 3.2 - July 27th 2020

This week I took on Blackbox Society by Frank Pasquale for my blog choice. I found this reading very interesting as the author sounds like a bit of a conspiracy theorist, and I am totally into that. Frank Pasquale gives us a look into how major corporations use secret algorithmic mechanisms to aid in their daily decisions. Through the collection of personal data via the internet and social media, data collected allows tech companies to see our decisions made so that they can write algorithms that influence our daily lives. Moreover, information gathered by these major companies such as Netflix, Facebook, and Google is rumored to be sold to the government as Pasquale states in chapter 2, “Laws prevent government itself from collecting certain types of information, but data brokers are not so constrained. And little stops the government from buying that information once it’s been collected ” ( Pasquale, 2016).

These multibillion-dollar companies build profiles of their users to sell their computer spawned opinions to people worldwide. Algorithms give us scores which are used to determine if we are worthy of receiving credit, renting or buying a home, and even for employment. As Pasquale notes, these scores are so prevalent, yet it is very secretive on how they are calculated, suggesting that entire societies play along “black box” rules. This Big Data only benefits financiers and a handful of tech entrepreneurs, while everyday people live oblivious to these rules in which their lives are played. The Data Big Tech Companies Have On You

Work Cited

Pasquale, Frank. “The Black Box Society: the Secret Algorithms That Control Money and Information.” Amazon, Harvard University Press, 2016, www.amazon.com/Black-Box-Society-Algorithms-Information/dp/0674970845.

Back to top