Week 3 Algorithms Good? Bad? Ugly?



Hello!

As I've been thinking about social media, and the internet I started to get curious about about algorithms. This word pops up often in discussions about the internet. Personally, I do not think I had the strongest understanding of what "the algorithm" is at the beginning of this exploration. Before I started to explore what algorithms are I would describe them as sites recording patterns to better tailor ads and content to the user. This has helpful, benign, and also sinister implications. My understanding is that companies can very easily justify tracking data and activity for the algorithm because they can claim it enhances users' experiences. I started my research with this article. It is very interesting because it looks at the basics of algorithms and also how to manipulate it for yourself.



(Photo Credit: Invisibly )


Now, I think it would be helpful to point out some other articles that discuss this topic. I read an article called "Everything you need to know about social media algorithms" which does a good job of providing a solid foundation to understanding these topics. This article even specifies the differences between major social media platforms and how they create algorithms. After reading this article it seems that my limited understanding was not wrong. Some of my takeaways that were new to me was, that an advantage of algorithms is there is just too much media and information to consume, and having a tailored feed through an algorithm cuts out media that may not be interesting to you. I also appreciated this article gave tips about how to make the algorithm work for you, be thoughtful in what you like and interact with, and learn how to use hashtags. This article also explains how to use the algorithm to promote your own content.





(Photo Credit: Stanford Online)

Next shifting gears I want to look at something specific, "angry reactions" on Facebook. This article sheds some really interesting information about this topic. It gives a timeline of the "angry reacts" and the concerns it raised. The article even points out how because staff at Facebook had concerns and ethical qualms about promoting posts that had that reaction. This reaction usually coincided with upsetting material and hate. It also references that Mark Zuckerburg once responded that "angry reactions" could be used in place of a dislike button. Part of the issue with Facebook reacts is people view them differently. I have seen posts that are memes on social issues and people will laugh because they find the meme funny but people will also laugh react to mock the meme's position. This creates a murky view of what people actually think about the content. Since it is not clear to humans it must be hard for a computer to take into account the different meanings of reactions. This article proves that algorithms can get messy because often there is a missing human perspective from the equation.

(Image Credit: Rozelt)



Another article I read was this KQED article, "How Much Do Social Media Algorithms Influence Your Worldview?" which was an interesting look at this topic. It was not afraid to present some of the darker parts of this conversation. One quote from this article that stuck out was this:

"These algorithms are designed to keep you on a given social app for as long as possible. They do that by learning what you like and showing you more of that content. "

Arguably this was my understanding of algorithms but I had overlooked that this is a tool to keep you engaged and online. This opens up some questions about learning how to limit social media consumption and ethical practice (both of these are huge topics that may be explored in future posts). The other takeaway from this article is that algorithms can quickly create an echo chamber. An echo chamber can be dangerous because we start to believe that not only are we right but also everyone else shares our world views. This leads to more division in an already divided world.

Hopefully, you find all the linked articles interesting I am curious how others feel about algorithms and how much did you know before really looking into them?

Now, since this Blog is a space for me to explore social media and how I can use it as a tool in my career I thought about what all this means to me. For one I have learned that privacy is hard to maintain when online. Algorithms makes privacy even more difficult to maintain. I thought I was careful before but I think I will be making an even more conscious effort to self-monitor what and how I react to content online. I want to have some control over what I see. Additionally, even though echo chambers are comfortable I want to frequently question whether I am stuck in one. Just taking steps to evaluate how I can expose myself to other opinions and views could break that wall.


(Imige Credit: Anyaberkut) 


All of this information is important to keep in mind for my future career. As an information professional I want to make sure I am aware of privacy issues and algorithms so I can teach about them. Whether I teach information literacy at a college library or hold technology workshops at a public library I want to stay on top of this information. I want to understand for myself so I can explain it to others and they can more safely and consciously use the internet.

Comments

  1. Knowledge is power! Even if we choose to keep doing what we are doing, understanding the role algorithms play and helping others to understand them is the first step to a more digitally literate society.

    ReplyDelete
    Replies
    1. I completely agree! Looking at this topic make me realize that we should be actively teaching things like algorithms to students or as a librarian the public. People, like me at the beginning of this week, may only have a vague understanding of these topics and its important that people have technology and information literacy.

      Delete
  2. Hi Kira! Your mention of echo chambers really is quite a fascinating point. You would think that with access to so many varying world views and mindsets, it would be easy for people to branch out in their knowledge. Yet, this is not how it works out. Both in person and in a virtual platform, we tend to gravitate towards like-minded people. Sometimes that's great and there grows supportive environments. However, it can also be dangerous and that is where the virtual echo chamber comes into focus. virtual isolation is a reality that is not discussed enough. If people are playing off certain insecurities or particular ignorance and all that negativity is being reinforced by others with similar views, there is a belief that these negative beliefs have to be the correct beliefs because no one is speaking against them.

    ReplyDelete
    Replies
    1. Yes! Exactly! Learning about things like echo chambers and knowing how algorithms feed us what we want let's us make more informed decisions.it can also help us examine our relationship with social media and knowledge. Very well put.

      Delete
  3. Hello, Kira! Thanks for sharing your insights. I also had a limited understanding of algorithms that only started to grow once I started to research. Excellent point that the meaning of an emoji is dependent on the context, and algorithms may not catch that context. I hadn't thought of how the laughing emoji depends on the context. I may laugh at an LGBTQ+ meme because I find it relatable, but there's no way to 100% know if someone else is laughing in a #relatable sort of way or a mocking sort of way. Context matters! I sometimes struggle with interpreting these social cues in in-person spaces, so reading these social cues in digital spaces can be even more tricky. I also find the Facebook emojis interesting to dissect because of how the angry emoji was so weighted in the algorithm. I saw an example of this in real time once on Facebook. One person had an angry reaction to an LGBTQ+ event, and that was the first emoji showcased, while everyone else had likes, hearts, etc. Nonetheless, the positive reactions were second and third to the solo angry reaction. C'mon, Facebook!

    Great point about echo chambers as well. Sometimes I wonder how to distinguish between an echo chamber and a safe space. For example, it's affirming to have digital spaces where everyone in the LGBTQ+ community is welcome and validated, and I wouldn't want to open that space to those who choose to harass or shame LGBTQ+ people for being themselves. I also wouldn't want to ban civil discussion with folks for simply having different views on certain topics (including divisiveness that exists within the LGBTQ+ community). Perhaps it's more about behavior toward that community, and less about viewpoints of that community? Still, perceptions often influence behavior, and behavior often influences perceptions. Something for me to ponder.

    Out of curiosity, if you were to write a blog post about ethical practice, are there particular practices that you would cover? It sounds like a fascinating topic for a blog!

    ReplyDelete
  4. Hello Kira,
    Just like you my understanding of the word algorithm was not a clear understanding, but a self inflicted implied understanding. When people hear the word algorithm all the time and from multiple sources it is easy to jump to conclusions without being totally sure of the true outcome. You quoted a section from the third article saying something along the lines of the algorithm is in place to keep people on that platform and that was a revelation to me, almost like something clicked in the back of my mind. Something I have noticed about Instagram Reels is the fact that it will straight up ask people if they are interested in a video and then proceed with playing videos just like it for a month straight. How people proceed has a strong impact on how they are influenced.

    ReplyDelete

Post a Comment

Popular posts from this blog

Week Four Internet Safety and Library Policy

Week Five Digital Tattoos

Bonus Post! Exploring Parental Consent for Social Media