One idea that stood out to me this week is the concept of the “algorithmized self.” Bhandari and Bimo argue that platforms like TikTok do more than simply host content. They shape how users present themselves by rewarding certain styles, trends, and formats with greater visibility. The lecture slides make a similar point by showing how TikTok foregrounds the algorithm, encouraging people to experiment while still pushing them toward whatever the platform is most likely to promote.
What interests me most is how this changes the idea of being authentic online. Social media no longer feels like a space where people can simply express themselves freely. Instead, identity becomes shaped by what users think the algorithm wants to see. Even when people are aware of this, they often still adjust what they post in order to gain attention and reach. That creates a strange tension between self-expression and performance, where online identity can start to feel less like a reflection of who someone is and more like a version of themselves designed for visibility.
This connects closely to Eli Pariser’s idea of the filter bubble. Algorithms do not only shape what we share, but also what we see. By feeding users content that aligns with their existing interests and beliefs, platforms can create echo chambers where difference is filtered out. Feng and Kim’s research on trust in AI-generated advertising adds another layer to this issue. Their work shows that people are more likely to trust personalized advertising when it appears useful, enjoyable, reliable, and worth acting on. Altogether, these readings suggest that algorithms shape online life at every level, influencing identity, attention, and trust.
Leave a comment on this post. (Comments are saved locally in your browser — no login required.)
No comments yet — be the first.