Don't take away my algorithm!

Lewandowsky et al. (2023) describe how humans are entangled with algorithms, and why social media and search engine transparency is so vital.

Believe it or not, I want the Internet to be a place where “where everybody knows your name”:

Well, okay, maybe I don’t want everybody on the internet to know my name - but I want many websites to know me. Imagine an internet that knew nothing about you. What if Google Maps didn’t know where you were when you asked for directions, requiring you to put in your current address every time? Imagine what a mess online shopping would be if websites didn’t remember who you are and what you liked and needed. And I shudder to think what I’d get on Spotify if, every time I logged on, it played only the most popular songs for that day, rather than the genres and tracks I’d selected in the past. The point is, I depend upon internet “algorithms” to tailor my online experience to my preferences, and I like that. I don’t want them to go away. But, that’s doesn’t mean they are perfect.

Clearly, algorithms can lead to very, very bad outcomes. So, we need to better conceptualize how they work, as well as how they should work. Lewandosky and colleagues (2023) have a new article out that goes into the many ways people are entangled with the algorithms used by social media companies, search engines, and other technology tools. It has a very helpful way of thinking about the degree of control and autonomy in those algorithms and, to me, it illustrates why some algorithms are useful, whereas others are very bad. See, I don’t mind algorithms learning my preferences and needs when it’s my choice for them to do so. And, I need those algorithms to be transparent and explainable (e.g., when a social media site pushes me an ad and, below it, there’s a link that explains why I got that ad and allows me to better train the algorithm and/or opt out of that specific ad). I do very, very, very much mind when companies deploy algorithms without my knowledge or consent, when they use algorithms to manipulate my engagement, and when I cannot access and control those algorithms. That lack of transparency is a big problem for individuals, society, and researchers, as Lewandosky et al. explained. I’d support legislation to regulate how companies design and use algorithms, but I don’t want them abolished.

So, I want my internet to be like Cheers: a place where people know my name and set up my favorite drink for me when I walk in, but also where I can say when I want to drink something different. And I need to be able to choose when to walk in and when to walk out. Oh, and there’s another important point: at Cheers, I pay them to serve me, so they are incentivized to keep me happy. Imagine a Cheers where you drank for free, and the bar made its money only through advertising. That’s basically what social media is at the moment, and that’s one reason why it isn’t working so well.