Personalization: Machine Mirrors the Ugly Us that We don’t Want to See so We Blame the Machine

There is a fair amount of discussion that Google search, Amazon recommendation and Facebook streams are too much personalized and making us closed minded, the so called filter bubble or echo chamber effect. I’ve been thinking this for a while, and I have a new idea about what makes personalization and recommendation bad.

It is not the machine, not the algorithms. It is human nature. The machine is learning from the human beings eventually, and the machine is just augmenting human reality. My argument is that for a person who has relatively open and balanced mind, the machine personalization results will be fairly balanced, and the recommendation results will serve good knowledge discovery. Only for persons who initially themselves have huge biased opinions and worldviews towards certain things, the machine personalization results will be biased. I don’t think the machine is making things worse, the machine is just reflecting the reality, and it is good that machine is making us see the reality so we can figure out a way to solve it. What results these biased and closed opinions are essentially human nature, and it’s not the machine’s responsibility to solve this problem.

Close minded people always exist, no matter whether Google search exists. Even if there is no Google search and other things, these people will not seek or listen opinions outside of their chamber what so ever. We dream that actually machine can solve this problem by providing opposite and diversity opinions (effort like Findory news), but the thing is not that easy, it is difficult to move people out of their comfort zone, so Findory failed.

To solve this problem, we have to open our mind first, or some of us. Then we figure out a way that could open other people’s mind more effectively, and make the machine do it.

To sum up, what I am arguing it that, at this moment, machine personalization and recommendation is not doing bad things (not that good either, just fact) it’s just reflecting human reality. What we have to realize is that the problem is not the machine, it’s ourselves, machine is just letting us see our flaws that we don’t want to face sometimes. In the future, we need to figure out ways to let the machine do good on this.

Advertisements