This article is a translated version of Japanese version. (Powered by Google Translate)
Introduction
Services on the Internet have come to present information in accordance with interests of users. In other words, even if you follow exactly the same friend on SNS, the information displayed will be different for each user.
On the other hand, there is concern about the risk of "filter bubble" which harms by being surrounded only by information that is interested in the user's own interest.
Here we examine the filter bubble and its related information, and report on impressions actually changed on the Facebook filter after intentionally changing it.
Filter Bubble
Personalization has become widespread in recent Internet service.
This is to individually present content that each user seems to be interested based on past actions of the user.
At first glance this is a very good thing. For users, content of interest can be presented one after another and enjoyable. Even for operators, as users become repeaters, sales increase through the operator's business model.
Also, in the advertisement online advertisement area, which is a source of revenue for many Internet services, research and investment for delivering interesting advertisements are thriving because users are directly linked to sales by clicking on it .
However, this personalization has the following problems.
- You will only see information that is convenient for you, and you will not touch unfavorable information (confirmation bias)
- It will be deeply in touch only with what is of interest, and new ideas and information will not be included
These are called filter bubbles, and it is said that there is a danger of harming the whole society by manipulating information to people.
Digital Gerrymandering
Among filter bubbles, the behavioral actions to be used politically are called digital garymarinda. Recently special features have been taken up in the academic journal of the Information Processing Society and attracting attention.
The fact that this word Digital Gerrymandering was born is the experiment of Facebook company.
It took place on the day of the US presidential election held in November 2010.
Some of the users showed today's voting date, and in addition they displayed the "voted" button and the friends who already pressed that button.
As a result, it was suggested that information on voting on Facebook indirectly raises voter turnout rate.
This result was concerned that certain problems will occur in the future. That means that if the management company of SNS carries out such measures only for users who conform to certain political beliefs, candidates or political parties that conform to the political creed are more likely to be elected.
Will people be happy in opium?
Here is one interesting article.
[Can people become happy in opium? ] (http://aniram-czech.hatenablog.com/entry/2018/03/29/220000)
In this article, humans suffer from a gap between desire and reality, but sucking opium makes the desire vessel smaller and becomes happy with trivial things. Then would people be happy if opium was infinite enough to get free? It is contents. Indeed it will be a happy society that nobody suffers.
Through personalization, the Internet becomes free to satisfy greed and it seems that it is just becoming an opium. But is this a good thing?
The author concludes like this at the end
` "All of us can become Shiawase soon, soon there will be no exception."
`
I think so too. Especially I think that the day we can have a happy life in the main character of the world by the intersection of the Internet, the virtual space, the brain science, the universe and so on.
However, I think that those who think that it is pleasant because they have troubles will leave descendants behind them. This may be a spiritual update of human beings, which is angry if the subject is big, so let's stop this story around here.
Experiment on my Facebook
Well, from some day Facebook got to display information that does not suit my opinion.
In terms of extracting Kirei Goto, I think that there is information that can not be agreed to the information flowing to Facebook. This kind of information has been displayed more often.
I thought a little about why the information different from my opinion increased. Is my opinion far from the general public? Although there was something I thought, it seems that it is not the case when talking with other SNS and real society. So I guessed it might be a filter by Facebook.
I have an idea to support diversity. There may be more than one interpretation for one phenomenon, I think that the correct answer is not one. Therefore, even people with different opinions have tried to understand what kind of opinion they are saying in the background. I think that because of that, I was unconsciously investigating what kind of experience and position people speak of saying that they would not agree on Facebook.
It seems that this behavior was interpreted as "From the viewpoint of Facebook, this person is interested in such information / people". In other words, "I am interested in information that does not fit my idea".
Then, as information that meets this condition came to flow and Facebook as I saw it was thought that a person with an opinion opposed to him became a place to speak, I think that the sequence of phenomena is consistent .
Try using blocking function
I never used the blocking function before. It was because I felt that there was no merit in the act of covering information that was not convenient for me.
However, I thought that using this blocking function will change information in Facebook this time.
Facebook has the feature "to remove follow-up for 30 days". If you do this, that person's remarks will not be displayed at all.
I hesitated a little but decided to try it to see how much information gained will change. Then, the change appeared immediately.
Of course, only information that matches my opinion will come to flow, so it is kind of comfortable and calm. Although it might be exaggerated, it comes to the sense that it belongs to society more than ever. In fact, there are reports that [Block is effective for unpleasant people] (https://blog.tinect.jp/?p=50304).
Next time I was surprised that I realized that someone who thought that I was not updating Facebook regularly updated it surprisingly.
Through this experiment, I felt that the biasing of information by the information filter is more familiar than imagined.
I want to refuse a word for friends who may have been blocked so that there is no misunderstanding here. Even if the opinion does not fit, human nature does not deny, and I want to make friends in the future as well. Also, I think that the block function will not be used after the experiment. Because I am interested in information that does not fit my idea.
Decentralized Internet
Well, what kind of measures can be taken in such a situation? Filter bubbles are caused by existence of existence (= administrator) who can manipulate information arbitrarily.
It is better to obtain information as close to life as possible. However, there is a problem with this.
- Raw information is too much information and can not be selected (information explosion, information flooding)
- The number of services that can not be displayed in chronological order has increased (Facebook, Twitter, etc. are no longer displayed in new arrival order)
Therefore, networks that do not have a specific administrator, that is, "distributed" (non-centralized) which is the mechanism of the original Internet are being attracted attention.
Although it has nothing to do with the filter bubble, recently the virtual currency and block chains of the topic are related, and the word non-centralized will be more noticed in the future. This is a long talk, so I would like to have another opportunity.
Summary
On the Internet, information is often presented in accordance with interests of users. While this is convenient, it has the possibility that information is arbitrarily selected and manipulated by the operator.
So I changed my interests daringly by using Facebook blocking function to what extent that information changes. As a result, there were many kinds of information which was subjective, but different from that before the experiment.
As a countermeasure to such problems, non-centralized services are attracting attention. For details please visit next time.
Finally, when my daughter comes into contact with a social network, I would like to advise as follows.
- Your timeline is biased towards information you are interested in
- Even with the same phenomenon, the impression will change depending on the expression, so look at the primary information as much as possible
- Having philosophy / philosophy as the core for sorting out information
References
- Itakura Yoichiro, Digital Gelli Manda and Privacy, Self-determination Rights, Information Processing Vol.58 No.12, 2017.
- [Gerimanda]] (https://en.wikipedia.org/wiki/%E3%82%B2%E3%83%AA%E3%83%9E%E3%83%B3%E3%83%80%E3% 83% BC), Wikipedia.
- [Filter Bubble] (https://en.wikipedia.org/wiki/%E3%83%95%E3%82%A3%E3%83% AB%E3%82%BF%E3%83%BC%E3 % 83% 90% E 3% 83% 96% E 3% 83% AB), Wikipedia.
- ["Unpleasant people immediately block" is reasonable from the perspective of network science. ] (https://blog.tinect.jp/?p=5030), 2018.
- [Can people become happy at opium? ], http://aniram-czech.hatenablog.com/entry/2018/03/29/220000, 2018.