What kind of content could possibly be considered “dangerous” in the era of political correctness and lobbying? How can we be certain that anything we publish on the Internet will stay there? These questions and many more were addressed in the course of the very interesting and timely discussion “Social Media and Human Rights” that took place on Monday 5 March 2018, in Warehouse C, as part of the 20th anniversary edition of the Thessaloniki Documentary Festival.
The discussion, prompted by the screening of Hans Block’s documentary The Cleaners in this year’s event, was attended by the German director, as well as Yannis Ioannidis, Vice President of the Hellenic League for Human Rights, Michail Bletsas, Director of Computing at the MIT Media Lab, and Aris Dimokidis, editor in chief of LIFO, who was the moderator.
Opening the event, the Director of the Thessaloniki International Film Festival Orestis Andreadakis welcomed the guests. Consequently, Hans Block thanked the Thessaloniki Documentary Festival and made a short description of The Cleaners. “This film was the fruit of a thorough three-year research on a secret industry in Philippines, where workers -young people mostly- were monitoring for 8-10 hours a day any activity on Facebook, rejecting any material that is considered inappropriate, such as terrorism-related posts, child and underage pornography/exploitation, etc.” he said. He added that the idea for this documentary was born after a video with inappropriate pictures of underage people was shared 16,000 times on Facebook before being removed, in 2013.
“So the co-director, Moritz Riesewieck, and I wondered in what way the material to be published gets to be approved or rejected by social media. Is there an algorithm behind this procedure, or there are people who assume this task? We then learned that in Philippines there is a whole team assigned the job. These people are cut off from everyone and everything, are being protected by private police, sign confidentiality agreements and never talk to anyone about what they do”. Consequently, Hans Block explained the two main questions posed in the documentary. First, how these “cleaners” deal with all those repulsive things they see everyday. The film director said that, according to two psychiatrists that he talked to in Germany and the US, what the cleaners go through is similar to the post-traumatic stress experienced by war veterans. “The second question is how one ends up being responsible for anything we will or won’t see on our social media screens”, the director said.
Consequently, Yannis Ioannidis thanked TDF for this discussion and talked in short about the Hellenic League for Human Rights. “It is the oldest human rights organization in Greece. Ever since it was set up, in the period between the two Great Wars, it undertakes activities focusing on immigrants, prisoners, freedom of speech, etc. In the last few years, it was only natural for activities pertaining to the Internet and the digital world to be included in our agenda”, he said. Mr Ioannidis
also mentioned the three main areas of concern as to legal issues relating to social media. “First, the digital world has taken almost equal dimensions to the tangible one’s, since it has now become a place to meet, connect, do business, sell products and practice politics. Second, in social media platforms we deal with private companies that operate in obscurity as to decision making. Therefore, it is very difficult either to supervise and control their activities or make sure that this supervision will not violate rights related to freedom of speech. Third, we have to take into account that these particular platforms act in a somehow dividing and manichaeistic way, grouping the users by opinions and interests; thus, we end up locked into a bubble, unable to see the wider picture and the lurking dangers”.
On his turn, Michail Bletsas praised the documentary The Cleaners. “This film shows us the way art can prompt fascinating discussions on what’s important in the news”. He also added: “We live in a transitional period, in the era of artificial intelligence and automation. We actually don’t know how this automated system works and the real danger lies in these companies’ wish to materialize this procedure at the lowest possible cost and not assisted by an educated and trained workforce. We must never forget that all data systems adopt our own polarizations, our own prejudices and obsessions. A typical example is Google’s recent fiasco, when its Photos app tagged an Afro-American couple as ‘gorillas’ in vacation pictures they had publicized”.
On his part, Aris Dimokidis spoke about the conflicting feelings he had when he watched the documentary The Cleaners. “At first you think Facebook shamelessly censors any content they come across to, violating freedom of speech and expression. Eventually, you begin to wonder why they didn’t bother blocking so many publications containing fake news, live suicides, etc.” Consequently, he shared with the audience his own experience as a cleaner, both for the site and the Facebook page of the medium he works with, stressing the moral dilemmas he encounters each day, and the inability to handle such a huge volume of data. “It is practically impossible to keep yourself updated. What the new ISIS flag is, which are the codified symbols used by criminal and terrorist groups. This job becomes even more stressful when you consider that any mistake could have disastrous consequences not in the digital, but in the real world”, he said.
Consequently, Mr Dimokidis gave the floor to Theodoros Daniilidis, founder of the site www.ellinikahoaxes.gr, who was attending the discussion. Mr Daniilidis explained how strenuous is to collect proof in order to bring down any false news, even if they come from a blatantly unreliable source, and said he has been threatened many times, either with lawsuits or bluntly with threats against his life and physical integrity.
Consequently, the three main guests replied to questions, starting from whether it has become obsolete to make the distinction between digital and tangible reality. “The dividing line is not clear anymore, since the way we see things tends to substitute reality itself”, Mr Ioannidis said. Consequently, Hans Block stressed the meaning of a substantial difference, lying in the fact that the various safety mechanisms and institutions for rights protection established in the real world (laws, courts, etc.) also need to exist in digital world, in any way that is considered appropriate. “Digital world still lags behind compared to the tangible one as to direct feelings and interactions with the interlocutor, but I strongly believe that this is soon to change”, Mr Bletsas added.
Shortly before concluding the conversation, the question of the existing legal framework applying to the particular issues was raised. Mr Ioannidis insisted on the fact that digital community is globalized, making it hard to establish an overall framework to deal with any occurring issue. At the same time, he stressed how hard it is to find the right balance that would enable the necessary controls, without opening the way to censorship. “In Germany, a law was passed recently, forcing platforms to remove within 24 hours any content promoting racist speech and intolerance. While this law’s intentions are in the right direction, it is troublesome that it relies on a private company to be held responsible as to what must, or must not be removed. Generally speaking, we must always remember that hate speech in digital world is now being reflected in outbursts of violence in real life”, Hans Block added. Concluding, Mr Bletsas stressed that it is impossible and unrealistic to try and handle these issues on a national level, adding that hopefully the right formula will be found, though not yet, while we still try to familiarize ourselves in a totally new world, like toddlers.