LIVE STREAMING
WASHINGTON, DC - APRIL 11: Photographers make images of Facebook co-founder, Chairman and CEO Mark Zuckerberg as he testifies before the House Energy and Commerce Committee at the Rayburn House Office Building on Capitol Hill April 11, 2018 in Washington, DC. (Photo by Chip Somodevilla/Getty Images)
WASHINGTON, DC - APRIL 11: Photographers make images of Facebook co-founder, Chairman, and CEO Mark Zuckerberg as he testifies before the House Energy and Commerce Committee at the Rayburn House Office Building on Capitol Hill April 11, 2018, in…

Facebook, privacy and hate messages: is technology ready for us?

From the Cambridge Analytica scandal to the shooting in New Zealand, social networks are at the center of the debate. Maybe it's time to take a look at the…

MORE IN THIS SECTION

Gifts to Avoid Giving

Thanksgiving: how did it go?

Black Friday: Anti-Inflation

Green-Boned Dinosaur

Salt Museum in the USA

Hispanic culture on cinema

HHM Authors to Note

Celebrating Latino Artists

SHARE THIS CONTENT:

In early March, Mark Zuckerberg wrote again on his Facebook account. This time it was not a summarized post, rather it was a very long post, with too many explanations and mea culpas on how the new integration of Facebook, Instagram, and WhatsApp would give a solution to protect the privacy of users of the social network.

But, in truth, is Facebook still having a privacy problem?

"As I think about the future of the Internet, I believe a privacy-focused communications platform will become even more important than today’s open platforms," Zuckerberg wrote. “Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.”

The CEO of Facebook uses the word freedom, that throwing weapon that can give us free will to our actions but, at the same time, places us in a position of greater responsibility — that of taking charge of ourselves, of our information and of what we share — while freeing their company from the legal repercussions for the data circulating in private networks.

Think of the Nido case in Chile, where thousands of images of naked women, some of whom had been raped, or mutilated parts of their bodies spread on social networks at the end of February because they were housed in a private forum under the domain Nido.org.

According to the Ahora Noticias website in Chile, Nido was developed under the format Imageboard, a network based on anonymity and "a mutation of 6Chan – a site based on 4Chan where private intimate material began to appear - which mainly contained photographs of naked women, many of them obtained through purchases of 'packs' - a file compressed with photos of a sexual nature that a person sells on the internet - or by the sending of intimate material without consent."

The Nido case is the best example to explain that the word “privacy” is not associated with the number of users who see a news post, but with the quality of those users, the type of content and the ethics involved in what is being shared.

The fact that Facebook brings to light the issue of privacy at this time is partially a necessity at the level of public relations, a maneuver to contain the serious reputation crisis suffered by the company since the Cambridge Analytica case and that has generated a great mistrust among users, because the social network remains under the scrutiny of U.S. regulators.

As the New York Times explained, the Federal Trade Commission (FTC) is considering a multi-billion dollar fine against the company for violating a decree of privacy consent in 2011, where the company of Mark Zuckerberg was required to request the permission of the users for sharing data with third parties.

Some FTC officials have pressed for maximum penalties due to new existing reports where possible privacy violations appear from the beginning of the investigation. The FTC could fine Facebook up to $ 41,000 for each violation found by the agency.

If we remember that the case of Cambridge Analytica affected a population of 87 million users of the social network, you do the math. Solving the privacy problem internally is the possibility of a less expensive investment than paying a fine — a way to wash your face at a public level and earn back your reputation.

Does privacy solve hate speech?

A few weeks ago, social networks envisioned a new problem between terrorism and virality in their distribution channels: Facebook was the main digital problem during the mass shooting in two mosques in Christchurch, New Zealand.

Brenton Tarrant, the terrorist and white supremacist, did everything possible to make the event go viral. Part of his hate strategy was not only to shoot but to generate terror by transmitting live through Facebook a 17-minute video of the shooting taking place.

The transmission was made as if it were a video game with violent weapons and in the first person but hiding the weapon with painted memes that contained supremacist hate messages.

Tarrant broadcasted a live video from Facebook, but his distribution center apparently was 8chan, the same community of image boards that were used in the Nido case but in this case with content focused on promoting right-wing extremism.

The social network quickly removed the video from their Facebook and Instagram accounts. However, it didn’t matter that the original video was deleted. The clip had already been downloaded and went viral faster than social networks and platforms could respond.

According to the WSJ, of the 1.5 million shared videos of the massacre filmed by the camera that was attached to the body of the terrorist, only 1.2 million blocked the load.

Frustrating this type of transmission in social networks is a great riddle, almost impossible to solve due to the lack of legal supervision, the invasive nature of technology and the instinctive reaction of users to these images.

Although Facebook uses Artificial Intelligence and human moderators, the virality of morbid images seems unstoppable. Now think how complicated it is to destroy such a message in small, more private groups.

How is this type of violent content encrypted in a new and private Facebook? I think about the Nido case, the New Zealand gunman and how he knew how to skip the rules. Is the solution for Facebook to become the WeChat created in China but with a Western vision?

The question could perhaps be another: is technology ready for human prejudices and vices?

  • LEAVE A COMMENT:

  • Join the discussion! Leave a comment.

  • or
  • REGISTER
  • to comment.
  • LEAVE A COMMENT:

  • Join the discussion! Leave a comment.

  • or
  • REGISTER
  • to comment.