The police filter bubble
We’ve seen the president retweet questionable news sources before, but a recent database featuring the Facebook rantings of some Philadelphia police officers…
MORE IN THIS SECTION
In 2016, the Philadelphia Police Department, like many city police departments across the country, was coming to grips with a wave of revelations about its own racial bias and treatment of minorities.
An officer was investigated after photos appeared online of him displaying a Nazi emblem tattoo while patrolling a Black Lives Matter march, and, in a separate revelation, a first batch of racist, violent Facebook activity from other officers came to light.
The latter would be the inspiration for The Plain View Project, a database targeting biased Facebook activity from police officers that was released this past Saturday. It features bigoted posts and comments of 2,900 active officers and 600 retired officers from eight police departments across the country.
The project identified 1,073 officers on Facebook in Philadelphia. A third have posted content deemed racist and violent. Fifteen of them are considered high ranking.
The data collected by the project was gathered from as early as 2013 and as recently as earlier this year. While there are some original posts containing racist or violent diatribes, most are reactions to news links or posts from other pages. The posts include harsh reactions to many of the mainstream Philadelphia breaking news outlets, but also prevalent are a slew of lesser, more slanted sources from the extreme right.
It offers insight into the filter bubble of many police officers across the country.
“Filter bubble” is a term coined by Upworthy founder Eli Pariser in his book of the same name. In short, it is a space created on the Internet where an individual encounters information that only conforms to or reinforces their own beliefs.
Tech companies like Google and Facebook monitor and collect data on the online consumption habits (news articles read, links clicked, posts reacted to) of everyone who uses their platforms. The data is used to personalize the content presented to every user. With news, users are presented articles they’re more likely to engage with — most likely ones they agree with.
This “filtering” by tech companies creates a bubble of content for like-minded users. The bubble is where the more extreme, untrustworthy sources seep into the dialogue and influence users. During the 2016 election, the trend towards extremism was exploited by foreign agents to drive polarization and sow divide in the country.
In the case of the Philadelphia Police Department, the filter bubble has further divided an already distrustful community from those sworn to protect them. The department has tried to regulate what its officers can post, but The Plain View Project has shown that they’ve had little effect.
Philadelphia District Attorney Larry Krasner said the revelations from the project could mean the expansion of the office’s “do-not-call” list. Some of the police officers highlighted by the project have deleted their Facebook profiles or made them private.
Of the eight other police departments analyzed by the project, three others are from major U.S. cities, in addition to three more from smaller municipalities.