I'm a suedo-academic with a lot of feelings and an unimpressive vocabulary.
Don't wanna be here? Send us removal request.
Link
A recent claim by a former Amazon employee that the company fired them for participating in legally protected acts has opened up an entirely new look into the industry giant’s practices. The employee claimed that his effort to unionize his coworkers is what resulted in his termination, but the letter Amazon submitted told a different and arguably more horrific story. The letter was attained under the Freedom of Information Act by the Verge and has since been circulating.
In the letter, Amazon states that the employee had been fired due to inability to meet productivity standards, something that over 300 employees had been fired for from one shipping facility in just one year. The company asserts that there is no bias in this decision, as it was made by an automated system that keeps track of employee productivity.
The system tracks each individual employee’s output and time-off-task or TOT. The system can generate warnings and even terminate employees without supervisor approval. Supervisors have the ability to override these and an appeal process is presented, but the decisions are being handed out by automated machines.
The claim that there was no bias in this termination we then know to be false, as these automated systems were designed by the corporation that utilizes them. This kind of removal of blame from the company to the digital program itself is exactly what Noble discusses in her book in relation to Google and the algorithms it utilizes. This move toward automated firing of employees overwhelmingly takes the humanity out of their situation, with no forgiveness or consideration of circumstances. Amazon has taken this opportunity to further disconnect their corporation from the employees that make their monopoly possible. The suit was dropped, therefore legitimizing the use of a digital middle man as a scapegoat, when these systems could not exist without human design and implementation. This spells trouble for future legal action against discrimination put into action by algorithms or automation.
0 notes
Link
This article uses a lot of economic jargon to basically tell the reader that Facebook is still doing better than ever, even with the impending FTC fines. Facebook has set aside three billion dollars in order to prepare for the fines, although the anticipated fine could be anywhere from three to five billion dollars when finally determined. If Facebook settles, it draws major concerns for advertisers who would no longer be able to benefit from some of the data that is currently collected from Facebook users.
However, even with these seemingly huge fines on the horizon, Facebook shares are being traded at higher prices than ever. This only goes to reinforce what we have discusses in class, in which tech corporations that operate within our society only exist because of their ability to produce capital gains. Facebook has gained enough capital to continue to be useful, even when they are being forced to reevaluate their privacy terms. What this says to me, is that even after the potential settlement with the FTC, Facebook will continue to mine data from its users to be sold to advertisers.
0 notes
Link
Ousmane Bah, an 18 year old from New York, was recently charged with theft in four different states. He has been cleared of all charges everywhere except New Jersey. Bah and his attorneys are claiming that Apple’s use of facial recognition software in their stores are what lead to the multitude of charges against him. The suit is claiming that when the thief used an old photo-less ID of Bah’s at an Apple store in Boston, the facial recognition software Apple uses to keep track of suspected thieves connected the thief’s face with Bah’s name, leading to several other charges in connection with the same thief.
Apple’s official statement has been that they do not use facial recognition in their stores. However, this kind of surveillance is something that hasn’t been regulated heavily, and therefore the proposed purpose is yet unknown.
The ramifications of this kind of surveillance, along with the cataloging of customer information and images has the potential to infringe on the personal privacy of current and future customers. The precedent of keeping tabs on customers labeled as “potential shoplifters” also leads down an uncomfortable road of who is making the decision on who gets watched. Whether a human being or a program, we know from our readings and the studies of Safia Noble that this can and will likely lead to over-persecution of oppressed populations in the same way that our social structures over-persecute said populations.
0 notes
Link
The media giant Youtube has recently installed algorithms, the intention of which is to provide context for conspiracies and misinformation. This algorithm identifies these videos and then places third party sources (encyclopedia Britannica, Wikkepedia, etc) with contextual information on the subject below the video.
However, these algorithms failed earlier today when they identified a video of Notre Dame on fire as a 9/11 conspiracy theory video. The program inserted a box with information on the 9/11 terrorist attack below the live stream of the cathedral in flames, sparking an outcry from the news station that broadcast the footage and viewers alike.
Youtube apologized for the misstep, but it highlights the important issue that Noble talks about in her book. Algorithms are not perfect, and they can only anticipate as much as those who program them. They represent a mirrored reflection of our society, and yet we see them as infallible and objective. It is instances like this one that shows how volatile these programs can be.
0 notes
Link
This week in the UK they have recently announced their intent to put laws into action that would force tech giants such as Facebook and Twitter to more heavily monitor the content available on their site. These new laws would require the removal of illegal material such as terror and child abuse, but also legal material like cyberbullying and disinformation.
The new laws would also create an oversight committee that would take the place of the self-policing that has been the current practice. These new laws rely heavily on blaming the tech companies themselves for the content being shared through these platforms.
The article cites two different cases as the motivation for these regulations: the sharing of the Facebook Live video from the New Zealand shooting and a young girl who committed suicide after watching distressing content about self-harm on Instagram. Safiya Umoja Noble discusses problems with this theory in her book Algorithms of Oppression. Noble discusses the problems with Silicon Valley ignoring the problems and blaming them on algorithms rather than taking full account of the human aspect.
This committee is designed to incentivize a certain amount of censorship that has the potential to be misconstrued or misused for an agenda. While the propagation of these materials should be limited, the censorship of websites that have merely provided a space may not be the way. People are drawn to the content and will continue to produce it, whether on mainstream platforms or more hard to track places.
0 notes
Link
The article is short and sweet, but the essential information is that Jaden Smith, a hip hop artist and son of Jada Pinkett and Will Smith, has worked with a church to develop a water filtration system in order to take on the Flint, Michigan water crisis.
I chose this particular article because it is so concise. It’s obvious that Smith’s company put out a short press release with very few details about this development and it seems that several different outlets just copy and pasted the news release onto their platform and published it. While this is a common practice, the content of this press release seems to deserve more elaboration.
The lack of coverage on this issue by large news sources but the abundance of information I have personally consumed on social media platforms displays clearly the disconnect that I have created for myself. By catering my personal news feeds and timelines to issues that I care about, I have essentially created a vortex of information that revolves solely on issues I personally find important. In searching for the article outside of my social bubble, it became clear that the intensity of the Flint issue was not held in as high regard as my circle of influence lead me to believe.
0 notes
Link
Everything feels like a simulation. Even five years ago, a statement like this one would have never even been considered, let alone made it to press; but there is something about the twenty-four hour news cycle, internet fame, and the culture of absurdism that makes something like this almost boring.
The article is basically just Paper Magazine allowing us to read the ridiculous press release put out by Megan Trainor’s PR team in preparation for her new album release. The statement uses phrases such as “Billboard is ‘wet’ for these new songs” and “these lyrics will have you stanning for days,” in a blatant attempt to utilize popular slang as a avenue for attention to their client’s album. Several pop culture news sources have commented how the release made them feel uncomfortable and just downright confused. The full statement is something worth reading just for the experience.
We operate in a constant state of consumption; just ingesting media and stimulation almost without interruption. Because of this, it takes something as ridiculous sounding as this press statement to catch the eye and draw attention away from the endless other media outlets vying for a moment of our time. However, even though the content is not what we expect, I’ve found that it isn’t quite strange enough to call attention for much longer than a few moments.
It’s an odd and almost unbelievable statement, but aside from my fellow public relations classmates, I haven’t found many people who’s interest it holds for longer than it takes to skim the letter and laugh at it. People move on, because the language is shocking but the message is not.
It reads like a joke, trying too hard to be “on-trend” or edgy. The general public gets a laugh out of it and then moves on, because we are seeking new stimuli. The message is about a newly married woman putting out an album containing love songs about her husband, which all-in-all, is completely socially acceptable and not very interesting. The team who put out the press release was hoping to hold people’s attention and keep them talking about their client, to make them want to listen to these “horny” songs on her new album, but they made a mistake. They are trying too hard and in this age of personalized ads based on algorithms and paid sponsorship, the only thing that is TRULY shocking is transparency.
#meghan trainor#public relations#school#academia#this is for a class#pr#pr teams#advertiments#ads#publicity stunt#paper magazine
0 notes