November 12 Essay 3 is due on turnitin.
Essay 3
Minimum of 2 sources for your MLA Works Cited page.
Choice A
Watch Hasan Minhaj in The Patriot Act episode “Why the Internet Sucks” and develop an argumentative thesis that addresses the episode’s main theme.
Choice B
Watch John Oliver’s YouTube presentation about medical devices and develop an argumentative thesis that addresses the alleged abuses in the medical device industry.
Choice C
Read Ibram Kendi’s “What the Believers Are Denying” and agree or disagree with his contention that racism and global warming denial are rooted in the same psychologically flawed thinking.
Choice D
Read "It's Time to Confront the Threat of Right-Wing Terrorism" by John Cassidy in The New Yorker and "Does the banning of Alex Jones signal a new era of big tech responsibility?" by Julia Carrie Wong and Olivia Solon in The Guardian and agree or disagree with the claim that big tech companies are morally obliged to censor right-wing white nationalist trolls such as Alex Jones. For another source, you can also use “Free Speech Scholars to Alex Jones: You’re Not Protected” by Alan Feuer.
Choice E
In the context of Jasmin Barmore’s essay “The Queen of Eating Shellfish Online,” develop an argumentative thesis that addresses the alleged benefits of mukbang, the glorification of binge-eating on a webcam.
Choice F
Read “Is Dentistry a Science?” by Ferris Jabr and refute or defend his claim that dentistry is rife with venality (greed) and corruption that compromises a patient’s best interests. For this assignment, you can consult “Dentists Need to Up Their Game” and “Is Your Dentist Ripping You Off?”
Choice G
Read Nick Hanauer’s “Better Schools Won’t Fix America” and refute or support the author’s contention that structural inequality, not schooling, is the root of America’s crisis.
Choice H
Read Andrew Marantz’s “Free Speech Is Killing Us” and support or refute his claim that free speech does not apply to private companies.
Choice I
Read Allison Arieff’s “Cars Are Death Machines. Self-Driving Tech Won’t Change That” and support or refute her contention that self-driving cars are not the solution to traffic dangers.
Choice J
Read Judith Shulevitz’s essay “Why You Never See Your Friends Anymore,” and support or refute the author’s claim that lack of regular friendship bonding is having far outreaching destructive effects on society.
Option K
See the movie Black Panther and in an argumentative essay, with a counterargument-rebuttal section, address the question: Is Erik Killmonger a villain or a hero?
Resources for Works Cited:
See: Argument about Erik Killmonger
See: Boston Review
See:"Black Panther and the Invention of Africa" by Jelani Cobb
See Guardian
See Washington Post
See Forbes
See The Ringer
Option L
Watch the movie Black Panther and address the argument that the mythical city of Wakanda is a metaphor for the need of African history that has been corrupted and "white-washed" over the centuries by racist, white historians who have painted an inaccurate history of Africa.
Sources:
"Black Panther and the Real Lost Wakandas" by Clive Irving
"Black Panther and the Invention of Africa" by Jelani Cobb
"Black Panther: A Conversation about Real African History" by Melvin Lars
"Black Panther is a gorgeous, groundbreaking celebration of black culture" by Tre Johnson
"The Real History Behind the Black Panther" by Ryan Mattimore
"Searching for Wakanda: The African Roots of the Black Panther Story" by Thomas F. McDrew
Option M
Watch the movie Black Panther and develop a thesis about how the film sheds light on the tensions between Africans and black Americans.
Sources:
"Black Panther: Why the relationship between Africans and black Americans is so messed up" by Larry Madowo and Karen Attiah
"Black Panther and the Invention of Africa" by Jelani Cobb
"Black Panther Forces Africans and Black Americans to Reconcile the Past" by Kovie Biakolo
Choice N
Read Tad Friend’s New Yorker online article “Can a Burger Help Solve Climate Change?” and look at two opposing camps on the role of alternative protein sources as a viable replacement for meat. One camp says we face too many obstacles to accept non-animal alternative proteins: evolution, taste, and cost, to name several. An opposing camp says we have the technology and the proven product in Impossible Foods and other non-meat proteins to replace animal protein. Assessing these two opposing camps in the context of Tad Friend’s essay, develop an argumentative thesis addresses the question: How viable is the push for tech companies to help climate change by replacing animals with alternative proteins?
Free Speech Debate on Social Media Platforms Combines Two Essay Prompts:
Choice D
Read "It's Time to Confront the Threat of Right-Wing Terrorism" by John Cassidy in The New Yorker and "Does the banning of Alex Jones signal a new era of big tech responsibility?" by Julia Carrie Wong and Olivia Solon in The Guardian and agree or disagree with the claim that big tech companies are morally obliged to censor right-wing white nationalist trolls such as Alex Jones. For another source, you can also use “Free Speech Scholars to Alex Jones: You’re Not Protected” by Alan Feuer.
Free Speech on Social Media Is Destroying Democracy & Spreading Fascism
Focus on Choice H
Read Andrew Marantz’s “Free Speech Is Killing Us” and support or refute his claim that free speech does not apply to private companies. You can also consult Aaron Sorkin's open letter to Zuckerberg in the NYT.
Suggested Outline
Paragraph 1: Summarize the way Facebook has eroded democracy and aided fascism and foreign meddling as chronicled int the two Frontline videos, "The Facebook Dilemma, Part 1 and Part 2.
Paragraph 2: In the context of the above videos, address Andrew Marantz's claim that Facebook does not have the moral and legal right to hide behind the First Amendment as it allows trolls and other bad actors to use micro-targeting on its news feeds to weaponize misinformation.
Paragraphs 3-6 give supporting reasons for your claim.
Paragraph 7 is your counterargument-rebuttal.
Paragraph 8, your conclusion, is a powerful restatement of your thesis.
Points to Consider in Your Mapping Components
One. We have a history of failed censorship leading to genocide.
Two. It is lazy and over simplistic to say we can do nothing about social media promoting hate and fake news because we must defend the First Amendment.
Three. Online dog whistles on social media platforms are not so much opinion as they are incitements to violence.
Four. Weaponized misinformation radicalizes lonely malcontents so that some of them turn to violence.
Five. Social media platforms create need for viral videos of violence such as the killer at the New Zealand mosque.
Six. Businesses such as FB, YT, and Twitter are not bound by free speech law, so they can use their discretion as they see fit.
Your Counterarguments
One. The definition of a troll may be nebulous. What is a troll that uses fake speech and an honest dealer who uses real speech? How is the arbiter of this distinction?
Two. Slippery slope: If we deplatform Alex Jones, where does it stop?
Three. Snowflake argument: If we protect people from PTSD and the like from deplatforming offensive trolls, are we conditioning people into snowflakes?
Summary of "The Facebook Dilemma," Part 1
One. Facebook users get a custom news feed based on algorithms ("engagement metrics") that caters to their personality profile so that they live in a censored world, a bubble where they commiserate with members of their own tribe while demonizing those outside their tribe. In this regard, FB is contributing to polarization, a liability against democracy.
Two. FB executives say censorship is not the answer because the "common decency" and "common sense" of the users will prevail even though evidence contradicts their assumption.
Three. FB created the superstar model for scaling ad revenue married to private data collection so that users are not customers but the product itself.
Four. The United States Defense Department is studying "malicious actors" who will use private date for "mass manipulation" to undermine other countries, especially democratic ones.
Five. Facebook has become a vector for an international troll factory (Russia, Macedonia) that has hacked America.
Summary of "The Facebook Dilemma," Part 2
One. FB provides 40% of people's news, so we can imagine why troll factories with malicious intent would want to exploit the FB news feeds with weaponized misinformation to exploit fault lines in democracy and intensify conflicts for the sake of undermining democracy, elections, trust in government, etc. Bad actors have learned that social media is an Information Ecosystem that can turn a democracy upside down.
Two. FB does not edit its news. It relies entirely on the business model of algorithms that optimize virality.
Three. FB has created through micro-targeting news feeds "hyper-partisan" tribalistic bubbles with one group saying, "They're terrible. We're the best." This has become part of cancel culture, a detriment to democracy and critical thinking.
Four. We don't engage in adult dialogue but lizard or reptilian dialogue based on primal fear and anger.
Five. Spreading misinformation in a social media ecosystem has led to violence and genocide, notably the killing of Muslims in Mynamar and the killing of Ukranians after Russian trolls staged fake footage of Ukranians crucifying babies in caves. When confronted with their responsibility in these acts of genocide, FB top executives made studied expressions of concern and offered nothing more than a "we need to have on ongoing conversation about these serious matters," which is code for we refue to do anything and we are not responsible.
Six. One FB executive admits FB is ill equipped to deal with weaponized misinformation on its platform. He says, "It's a problem you don't solve. It's a problem you contain." But even containment is doubtful. In fact, weaponized misinformation and the violence it creates is out of control.
Andrew Marantz’s “Free Speech Is Killing Us”
(Parenthetical citations my own)
(Social media platforms deny accountability for their content by hiding behind two false claims: First Amendment, the right of free speech, and the idea that words can't translate into violence.)
There has never been a bright line between word and deed. Yet for years, the founders of Facebook and Twitter and 4chan and Reddit — along with the consumers obsessed with these products, and the investors who stood to profit from them — tried to pretend that the noxious speech prevalent on those platforms wouldn’t metastasize into physical violence. In the early years of this decade, back when people associated social media with Barack Obama or the Arab Spring, Twitter executives referred to their company as “the free-speech wing of the free-speech party.” Sticks and stones and assault rifles could hurt us, but the internet was surely only a force for progress.
(The idea that social media is benign and can't be held to account for violence has lost credibility in the face of recent bloodshed.)
No one believes that anymore. Not after the social-media-fueled campaigns of Narendra Modi and Rodrigo Duterte and Donald Trump; not after the murder of Heather Heyer in Charlottesville, Va.; not after the massacres in a synagogue in Pittsburgh, two mosques in Christchurch, New Zealand, and a Walmart in a majority-Hispanic part of El Paso. The Christchurch gunman, like so many of his ilk, had spent years on social media trying to advance the cause of white power. But these posts, he eventually decided, were not enough; now it was “time to make a real life effort post.” He murdered 51 people.
(In addition to violence, Alt-Right conspiracy memes can even influence national policy in favor of the bigots' views.)
Having spent the past few years embedding as a reporter with the trolls and bigots and propagandists who are experts at converting fanatical memes into national policy, I no longer have any doubt that the brutality that germinates on the internet can leap into the world of flesh and blood.
(The heart of the argument from which we derive our thesis)
The question is where this leaves us. Noxious speech is causing tangible harm. Yet this fact implies a question so uncomfortable that many of us go to great lengths to avoid asking it. Namely, what should we — the government, private companies or individual citizens — be doing about it?
(Common argument is nothing should be done to censor social media content because this censorship is a form of fascism and "thought police.")
Nothing. Or at least that’s the answer one often hears from liberals and conservatives alike. Some speech might be bad, this line of thinking goes, but censorship is always worse. The First Amendment is first for a reason.
After one of the 8chan-inspired massacres — I can’t even remember which one, if I’m being honest — I struck up a conversation with a stranger at a coffee shop. We talked about how bewildering it was to be alive at a time when viral ideas can slide so precipitously into terror. Then I wondered what steps should be taken. Immediately, our conversation ran aground. “No steps,” he said. “What exactly do you have in mind? Thought police?” He told me that he was a leftist, but he considered his opinion about free speech to be a matter of settled bipartisan consensus.
(The author compares the refusal to address free speech in the face of violent-provoking memes with the refusal to address gun rights in the face of gun violence. Is this a fair comparison?)
I imagined the same conversation, remixed slightly. What if, instead of talking about memes, we’d been talking about guns? What if I’d invoked the ubiquity of combat weapons in civilian life and the absence of background checks, and he’d responded with a shrug? Nothing to be done. Ever heard of the Second Amendment?
(First Amendment doesn't apply to private companies.)
Using “free speech” as a cop-out is just as intellectually dishonest and just as morally bankrupt. For one thing, the First Amendment doesn’t apply to private companies. Even the most creative reader of the Constitution will not find a provision guaranteeing Richard Spencer a Twitter account.
(Even if social media platforms were government entities such as a public utility, there would be no absolute free speech.)
But even if you see social media platforms as something more akin to a public utility, not all speech is protected under the First Amendment anyway. Libel, incitement of violence and child pornography are all forms of speech. Yet we censor all of them, and no one calls it the death knell of the Enlightenment.
(Free speech exists in constant tension with public safety; the issue is balancing the two, not granting absolute favor in one over the other.)
Free speech is a bedrock value in this country. But it isn’t the only one. Like all values, it must be held in tension with others, such as equality, safety and robust democratic participation. Speech should be protected, all things being equal. But what about speech that’s designed to drive a woman out of her workplace or to bully a teenager into suicide or to drive a democracy toward totalitarianism? Navigating these trade-offs is thorny, as trade-offs among core principles always are. But that doesn’t mean we can avoid navigating them at all.
(History shows that failure to censor has led to genocide.)
In 1993 and 1994, talk-radio hosts in Rwanda calling for bloodshed helped create the atmosphere that led to genocide. The Clinton administration could have jammed the radio signals and taken those broadcasts off the air, but Pentagon lawyers decided against it, citing free speech. It’s true that the propagandists’ speech would have been curtailed. It’s also possible that a genocide would have been averted.
(Marantz clarifies his thesis that censorship is a constant judgment call in the interests of public safety as opposed to making an outright ban on First Amendment.)
I am not calling for repealing the First Amendment, or even for banning speech I find offensive on private platforms. What I’m arguing against is paralysis. We can protect unpopular speech from government interference while also admitting that unchecked speech can expose us to real risks. And we can take steps to mitigate those risks.
The Constitution prevents the government from using sticks, but it says nothing about carrots.
(Marantz offers solutions short of censorship, including "news literacy.")
Congress could fund, for example, a national campaign to promote news literacy, or it could invest heavily in library programming. It could build a robust public media in the mold of the BBC. It could rethink Section 230 of the Communications Decency Act — the rule that essentially allows Facebook and YouTube to get away with (glorification of) murder.
(U.S. government should create a competing public utility, a social media platform that competes with Facebook.)
If Congress wanted to get really ambitious, it could fund a rival to compete with Facebook or Google, the way the Postal Service competes with FedEx and U.P.S.
Or the private sector could pitch in on its own. Tomorrow, by fiat, Mark Zuckerberg could make Facebook slightly less profitable and enormously less immoral: He could hire thousands more content moderators and pay them fairly. Or he could replace Sheryl Sandberg with Susan Benesch, a human rights lawyer and an expert on how speech can lead to violence. Social media companies have shown how quickly they can act when under pressure. After every high-profile eruption of violence — Charlottesville, Christchurch and the like — tech companies have scrambled to ban inflammatory accounts, take down graphic videos, even rewrite their terms of service. Some of the most egregious actors, such as Alex Jones and Milo Yiannopoulos, have been permanently barred from all major platforms.
(Marantz raises this question: Should the government favor the rights of free speech of racists while showing no concern for that racist speech leading to crosses burning on people's lawns? Is that connection valid? Should you raise this question in your counterargument section?)
“We need to protect the rights of speakers,” John A. Powell, a law professor at the University of California, Berkeley, told me, “but what about protecting everyone else?” Mr. Powell was the legal director of the American Civil Liberties Union in the late 1980s and early 1990s, and he represented the Ku Klux Klan in federal court. “Racists should have rights,” he explained. “I also know, being black and having black relatives, what it means to have a cross burned on your lawn. It makes no sense for the law to be concerned about one and ignore the other.”
Mr. Powell, in other words, is a free-speech advocate but not a free-speech absolutist. Shortly before his tenure as legal director, he said, “when women complained about sexual harassment in the workplace, the A.C.L.U.’s response would be, ‘Sorry, nothing we can do. Harassment is speech.’ That looks ridiculous to us now, as it should.” He thinks that some aspects of our current First Amendment jurisprudence — blanket protections of hate speech, for example — will also seem ridiculous in retrospect. “It’s simpler to think only about the First Amendment and to ignore, say, the 14th Amendment, which guarantees full citizenship and equal protection to all Americans, including those who are harmed by hate speech,” he said. “It’s simpler, but it’s also wrong.”
(To stand for absolute free speech is an easy, lazy position that excludes acknowledging the relationship between speech that incites violence.)
I should confess: I used to agree with the guy I met in the coffee shop, the one who saw the First Amendment as an all-or-nothing dictate. This allowed me to reach conclusions with swift, simple authority. It also allowed me to ignore a lot, to pretend that anything that was invisible to me either wasn’t happening or didn’t matter.
(Racist speech can be compared to air pollution; it can reach lethal levels.)
In one of our conversations, Mr. Powell compared harmful speech to carbon pollution: People are allowed to drive cars. But the government can regulate greenhouse emissions, the private sector can transition to renewable energy sources, civic groups can promote public transportation and cities can build sea walls to prepare for rising ocean levels. We could choose to reduce all of that to a simple dictate: Everyone should be allowed to drive a car, and that’s that. But doing so wouldn’t stop the waters from rising around us.
Recently, Facebook put Breitbart News on their site.
Excerpts from Charlie Warzel's "Why Will Breitbart Be Included in 'Facebook News'?"
Parenthetical citations are mine.
(Facebook is now partnering with an Alt-Right troll media outlet.)
It’s into this environment that, on Friday, Facebook announced Facebook News — a curated section on the social network that will partner with news publishers. Facebook will pay for content from dozens of partners, including The Times, The Washington Post, Business Insider and others.
But any hope that the takeaway from the announcement would be “Facebook saves the news” was quashed by the inclusion of one unpaid partner: the far-right online outlet Breitbart News.
The site, formerly run by Steve Bannon, is known for its unabashed pro-Trump activism and early embrace of toxic online politicking and trolling. Breitbart has published articles with tags like “Black Crime” and was once described by Mr. Bannon as a platform for the alt-right. A 2017 BuzzFeed News exposé detailed, via obtained emails, how Breitbart actively courted the right-wing online fringes and helped to launder white nationalist talking points into the mainstream. Since 2016, more than 4,000 advertisers have severed ties with Breitbart over its ideological bent, according to the Sleeping Giants founder Matt Rivitz.
For some, Breitbart’s inclusion among its select news publishers is proof of Mr. Zuckerberg’s, and his company’s, political biases. Judd Legum, who publishes the newsletter Popular Information, reported recently that three Republican employees in the company “call the shots at Facebook” and that the social network “has repeatedly taken actions that benefit Republicans and the right wing.” Progressive critics have suggested that Mr. Zuckerberg is a Republican and that his company’s ethos leans to the right as well. “Facebook is a conservative outlet,” Adam Serwer, a journalist at The Atlantic, tweeted last week. “When conservatives criticize, they solemnly and apologetically promise to do better. When liberals criticize, they tell them to shut up.”
(Zuckerberg isn't so much Right-Wing as he is libertarian; he sees government as a threat to running his operation.)
Facebook’s decision to include Breitbart among its select publishers is clarifying, though perhaps not in the way many critics have suggested. It’s not an indicator of secret political bias; instead, it’s a small window into how Mark Zuckerberg and Facebook see the world. Here, the realms of government and media aren’t levers to achieve some ideological goal — they’re mere petri dishes in which to grow the Facebook organism. And when it comes to Facebook and Mr. Zuckerberg’s end game, nothing is more important than growth.
(Zuckerberg isn't about public safety or public interest or sanctimonious chants about "bringing people together"; he's about unlimited growth of his company.)
Growth has always been the end game for Facebook. The company’s onetime internal credo, “Move fast and break things,” was about a need for rapid, sometimes reckless innovation in service of adding more users, market share and ad dollars, while its early mission statement, “Make the world more open and connected,” was a friendly way of expressing a desire for exponential growth. The company’s new mission statement, “Bring the world closer together,” is a friendlier way of saying the same thing — after all, you can’t bring people closer together if you don’t acquire them as active users first. Growth at any cost is a familiar mantra inside Facebook as well, as an internal memo surfaced last year by BuzzFeed News revealed; subsequent investigations by The Times detailed a company “bent on growth.”
(Facebook takes a big you know what and leaves others to deal with the mess.)
But the costs of this growth — election interference, privacy violations — are passed on to users, not absorbed by Facebook, which takes a reputational hit but generally maintains, if not increases, market share and value. The real threat to Facebook isn’t bad P.R., it’s alienating its user base.
Through this lens, it makes perfect sense that Facebook should want to publicly court conservative audiences that seethe at what they perceive as Facebook’s liberal bias. And while the outcomes of Facebook’s decisions have serious political consequences, Mr. Zuckerberg and his fellow decision makers at the company view their decision to choose both publishers and off-the-record dining partners in terms of user acquisition strategy. According to Bloomberg, publications for Facebook News were chosen after surveying users and studying news consumption habits on the platform. Breitbart’s inclusion suggests that it checked enough of Facebook’s boxes, despite its toxicity. The same goes for dinner with Mr. Carlson, who launders white nationalist talking points and speaks to a large audience on cable TV every weeknight. The pattern is clear: If an entity or individual achieves a certain level of scale and influence, then the company will engage earnestly.
(Facebook benefits from "hyperpartisan vitriol.")
It’s telling that Facebook would look to Mr. Carlson or Breitbart and interpret a large audience and influence as a stand-in for authority and credibility. What else should we really expect from a company that refuses to meaningfully distinguish those who share hyperpartisan vitriol from those joyfully sharing baby pictures? When scale is the prism through which you view the world, that world becomes flat. When everyone becomes a number, everyone starts to look the same.
Because Mr. Zuckerberg is one of the most powerful people in politics right now — and because the stakes feel so high — there’s a desire to assign him a political label. That’s understandable but largely beside the point. Mark Zuckerberg may very well have political beliefs. And his every action does have political consequences. But he is not a Republican or a Democrat in how he wields his power. Mr. Zuckerberg’s only real political affiliation is that he’s the chief executive of Facebook. His only consistent ideology is that connectivity is a universal good. And his only consistent goal is advancing that ideology, at nearly any cost.
Recommended Outline
Paragraph 1: Your introduction discusses the complexities of free speech before the age of social media (2010) when inciting violence, community standards (Gucci "blackface" sweater), defamation (slander, libel), and what constitutes emotional damage in a work or educational environment.
Paragraph 2: Support or refute that in a digital world (post-2010), social media platforms either need to be censored or shunned or boycotted.
Points to Consider in Your Mapping Components
One. We have a history of failed censorship leading to genocide.
Two. It is lazy and over simplistic to say we can do nothing about social media promoting hate and fake news because we must defend the First Amendment.
Three. Online dog whistles on social media platforms are not so much opinion as they are incitements to violence.
Four. Weaponized misinformation radicalizes lonely malcontents so that some of them turn to violence.
Five. Social media platforms create need for viral videos of violence such as the killer at the New Zealand mosque.
Six. Businesses such as FB, YT, and Twitter are not bound by free speech law, so they can use their discretion as they see fit.
Your Counterarguments
One. The definition of a troll may be nebulous. What is a troll that uses fake speech and an honest dealer who uses real speech? How is the arbiter of this distinction?
Two. Slippery slope: If we deplatform Alex Jones, where does it stop?
Three. Snowflake argument: If we protect people from PTSD and the like from deplatforming offensive trolls, are we conditioning people into snowflakes?
Option K
See the movie Black Panther and in an argumentative essay, with a counterargument-rebuttal section, address the question: Is Erik Killmonger a villain or a hero?
Resources for Works Cited:
See: Argument about Erik Killmonger
See: Boston Review
See:"Black Panther and the Invention of Africa" by Jelani Cobb
See Guardian
See Washington Post
See Forbes
See The Ringer
Option L
Watch the movie Black Panther and address the argument that the mythical city of Wakanda is a metaphor for the need of African history that has been corrupted and "white-washed" over the centuries by racist, white historians who have painted an inaccurate history of Africa.
Sources:
"Black Panther and the Real Lost Wakandas" by Clive Irving
"Black Panther and the Invention of Africa" by Jelani Cobb
"Black Panther: A Conversation about Real African History" by Melvin Lars
"Black Panther is a gorgeous, groundbreaking celebration of black culture" by Tre Johnson
"The Real History Behind the Black Panther" by Ryan Mattimore
"Searching for Wakanda: The African Roots of the Black Panther Story" by Thomas F. McDrew
Option M
Watch the movie Black Panther and develop a thesis about how the film sheds light on the tensions between Africans and black Americans.
Sources:
"Black Panther: Why the relationship between Africans and black Americans is so messed up" by Larry Madowo and Karen Attiah
"Black Panther and the Invention of Africa" by Jelani Cobb
"Black Panther Forces Africans and Black Americans to Reconcile the Past" by Kovie Biakolo
Choice N