We Forgot About the Most Important Job on the Internet

Mar 13, 2020 · 92 comments
GailB (NYC)
I love the notion that moderators could rise up and balance out the meanness that social media. Thanks for this new and hopeful perspective.
JUHallCLU (San Francisco Bay Area, CA)
Content "moderators" sadly don't do very work generally. They haven't seemed to prevent trial by twitter and facebook/twitter mobs from wrongfully lynching people using false information. #MeToo has also participated in some of the lower profile "lynchings". Where were the "content" moderators then?
Candlewick (Ubiquitous Drive)
When I began commenting here seven years ago, it was a different place. Getting a comment published was like hitting the lottery. There was very little back-and-forth counter commenting, taking other's comments out of context or not even comprehending. Now, far too many sound like they are copying and pasting their comments from Twitter and Facebook: Back then, there was a full staff of human moderators and a Community Editor. In 2017, that all changed when the NYT implemented Google's AI "Perspective" software and downsized its moderator staff. No longer distinguishing between literal and figurative speech; catching faulty analogies and all manner of nuances of human speech; and it shows. The published Community Standards have also changed. Although the NYT comment threads are still a cut above the rest; the quality has been greatly diminished from what it was. Yes, it has greatly expanded the number of articles and the numbers of commenters are greatly increased; but more is not necessarily better. Lastly, there needs to be an "edit" feature with a 2 or 3 minute allowance to allow the ability to *think* before hitting submit.
Charles Packer (Washington, D.C.)
I am so glad there is still Usenet. By virtue of being unmoderated, it's the only outpost on the internet of true freedom of speech. In fact, I recently posted there why Reddit is uncongenial: who wants to have his posting vetted by some moderator who is likely one third his age?
Paris Spleen (Left Bank)
The main problem with the moderators at the NYT is political bias. It seems pretty clear to me they have moderate-liberal tilt; anything European Left or Hard Right is beyond their Overton window. Someone should do a post-election season statistical analysis of how many pro-Biden comments received a NYT pick versus how many pro-Sanders comments received it. But still, this is probably quibbling on my part. The paper’s comment section is still one of the best I regularly read.
Florian Marquardt (Nuremberg)
Since the New York Times comments section seems to me the best there is on the internet, I am curious to know: - how many moderators need to work on this? - how many comments have to be deleted, and what do they typically contain? - is a good comments section self-reinforcing, as in problematic people would find it boring to comment here?
gary e. davis (Berkeley, CA)
I get disgusted with Disqus sometimes because an effort to give thoughtful comment (drafted offline before posting) at a platform using the service results in “Detected as spam. Thanks, we’ll work on getting this corrected.” But I haves no idea why a thoughtful comment is detected as spam. Was it because a word in an article about a controversial subject is one of the keywords that causes rejection of the comment when I quote from the article? One never knows. The “we” of “we’ll work on…this” is somebody’s algorithm boilerplate. My rejected comment remains visible for my account—visible to me only— and can’t be deleted! That kind of thing causes me to just not comment. Who cares? In the first place, I have better things to do than seek attention, so only thoughtfulness matters to me. I want to be useful. If that’s not reliably feasible, then good riddence to the frivolity, vanity, and impulsiveness that the algorithms welcome.
Ben (Florida)
@Tom: Great response. But you misrepresent what I said in your summary. You said no algorithm can replace a human because we will have to assume it is sentient. I said that if an algorithm is ever capable of replacing a human being, which I deemed practically impossible, we will have to assume it is sentient. I do not believe that sentience precludes artificial intelligence. I do believe that properly demonstrated intelligence implies sentience. There is something called the zombie paradox in philosophy. How do we know that other people are sentient, rather than being automatons who just pretend to be human? The philosophical answer is that we have to assume that people who act like us have the same motives and feelings as us. The religious answer is that everything which exists has its own relative share of divine intelligence. The physical answer is that we do not know that we have our own free will and sentience, and as ephemeral as it is for us, it is almost impossible to assign it to external beings.
Frances Grimble (San Francisco)
I certainly miss the pre-Facebook days, where most moderators felt a sense of duty to the whole group, and did not review the group as their personal ego thing based on who they liked and disliked. These days, some actually promote bullying.
Kathleen (Michigan)
We only have so much time to read, so I'm grateful to the moderators on those few sites I frequent. Long ago I stopped reading comments in various media because the comments were uninformed and just trollish or ads for a product/site. Even if they supported my own position it was just a waste of time. The NYT has a variety of opinions and they are usually well-informed. I've learned a lot from the excellence of some other comments, sometimes more than from the column. Even when I've disagreed. Sometimes I've changed my view based on comments. Even on hobby forums the moderators have a big job, and those are politically neutral. I know one who has volunteered his time and it was a huge task. In the current primary I've formed ideas about the campaigns of candidates from comments sections. Repetition of the same talking points I just skip over if I see them again and again (many seem to just cut and paste the same ideas). I'm hoping after the election it will change. Thanks to The times comment moderators. It is why I subscribe and read here.
Mike S. (Eugene, OR)
I left FB 2 years ago and don't miss it in the slightest. I find the NYT comments more interesting and thought out. Here, there are far fewer of the back and forth arguments between two people who are not going to either change their mind or the subject (Franklin's definition of a fanatic.) I would like to see are comments that have been edited at least once in their writing. Editing means pausing or filtering, and if there is one thing the Internet badly needs, it is "before tapping, engage brain." Having a way to delete one's comment would also be useful.
Ben (Florida)
It seems to me that the people who enforce Terms Of Service have the most power of anyone on the internet. Imagine if they held Trump responsible?
Kathleen (Michigan)
Moderators need to moderate hate speech and obvious lies on one hand. But on the other with lies, sometimes it shows us what kind of information is being spread. And when a bunch of people challenge, it can be educational for others who've been hearing it anyway. With vile and blatant hate speech we need moderation. But there's a gray area. That is sometimes in the area that people call "overly woke." Especially if it applies only to certain groups. If you are allowed to demean one group and not the other, what does that leave us with? I frequently see ageist comments, and rarely does anyone object. Even wanting all the elderly to die off seems fair game. Saying this about any other group would be shocking. Is OK, Boomer an ageist slur? I often address these, because I think othering of any kind attacks our humanity. It's also true sometimes of younger people, calling them slackers or lacking common sense. But filtering all such comments would also be a mistake. In the current election we have three candidates who are the oldest in history. How do we discuss this? The electability of women seems fair game, but is it the same discussion for a gay or transgender candidate? In my opinion it's healthy to have these discussions. But where do we draw the line within a gray area? And we should have a line. That will certainly take a human moderator, and an objective one, at that.
EJ (Nes Ziona)
Content moderators are important but too often they are content censors. One should not go further than this newspaper. Please, surprise me a publish this post. Without cuts.
ChristineMcM (Massachusetts)
"To cope, some companies have tried to replace human moderators with algorithms. The results have been mixed at best." That only stands to reason. How can an algorithm discern between a typo, an intentional racist implication or a valid, if blunt, political observation? As comments have proliferated everwhere in this nightmare political age--and are likely to do even more with this global pandemic where people have too much time without outlets to express things--the role of moderator becomes more important than ever. The Times knows its comment community is the best in the business. To keep it that way, wny not hire more moderators to keep up with the flood of posts from people desperate to get their feelings out?
Tom (Washington DC)
Anonymity dis-inhibits. Public expression has never been a recognized means to solely relieve yourself. There isn't a precedent in human history where any man could unleash his most base impulses at a global audience without facing any consequences whatsoever. That is exactly what Twitter, Facebook, etc. capitalize on. On this very day, when people seek information on Corona, they do not turn to Facebook for answers. Facebook is the landfill of information. As soon as your life depends on it, you quickly re-discover the value of truth, (and science as the only access path to it.)
Kathleen (Michigan)
@Tom And when Kushner does turn to Facebook, it's big news, as it should be.
Rhporter (Virginia)
This was a strange article. If you think for yourself you don’t need someone to moderate your comments. Generally moderators exist as security guards and bouncers, not facilitators. For example take wsj/fox: they block anything too anti trump. Anyway it was also strange to read about moderation at the root. The root has some interesting pieces, but is debased by widespread juvenile profanity. It is also difficult to post comments on that site.
Tom Fiorina (France)
I joined an internet start-up in 1998 in Fremont, CA started by Hotmail cofounder Sabeer Bathia. Arzoo was to be a platform of expert moderators on subjects such as music, fashion, art, cinema, etc. It went bust, like many start-ups of that epoch. I agree that moderators could make a valuable contribution to improving the internet of today.
Jeffrey Herrmann (London)
What a job. It’s like having to spend your days at the town garbage dump sifting out the nastier stuff from the ordinary garbage.
Dan (California)
I full heartedly support your view that moderators are invaluable. You mentioned MetaFilter. It's one of the most civil, positive, and constructive online communities I've experienced, and I think that's in great part due to the diligence of its moderators.
Carol (Newburgh, NY)
I usually do not read op-eds on the NYT except for Margaret Renkl and occasionally Thomas Edsall (the rest are boring). I skimmed this one. I am not tech savvy and am not interested in social media. But I do read many comments (not the extra-long ones) on the op-eds and articles. I avoid those without a comment section. I enjoy putting comments on the Times especially after having a couple of glasses of wine! I've been reading the Times since the 60's. In those days and for many years, there were letters and ads (I avoided them). My, how times have changed! Ads popping up/out at me -- too much movement as on this op-ed -- too distracting. But thanks for the comment sections. It gives me something to do besides playing with my dog and taking him to the park, reading, reading,reading books, playing my piano, doing crosswords, going out to lunch, going to the gym/exercising, researching stuff on the internet.
Hydraulic Engineer (Seattle)
I am totally on board with this! The NY Times comments section that you are reading now is a perfect example. The difference between the comments published here and those in virtually every other on-line publication I have seen is night and day. Other comment sections are crammed full of angry, disrespectful comments, disinformation, ignorance, etc. People attack each other, using language that is inflammatory and just plain mean. Perhaps more important, I often look at the NY Times comment section for the valuable critiques and relevant personal experience from the readership regarding the content of the article at hand. I often find that the commenters add important new information from people of all stripes. It would not be so I it were necessary to wade through piles of comments from angry people with nothing to add but their bile. I also find it enlightening to check the number of "recommend" clicks a particular comment gets compared to other comments. Its is a quick, though non-random, poll of NY Times reader.
Bruce Michel (Dayton OH)
Perhaps a way to mitigate the commentary that never should have been written is to not allow submission by pseudonym.
Passion for Peaches (Left Coast)
@Bruce Michel, that is an old, dead point of discussion. It has been the subject of long, comment threads in this paper, so I won’t even bother. But I will mention that no one here knows whether you are really, legally, Bruce Michel of Dayton, Ohio. Even the newspaper doesn’t know if that is your real name. So who cares what your posted handle is?
Positively (4th Street)
The NY Times moderates the bejeepers out of me!
Elle Roque (San Francisco)
Censors are moderators, too.
Stephen Merritt (Gainesville)
Mx. Newitz (whose writing I admire) is right, but I can't imagine for-profit companies working from the value system that's necessary to carry out their recommendation. I'd love to see the reorganization of the internet that would make it more humane, but unless people with the values of the Sisters of Harriet get to be decision makers, I'm afraid it won't happen.
Alexander (Toledo)
Facebook, Twitter, and the comments sections of news sites aren't comedy clubs. They're the public squares of the digital age. That means that moderation is effectively censorship -- not *technically* censorship, because it's not done by the government, but *effectively* censorship, because it stops people from saying what they think where others are likely to hear it. It even causes self-censorship, because "Facebook jail," like real jail, can mean being cut off from your family, friends, and professional activities.
McGloin (Brooklyn)
Yes, we need human moderators to keep people honest and polite. Web companies think they have no responsibility to their customers, the nation, or the people of the world. They refuse to pay for basic necessities of social media, like moderators and fact checkers. Its not like these companies can't afford them. They would just rather make more profits, no matter the cost to society, our Constitution, or the Earth. We the People have a Constitution that says we should "regulate trade." Let's save the Constitution, by implementing the Constitution.
The View From Downriver (Earth)
Moderator is not just the most important job in the Internet, it's also the most thankless job. Some niche topic boards I was on were simply shut down after the moderators threw up their hands and decided it was more of a second job than anything else. One run by a small company shut down after the drain on the "day jobs" caused by constant moderation was computed.
Chris Morris (Idaho)
The AI behind the internet is much clunkier than we imagine. EG: Weeks ago we did a search of a furniture store's inventory for TV stands to get an idea of their stock, liked what we saw, drove there that day and bought one in store. Done! To this day we continue to get push adds popping up in front of us everywhere, news sites, FB, Google, everywhere, yet we are not in the market any longer. It's like those old days subscription renewal cards that continued to appear in your mail for months after payment rendered. It's insanity.
McGloin (Brooklyn)
@Chris Morris Yes, I'm still getting ads based on a web search I did two years ago.
Kathleen (Michigan)
@Chris Morris Occasionally I'll get a gift for someone online, and then I'll get ads for that same type of product for a long time. AI and the algorithms are getting more and more sophisticated, as Andrew Yang discussed in his campaign. I don't know if that's better or worse. Worse, I think!
Chris M (Boston)
This is predictable coming from the NYT. It feels like a justification for censoring certain commentary. In the case of the NYT it regularly if not incessantly promotes a feminist narrative but then censors comments in any way critical of that. If content is critical of men, and that's a lot at the NYT, then you get to comment away. Comments are rarely even allowed for the more ardent feminist content the NYT promotes. Clearly the moderators are far too vested in a feminist bias. Please work on your own objectivity NYT before you allow lectures on everyone else's online responsibilities.
Jennie (WA)
Moderated comment areas are the only ones worth reading.
Mark (BVI)
What kind of skills does being a moderator require beyond the ability to read?
James Ketcham (Los Angeles)
‘NYT Picks’ often leads with a few dead rats, suspect things to hold at arm’s length. Sterling examples of bad thinking, they can be informative. Then on to the better comments. For social media, removing anonymity, verifying and confirming the source before allowing the post, liability and accountability would be a giant step forward. At this point, though, using your real name is begging for anonymous abuse.
Kathleen (Michigan)
@James Ketcham Removing anonymity would be a big mistake due to what you mention, online abuse. Let the ideas stand on their own. Or be moderated on their own.
W in the Middle (NY State)
Actually not… Most of the Internet is – and has been, for some time – made of images… And the mass-communication narrative-image phase change was presaged almost a century ago by the radio-television phase change… Let me prove it to you, right here, Annalee… Your article and headers/footers – but sans comments – less than 8000 characters… i.e. ~8 kilobytes of data/information… Go ask your graphic arts folks how much data capacity needed for your lead-in pic… Which is relatively simple, compared to an actual photo… Now, we could comment on such a photo – even if limited to 140 characters – in (at least) 3 ways: 1. Description 2. Interpretation 3. Judgment With this, there is no bright line between captions and comments, whether for image or narrative… What’s made things interesting for humans is the re-entrancy/recursiveness of #’s 1-3… i.e. I can describe, interpret, or judge these sorts of comments – and keep going… See, the punch line – in this comment – is this: None of A, F, G, or M have any problem targeting ads, based on analytics – and with very few humans in the loop… Do you mean to say you think they couldn’t aim that same sort of horsepower at semantic analysis? They don’t want to – any more than the US healthcare-combine wants to make diagnostic testing affordably and readily accessible… That’d mean the machines had won… Then – anybody could do what they’re doing…
poslug (Cambridge)
Not blocking Trump's blatant lies and attacks has the effect of validating their veracity to the morally challenged and cult minds. There lies the major problem when the rules do not apply to all, including our elected but defective leadership.
Ltj (Florida)
I'm not sure comments sections add much to the world, as enlightening as some (like the NYT comments) might be. Far too many are cesspools of racism and toxicity. We'd gain more than we lose by getting rid of them entirely.
Concernicus (Hopeless, America)
@Ltj Disagree completely. I often find myself as enlightened, sometimes more so, by the comments than I do by the original Op-Ed or article. This is especially true when the commenters post an opposing viewpoint from the original piece. It really makes me look at the other side. You are correct about toxic comments which the NYT tries to minimize. One other thing that really grates on me is when people just have to inject Trump into pieces that have nothing to do with Trump. Still, on the whole, I find the comment section to be a net plus.
Kathleen (Michigan)
@Concernicus Even when people inject their chosen candidate or their candidate's talking points into something that has nothing to do with it. It reflects badly on the candidate and makes people less likely to consider them.
Passion for Peaches (Left Coast)
I love that this article on comments is open to comments. The only place where I read and contribute to comment threads is here on the Times. I enjoy a good discussion. Unfortunately, since the paper started using algorithm moderation I’ve noticed an increase in “why the heck did that post?” comments. I flag them when I see them, but they remain remain posted. I’m not surprised that the bot doesn’t get it right 100% of the time, but I often question the decisions of the humans who review the flags. But what really gets me down is the “incel” language that always turns up in comment threads appended to articles or essays on women’s issues. I don’t know whether a bot can be programmed to recognize abusive viewpoints (minus the obvious slurs), or whether such viewpoints should even be censored. What I can tell you is that, as an older woman who has experienced misogyny and discrimination in the workplace, and who has also been the victim of sexual assault, I feel that such rhetoric is harmful. For me, it is hate speech, loosely disguised.
Gabby K (Texas)
@Passion for Peaches I have noticed this also.
Ben (Florida)
Civility is a virtue.
Andrew (USA)
Moderators are needed gatekeepers, but they are also judge, jury, appeals court, and executioner rolled into one. Who will moderate the moderator?
McGloin (Brooklyn)
@Andrew Yes, if our comments are taken down by a moderator, we should be notified with an explanation, and given a chance to appeal the decision to management of he moderation department.
J House (NY,NY)
Platforms such as Quora are currently filled with posts from China accusing the U.S. of creating the corona virus...a disinformation campaign by the CCP that is underway with full force. The President needs to order the U.S. intelligence community to declassify their assessment of the origin of the virus. Let's prove to the world where it originated, and how.
Nyu (PA)
I always wondered why we don't have official "internet police" even though their everywhere on our streets. With all these IOT storing personal data to controlling everything in your home, you would expect someone to create this as a public service rather than these private companies charging you to protect you. I feel like the US seem to be one step behind some of the Asia countries on this one.
Jonathan Katz (St. Louis)
Lots of ambiguity in any human language. For example, the quote from the article "Don't discuss this because you are black" could mean either: 1. Because you are black you don't have the right to discuss this; 2. Being black isn't a reason to discuss this. Algorithms will never resolve ambiguities like this one, and there aren't enough human moderators (especially considering automated mass distribution of comments). Better to close the internet entirely.
archipelago (usa)
When I think about comments, I think most particularly about these NYTimes comments, the most successfully moderated of any I've seen on the Internet in 25 years of use. Having said that, they seem completely ephemeral. And I don't know how else they should be. The best can be more insightful than the accompanying article, but it is part of the passing avalanche of news. I think the NYTimes still publishes conventional letters to the editor as a separate channel somewhere in the paper, which probably becomes part of the official record. But that works because there are so few that get recognized that way.
Cynthia Hennecke (Albuquerque, NM)
I noticed that comments here are moderated for civility. One issue with the Wild West culture of social media is that so many people (often not actual people at all) are so hateful and ugly and divisive that it widens the gap and increases the generalizations about 'that other group.' Also, editors in publications serve a valuable purpose - they check the facts (for one thing) and facts are sorely lacking in the comments of many on those social platforms. Don't even try to cite the demonstrable truth, they don't want to know. It's amazing and discouraging and disgusting at the the same time. We need moderators, just as our verifiable publications need editors. Yes, there will still be (sometimes shocking) bias that pops up (even in out print media), but the lies and hate will be diminished.
Cindy (Vermont, USA)
It's time for internet-based content to be held to the same standards and accountability as other forms of mass media.
old soldier (US)
Ms. Newitz, thank you for sharing your thoughts and ideas with regard to managing content on the internet. It sounds like there is a sincere effort, at least outside of the executive suite, to keep, people with corrupt/criminal intent, sleazy political operatives, people with no social filter, and the mentally ill in check on internet. That said, as I read this opinion the words of Bob Dylan's jumped into my head "It won't work because the vandals took the handle."
Linda (Georgia)
I detest reading articles that don't allow any comments to be made whatsoever! Some articles appear to be full of "facts" that are very debatable but don't allow comments from the reader in order to discuss. Also Snopes and other "fact-checkers" are supposed to be just that - fact checkers, but upon researching what they had to say about the once famous Oprah comment on her show that there were "many more paths to salvation other than Jesus", I found an awful lot of opinion on the part of the "fact-checkers" of Snopes, which infuriated me since I heard the comment she made myself the day she made it!
Kathleen (Michigan)
@Linda Snopes used to be better than it is now. For one thing, there are rumors that they don't address and it appears to be based on their political viewpoint. I've looked at the original and at their fact-checking, and the fact checking was cherry-picking. Not sure what happened to it. I'll still take a look but I'll also look at the original source and make up my own mind.
Ben (Florida)
I hate it when people mistake the enforcement of basic decency for censorship. I also hate when people mistake being ignored for censorship. These are the pitfalls of moderators, as people would rather be victims than accept responsibility when it comes to the internet.
Ben (Florida)
I am completely against genuine censorship. I just think false accusations hurt the effort to prevent the real thing.
Harvey Botzman (Rochester NY)
In the infant years of the internet I was a moderator on a BBS (Bulletin Board System) board. Being a moderator involves a lot of uncompensated time.
Watercannon (Sydney, Australia)
The NYT comments system is one of the best. But here are my suggested improvements: 1. Send rejection notifications, even if you don't want to spend time adding a reason. This would resolve what could otherwise be a long moderation delay, and give the commenter a copy of their submission so they can re-submit. 2. Send reply notifications, so there's a better chance of debate before comments are closed. 3. Use "NYT Replies" author comments more often. Without them, it gives the appearance of ivory tower writers moving onto their next topic, leaving the plebs to irrelevant fights over their missives. If necessary, pay your writers to do this. 4. Eliminate the advantage of early, perhaps less thoughtful, comments by allowing comment submission from the start, but holding back publication for some hours, eventually publishing them in random order. 5. Provide a random sort option that gives every comment a better chance of exposure. 6. Include comments with fewer than two recommendations at the bottom of the "Readers Picks" tab, allowing one to read all in recommended order. 7. Allow one's comment history and the comments one has recommended to be viewed.
Passion for Peaches (Left Coast)
@Watercannon, I absolutely disagree with number 7, if you mean viewed by everyone. Many years ago I participated on a comment board that allowed one’s comment history to be viewed by all (they eventually gave users an option to make their history private). Abusive commenters would look up the history of people they were targeting, and would cut and paste their past comments into threads. It was ugly.
Kathleen (Michigan)
@Watercannon Especially like #3, 4, and 5. Great ideas.
Kathleen (Michigan)
@Passion for Peaches Definitely disagree with #7 for the reason you mention. In addition, I've been on sites where this was done and some people became superstars and their opinion carried undue weight. The way it is now, each comment stands on its own merit. Plus, I've made some stupid comments or have changed my mind later. I'm not a politician and don't want to be challenged when I change my mind. Even some politicians shouldn't be, though sometimes it's warranted.
Eli (Austin)
A huge shout out to the moderators who make the NYT such an incredibly interesting and useful site. The value of the paper is inestimably increased by the comments section, I have learned SO much from other readers, and reading their thoughtful critiques encourages me that we are not all crazy. And heck, some days it is like therapy!
Bill Decker (Iowa City, IA)
From Day 1, the profession of Library Science should have been involved or should have involved itself in these issues. Unfortunately, for many in the profession, the Internet seemed like a threat rather than an opportunity. We have long relied on our libraries and media to provide us with information and with access to information, and I choose to be among those who value the roles they have played in this regard. I also choose to believe that they are not censors. They can guide me to relevant information on all sides of an issue or question--traditionally, they have done exactly that. It is not too late for these professions to address these "most important jobs" vis-a-vis the Internet.
manfred marcus (Bolivia)
Of course we appreciate commentator's job in modulating the News, filtering out mis- and disinformation, and making facts more 'digestible' for those of us with limited knowledge (and who doesn't?). Still, we all need to educate ourselves on the go, and try to gather diverse information's outlets ('trust but verify') so any given News Media, especially if known to be Trump's propaganda boy, i.e. Fox Noise, may be led into a more responsible behavior. The Internet ought to be considered a 'marvel' of technology of course, yet know it's health depends on all of us users, to keep it the least contaminated as possible...from Trumpian content (lies and innuendos and fake news and conspiratory theories and the allowance of foreign interference in domestic affairs).
Anthony (Western Kansas)
We need to expand the role of moderators in the political realm and keep out hate speech and obvious lies. This would help our political world. Lies that are not obvious can be vetted and taken down as soon as possible. There is no reason to allow obvious lies on a large platform.
alboyjr (NYC)
This is such a wonderful idea, but I fear it will never happen. Of course there needs to be moderation (in all things!) but the evolution of the Internet has made "information should be free" a mantra. Freedom yes, but with a dollop of regulation. Anarchy is not the answer to a great society.
jd (ga)
This article disgusts me on so many levels as it antithetical to one of the underlying premises by the internet, which is freedom in the absence of gatekeepers. This author is arguing for replicating the gatekeeper world in cyberspace. There needs to be a place where people can speak or type their mind without fear of consequence. This author is arguing for the regulation of ideas and how those ideas are disseminated. Big brother or I should say, Big Sister. This author is low key arguing for censorship. Net neutrality will find great support in this author's words. The beauty of the internet is that it is destroying gatekeepers. I no longer have to depend on just three networks and few news organizations to get news or programming. Moderators destroy freedom as their bias will always pollute and water down art and expression. Think Huffingtonpost which used to be a great platform for views all over the political spectrum, but then grew to screening out all comments critical of anything left of center. I absolutely detest censorship and people who work in 9 to 5 careers that have bosses that monitor their every thought, word and deed detest those who make a ton more money operating outside of that without fear that they can be fired for something they say or post. This author should stay within her cog-in-a-wheel establishment world where everything has to be approved by someone over her. Meanwhile, for those of us who dared to unplug from the matrix I say leave us alone.
Jo (Right here Right now)
@jd -I disagree. Moderators are not censors, they apply rules as written by the group itself or the owner of the platform. You know who dislikes being pulled up their ugly posts? Bullies! Trolls and bullies have been trying to destroy the internet and have hurt real people since the internet came into being. If you're in a group and feel mods are picking and choosing favorites and punishing others then you need to step up and stop it but crying foul at the notion of having to be civil and factual etc is out of line and implies you're one of the ones who needs muzzling.
Revelwoodie (Trenton, NJ)
There has never been, in the whole of human history, a time when people could "type their mind without fear of consequence." Nor should there be. Freedom from legal consequence? Certainly. That's what the first amendment is all about. No one wants to live in a country where people can be arrested for unpopular opinions. But human beings have always, ALWAYS, built a shared consensus around socially acceptable behavior and enforced that through social sanction. It's what separates us from animals, and people who unironically tells others to "unplug from the Matrix." We're talking about comment moderation here. And if you find yourself in danger of being fired for something you post, then I suggest you take a serious look at what you are posting. And welcome a moderator who might give you the minor consequence of a temporary account suspension, thereby saving you from the more serious consequence of losing your job.
@jd there exist plenty of places with little or not moderation. You can even start your own!
Jo (Right here Right now)
A good moderator is so important to any group of people online. I recently left a private group then facebook entirely due to a poor moderator applying rules as she pleased rather than as they were written and it soured me forever on facebook because I know now there is no one else to turn to. Facebook doesn't care if groups don't follow their own rules. Rules MUST be fair and evenly enforced and that's why I love online forums that have active impartial mods. Thanks for all you do moderators of the internets! And those that don't-a pox on you! (Or is it more current to wish coronavirus on baddies these days?)
FJS (Monmouth Cty NJ)
@Jo This is very common in my experience as well. I've noticed that many believe if something is good enough for them it must good enough for everyone. My question is when does moderation cross over into the realm of censorship?
OldBoatMan (Rochester, MN)
Since I was not familiar with the Reddit forum r/science, I checked it out. Here is what it says about the work of its moderators. "Welcome to r/science! Our team of 1,500+ moderators will remove comments if they are jokes, anecdotes, memes, off-topic or medical advice (rules). We encourage respectful discussion about the science of the post." The Reddit forum r/science was pretty much what I expected. Threads started with a statement or question, related often to junk science, that invited discussion. There was no requirement that posts be "based on peer-reviewed scientific research".
Sándor (Bedford Falls)
@OldBoatMan Even worse, you likely spent more time fact-checking this piece than the author spent writing it. The quality of writing in The New York Times Opinion section has declined in the past decade. This piece right here is more of a glorified Wikipedia entry.
PJ (Colorado)
@OldBoatMan "Our team of 1,500+ moderators..." This illustrates the effort involved in properly moderating comments. Imagine the number of hours involved. And the cost if the moderators were paid.
Ben (Florida)
No algorithm will be able to replace a human being. If it ever does, it will have essentially passed the Turing test and we will have to assume it has become sentient. Such is the troublesome philosophical nature of AI.
tomP (eMass)
@Ben Shortened up: "No algorithm will be able to replace a human [because] we will have to assume it has become sentient." So sentient behavior is necessarily human?
Ben (Florida)
An interesting point of argument, @tom. No, sentient behavior is not necessarily human. I have assumed that human behavior is the pinnacle of sentience but perhaps that is not the case.
Lillian F. Schwartz (NYC)
Algorithms depend on linear programming; AI depends on intuitive use altered when necessary. But the US lags China and France in AI due to Trump's restrictions on Indians (from India) and Chinese. There is too much data daily for even teams of moderators to handle; AI is the only solution.
Ben (Florida)
The internet is a sort of libertarian paradise for the most part. For better or for worse. The content moderators provide a filter in spots where a bit of light might shine through.
Ben (Florida)
The freedom is necessary. I agree with the libertarian paradise. But much like life, we need lights in the darkness.
SDG (brooklyn)
Greed has much responsibility for our society committing suicide -- face it, basing society on lies is a prescription for its end. Another issue that is not being addressed is the religious-like belief in logarithms. There is no real such thing as artificial intelligence -- it is a series of programmed responses based on the biases of the programmer, some unknown person who probably is quite intelligent but whose values are unknown.
Terry (ct)
@SDG Yes, that's immediately obvious every time I look at what Netflix or YouTube algorithms have chosen for me, which are never even close to matching my interests.
Chris (Missouri)
@SDG Logarithms? Really?
Candace Kalish (Port Angeles)
@SDG Algorithms, not logarithms. You also misunderstand the source of AI bias, which is not necessarily the programmers' values and prejudices. A lot of machine learning relies on real world data sets. Unfortunately, if the data reflect bias, the machine learning program can perpetuate, and even amplify, the bias. For example, suppose you want to develop a program to identify the most promising resumes provided by job applicants. If you "teach" the program to recognize a good candidate by exposing it to resumes of existing employees, you run the risk of winnowing out anyone who doesn't have the same characteristics as those employees, even if those characteristics have nothing to do with the ability to do the job.
Revelwoodie (Trenton, NJ)
Live moderators works great for smaller communities. But for the large platforms, like Youtube, it's hard to imagine how they could hire enough people. And unfortunately, that's where it's most needed. One possible approach is community moderation. Long time users with a positive posting history can earn moderator status. There would still need to be paid employees to whom edge cases or contested decisions could be kicked up to. But with community moderation, you could very quickly scale up to an army of thousands of front line people to handle the triage. For free.
Concernicus (Hopeless, America)
@Revelwoodie I think Google can afford to hire enough full time paid moderators for Youtube . We already have some forms of community moderation. You can flag or report a comment on most sites. You can "ignore user" on others. Your idea might work for some. I am fed up with giving multi-billion dollar corporations free labor or advertising.
SB (Los Angeles)
As a a YouTube creator I moderate my own comment sections. The tools are provided. Luckily, my subscribers are almost universally nice.
See also