Serpent » June 26th, 2018, 3:33 am wrote:As to the credibility of news outlets, I'm pessimistic. [...] when government is relatively clean, so is news reporting - and vice versa.
Where did I mention the credibility of new outlets themselves? - I didn't. I am talking about the credibility of alleged individual opinions within comments sections and forums - where political interests with multiple fake accounts may seek to derail or bury topics or ideas that they do not want others to deem popular or mainstream. Rather, such paid agents - or "troops' as they tend to be called when they are state-sponsored, spread an alternative and seemingly dominant 'community voice' that easily appears as a democratic majority opinion.
After a quick search, here is more info:
Invasion of the troll armies: from Russian Trump supporters to Turkish state stoogesThe Guardian, Sun 6 Nov 2016
We don’t know who they are, or what their mission is. We only know that there are thousands of them out there, pretending to be us. They may be at home, or in special offices, or sitting beside you on the train. They use social media, and write blogs and comments. Some of them may visit the bottom of this article.
[...]
cent reports suggest that many of Donald Trump’s most fervent online supporters are not themselves Americans, but Russians being paid by their government to help him win. One told Samantha Bee that she pretends to be a housewife from Nebraska. Why she would confess it now is unexplained, but when you look around it begins to feel like everybody does it. It’s just that no two countries’ methods are the same.
[...]
Rather than attacking unbelievers, they focus on swamping the doubters with a flood of positive messages, or cleverly diverting the conversation. As with any job, some practitioners are laughably bad at it.
[...]
Long before Donald Trump met Twitter, Russia was famous for its troll factories – outside Russia, anyway. Allegations of covert propagandists invading chatrooms go back as far as 2003, and in 2012 the Kremlin-backed youth movement Nashi was revealed to be paying people to comment on blogs. However most of what we know now comes from a series of leaks in 2013 and 2014, most concerning a St Petersburg company called Internet Research Agency, then just “Internet Research”. It is believed to be one of several firms where trolls are trained and paid to smear Putin’s opponents both at home and internationally.
[...]
Estimated troops Several thousand.
Favourite subjects Putin and Trump being great, the opposition being corrupt, the Nato conspiracy against Russia, the effeminacy of Barack Obama.
Why the Right’s Dark-Web Trolls Are Taking Over YouTubeVanity Fair, March 1, 2018
s. As the Daily Beast reports, members of at least one far-right group crafted a strategy to game YouTube’s Trending tab into promoting false videos, seizing on a soft target with a boldness that in the past has been restricted to the darkest corners of the Internet.
[...]
At first, these efforts mostly targeted the comment sections of news stories, which “allowed them to be seen by a large number of people.” When outlets started to crack down on comments, however, these groups migrated to places like Twitter and YouTube. “Twitter was even more valuable for the far right, as it allowed anonymous users to directly interact with public figures and spontaneously launch semi-coordinated trolling campaigns,” Hawley said. YouTube, too, is important, allowing users who may not be seeking out right-wing content to stumble upon it organically.
More pernicious than the far right’s occupation of YouTube, however, is the group’s means of manipulating the platform to spread its messages. In the case of the Parkland shooting, the far-right group Reconquista Germanica used networks of fake accounts to manipulate YouTube’s algorithms, strategically downvoting and upvoting videos in an attempt to make those they like rise in the platform’s search function, while pushing down videos they disagree with so that they take longer to find or are ignored altogether. “We can push our own videos through likes and comments, through the organization we’ve created, so that they are rated more relevant by YouTube’s search algorithm,” one Reconquista Germanica member said in German, according to screenshots tweeted by the anti-far-right group Alt Right Leaks.
[...]
“Most of this is less new than people think it is,” extremism expert J.M. Berger told me, explaining how the far right has learned to emulate tactics used for years by spammers, Russian hackers, and even the Islamic State. At the same time, it is hard to deny that Donald Trump gave them a boost. “The right-wing resurgence we’re seeing now is not just astroturfing, but the result of several years of work by far-right activists, culminating in the rise of a candidate who was willing to overtly pander to white nationalists and other right-wing extremists. The election of President Trump has done more to mainstream white nationalism than anything in the last 40 years, but it’s a symbiotic arrangement. He elevates their issues, and they organize social-media campaigns to protect and elevate him.”
[...]
With their platforms under siege, tech companies are struggling to reimagine the algorithms on which they’re based
It looks like Russia hired internet trolls to pose as pro-Trump AmericansBusiness Insider, Jul. 27, 2016
"A very interesting thing happened," Chen told Longform's Max Linsky in a podcast in December.
"I created this list of Russian trolls when I was researching. And I check on it once in a while, still. And a lot of them have turned into conservative accounts, like fake conservatives. I don't know what's going on, but they're all tweeting about Donald Trump and stuff," he said.
Linsky then asked Chen who he thought "was paying for that."
"I don't know," Chen replied. "I feel like it's some kind of really opaque strategy of electing Donald Trump to undermine the US or something. Like false-flag kind of thing. You know, that's how I started thinking about all this stuff after being in Russia."
In his research from St. Petersburg, Chen discovered that Russian internet trolls — paid by the Kremlin to spread false information on the internet — have been behind a number of "highly coordinated campaigns" to deceive the American public.
From his interviews with former trolls employed by Russia, Chen gathered that the point of their jobs "was to weave propaganda seamlessly into what appeared to be the nonpolitical musings of an everyday person."
"Russia's information war might be thought of as the biggest trolling operation in history," Chen wrote. "And its target is nothing less than the utility of the Internet as a democratic space."
So fake news is one thing, but there are also a significant number of 'opinionated Americans' online that are fake.
It seems that commenters cannot be effectively verified, as much as many 'alternative facts' put forth in the news cannot be, and so, I wonder, is the internet as a 'community space' going to dissolve into a completely untrustworthy domain of personal opinion as well as factual data? It seems like it is very much heading in that direction.
Perhaps only forums like this one - that champion empiricism, will be left as islands, or perhaps towers of rigour rising up from within the fog of information warfare. I would say such a war is clearly visible right now on this very forum, but it seems to be holding onto its truth quite well.
Youtube, however - well, I now think that opinion space is seriously compromised (where Trump and Alt-right agendas are being discussed anyway, at the very least).
I also find it very interesting that this thread has had more than 100,000 views - a huge amount compared to most other threads. Perhaps this situation regarding the Truth of facts and individual online voices is like an open wound that many people feel inclined to try to heal - like a global social emergency?