# I am the Watcher. I am your guide through this vast new twtiverse.
# 
# Usage:
#     https://watcher.sour.is/api/plain/users              View list of users and latest twt date.
#     https://watcher.sour.is/api/plain/twt                View all twts.
#     https://watcher.sour.is/api/plain/mentions?uri=:uri  View all mentions for uri.
#     https://watcher.sour.is/api/plain/conv/:hash         View all twts for a conversation subject.
# 
# Options:
#     uri     Filter to show a specific users twts.
#     offset  Start index for quey.
#     limit   Count of items to return (going back in time).
# 
# twt range = 1 3
# self = https://watcher.sour.is/conv/dngfrxq
@tkanos
> the point on debating in social network, is not stopping people from spreading bad ideas. Is to make everybody else that look at the debate think, and not fall on those bad ideas, by hiding the bad ideas, and not debating them, we may push others people to believe in them, and we may push people that already believe in them to stay in an echo chamber

No. This is a naive point of view, and it does not jibe with current research. Really. I urge you to read up on disinformation research especially after Facebook was called out for the Cambridge Analytica scandal. Other people *do not* look at a debate, see the bad information exposed as bad by good arguments, and change their minds. It doesn't work that way. Misinformation purposely targets people's emotions, and when the emotional appeal works, they tend to view the people debating against the view as enemies. They *reject* the good ideas even more forcefully.

Sure, there are hypothetical people who will see a debate, recognize that bad information has been exposed, and react by rejecting that bad information. Probably most of the people here fall into that group. But people like that were never the problem. The problem is the vast number of people who will react by *believing the bad information even more stubbornly*. Read the research--this is a real, documented effect I am describing.

Also, the dangers of the "echo chamber" that you evoked are very much overblown, almost surely by purveyors of disinformation because that fear helps them do their work (I'll note you raised this as a danger--an emotional appeal--instead of citing data). The echo chamber effect, to the extent it exists, is bad for people *who are already suffering from information poisoning*. People who've already bought into some piece of misinformation fall into or stay in an echo chamber. Once again, misinformation purveyors have very detailed strategies--Google, you can find them--for how to *draw unsupecting people* into an echo chamber and keep them there.
@abucci

> Misinformation purveyors have very detailed strategies for how to draw unsuspecting people into an echo chamber and keep them there.

I'd say a pretty good way to get people into an echo chamber is to force them into their own space where their ideas get no pushback at all.
@mckinley more like they draw them into a space where their worst inclinations are reinforced. Pushback only reinforces beliefs in some people.

The notion that reasonably well-adjusted people who mostly read stuff by other reasonably well-adjusted people are somehow at risk of some ill-defined "echo chamber" effect is bunk. Folks tend to seek out information and adjust their own notions accordingly, unless they've been "info poisoned" for lack of a better term.