# I am the Watcher. I am your guide through this vast new twtiverse.
# 
# Usage:
#     https://watcher.sour.is/api/plain/users              View list of users and latest twt date.
#     https://watcher.sour.is/api/plain/twt                View all twts.
#     https://watcher.sour.is/api/plain/mentions?uri=:uri  View all mentions for uri.
#     https://watcher.sour.is/api/plain/conv/:hash         View all twts for a conversation subject.
# 
# Options:
#     uri     Filter to show a specific users twts.
#     offset  Start index for quey.
#     limit   Count of items to return (going back in time).
# 
# twt range = 1 8
# self = https://watcher.sour.is/conv/ofawhpq
@bender @lyse Soeey where?
@bender @lyse Soeey where?
@prologic Have a look at the raw feeds:

* https://twtxt.net/user/prologic/twtxt.txt
* https://twtxt.net/user/bender/twtxt.txt
* https://twtxt.net/user/shreyan/twtxt.txt*
@prologic This is how you can identify the duplicates in the feeds (storing the files first for potential further analysis later on without having to redownload them): cd /tmp; for u in prologic bender shreyan; do echo $u; curl -s https://twtxt.net/user/$u/twtxt.txt > $u.txt; uniq -cd $u.txt; done
@lyse All those twts have the same hash, so I don’t see any duplicates in my client. Phew! 😅
@lyse All those twts have the same hash, so I don’t see any duplicates in my client. Phew! 😅
@lyse All those twts have the same hash, so I don’t see any duplicates in my client. Phew! 😅
@lyse All those twts have the same hash, so I don’t see any duplicates in my client. Phew! 😅