# I am the Watcher. I am your guide through this vast new twtiverse.
# 
# Usage:
#     https://watcher.sour.is/api/plain/users              View list of users and latest twt date.
#     https://watcher.sour.is/api/plain/twt                View all twts.
#     https://watcher.sour.is/api/plain/mentions?uri=:uri  View all mentions for uri.
#     https://watcher.sour.is/api/plain/conv/:hash         View all twts for a conversation subject.
# 
# Options:
#     uri     Filter to show a specific users twts.
#     offset  Start index for quey.
#     limit   Count of items to return (going back in time).
# 
# twt range = 1 23
# self = https://watcher.sour.is/conv/v7l5bjq
/me starts implementing a --blacklisted-feeds configuration option...
/me starts implementing a --blacklisted-feeds configuration option...
And... done.\n\n
\nWARN[0096] attempt to fetch blacklisted feed @<port70 gopher://port70.dk/0/twtxt.txt>\nWARN[0096] unable to find a suitable avatar for gopher://port70.dk/0/twtxt.txt generating one\nDEBU[0096] reloading templates in debug mode...\n[yarnd] 2021/11/02 23:03:52 (10.0.0.101:49908) "GET /external?uri=gopher://port70.dk/0/twtxt.txt&nick=port70 HTTP/1.1" 200 4312 162.608202ms\n[yarnd] 2021/11/02 23:03:52 (10.0.0.101:49908) "GET /externalAvatar?uri=gopher%3a%2f%2fport70.dk%2f0%2ftwtxt.txt HTTP/1.1" 200 1293 272.333µs\n
And... done.


WARN[0096] attempt to fetch blacklisted feed @<port70 gopher://port70.dk/0/twtxt.txt>
WARN[0096] unable to find a suitable avatar for gopher://port70.dk/0/twtxt.txt generating one
DEBU[0096] reloading templates in debug mode...
[yarnd] 2021/11/02 23:03:52 (10.0.0.101:49908) "GET /external?uri=gopher://port70.dk/0/twtxt.txt&nick=port70 HTTP/1.1" 200 4312 162.608202ms
[yarnd] 2021/11/02 23:03:52 (10.0.0.101:49908) "GET /externalAvatar?uri=gopher%3a%2f%2fport70.dk%2f0%2ftwtxt.txt HTTP/1.1" 200 1293 272.333µs
And... done.


WARN[0096] attempt to fetch blacklisted feed @<port70 gopher://port70.dk/0/twtxt.txt>
WARN[0096] unable to find a suitable avatar for gopher://port70.dk/0/twtxt.txt generating one
DEBU[0096] reloading templates in debug mode...
[yarnd] 2021/11/02 23:03:52 (10.0.0.101:49908) "GET /external?uri=gopher://port70.dk/0/twtxt.txt&nick=port70 HTTP/1.1" 200 4312 162.608202ms
[yarnd] 2021/11/02 23:03:52 (10.0.0.101:49908) "GET /externalAvatar?uri=gopher%3a%2f%2fport70.dk%2f0%2ftwtxt.txt HTTP/1.1" 200 1293 272.333µs
Sooooo, guessing dialog didn't progress too far? 😆 In any case though, handy feature 👌
@eldersnake Oh the dialogue went well actually. He's just frustrated I guess mostly by the following:\n\n> If people would just unsubscribe when a feed disappears (e.g., “404 Not found” or ”410 Gone” — I have tried both), this wouldn't happen.\n> \n> I have never claimed that your pod is at fault here. While I can see the IP address of those who are still fetching my [not dead] feeds — many months after I have stopped using twtxt — I have no idea who are at the other end, except if the put their feed name in the User-Agent string (and don't lie about who they are).\n> \n> A user with the User-Agent string\n> \n> tt/0.22.0 (+https://xandkar.net/twtxt.txt; @xandkar)\n> \n> is the most persistent user I see, but not the only one.
@eldersnake Oh the dialogue went well actually. He's just frustrated I guess mostly by the following:

> If people would just unsubscribe when a feed disappears (e.g., “404 Not found” or ”410 Gone” — I have tried both), this wouldn't happen.
>
> I have never claimed that your pod is at fault here. While I can see the IP address of those who are still fetching my [not dead] feeds — many months after I have stopped using twtxt — I have no idea who are at the other end, except if the put their feed name in the User-Agent string (and don't lie about who they are).
>
> A user with the User-Agent string
>
> tt/0.22.0 (+https://xandkar.net/twtxt.txt; @xandkar)
>
> is the most persistent user I see, but not the only one.
@eldersnake Oh the dialogue went well actually. He's just frustrated I guess mostly by the following:

> If people would just unsubscribe when a feed disappears (e.g., “404 Not found” or ”410 Gone” — I have tried both), this wouldn't happen.
>
> I have never claimed that your pod is at fault here. While I can see the IP address of those who are still fetching my [not dead] feeds — many months after I have stopped using twtxt — I have no idea who are at the other end, except if the put their feed name in the User-Agent string (and don't lie about who they are).
>
> A user with the User-Agent string
>
> tt/0.22.0 (+https://xandkar.net/twtxt.txt; @xandkar)
>
> is the most persistent user I see, but not the only one.
cc @lyse I _believe_ this is your client. Although I'm not sure you can really do anything here 🤔 But we _should_ talk about an aspect of the Twtxt spec that is missing IHMO. Recommendations for how clients should behave for "dead" feeds. What is considered a dead feed, when should a client give up and report an error? etc
cc @lyse I _believe_ this is your client. Although I'm not sure you can really do anything here 🤔 But we _should_ talk about an aspect of the Twtxt spec that is missing IHMO. Recommendations for how clients should behave for "dead" feeds. What is considered a dead feed, when should a client give up and report an error? etc
@prologic Interesting, it seems there's another client called tt. That's defintely not mine since I'm still relying on the original twtxt client to fetch the feeds.
\n> it seems there’s another client called tt\n\nYup.
@fastidious Thanks for digging that up!
@lyse no problem. That client’s crawl feature seems interesting for discovery on clients like yours, or @movq’s.
@lyse Oh 🤦‍♂️ That’s right I had forgotten about that 🤦‍♂️ So sorry, was not intended in any assusatoey way, just trying to help out Klaus Alexander Seistrup 👌
@lyse Oh 🤦‍♂️ That’s right I had forgotten about that 🤦‍♂️ So sorry, was not intended in any assusatoey way, just trying to help out Klaus Alexander Seistrup 👌
No worries, @prologic, I fully grasped your intention to help and inform. :-) Haven't looked deeply into the code, @fastidious, but that craw operation might be quite similar to what my tt offers. I'm highlighting unfollowed mentions, so I can quickly check them out in the browser or even follow them in the URLs view. @xandkar's crawling might be much more sophisticated, though. Could be worth spending an extra view for the discovery in the future. So thanks for the crawling hint, I actually missed that initially! Now added a TODO in my README.
@lyse Is this basically what I plan on doing for yarnd that I’ll likely call Lists? Basically what Bookmarks are now. 🤔
@lyse Is this basically what I plan on doing for yarnd that I’ll likely call Lists? Basically what Bookmarks are now. 🤔
@prologic I might have misunderstood you but "Lists" is quite a generic name for that. :-S
@lyse Indeed 😂
@lyse Indeed 😂