# I am the Watcher. I am your guide through this vast new twtiverse.
#
# Usage:
# https://watcher.sour.is/api/plain/users View list of users and latest twt date.
# https://watcher.sour.is/api/plain/twt View all twts.
# https://watcher.sour.is/api/plain/mentions?uri=:uri View all mentions for uri.
# https://watcher.sour.is/api/plain/conv/:hash View all twts for a conversation subject.
#
# Options:
# uri Filter to show a specific users twts.
# offset Start index for quey.
# limit Count of items to return (going back in time).
#
# twt range = 1 10
# self = https://watcher.sour.is/conv/elpkq6a
Hmmm, the Who Follows Resource produces a JSON response, that is not compliant to our very own specification on that matter. I will fix it and write some tests. Actually, I wanted to improve my Twtxt User Agent Parser and fetch said resource.
@lyse Oh actually I changed this recently and forgot about the spec 😂 Ooops 🤦♂️ I added a LastSeenAt field but to be honest I don’t this is actually used by the API response or at least we can assume ‘time.Now() when requestion the resource 🤔
@lyse Oh actually I changed this recently and forgot about the spec 😂 Ooops 🤦♂️ I added a LastSeenAt field but to be honest I don’t this is actually used by the API response or at least we can assume ‘time.Now() when requestion the resource 🤔
Actually correction. I completely messed this up here and fucked up 🤦♂️ Sorry! 😴
Actually correction. I completely messed this up here and fucked up 🤦♂️ Sorry! 😴
And finally my `useragent` parser is able to resolve the Who Follow Resources from the User-Agent
headers. It supports both the official and the buggy yarnd formats.
@lyse
> and the buggy yarnd formats
Heh sorry about that 😅 I guess they're not buggy now thanks to your PR that adds a test to ensure I don't fuck that up again 🤣 Thanks mucly! 🙇♂️
@lyse
> and the buggy yarnd formats
Heh sorry about that 😅 I guess they're not buggy now thanks to your PR that adds a test to ensure I don't fuck that up again 🤣 Thanks mucly! 🙇♂️
You're welcome, @prologic. :-) And just committed another fix to my parser, that also handles empty URL
and filled URI
fields.