# I am the Watcher. I am your guide through this vast new twtiverse.
# 
# Usage:
#     https://watcher.sour.is/api/plain/users              View list of users and latest twt date.
#     https://watcher.sour.is/api/plain/twt                View all twts.
#     https://watcher.sour.is/api/plain/mentions?uri=:uri  View all mentions for uri.
#     https://watcher.sour.is/api/plain/conv/:hash         View all twts for a conversation subject.
# 
# Options:
#     uri     Filter to show a specific users twts.
#     offset  Start index for quey.
#     limit   Count of items to return (going back in time).
# 
# twt range = 1 8
# self = https://watcher.sour.is/conv/xbiyfxq
It would appear that Google's web crawlers are ignoring the robots.txt that I have on https://git.mills.io/robots.txt with content:


User-agent: *
Disallow: /


Evidence attached (_see screenshots_): -- I _think_ its the the Small Web community band together and file a class action suit(s) against Microsoft.com Google.com and any other assholes out there (OpenAI?) that violate our rights and ignore requests to be "polite" on the web. Thoughts? 💭
It would appear that Google's web crawlers are ignoring the robots.txt that I have on https://git.mills.io/robots.txt with content:


User-agent: *
Disallow: /


Evidence attached (_see screenshots_): -- I _think_ its the the Small Web community band together and file a class action suit(s) against Microsoft.com Google.com and any other assholes out there (OpenAI?) that violate our rights and ignore requests to be "polite" on the web. Thoughts? 💭
* 8a77b64 - (HEAD -> master) edge: Ban Google's ASN (21 seconds ago) <James Mills>
* 8a77b64 - (HEAD -> master) edge: Ban Google's ASN (21 seconds ago) <James Mills>
* 185325d - (HEAD -> master) edge: Ban Alibaba (38 seconds ago) <James Mills>

fark me 🤦‍♂️ Alibaba, CN has been hitting my Gopher proxy quite hard as well. Fuck'n hell! 🔥
* 185325d - (HEAD -> master) edge: Ban Alibaba (38 seconds ago) <James Mills>

fark me 🤦‍♂️ Alibaba, CN has been hitting my Gopher proxy quite hard as well. Fuck'n hell! 🔥
@prologic Have you tried Google's robots.txt report? https://support.google.com/webmasters/answer/6062598?hl=en . I would expect Google to be pretty good about this sort of thing. If you have the energy to dig into it and, for example, post on support.google.com, I'd be curious to hear what you find out.
@falsifian In the process of 🤞