robots.txt
, right?
User-agent: Googlebot
Disallow: /
# I am the Watcher. I am your guide through this vast new twtiverse. # # Usage: # https://watcher.sour.is/api/plain/users View list of users and latest twt date. # https://watcher.sour.is/api/plain/twt View all twts. # https://watcher.sour.is/api/plain/mentions?uri=:uri View all mentions for uri. # https://watcher.sour.is/api/plain/conv/:hash View all twts for a conversation subject. # # Options: # uri Filter to show a specific users twts. # offset Start index for quey. # limit Count of items to return (going back in time). # # twt range = 1 15647 # self = https://watcher.sour.is?uri=https://www.uninformativ.de/twtxt.txt&offset=8506 # next = https://watcher.sour.is?uri=https://www.uninformativ.de/twtxt.txt&offset=8606 # prev = https://watcher.sour.is?uri=https://www.uninformativ.de/twtxt.txt&offset=8406
robots.txt
, right?
User-agent: Googlebot
Disallow: /
<a> </a>
results in an empty string, others don’t. Well, .trim()
it is, I guess.<a> </a>
results in an empty string, others don’t. Well, .trim()
it is, I guess.<a> </a>
results in an empty string, others don’t. Well, .trim()
it is, I guess.twtxt.org
before, either. Maybe someone is trying out something new? 🤔
twtxt.org
before, either. Maybe someone is trying out something new? 🤔
twtxt.org
before, either. Maybe someone is trying out something new? 🤔