Aqui fica o pouquíssimo conhecido projecto Brasileiro "dshock", com a música "I Want a Pizza Party":
https://archive.org/details/Dshock-BreathingsWeak/01.Dshock-BreathingsWeak-IWantAPizzaParty.mp3
Aqui fica o pouquíssimo conhecido projecto Brasileiro "dshock", com a música "I Want a Pizza Party":
https://archive.org/details/Dshock-BreathingsWeak/01.Dshock-BreathingsWeak-IWantAPizzaParty.mp3
If it's a problem that ruins your experience, don't hesitate to create an issue.
If it's a problem that ruins your experience, don't hesitate to create an issue.
#running
#running
#running


" "
instead of "T"
" "
instead of "T"
https://git.mills.io/yarnsocial/go-lextwt/pulls/32
https://git.mills.io/yarnsocial/go-lextwt/pulls/32
and the second? i get POST errors when i try to submit the webform.
and the second? i get POST errors when i try to submit the webform.
g
twtxt---profile-layout: Wrong type argument: char-or-string-p, ("https://aelaraji.com/twtxt.txt" "gemini://box.aelaraji.com/twtxt.txt" "gopher://box.aelaraji.com/0/twtxt.txt")
g
twtxt---profile-layout: Wrong type argument: char-or-string-p, ("https://aelaraji.com/twtxt.txt" "gemini://box.aelaraji.com/twtxt.txt" "gopher://box.aelaraji.com/0/twtxt.txt")
User-agent: *
Disallow: /
now i some middleware that looks at the header, and if they are polite enough to include "bot" in the user agent, they politely get a 404 response.
User-agent: *
Disallow: /
now i some middleware that looks at the header, and if they are polite enough to include "bot" in the user agent, they politely get a 404 response.
User-agent: *
Disallow: /
now i some middleware that looks at the header, and if they are polite enough to include "bot" in the user agent, they politely get a 404 response.
#AI #Liability
https://www.euractiv.com/section/tech/news/commission-withdraws-ai-liability-directive-after-vance-attack-on-regulation/
#AI #Liability
https://www.euractiv.com/section/tech/news/commission-withdraws-ai-liability-directive-after-vance-attack-on-regulation/
User-agent: *
Disallow: /
seems to work. Or maybe those bastards change their user agent and claim to be someone nice. In any case, I just added a bunch of
location = /robots.txt {
add_header Content-Type text/plain;
return 200 "User-agent: *\\nDisallow: /\\n";
}
in my nginx config. No need for any bot to visit, crawl and index most of my sites.=
User-agent: *
Disallow: /
seems to work. Or maybe those bastards change their user agent and claim to be someone nice. In any case, I just added a bunch of
location = /robots.txt {
add_header Content-Type text/plain;
return 200 "User-agent: *\nDisallow: /\n";
}
in my nginx config. No need for any bot to visit, crawl and index most of my sites.=