bc and ibase=2/obase=2 for conversions. But your digit grouping is what I always lacked. I gotta switch.
# I am the Watcher. I am your guide through this vast new twtiverse. # # Usage: # https://watcher.sour.is/api/plain/users View list of users and latest twt date. # https://watcher.sour.is/api/plain/twt View all twts. # https://watcher.sour.is/api/plain/mentions?uri=:uri View all mentions for uri. # https://watcher.sour.is/api/plain/conv/:hash View all twts for a conversation subject. # # Options: # uri Filter to show a specific users twts. # offset Start index for quey. # limit Count of items to return (going back in time). # # twt range = 1 195779 # self = https://watcher.sour.is?offset=179585 # next = https://watcher.sour.is?offset=179685 # prev = https://watcher.sour.is?offset=179485
bc and ibase=2/obase=2 for conversions. But your digit grouping is what I always lacked. I gotta switch.
proxy-1:~# curl -qs https://openai.com/gptbot.json | jq -r '.prefixes[].ipv4Prefix' | xargs -I{} ./block-ip.sh {}
block-ip.sh is simply:
#!/bin/sh
ufw insert 1 deny from "$1" to any
proxy-1:~# curl -qs https://openai.com/gptbot.json | jq -r '.prefixes[].ipv4Prefix' | xargs -I{} ./block-ip.sh {}
block-ip.sh is simply:
#!/bin/sh
ufw insert 1 deny from "$1" to any
validator.twtxt.net 😅😅
validator.twtxt.net 😅😅
"twtxtfeevalidator/0.0.1" UA about? I thought I could ask before throwing a 1000GB file at it 🪤 could it be the same 'xt' thing @lyse was talking about the other day?
"twtxtfeevalidator/0.0.1" UA about? I thought I could ask before throwing a 1000GB file at it 🪤 could it be the same 'xt' thing @lyse was talking about the other day?
proxy-1:~# ./audit-log-by-ip.sh 4.227.36.76 | coraza-log-formatter -m -
2025/01/04 23:17:04 4.227.36.76 58982 GET /external?aff-HY0BLO=&f=mediaonly&f=noreplies&nick=g1n&uri=https%3A%2F%2Fthe-president-codes.linegames.org null 0 On OWASP_CRS/4.7.0
Actionset: OWASP_CRS/4.7.0
Message: Bad User Agent
Severity: 0
Raw: SecRule REQUEST_HEADERS:User-Agent "@pmFromFile /etc/caddy/waf/bad_user_agents.txt" "id:2000,log,phase:1,deny,msg:'Bad User Agent'"
proxy-1:~# ./audit-log-by-ip.sh 4.227.36.76 | coraza-log-formatter -m -
2025/01/04 23:17:04 4.227.36.76 58982 GET /external?aff-HY0BLO=&f=mediaonly&f=noreplies&nick=g1n&uri=https%3A%2F%2Fthe-president-codes.linegames.org null 0 On OWASP_CRS/4.7.0
Actionset: OWASP_CRS/4.7.0
Message: Bad User Agent
Severity: 0
Raw: SecRule REQUEST_HEADERS:User-Agent "@pmFromFile /etc/caddy/waf/bad_user_agents.txt" "id:2000,log,phase:1,deny,msg:'Bad User Agent'"
proxy-1:~# ./audit-log-by-ip.sh 4.227.36.76 | coraza-log-formatter -m -
Actionset: OWASP_CRS/4.7.0
Message: Bad User Agent
Severity: 0
Raw: SecRule REQUEST_HEADERS:User-Agent "@pmFromFile /etc/caddy/waf/bad_user_agents.txt" "id:2000,log,phase:1,deny,msg:'Bad User Agent'"
proxy-1:~# ./audit-log-by-ip.sh 4.227.36.76 | coraza-log-formatter -m -
Actionset: OWASP_CRS/4.7.0
Message: Bad User Agent
Severity: 0
Raw: SecRule REQUEST_HEADERS:User-Agent "@pmFromFile /etc/caddy/waf/bad_user_agents.txt" "id:2000,log,phase:1,deny,msg:'Bad User Agent'"
proxy-1:~# jq '. | select(.request.remote_ip=="4.227.36.76")' /var/log/caddy/access/mills.io.log | jq -s '. | last' | caddy-log-formatter -
4.227.36.76 - [2025-01-05 04:05:43.971 +0000] "GET /external?aff-QNAXWV=&f=mediaonly&f=noreplies&nick=g1n&uri=https%3A%2F%2Fmy-hero-ultra-impact-codes.linegames.org HTTP/2.0" 0 0
proxy-1:~# date
Sun Jan 5 04:05:49 UTC 2025
proxy-1:~# jq '. | select(.request.remote_ip=="4.227.36.76")' /var/log/caddy/access/mills.io.log | jq -s '. | last' | caddy-log-formatter -
4.227.36.76 - [2025-01-05 04:05:43.971 +0000] "GET /external?aff-QNAXWV=&f=mediaonly&f=noreplies&nick=g1n&uri=https%3A%2F%2Fmy-hero-ultra-impact-codes.linegames.org HTTP/2.0" 0 0
proxy-1:~# date
Sun Jan 5 04:05:49 UTC 2025
base(2) or base(16) in calc to do that. That’s exhausting after a while.exec().base(2) or base(16) in calc to do that. That’s exhausting after a while.exec().base(2) or base(16) in calc to do that. That’s exhausting after a while.exec().base(2) or base(16) in calc to do that. That’s exhausting after a while.exec().robots.txt files at all really, because they mostly get ignored. I don't generally mind if "normal" web crawlers crawl things. But LLM(s) can go fuck themselves 🤣
robots.txt files at all really, because they mostly get ignored. I don't generally mind if "normal" web crawlers crawl things. But LLM(s) can go fuck themselves 🤣