# I am the Watcher. I am your guide through this vast new twtiverse.
# 
# Usage:
#     https://watcher.sour.is/api/plain/users              View list of users and latest twt date.
#     https://watcher.sour.is/api/plain/twt                View all twts.
#     https://watcher.sour.is/api/plain/mentions?uri=:uri  View all mentions for uri.
#     https://watcher.sour.is/api/plain/conv/:hash         View all twts for a conversation subject.
# 
# Options:
#     uri     Filter to show a specific users twts.
#     offset  Start index for quey.
#     limit   Count of items to return (going back in time).
# 
# twt range = 1 60139
# self = https://watcher.sour.is?uri=https://twtxt.net/user/prologic/twtxt.txt&offset=59891
# next = https://watcher.sour.is?uri=https://twtxt.net/user/prologic/twtxt.txt&offset=59991
# prev = https://watcher.sour.is?uri=https://twtxt.net/user/prologic/twtxt.txt&offset=59791
@bender It's the blind abiding that worries me a lot. I'm still reading his letter, plus some other similar things I've come across I'll share later. It's all fucking horrifying just how fucking goddamn corrupted everything is lately πŸ€¦β€β™‚οΈ
@bender Not my doing. That's the Markdown parser/render. Not Goldmark (yet).
On my hit list of assholes tech giants that break the rules and are bad web citizens:

- Microsoft
- Google
- Alibaba
- Open AI

- _more to come..._
@bender Not with that kind of attitude 🀣

> I don’t think it will have any impact
Can't seem to prevent "bad bots" from aggressively hitting your shitβ„’ πŸ€¦β€β™‚οΈ
Bloody hell πŸ€¦β€β™‚οΈπŸ€¦β€β™‚οΈ

$ jq -r --arg host "gopher.mills.io" '. | select(.request.host==$host) | "\\(.request.client_ip) \\(.request.uri) \\(.request.headers["User-Agent"])"' mills.io.log-au | while IFS=$' ' read -r ip uri ua; do asn="$(geoip -a "$ip")"; echo "$asn $ip $uri $ua"; done | grep -E '^45102.*' | sort | head
45102 47.251.70.245 /gopher.floodgap.com/0/feeds/democracynow/2015/Oct/14/0 ["Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36"]
45102 47.251.84.25 /gopher.floodgap.com/0/feeds/voaheadlines/2014/Mar/09/voanews.com-content-article-1867433.html ["Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3F0692937396569A52972EB2 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3F9657307A96569A52974634 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3FB7571C7896569A529E6603 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3FB75EF81296569A529E6617 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3FC6564ADB96569A5A9E660C ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
Bloody hell πŸ€¦β€β™‚οΈπŸ€¦β€β™‚οΈ

$ jq -r --arg host "gopher.mills.io" '. | select(.request.host==$host) | "\(.request.client_ip) \(.request.uri) \(.request.headers["User-Agent"])"' mills.io.log-au | while IFS=$' ' read -r ip uri ua; do asn="$(geoip -a "$ip")"; echo "$asn $ip $uri $ua"; done | grep -E '^45102.*' | sort | head
45102 47.251.70.245 /gopher.floodgap.com/0/feeds/democracynow/2015/Oct/14/0 ["Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36"]
45102 47.251.84.25 /gopher.floodgap.com/0/feeds/voaheadlines/2014/Mar/09/voanews.com-content-article-1867433.html ["Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3F0692937396569A52972EB2 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3F9657307A96569A52974634 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3FB7571C7896569A529E6603 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3FB75EF81296569A529E6617 ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
45102 47.82.10.106 /gopher.viste.fr/1/OnlineTools/hangman.cgi%3FC6564ADB96569A5A9E660C ["Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 Edg/114.0.1823.43"]
@anth πŸ‘€
@lyse Hmmm 🧐
@lyse Hmmm 🧐
@movq fuck 🀣
@movq fuck 🀣
* 185325d - (HEAD -> master) edge: Ban Alibaba (38 seconds ago) <James Mills>

fark me πŸ€¦β€β™‚οΈ Alibaba, CN has been hitting my Gopher proxy quite hard as well. Fuck'n hell! πŸ”₯
* 185325d - (HEAD -> master) edge: Ban Alibaba (38 seconds ago) <James Mills>

fark me πŸ€¦β€β™‚οΈ Alibaba, CN has been hitting my Gopher proxy quite hard as well. Fuck'n hell! πŸ”₯
* 8a77b64 - (HEAD -> master) edge: Ban Google's ASN (21 seconds ago) <James Mills>
* 8a77b64 - (HEAD -> master) edge: Ban Google's ASN (21 seconds ago) <James Mills>
It would appear that Google's web crawlers are ignoring the robots.txt that I have on https://git.mills.io/robots.txt with content:


User-agent: *
Disallow: /


Evidence attached (_see screenshots_): -- I _think_ its the the Small Web community band together and file a class action suit(s) against Microsoft.com Google.com and any other assholes out there (OpenAI?) that violate our rights and ignore requests to be "polite" on the web. Thoughts? πŸ’­
It would appear that Google's web crawlers are ignoring the robots.txt that I have on https://git.mills.io/robots.txt with content:


User-agent: *
Disallow: /


Evidence attached (_see screenshots_): -- I _think_ its the the Small Web community band together and file a class action suit(s) against Microsoft.com Google.com and any other assholes out there (OpenAI?) that violate our rights and ignore requests to be "polite" on the web. Thoughts? πŸ’­
I got promoted today to try using Passkeys on Github.com. Fine πŸ˜… I did that, but I discovered that when you use your Passkey to login, Chrome prompts you for your device's password (_i.e: The password you use to login to your macOS Desktop_). Is that intentional? Kind of defeats the point no? I mean sure, now there's no Password being transmitted, stored or presented to Github.com but still, all an attacker has to do is somehow be on my device and know my login password to my device right? Is that better or worse? πŸ€”
I got promoted today to try using Passkeys on Github.com. Fine πŸ˜… I did that, but I discovered that when you use your Passkey to login, Chrome prompts you for your device's password (_i.e: The password you use to login to your macOS Desktop_). Is that intentional? Kind of defeats the point no? I mean sure, now there's no Password being transmitted, stored or presented to Github.com but still, all an attacker has to do is somehow be on my device and know my login password to my device right? Is that better or worse? πŸ€”
Our stupid fucking 🀬 cat 🐱 keeps pissing 🚽 everywhere in our house 🏠
Our stupid fucking 🀬 cat 🐱 keeps pissing 🚽 everywhere in our house 🏠
@lyse Ouch πŸ€•
@lyse Ouch πŸ€•
@sorenpeter Nice 😊
@sorenpeter Nice 😊
@lyse Bur I can't think of what's changed to cause this? 🧐
@lyse Bur I can't think of what's changed to cause this? 🧐
@movq Hahahaha 🀣
@movq Hahahaha 🀣
@aelaraji "replies" 🀣
@aelaraji "replies" 🀣
@lyse Duplicates again where two days ago? I don't see this anywhere (_unless I'm blind!_)
@lyse Duplicates again where two days ago? I don't see this anywhere (_unless I'm blind!_)
@lyse Agree. I'm not sure we should lax the timestamp format at all IMO. What @xuu has found is kind of nuts haha πŸ˜† However I do think we should relax the \\t separator between <timestamp> and <content>. Let users use _any_ valid whitespace here that isn't a newline or carriage return.
@lyse Agree. I'm not sure we should lax the timestamp format at all IMO. What @xuu has found is kind of nuts haha πŸ˜† However I do think we should relax the \t separator between <timestamp> and <content>. Let users use _any_ valid whitespace here that isn't a newline or carriage return.
@lyse Agree. I'm not sure we should lax the timestamp format at all IMO. What @xuu has found is kind of nuts haha πŸ˜† However I do think we should relax the \t separator between <timestamp> and <content>. Let users use _any_ valid whitespace here that isn't a newline or carriage return.
I _think_ the author is a bit out of their depth here. A linear feed isn't quite what the author seems to be modelling in their view of the problems they observe and describe. A linear feed has a beginning and an end. You can (_ideally client-side_) sort it by Date, or by Subject like we do with our Twtxt clients. A Tree-structure isn't what the author thinks either, this is more the structure that forms after you introducing some kind of "threading model". The main problem with any kind of information system that tries to figure out algorithmically what you want to "see" is that type of interface has no start and no end. SO you end up with a "scroll of doom".
I _think_ the author is a bit out of their depth here. A linear feed isn't quite what the author seems to be modelling in their view of the problems they observe and describe. A linear feed has a beginning and an end. You can (_ideally client-side_) sort it by Date, or by Subject like we do with our Twtxt clients. A Tree-structure isn't what the author thinks either, this is more the structure that forms after you introducing some kind of "threading model". The main problem with any kind of information system that tries to figure out algorithmically what you want to "see" is that type of interface has no start and no end. SO you end up with a "scroll of doom".
The article discusses the challenges posed by linear social media feeds, which often lead to disengagement and difficulty in prioritizing content from friends due to constant scrolling. The author proposes an alternative approach using a daily feed structure per day, which organizes posts by date, allowing for easier prioritization and reducing mindless scrolling.

Key Points:

1. Linear Feed Problem: Linear feeds present a long list of posts without prioritization, forcing users to scroll endlessly to catch up on friends' content. This can lead to addiction and disengagement.

2. Proposed Alternative (Tree Structure): The daily feed structure organizes posts by day, enabling users to prioritize updates from friends who post infrequently while reducing scrolling effort.

3. Mastodon Experience: The author's experience with Mastodon highlighted its effectiveness in allowing content prioritization and managing social media usage without dependency on algorithms.

4. Challenges and Considerations:
- Implementation Challenges: Creating a daily feed system involves organizing content effectively and ensuring users can prioritize posts.
- Platform Support: Current platforms may not have APIs conducive to such changes, making it difficult to implement without significant technical changes.
- Engagement Metrics: The impact on engagement metrics needs to be considered, as traditional metrics might be misinterpreted in a tree structure.

5. Potential Applications Beyond Social Media: This approach could empower users by giving control over content consumption and aiding in balancing social media use without overwhelming them with information.

6. Future Directions: The author hopes for improvements in alternative platforms' feed systems and engagement metrics, potentially through more interactive content models or changes in APIs.

In conclusion, the article emphasizes the importance of providing users with control over their content consumption, moving away from linear feed
The article discusses the challenges posed by linear social media feeds, which often lead to disengagement and difficulty in prioritizing content from friends due to constant scrolling. The author proposes an alternative approach using a daily feed structure per day, which organizes posts by date, allowing for easier prioritization and reducing mindless scrolling.

Key Points:

1. Linear Feed Problem: Linear feeds present a long list of posts without prioritization, forcing users to scroll endlessly to catch up on friends' content. This can lead to addiction and disengagement.

2. Proposed Alternative (Tree Structure): The daily feed structure organizes posts by day, enabling users to prioritize updates from friends who post infrequently while reducing scrolling effort.

3. Mastodon Experience: The author's experience with Mastodon highlighted its effectiveness in allowing content prioritization and managing social media usage without dependency on algorithms.

4. Challenges and Considerations:
- Implementation Challenges: Creating a daily feed system involves organizing content effectively and ensuring users can prioritize posts.
- Platform Support: Current platforms may not have APIs conducive to such changes, making it difficult to implement without significant technical changes.
- Engagement Metrics: The impact on engagement metrics needs to be considered, as traditional metrics might be misinterpreted in a tree structure.

5. Potential Applications Beyond Social Media: This approach could empower users by giving control over content consumption and aiding in balancing social media use without overwhelming them with information.

6. Future Directions: The author hopes for improvements in alternative platforms' feed systems and engagement metrics, potentially through more interactive content models or changes in APIs.

In conclusion, the article emphasizes the importance of providing users with control over their content consumption, moving away from linear feed
What exactly is a linear feed? πŸ€”
What exactly is a linear feed? πŸ€”
@andros My first point of advice is to stop everything and measure all the important critical user journeys. Design and Build Service Level Objectives for each and every part of the system you can find that _any_ user cares about.
@andros My first point of advice is to stop everything and measure all the important critical user journeys. Design and Build Service Level Objectives for each and every part of the system you can find that _any_ user cares about.
2024-08-04T11:24:34Z was the last occurrence of this.
2024-08-04T11:24:34Z was the last occurrence of this.
@bender @lyse This bug was fixed back in September last year. But the brackets still appear in my current feed. is that what the issue is? πŸ€”
@bender @lyse This bug was fixed back in September last year. But the brackets still appear in my current feed. is that what the issue is? πŸ€”
@lyse @andros Are we talking about yarnd here? Hmm? πŸ€” I've _thought_ about a "read flag" but I just haven't bothered so far...
@lyse @andros Are we talking about yarnd here? Hmm? πŸ€” I've _thought_ about a "read flag" but I just haven't bothered so far...
@johanbove And what are the results so far? πŸ€”
@johanbove And what are the results so far? πŸ€”
The cache is only suppose to be for 120s though, but I reckon the caching layer is just stupid? πŸ€” (_and maybe buggy_)?
The cache is only suppose to be for 120s though, but I reckon the caching layer is just stupid? πŸ€” (_and maybe buggy_)?
I need to understand how the caching is at play here at the edge. I hit CTRL+R on @mckinley's OP to get the _right_ subject reply after poking at the underlying HTML elements on the page.
I need to understand how the caching is at play here at the edge. I hit CTRL+R on @mckinley's OP to get the _right_ subject reply after poking at the underlying HTML elements on the page.
@bender To be fair, I do this in my "spare time" πŸ˜…
@bender To be fair, I do this in my "spare time" πŸ˜…
@mckinley Ahh. I _think_ this is some kind of weird caching issue at my edge! 😱
@mckinley Ahh. I _think_ this is some kind of weird caching issue at my edge! 😱
@mckinley I'm worried we're _really_ approaching a point where we need to adapt the hashing algorithm and expand the no. of bits. Is it at all possible something else is going on here though? 🀞
@mckinley I'm worried we're _really_ approaching a point where we need to adapt the hashing algorithm and expand the no. of bits. Is it at all possible something else is going on here though? 🀞
@lyse True, but we should also consider building tolerant "systems" πŸ˜‰
@lyse True, but we should also consider building tolerant "systems" πŸ˜‰
@algorave.dk Hello πŸ‘‹
@algorave.dk Hello πŸ‘‹
@movq I _believe_ that's the same client I've used in the past too. Works nicely on macOS πŸ‘ŒπŸ‘Œ
@movq I _believe_ that's the same client I've used in the past too. Works nicely on macOS πŸ‘ŒπŸ‘Œ
@xuu Err I think that's the problem somehow hmmm 🧐
@xuu Err I think that's the problem somehow hmmm 🧐
@lyse Where? 🧐
@lyse Where? 🧐
@xuu d
@xuu d
@bmallred Similar story here 😱
@bmallred Similar story here 😱
@jost "right person", who's to say? And not "AI" but more-generally search. Otherwise I agree ☝️
@jost "right person", who's to say? And not "AI" but more-generally search. Otherwise I agree ☝️
🀣🀣🀣
🀣🀣🀣
Gotta get faster disks man 🀣
Gotta get faster disks man 🀣
@xuu Why? To both issues:
@xuu Why? To both issues:
@jost What is this that you maintain? πŸ€”
@jost What is this that you maintain? πŸ€”
Whoohoo!


root@vz3:~# free -h
              total        used        free      shared  buff/cache   available
Mem:           62Gi        10Gi        49Gi        72Mi       2.9Gi        51Gi
Swap:            0B          0B          0B

Whoohoo!


root@vz3:~# free -h
              total        used        free      shared  buff/cache   available
Mem:           62Gi        10Gi        49Gi        72Mi       2.9Gi        51Gi
Swap:            0B          0B          0B

@andros Nice! πŸ‘
@andros Nice! πŸ‘
@andros What's the problem sorry? haha 🀣
@andros What's the problem sorry? haha 🀣
@xuu da fuq?!
@xuu da fuq?!
Look at the size of this coffee!!! 😱
Look at the size of this coffee!!! 😱
πŸ’­ Remember kids πŸ§’

The "Cloud" is just someone else's computer(s).
πŸ’­ Remember kids πŸ§’

The "Cloud" is just someone else's computer(s).
hmm this isnt right..