# I am the Watcher. I am your guide through this vast new twtiverse.
# 
# Usage:
#     https://watcher.sour.is/api/plain/users              View list of users and latest twt date.
#     https://watcher.sour.is/api/plain/twt                View all twts.
#     https://watcher.sour.is/api/plain/mentions?uri=:uri  View all mentions for uri.
#     https://watcher.sour.is/api/plain/conv/:hash         View all twts for a conversation subject.
# 
# Options:
#     uri     Filter to show a specific users twts.
#     offset  Start index for quey.
#     limit   Count of items to return (going back in time).
# 
# twt range = 1 196275
# self = https://watcher.sour.is?offset=174731
# next = https://watcher.sour.is?offset=174831
# prev = https://watcher.sour.is?offset=174631
@doesnm May I ask which hardware you have? SSD or HDD? How much RAM?

I might be spoiled and very privileged here. Even though my PC is almost 12 years old now, it *does* have an SSD and tons of RAM (i.e., lots of I/O cache), so starting mutt and opening the mailbox takes about 1-2 seconds here. I hardly even notice it. But I understand that not everybody has fast machines like that. 🫤
@doesnm May I ask which hardware you have? SSD or HDD? How much RAM?

I might be spoiled and very privileged here. Even though my PC is almost 12 years old now, it *does* have an SSD and tons of RAM (i.e., lots of I/O cache), so starting mutt and opening the mailbox takes about 1-2 seconds here. I hardly even notice it. But I understand that not everybody has fast machines like that. 🫤
@doesnm May I ask which hardware you have? SSD or HDD? How much RAM?

I might be spoiled and very privileged here. Even though my PC is almost 12 years old now, it *does* have an SSD and tons of RAM (i.e., lots of I/O cache), so starting mutt and opening the mailbox takes about 1-2 seconds here. I hardly even notice it. But I understand that not everybody has fast machines like that. 🫤
@doesnm here! create a $HOME/.cache/mutt/twtxt/ directory for example and then add this set header_cache = $HOME/.cache/mutt/twtxt/ to your muttrc (the one you have set up for or use with jenny if you're using different ones). That's what helped me with that.

Ref: http://www.mutt.org/doc/manual/#header-caching
@doesnm here! create a $HOME/.cache/mutt/twtxt/ directory for example and then add this set header_cache = $HOME/.cache/mutt/twtxt/ to your muttrc (the one you have set up for or use with jenny if you're using different ones). That's what helped me with that.

Ref: http://www.mutt.org/doc/manual/#header-caching
@doesnm here! create a $HOME/.cache/mutt/twtxt/ directory for example and then add this set header_cache = $HOME/.cache/mutt/twtxt/ to your muttrc (the one you have set up for or use with jenny if you're using different ones). That's what helped me with that.

Ref: http://www.mutt.org/doc/manual/#header-caching
@bender Right, it fetches archived feeds on the first run (because it wants to grab all twts from that feed). Later on, it keeps track of the last seen twt hash per feed – if it cannot find that hash anymore, then it concludes that the feed must have been rotated/archived, so it fetches some/all archived feeds again until it finds that twt hash. Easy, right? 😅
@bender Right, it fetches archived feeds on the first run (because it wants to grab all twts from that feed). Later on, it keeps track of the last seen twt hash per feed – if it cannot find that hash anymore, then it concludes that the feed must have been rotated/archived, so it fetches some/all archived feeds again until it finds that twt hash. Easy, right? 😅
@bender Right, it fetches archived feeds on the first run (because it wants to grab all twts from that feed). Later on, it keeps track of the last seen twt hash per feed – if it cannot find that hash anymore, then it concludes that the feed must have been rotated/archived, so it fetches some/all archived feeds again until it finds that twt hash. Easy, right? 😅
@bender Right, it fetches archived feeds on the first run (because it wants to grab all twts from that feed). Later on, it keeps track of the last seen twt hash per feed – if it cannot find that hash anymore, then it concludes that the feed must have been rotated/archived, so it fetches some/all archived feeds again until it finds that twt hash. Easy, right? 😅
@prologic? or not @prologic, that's the question!
@prologic but I have placed question marks, commas, and periods after @prologic, and it has worked fine (this one, for example).
@bender Yeah I'm not even sure @bender? works (for example) but @bender does I think.
@bender Yeah I'm not even sure @bender? works (for example) but @bender does I think.
I need to wait 30 seconds every start of mutt with 8 feeds
@prologic yes.
@bender You mean @movq??
@bender You mean @movq??
@prologic, see broken mention above. Yarn is extremely inconsistent when mentioning.
@doesnm right, jenny isn't the problem, it's your platform of choice. The fetching of archives doesn't happen all time (once only, right @movq?), but yes, depending on the amount of feeds you follow that first time might take a while.
@bender I barely used it myself, I get why we built it (link verification), but I'd rather just keep the other feature that strips tracking params on links.
@bender I barely used it myself, I get why we built it (link verification), but I'd rather just keep the other feature that strips tracking params on links.
@prologic yeah, streamline. Make Yarn Great Again! 🤭
@bender Bahahahahaha
@bender Bahahahahaha
@prologic all this years preparing to be the AI you have become, and the justification you pick is 'being "human"'? 😂
@prologic all these years preparing to be the AI you have become, and the justification you pick is 'being "human"'? 😂
@aelaraji righto, thanks!
@bender My apologies 😅 I was just being "human" and saying "over there @aelaraji said this" 🤣
@bender My apologies 😅 I was just being "human" and saying "over there @aelaraji said this" 🤣
@thecanine I _think_ I might just remove this feature entirely. What do you think? The link verification think that is.
@thecanine I _think_ I might just remove this feature entirely. What do you think? The link verification think that is.
[47°09′24″S, 126°43′07″W] Non-significative results -- sampling finished
@bender Here #sgvko5a 😁
@bender Here #sgvko5a 😁
@bender Here #sgvko5a 😁
@prologic where did he pointed it out? I don’t see a twtxt from him on this yarn.
@thecanine Uggh no, that's not right. That seems like a bug with the external ink verification feature. Can you go into your Settings and turn that off and try again? 🙏
@thecanine Uggh no, that's not right. That seems like a bug with the external ink verification feature. Can you go into your Settings and turn that off and try again? 🙏
Tried migrating to jenny... So seems it not suitable for my phone. Fetch command fetched archived feeds so i have 37k+ entries and mutt hangs for several seconds for loading this. Also i don't like hardcoded paths for config and follow file
@bender Thank you! 🙏 I'll see about fixing this. If you can submit a PR maybe that would be good! 👍
@bender Thank you! 🙏 I'll see about fixing this. If you can submit a PR maybe that would be good! 👍
@falsifian Thanks 🙏
@falsifian Thanks 🙏
@cuaxolotl Okay you are right. I'm not being very specific, but intentionally very broad and my statement is generalized that's true. There are so many examples and issues to talk about, if we did, we'd be here a while 😅 Let's just agree that we both agree on extremism not really being a good thing and leave it at that 🤣
@cuaxolotl Okay you are right. I'm not being very specific, but intentionally very broad and my statement is generalized that's true. There are so many examples and issues to talk about, if we did, we'd be here a while 😅 Let's just agree that we both agree on extremism not really being a good thing and leave it at that 🤣
@Codebuzz It currently takes my yarnd pod here around ~2m on average to fetch, process and cache ~700 feeds.
@Codebuzz It currently takes my yarnd pod here around ~2m on average to fetch, process and cache ~700 feeds.
As @aelaraji points out, this @<bender bender@twtxt.net> is currently wrong. The 2nd part of a mention is currently required to be a full absolute URI.
As @aelaraji points out, this @<bender bender@twtxt.net> is currently wrong. The 2nd part of a mention is currently required to be a full absolute URI.
@Codebuzz Here you go:


$ bat https://twtxt.net/twt/dn2zlga | jq '.'
{
  "twter": {
    "nick": "Codebuzz",
    "uri": "https://www.codebuzz.nl/twtxt.txt",
    "avatar": "https://www.codebuzz.nl/twtxt-avatar-800.jpg"
  },
  "text": "(#q5rg3ea) Hey, @<bender bender@twtxt.net> I know. Just wondering the kind of apps or software and how you all stay up to date in conversations. Is it through webmentions?",
  "created": "2024-10-30T22:12:24Z",
  "markdownText": "(#q5rg3ea) Hey, @<bender bender@twtxt.net> I know. Just wondering the kind of apps or software and how you all stay up to date in conversations. Is it through webmentions?",
  "hash": "dn2zlga",
  "tags": [
    "q5rg3ea"
  ],
  "subject": "(#q5rg3ea)",
  "mentions": [],
  "links": []
}
@Codebuzz Here you go:


$ bat https://twtxt.net/twt/dn2zlga | jq '.'
{
  "twter": {
    "nick": "Codebuzz",
    "uri": "https://www.codebuzz.nl/twtxt.txt",
    "avatar": "https://www.codebuzz.nl/twtxt-avatar-800.jpg"
  },
  "text": "(#q5rg3ea) Hey, @<bender bender@twtxt.net> I know. Just wondering the kind of apps or software and how you all stay up to date in conversations. Is it through webmentions?",
  "created": "2024-10-30T22:12:24Z",
  "markdownText": "(#q5rg3ea) Hey, @<bender bender@twtxt.net> I know. Just wondering the kind of apps or software and how you all stay up to date in conversations. Is it through webmentions?",
  "hash": "dn2zlga",
  "tags": [
    "q5rg3ea"
  ],
  "subject": "(#q5rg3ea)",
  "mentions": [],
  "links": []
}
@thecanine It works. What's this pop up you're seeing?
@thecanine It works. What's this pop up you're seeing?
@aelaraji I didn't look, so that's why it's not rendering because it's not an actual URL.
@aelaraji I didn't look, so that's why it's not rendering because it's not an actual URL.
@movq Congrats, this is cool! :-) When I returned yesterday, I saw also a bunch of those.
@bender That mention looks like @<bender bender@twtxt.net> on my side ...
@bender That mention looks like @<bender bender@twtxt.net> on my side ...
@bender That mention looks like @<bender bender@twtxt.net> on my side ...
@rrraksamam have the Invidious instances (alternative front-end to the platform) stop working? Otherwise, I have just figured my way around navigating PeerTube content and I wouldn't even miss it if YT had to disappear from the internet.
@rrraksamam have the Invidious instances (alternative front-end to the platform) stop working? Otherwise, I have just figured my way around navigating PeerTube content and I wouldn't even miss it if YT had to disappear from the internet.
@rrraksamam have the Invidious instances (alternative front-end to the platform) stop working? Otherwise, I have just figured my way around navigating PeerTube content and I wouldn't even miss it if YT had to disappear from the internet.
[47°09′42″S, 126°43′04″W] Analyzing samples
[47°09′35″S, 126°43′20″W] Re-taking samples
Lots of tricksters at my door tonight! Happy Halloween.
nilFM now has net radio - finally! | https://nilfm.cc/grimoire.html
@Codebuzz you replied to me, but the reply was just an , nothing else (the whole handle was missing).

There are no web mentions here, and no notifications. It isn’t Mastodon; if you want to see if someone wrote something new, or replied to you, you need to open your client.
🧮 USERS:1 FEEDS:2 TWTS:1140 ARCHIVED:80224 CACHE:2551 FOLLOWERS:17 FOLLOWING:14
Maybe. If, I would miss some content they have, but a lot of it I won't. I am good with Rumble, Bitchute and alternatives. If they do, views will go down too, creators might feel the pain and double up on alternatives.
Well, having added more accounts, I have noticed aggregating (+/- 15min refresh, and caching afterwards) is still a hefty proces. Something I am pondering on how to do that one better.
Ehm.. you are now asking above my paygrade 🤣 I really don't know. Haven't look into webmentions, let alone how it is implemented in Timeline. What happened?
Wouldn't you rather have work and private seperated? Any thought behind this decission? I like tags, like Gmail does it. I still think mail needs a big rethink. It's too prominent in life, to be this archaic.
You were mentioned in: https://www.codebuzz.nl/twtxt.txt#:~:text=2024-10-31T22:32:46Z,%0A" rel=noopener>https://www.codebuzz.nl/twtxt.txt#:~:text=2024-10-31T22:32:46Z>
> (#q5rg3ea) Some interesting responses, hearing some with (intentional) manual labour involved. I am modifying
@sorenpeter Timeline. Still have things I want, and also pondering what would help others.
Some interesting responses, hearing some with (intentional) manual labour involved. I am modifying @sorenpeter Timeline. Still have things I want, and also pondering what would help others.
@prologic that's still a generalization. which religion, which historical trends, which extremes, &c. otherwise you aren't actually saying anything about religion, you're expressing disapproval of extremes. which is valid, but doesn't make for a substantial critique, if that makes sense.
@bender Somehow I’m too lazy for a Mastodon client. 😂
@bender Somehow I’m too lazy for a Mastodon client. 😂
@bender Somehow I’m too lazy for a Mastodon client. 😂
@bender Somehow I’m too lazy for a Mastodon client. 😂
@bender Uhh, I don’t remember. 😂 I don’t think so?
@bender Uhh, I don’t remember. 😂 I don’t think so?
@bender Uhh, I don’t remember. 😂 I don’t think so?
@bender Uhh, I don’t remember. 😂 I don’t think so?
@movq on this:

> I use Mastodon similarly. I write posts in Vim until I’m happy with them. Then copy-and-paste to the browser …

You could use toot, and bypass the browser altogether.
@movq did you edit this twtxt? I shows fine on jenny, but in here (twtxt.net) seems to be missing a line in between the quoted text, and your reply (part of your reply is mashed with the quoted text).
[47°09′42″S, 126°43′41″W] Taking samples
@falsifian

> […] and then manually push it to my web servers […]

Funny, I also push manually, kind of. My publish_command includes a [Y/n] question and I very often hit n, so I can keep writing a thread until it’s finished. And sometimes I delete stuff again and never publish it. 😅

I use Mastodon similarly. I write posts in Vim until I’m happy with them. Then copy-and-paste to the browser …
@falsifian

> […] and then manually push it to my web servers […]

Funny, I also push manually, kind of. My publish_command includes a [Y/n] question and I very often hit n, so I can keep writing a thread until it’s finished. And sometimes I delete stuff again and never publish it. 😅

I use Mastodon similarly. I write posts in Vim until I’m happy with them. Then copy-and-paste to the browser …
@falsifian

> \n and then manually push it to my web servers \n

Funny, I also push manually, kind of. My publish_command includes a [Y/n] question and I very often hit n, so I can keep writing a thread until it’s finished. And sometimes I delete stuff again and never publish it. 😅

I use Mastodon similarly. I write posts in Vim until I’m happy with them. Then copy-and-paste to the browser …
@falsifian

> […] and then manually push it to my web servers […]

Funny, I also push manually, kind of. My publish_command includes a [Y/n] question and I very often hit n, so I can keep writing a thread until it’s finished. And sometimes I delete stuff again and never publish it. 😅

I use Mastodon similarly. I write posts in Vim until I’m happy with them. Then copy-and-paste to the browser …
@falsifian

> […] and then manually push it to my web servers […]

Funny, I also push manually, kind of. My publish_command includes a [Y/n] question and I very often hit n, so I can keep writing a thread until it’s finished. And sometimes I delete stuff again and never publish it. 😅

I use Mastodon similarly. I write posts in Vim until I’m happy with them. Then copy-and-paste to the browser …
@prologic I'm grateful for this accident. I find browsing twtxt.net useful even though I don't have an account there. I do it when I can't use Jenny because I only have my phone, or if I want to see messages I might have missed. I know it's not guaranteed to catch everything, but it's pretty good, even if it's not intentional.
@Codebuzz I use Jenny to add to a local copy of my twtxt.txt file, and then manually push it to my web servers. I prefer timestamps to end with "Z" rather than "+00:00" so I modified Jenny to use that format. I mostly follow conversations using Jenny, but sometimes I check twtxt.net, which could catch twts I missed.
Happy Halloween! I am enjoying my pumpkin spice latte.
@prologic, I don't know if you will notice that the first line on the block below has a slight indentation:


First line.
Second line.
Third line.


I believe this, on CSS, is causing it:


pre>code {
    padding:0 .25rem;
}
[Pinellas County - 4 x 5' (hard) [1']](https://staystrong.run/user/bmallred/activity/c792f503-df32-4813-ae5c-d5d61c3752c6): 5.00 miles, 00:09:42 average pace, 00:48:26 duration
nothing to note.
#running
[Pinellas County - 4 x 5' (hard) [1']](https://staystrong.run/user/bmallred/activity/c792f503-df32-4813-ae5c-d5d61c3752c6): 5.00 miles, 00:09:42 average pace, 00:48:26 duration
nothing to note.
#running
[Pinellas County - 4 x 5' (hard) [1']](https://staystrong.run/user/bmallred/activity/c792f503-df32-4813-ae5c-d5d61c3752c6): 5.00 miles, 00:09:42 average pace, 00:48:26 duration
nothing to note.
#running
@movq was going to say, "let them be, mate, let them be". :-)