I finally changed the broken gear shift bowden cable of my bicycle in a longer lunch break.
@doesnm There is no real recommendation I think. But if you hit half a MiB or so, it might be worth considering to rotate in order to keep the network traffic low. People with bad connectivitiy might appreciate it. I want to implement HTTP range requests in my client rewrite at some point in time (but first, it has to become kinda usable, though).

The parchment, on the other hand, might be a bit wasteful for just temporary ideas that are not perfectly layed out yet.
I think non-breaking spaces are preferred nowadays to avoid the confusion.
* https://upload.wikimedia.org/wikipedia/commons/a/a9/Nebelbank_in_der_W%C3%BCste_Namib_bei_Aus_%282018%29.jpg
* https://upload.wikimedia.org/wikipedia/commons/1/17/Space_Shuttle_Challenger_moving_through_fog.jpg
* https://upload.wikimedia.org/wikipedia/commons/9/96/Fog_Bow_%2819440790708%29.jpg
* https://upload.wikimedia.org/wikipedia/commons/a/ac/360_degrees_fogbow.jpg
Section 3: I'm a bit on the fence regarding documenting the HTTP caching headers. It's a very general HTTP thing, so there is nothing special about them for twtxt. No need for the Twtxt Specification to actually redo it. But on the other hand, a short hint could certainly help client developers and feed authors. Maybe it's thanks to my distro's Ngninx maintainer, but I did not configure anything for the
Last-Modified
and ETag
headers to be included in the response, the web server just already did it automatically.The more that I think about it while typing this reply, the more I think your recommendation suggestion is actually really great. It will definitely beneficial for client developers. In almost all client implementation cases I'd say one has to actually do something specifically in the code to send the
If-Modified-Since
and/or If-None-Match
request headers. There is no magic that will do it automatically, as one has to combine data from the last response with the new request.But I also came across feeds that serve zero response headers that make caching possible at all. So, an explicit recommendation enables feed authors to check their server setups. Yeah, let's absolutely do this! :-)
Regarding section 4 about feed discovery: Yeah, non-HTTP transport protocols are an issue as they do not have
User-Agent
headers. How exactly do you envision the discovery_url
to work, though? I wouldn't limit the transports to HTTP(S) in the Twtxt Specification, though. It's up to the client to decide which protocols it wants to support.Since I currently rely on buckket's
twtxt
client to fetch the feeds, I can only follow http(s)://
(and file://
) feeds. But in tt2
I will certainly add some gopher://
and gemini://
at some point in time.Some time ago, @movq found out that some Gopher/Gemini users prefer to just get an e-mail from people following them: https://twtxt.net/twt/dikni6q So, it might not even be something to be solved as there is no problem in the first place.
Section 5 on protocol support: You're right, announcing the different transports in the
url
metadata would certainly help. :-)Section 7 on emojis: Your idea of TUI/CLI avatars is really intriguing I have to say. Maybe I will pick this up in
tt2
some day. :-)

When I glued the shelf between the posts of the stand, I tightened the long clamp too hard, ripping the back panel and shelf board apart. So, I had to reglue them. :-)
Metadata on individual twts are too much for me. I do like the simplicity of the current spec. But I understand where you're coming from.
Numbering twts in a feed is basically the attempt of generating message IDs. It's an interesting idea, but I reckon it is not even needed. I'd simply use location based addressing (feed URL + '#' + timestamp) instead of content addressing. If one really wanted to, one could hash the feed URL and timestamp, but the raw form would actually improve disoverability and would not even require a richer client. But the majority of twtxt users in the last poll wanted to stick with content addressing.
yarnd actually sends
If-Modified-Since
request headers. Not only can I observe heaps of 304 responses for yarnds in my access log, but in Cache.FetchFeeds(…)
we can actually see If-Modified-Since
being deployed when the feed has been retrieved with a Last-Modified
response header before: https://git.mills.io/yarnsocial/yarn/src/commit/98eee5124ae425deb825fb5f8788a0773ec5bdd0/internal/cache.go#L1278Turns out etags with
If-None-Match
are only supported when yarnd serves avatars (https://git.mills.io/yarnsocial/yarn/src/commit/98eee5124ae425deb825fb5f8788a0773ec5bdd0/internal/handlers.go#L158) and media uploads (https://git.mills.io/yarnsocial/yarn/src/commit/98eee5124ae425deb825fb5f8788a0773ec5bdd0/internal/media_handlers.go#L71). However, it ignores possible etags when fetching feeds.I don't understand how the discovery URLs should work to replace the
User-Agent
header in HTTP(S) requests. Do you mind to elaborate?Different protocols are basically just a client thing.
I reckon it's best to just avoid mixing several languages in one feed in the first place. Personally, I find it okay to occasionally write messages in other languages, but if that happens on a more regularly basis, I'd definitely create a different feed for other languages.
Isn't the emoji thing "just" a client feature? So, feed do not even have to state any emojis. As a user I'd configure my client to use a certain symbol for feed ABC. Currently, I can do a similar thing in
tt
where I assign colors to feeds. On the other hand, what if a user wants to control what symbol should be displayed, similar to the feed's nick? Hmm. But still, my terminal font doesn't even render most of emojis. So, Unicode boxes everywhere. This makes me think it should actually be a only client feature.
Btw. if you blindly run the command again in a few days, your query might match new feeds that are not included in today's list. Hence, some accounts might be dropped without a warning. But then, they probably don't care.
alt
choices are not the best. I should probably fix them.This also reminds me of a JS snippet my mate wrote for navigation in browsers that don't support incrementing numbers in the URLs. I'm using Tridactyl in Firefox and can
Ctrl+A
/Ctrl+X
myself through albums with properly named files.
On the summit the view was absolutely terrible, because there were super low hanging clouds. But it still looked fairly spectacular. Very surreal, I could not make out the edge of the Swabian Alb. The haze just blended with the rest of the sky. Towards the sun it was just one giant white wall after half a kilometer or so. That doesn't happen all that often here.
After dusk I saw five deer on a meadow. Well their outlines against the remaining backlit sky.
https://lyse.isobeef.org/waldspaziergang-2024-11-04/

3. Summer lightning.
4. Obviously aliens@11!!@1
I once saw a light show in the woods originating most likely from a disco a few kilometers away. That was also pretty crazy. There was absolutely zero sound reaching the valley I was in.
Did you manage to already hide it all in your tummy, @bender? :-)


https://lyse.isobeef.org/tmp/anschlagwinkel/
User-agent: *
Disallow: /
Allow: /$*
Recently, @bender made me finally switch to weechat in a tmux session on my server:
tmux new -s irc
and then run weechat
inside. On my local computer I then simply attach to that session, even got an alias for that: alias irc='ssh -t isobeef tmux attach -t irc'
I'm now basically online 24/7 and can skip over the new messages in the backlog by hand when I start my local computer. :-DI'm very happy with that. Can't imagine ever going back right now. I'm also wondering why it took me all those years to finally make the small step. Happy IRCing!
Oh, and the
lang
metadata field is indented with tabs, breaking the nice visual alignment._
Oh, and the
lang
metadata field is indented with tabs, breaking the nice visual alignment._
Lol, Schnitzelklopfen mit einem in Tüte eingepackten Schlosserhammer, das kam mir so auch noch nie unter. :-D
"Like a true German, I'm going to open this beer with my eye socket." Hahahahahahaaaaa! :-D
di{
some time ago but entirely forgot about it.
Read it, prologic, it's totally worth it. That's a great writeup by some very cool dude.
The PR article by the company just speaks for itself and reinforces their dick move. No more questions. https://support.zendesk.com/hc/en-us/articles/8187090244506-Email-user-verification-bug-bounty-report-retrospective

I just like to send a proper
Content-Type
stating the right encoding to be a good web citizen. That's all. :-)
> Clients (and human readers) just assume a flat threading
> structure by default, read things in order […]
I might misunderstand this, but I slightly disagree. Personally, I like to look at the tree structure and my client also does present me the conversation tree as an actual tree, not a flat list. Yes, this gets messy when there are a lot of branches and long messages, but I managed to live with that. Doesn't happen very often. Anyway, just a personal preference. Nothing to really worry.
> The v2 spec requires each reply to re-calculate the hash
> of the specific entry I’m replying to […]
Hmmmm, where do you read that the client has to re-calculate the hash on reply? (Sorry, I'm probably just not getting your point here in the entire paragraph.)
> Clients should not be expected to track conversations back
> across forking points […]
I agree. It totally depends on the client.
> Clients (and human readers) just assume a flat threading
> structure by default, read things in order \n
I might misunderstand this, but I slightly disagree. Personally, I like to look at the tree structure and my client also does present me the conversation tree as an actual tree, not a flat list. Yes, this gets messy when there are a lot of branches and long messages, but I managed to live with that. Doesn't happen very often. Anyway, just a personal preference. Nothing to really worry.
> The v2 spec requires each reply to re-calculate the hash
> of the specific entry I’m replying to \n
Hmmmm, where do you read that the client has to re-calculate the hash on reply? (Sorry, I'm probably just not getting your point here in the entire paragraph.)
> Clients should not be expected to track conversations back
> across forking points \n
I agree. It totally depends on the client.
Content-Type: text/plain
might be not enough, as the HTTP spec defaults to Latin1 or whatever, not UTF-8. So there is a gap or room for incorrect interpretation. I could be wrong, but I understand @anth's comment that he doesn't want to even have a Content-Type
header in the first place.I reckon it should be optional, but when deciding to sending one, it should be
Content-Type: text/plain; charset=utf-8
. That also helps browsers pick up the right encoding right away without guessing wrong (basically always happens with Firefox here). That aids people who read raw feeds in browsers for debugging or what not. (I sometimes do that to decide if there is enough interesting content to follow the feed at hand.)

just
before.
mkdir -p $dir
and just retrying the command works.