# I am the Watcher. I am your guide through this vast new twtiverse.
#
# Usage:
# https://watcher.sour.is/api/plain/users View list of users and latest twt date.
# https://watcher.sour.is/api/plain/twt View all twts.
# https://watcher.sour.is/api/plain/mentions?uri=:uri View all mentions for uri.
# https://watcher.sour.is/api/plain/conv/:hash View all twts for a conversation subject.
#
# Options:
# uri Filter to show a specific users twts.
# offset Start index for quey.
# limit Count of items to return (going back in time).
#
# twt range = 1 196263
# self = https://watcher.sour.is?offset=173056
# next = https://watcher.sour.is?offset=173156
# prev = https://watcher.sour.is?offset=172956
lxappearance pour configurer le darkmode sur dwm ?
[47°09′17″S, 126°43′12″W] Wind speed: 45kph
🧮 USERS:1 FEEDS:2 TWTS:1112 ARCHIVED:79684 CACHE:2598 FOLLOWERS:17 FOLLOWING:14
[47°09′04″S, 126°43′09″W] Wind speed: N/A -- Cannot comunicate
[47°09′49″S, 126°43′40″W] Weather forecast alert -- storm from E
@bender Yes. I _think_ as a fancy autocomplete "tool" it's not too shabby. Beyond that I'm not convinced it saves you time at all.
@bender Yes. I _think_ as a fancy autocomplete "tool" it's not too shabby. Beyond that I'm not convinced it saves you time at all.
@prologic quoting a friend of mine, C# developer of 25 years now converted to DevOP:
> "If you are not using AI everyday, you're working too much", and "completely worth it [referring to the use of ChatGPT], no question. Same work output, in less of my time. More breaks for me."
It is not to rely on it 100%. It's just a tool.
@prologic quoting a friend of mine, C# developer of 25 years now converted to DevOP:
> "If you are not using AI everyday, you're working too much", and "completely worth it \n, no question. Same work output, in less of my time. More breaks for me."
It is not to rely on it 100%. It's just a tool.
@prologic exactly! Supposedly this engagement of his is "blessed" by his lawyers. 🤦🏻♂️ He might need better lawyers too!
"You have reached a non-working number at UPS [...]" says the recording. If it is a non-working number, it wouldn't even ring, right? It should have said "You have reached an outgoing calls only number at UPS [...]", or better yet, route outgoing call only numbers to the one we should be calling instead. Problem resolved.
"You have reached a non-working number at UPS \n" says the recording. If it is a non-working number, it wouldn't even ring, right? It should have said "You have reached an outgoing calls only number at UPS \n", or better yet, route outgoing call only numbers to the one we should be calling instead. Problem resolved.
Wow! 😮 He seems to be digging himself into a hole there right? 🤣
Wow! 😮 He seems to be digging himself into a hole there right? 🤣
See comments from him (photomatt) on that HN entry.
Over the past few days I've been playing around with the latest Chat-GPT, I _think_ the model is called o1-preview. I've used it for various tasks from writing documentation, specs, shell scripts, to code (in Go).
The result? Well I can certainly say the model(s) are much better than they used to be, but maybe that isn't so much the models per se, but the sheer processing power at OpenAI's data centers? 🤔
But here's the kicker though... If anyone ever for a moment ever think that these "AI" things are intelligent, or that the marketing and hype is ever remotely close to trying to convince of us this "AGI" (Artificial General Intelligence) or ASI (Artificial Super Intelligence), you are sorely mistaken.
Chat-GPT and basically and any other technology based on Generative-AI (Gen-AI), these pre-trained transformers that use adversarial neural networks and insanely multi-dimensional vector databases to model all sorts of things from human language, programming languages all the way to visual and audible art are (_wait for it_):
Incredibly stupid! 🤦♂️
They are effectively quite useless for anything but:
- Reproducing patterns (_albieit badly_)
- Search and Retrieval (_in a way that "seems" to be natural_)
And that's about it.
Used as a tool, they're kind of okay, but I wouldn't use Chat-GPT or CoPilot. I'd stick with something more like Codeium if you want a bit of a fancier "auto complete". Otherwise, just forget about the whole thing honestly. It doesn't even really save you time.
Over the past few days I've been playing around with the latest Chat-GPT, I _think_ the model is called o1-preview. I've used it for various tasks from writing documentation, specs, shell scripts, to code (in Go).
The result? Well I can certainly say the model(s) are much better than they used to be, but maybe that isn't so much the models per se, but the sheer processing power at OpenAI's data centers? 🤔
But here's the kicker though... If anyone ever for a moment ever think that these "AI" things are intelligent, or that the marketing and hype is ever remotely close to trying to convince of us this "AGI" (Artificial General Intelligence) or ASI (Artificial Super Intelligence), you are sorely mistaken.
Chat-GPT and basically and any other technology based on Generative-AI (Gen-AI), these pre-trained transformers that use adversarial neural networks and insanely multi-dimensional vector databases to model all sorts of things from human language, programming languages all the way to visual and audible art are (_wait for it_):
Incredibly stupid! 🤦♂️
They are effectively quite useless for anything but:
- Reproducing patterns (_albieit badly_)
- Search and Retrieval (_in a way that "seems" to be natural_)
And that's about it.
Used as a tool, they're kind of okay, but I wouldn't use Chat-GPT or CoPilot. I'd stick with something more like Codeium if you want a bit of a fancier "auto complete". Otherwise, just forget about the whole thing honestly. It doesn't even really save you time.
@lyse pretty cool! What's the process that you follow? Share, share! :-)
Made the first apple sauce of the season in around three to four hours of work. Pretty cool, very, very little waste. The jars are currently cooking.
@xuu being contrarian isn't a problem. Having different opinions force us to think, and make---hopefully---better decisions. We shouldn't, mustn't be contrarians, tough, while not offering a viable path forward that makes sense. What I am saying is that after that "so…" of yours needs to come a (or a set of) tangible recommendation(s). 😉
[47°09′51″S, 126°43′01″W] Raw reading: 0x66FE7932, offset +/-1
@movq i'm sorry if I sound too contrarian. I'm not a fan of using an obscure hash as well. The problem is that of future and backward compatibility. If we change to sha256 or another we don't just need to support sha256. But need to now support both sha256 AND blake2b. Or we devide the community. Users of some clients will still use the old algorithm and get left behind.
Really we should all think hard about how changes will break things and if those breakages are acceptable.
@movq i'm sorry if I sound too contrarian. I'm not a fan of using an obscure hash as well. The problem is that of future and backward compatibility. If we change to sha256 or another we don't just need to support sha256. But need to now support both sha256 AND blake2b. Or we devide the community. Users of some clients will still use the old algorithm and get left behind.
Really we should all think hard about how changes will break things and if those breakages are acceptable.
@aelaraji Hmm that is worth trying. It is the same base Firefox I guess 🤔
Maybe i should sleep more? Noticed about mistake in my follow entry for prologic. Already fixed
It's all about the r gage meant ya see 😅
It's all about the r gage meant ya see 😅
[47°09′04″S, 126°43′35″W] Transfer aborted
@xuu @prologic You clearly have very different goals for twtxt and view it from a very different perspective. I don’t have the mental energy for these discussions. I’m gonna take a break.
@xuu @prologic You clearly have very different goals for twtxt and view it from a very different perspective. I don’t have the mental energy for these discussions. I’m gonna take a break.
@xuu @prologic You clearly have very different goals for twtxt and view it from a very different perspective. I don’t have the mental energy for these discussions. I’m gonna take a break.
@xuu @prologic You clearly have very different goals for twtxt and view it from a very different perspective. I don’t have the mental energy for these discussions. I’m gonna take a break.
@aelaraji Yep seems alright! Really fast too. I'm still using my main Firefox in general cos.. well it's set up so much and it's hardened, profile running in RAM, all that crazy stuff that got it working the way I want 😂
But keeping a good eye on Zen Browser's progress.
[47°09′47″S, 126°43′29″W] Sample analyzing complete -- starting transfer
I share I did write up an algorithm for it at some point I think it is lost in a git comment someplace. I'll put together a pseudo/go code this week.
Super simple:
Making a reply:
0. If yarn has one use that. (Maybe do collision check?)
1. Make hash of twt raw no truncation.
2. Check local cache for shortest without collision
- in SQL: select len(subject) where head_full_hash like subject || '%'
Threading:
1. Get full hash of head twt
2. Search for twts
- in SQL: head_full_hash like subject || '%' and created_on > head_timestamp
The assumption being replies will be for the most recent head. If replying to an older one it will use a longer hash.
I share I did write up an algorithm for it at some point I think it is lost in a git comment someplace. I'll put together a pseudo/go code this week.
Super simple:
Making a reply:
0. If yarn has one use that. (Maybe do collision check?)
1. Make hash of twt raw no truncation.
2. Check local cache for shortest without collision
- in SQL: select len(subject) where head_full_hash like subject || '%'
Threading:
1. Get full hash of head twt
2. Search for twts
- in SQL: head_full_hash like subject || '%' and created_on > head_timestamp
The assumption being replies will be for the most recent head. If replying to an older one it will use a longer hash.
This Zen-Browser is actually not bad! 🤯
- Based on Firefox instead of Chromium.
- Got tiling pans when you need them... (just like a tiling window manager).
- I can hide the Tabs and Nav-Bar with a single short-cut!! AKA Compact Mode ...
This Zen-Browser is actually not bad! 🤯
- Based on Firefox instead of Chromium.
- Got tiling pans when you need them... (just like a tiling window manager).
- I can hide the Tabs and Nav-Bar with a single short-cut!! AKA Compact Mode ...
This Zen-Browser is actually not bad! 🤯
- Based on Firefox instead of Chromium.
- Got tiling pans when you need them... (just like a tiling window manager).
- I can hide the Tabs and Nav-Bar with a single short-cut!! AKA Compact Mode ...
@eldersnake and Snapchat, that one is the worse. No, I am not sharing my entire address book. Geez!
@slashdot Pretend I'm Leonardo.
> /ME slow claps...
@slashdot Pretend I'm Leonardo.
> /ME slow claps...
@slashdot Pretend I'm Leonardo.
> /ME slow claps...
Lol, this is actually a good thing by Apple. Doesn't kill social apps at all, just prevents some harvesting of your entire address book by abusive apps like WhatsApp.
🧮 USERS:1 FEEDS:2 TWTS:1111 ARCHIVED:79666 CACHE:2610 FOLLOWERS:17 FOLLOWING:14
@bender Nope not at all. base64 just encodes more bits
@bender Nope not at all. base64 just encodes more bits
> Build what makes you happy. Let miserable people build the rest
> Build what makes you happy. Let miserable people build the rest
@lyse I _think_ the proposal should be as simple as this:
- Update the Twt Hash extension.
- Increase its truncation from 7 to 12
@xuu is right about quite a few things, and I'd love it if he wrote up the dynamic hash size proposal, but I'm inclined to just increase the length in the first place mostly because my own client yarnd doesn't even store the full hashes in the first place 🤦♂️ (I thinnk)
@lyse I _think_ the proposal should be as simple as this:
- Update the Twt Hash extension.
- Increase its truncation from 7 to 12
@xuu is right about quite a few things, and I'd love it if he wrote up the dynamic hash size proposal, but I'm inclined to just increase the length in the first place mostly because my own client yarnd doesn't even store the full hashes in the first place 🤦♂️ (I thinnk)
@xuu I guess @movq 's point is there isn't one that is available as standard on OpenBSD? 😅
@xuu I guess @movq 's point is there isn't one that is available as standard on OpenBSD? 😅
End the apartheid, End the war. #FreePalestine
stick computers, to snugly fit in reclaimed plastic tubes/containers #halfbaked #coding #programming #embedded #electronics
I mean sure if i want to run it over on my tooth brush why not use something that is accessible everywhere like md5? crc32? It was chosen a long while back and the only benefit in changing now is "i cant find an implementation for x" when the down side is it breaks all existing threads. so...
I mean sure if i want to run it over on my tooth brush why not use something that is accessible everywhere like md5? crc32? It was chosen a long while back and the only benefit in changing now is "i cant find an implementation for x" when the down side is it breaks all existing threads. so...
[47°09′10″S, 126°43′53″W] Taking samples
Necropost: btw i have twt alias for twet 😅
@bender Yes, a proposal alone is certainly not enough, but a good start. Absolutely necessary in my opinion. With everything just in thin air and constantly changing (at least it appears to me that way), I'm lost.
I have the feeling that the hashing part is the most important one that should be sorted first.
@lyse I agree. Yet, even with a proposal, it is hard to finally agree to something, because it is not about developing a unique, sole client, but agreeing on a set of "standards" to be used on a handful(?) of clients, make by different people.
Using Mastodon as a---albeit poorly---contrast, they set their road-map, and clients (even other server implementations!) that want to cater/communicate with it using similar APIs will have to adjust. No other way. That doesn't apply to twtxt.
I think the incremental changes that have been made to twtxt happened kind of slowly for that reason.
[47°09′51″S, 126°43′20″W] Reading: 0.23000 PPM
@quark I definitely agree with the first part. Not so sure about the second one. Maybe it then turns out miserable, too. :-?
@bender I do hope that it ends up fancy! But maybe it turns out rather crappy. Metal working is definitely beyond my capabilities. I just find it super fascinating.
I'd also appreciate if somebody wrote a proposal. It's very hard to piece everything together across all those many conversations.
@movq because sometimes resurrecting the dormant is worth it. Hello! :-)
Pinellas County - Easy: 5.02 miles, 00:10:12 average pace, 00:51:08 duration
kept it an easy feel. fun run.
#running
Pinellas County - Easy: 5.02 miles, 00:10:12 average pace, 00:51:08 duration
kept it an easy feel. fun run.
#running
Pinellas County - Easy: 5.02 miles, 00:10:12 average pace, 00:51:08 duration
kept it an easy feel. fun run.
#running
@prologic yeah, sad, convoluted, dangerous state of affairs for just about everyone. :-(
@prologic yeah, sad, convoluted, dangerous state of affairs for just about everyone. :-(
@bender Not yet! the prompt said the requests are treated manually and that it could take up to 30 days.
@bender Not yet! the prompt said the requests are treated manually and that it could take up to 30 days.
@bender Not yet! the prompt said the requests are treated manually and that it could take up to 30 days.
@bender yeah I know, I treat these like the RSS ones. I'm OK with them being one-ways as long as they don't get Spammy.
@bender yeah I know, I treat these like the RSS ones. I'm OK with them being one-ways as long as they don't get Spammy.
@bender yeah I know, I treat these like the RSS ones. I'm OK with them being one-ways as long as they don't get Spammy.
@aelaraji so, did you get approve? What's your tilde?
@aelaraji a one way only feed? I can't see the twtxt you are referring to, but checked the original feed, and they seem not to be engaging with anyone.
@prologic is base64 more desirable than base32? I noticed I get alphanumeric replacing base64 with base32.
@3r1c 🤔 Interesting! I was thinking about doing something like this in Rofi, now I can just play with this one.
@3r1c 🤔 Interesting! I was thinking about doing something like this in Rofi, now I can just play with this one.
@3r1c 🤔 Interesting! I was thinking about doing something like this in Rofi, now I can just play with this one.