


I'm no lawyer, but my uneducated guess would be that:
A) twts are already publicly available/public knowledge and such... just don't process children's personal data and _MAYBE_ you're good? Since there's this:
> ... an organization’s right to process someone’s data might override their right to be forgotten. Here are the reasons cited in the GDPR that trump the right to erasure:
> - The data is being used to exercise the right of freedom of expression and information.
> - The data is being used to perform a task that is being carried out in the public interest or when exercising an organization’s official authority.
> - The data represents important information that serves the public interest, scientific research, historical research, or statistical purposes and where erasure of the data would likely to impair or halt progress towards the achievement that was the goal of the processing.
B) What I love about the TWTXT sphere is it's Human/Humane element! No deceptive algorithms, no Corpo B.S ...etc. Just Humans. So maybe ... If we thought about it in this way, it wouldn't heart to be even nicer to others/offering strangers an even safer space.
I could already imagine a couple of extreme cases where, somewhere, in this _peaceful world_ one's exercise of freedom of speech could get them in *Real trouble* (if not danger) if found out, it wouldn't necessarily have to involve something to do with Law or legal authorities. So, If someone asks, and maybe fearing fearing for... let's just say 'Their well being', would it heart if a pod just purged their content if *it's serving it publicly* (maybe relay the info to other pods) and call it a day? It doesn't have to be about some law/convention somewhere ... 🤷 I know! Too extreme, but I've seen news of people who'd gone to jail or got their lives ruined for as little as a silly joke. And it doesn't even have to be about any of this.
P.S: Maybe make
X
tool check out robots.txt? Or maybe make long-term archives Opt-in? Opt-out? P.P.S: Already Way too many MAYBE's in a single twt! So I'll just shut up. 😅
I'm no lawyer, but my uneducated guess would be that:
A) twts are already publicly available/public knowledge and such... just don't process children's personal data and _MAYBE_ you're good? Since there's this:
> ... an organization’s right to process someone’s data might override their right to be forgotten. Here are the reasons cited in the GDPR that trump the right to erasure:
> - The data is being used to exercise the right of freedom of expression and information.
> - The data is being used to perform a task that is being carried out in the public interest or when exercising an organization’s official authority.
> - The data represents important information that serves the public interest, scientific research, historical research, or statistical purposes and where erasure of the data would likely to impair or halt progress towards the achievement that was the goal of the processing.
B) What I love about the TWTXT sphere is it's Human/Humane element! No deceptive algorithms, no Corpo B.S ...etc. Just Humans. So maybe ... If we thought about it in this way, it wouldn't heart to be even nicer to others/offering strangers an even safer space.
I could already imagine a couple of extreme cases where, somewhere, in this _peaceful world_ one's exercise of freedom of speech could get them in *Real trouble* (if not danger) if found out, it wouldn't necessarily have to involve something to do with Law or legal authorities. So, If someone asks, and maybe fearing fearing for... let's just say 'Their well being', would it heart if a pod just purged their content if *it's serving it publicly* (maybe relay the info to other pods) and call it a day? It doesn't have to be about some law/convention somewhere ... 🤷 I know! Too extreme, but I've seen news of people who'd gone to jail or got their lives ruined for as little as a silly joke. And it doesn't even have to be about any of this.
P.S: Maybe make
X
tool check out robots.txt? Or maybe make long-term archives Opt-in? Opt-out? P.P.S: Already Way too many MAYBE's in a single twt! So I'll just shut up. 😅
I'm no lawyer, but my uneducated guess would be that:
A) twts are already publicly available/public knowledge and such... just don't process children's personal data and _MAYBE_ you're good? Since there's this:
> ... an organization’s right to process someone’s data might override their right to be forgotten. Here are the reasons cited in the GDPR that trump the right to erasure:
> - The data is being used to exercise the right of freedom of expression and information.
> - The data is being used to perform a task that is being carried out in the public interest or when exercising an organization’s official authority.
> - The data represents important information that serves the public interest, scientific research, historical research, or statistical purposes and where erasure of the data would likely to impair or halt progress towards the achievement that was the goal of the processing.
B) What I love about the TWTXT sphere is it's Human/Humane element! No deceptive algorithms, no Corpo B.S ...etc. Just Humans. So maybe ... If we thought about it in this way, it wouldn't heart to be even nicer to others/offering strangers an even safer space.
I could already imagine a couple of extreme cases where, somewhere, in this _peaceful world_ one's exercise of freedom of speech could get them in *Real trouble* (if not danger) if found out, it wouldn't necessarily have to involve something to do with Law or legal authorities. So, If someone asks, and maybe fearing fearing for... let's just say 'Their well being', would it heart if a pod just purged their content if *it's serving it publicly* (maybe relay the info to other pods) and call it a day? It doesn't have to be about some law/convention somewhere ... 🤷 I know! Too extreme, but I've seen news of people who'd gone to jail or got their lives ruined for as little as a silly joke. And it doesn't even have to be about any of this.
P.S: Maybe make
X
tool check out robots.txt? Or maybe make long-term archives Opt-in? Opt-out? P.P.S: Already Way too many MAYBE's in a single twt! So I'll just shut up. 😅
zq4fgq
Thanks!
zq4fgq
Thanks!
zq4fgq
Thanks!
~ » echo -n "https://twtxt.net/user/prologic/twtxt.txt\n2020-07-18T12:39:52Z\nHello World! 😊" | openssl dgst -blake2s256 -binary | base32 | tr -d '=' | tr 'A-Z' 'a-z' | tail -c 7
p44j3q
~ » echo -n "https://twtxt.net/user/prologic/twtxt.txt\n2020-07-18T12:39:52Z\nHello World! 😊" | openssl dgst -blake2s256 -binary | base32 | tr -d '=' | tr 'A-Z' 'a-z' | tail -c 7
p44j3q
~ » echo -n "https://twtxt.net/user/prologic/twtxt.txt\\n2020-07-18T12:39:52Z\\nHello World! 😊" | openssl dgst -blake2s256 -binary | base32 | tr -d '=' | tr 'A-Z' 'a-z' | tail -c 7
p44j3q
h
#!/bin/bash
set -e
trap 'echo "!! Something went wrong...!!"' ERR
#============= Variables ==========#
# Source files
LOCAL_DIR=$HOME/twtxt
TWTXT=$LOCAL_DIR/twtxt.txt
HTML=$LOCAL_DIR/log.html
TEMPLATE=$LOCAL_DIR/template.tmpl
# Destination
REMOTE_HOST=remotHostName # Host already setup in ~/.ssh/config
WEB_DIR="path/to/html/content"
GOPHER_DIR="path/to/phlog/content"
GEMINI_DIR="path/to/gemini-capsule/content"
DIST_DIRS=("$WEB_DIR" "$GOPHER_DIR" "$GEMINI_DIR")
#============ Functions ===========#
# Building log.html:
build_page() {
twtxt2html -T $TEMPLATE $TWTXT > $HTML
}
# Bulk Copy files to their destinations:
copy_files() {
for DIR in "${DIST_DIRS[@]}"; do
# Copy both `txt` and `html` files to the Web server and only `txt`
# to gemini and gopher server content folders
if [ "$DIR" == "$WEB_DIR" ]; then
scp -C "$TWTXT" "$HTML" "$REMOTE_HOST:$DIR/"
else
scp -C "$TWTXT" "$REMOTE_HOST:$DIR/"
fi
done
}
#========== Call to functions ===========$
build_page && copy_files
h
#!/bin/bash
set -e
trap 'echo "!! Something went wrong...!!"' ERR
#============= Variables ==========#
# Source files
LOCAL_DIR=$HOME/twtxt
TWTXT=$LOCAL_DIR/twtxt.txt
HTML=$LOCAL_DIR/log.html
TEMPLATE=$LOCAL_DIR/template.tmpl
# Destination
REMOTE_HOST=remotHostName # Host already setup in ~/.ssh/config
WEB_DIR="path/to/html/content"
GOPHER_DIR="path/to/phlog/content"
GEMINI_DIR="path/to/gemini-capsule/content"
DIST_DIRS=("$WEB_DIR" "$GOPHER_DIR" "$GEMINI_DIR")
#============ Functions ===========#
# Building log.html:
build_page() {
twtxt2html -T $TEMPLATE $TWTXT > $HTML
}
# Bulk Copy files to their destinations:
copy_files() {
for DIR in "${DIST_DIRS[@]}"; do
# Copy both `txt` and `html` files to the Web server and only `txt`
# to gemini and gopher server content folders
if [ "$DIR" == "$WEB_DIR" ]; then
scp -C "$TWTXT" "$HTML" "$REMOTE_HOST:$DIR/"
else
scp -C "$TWTXT" "$REMOTE_HOST:$DIR/"
fi
done
}
#========== Call to functions ===========$
build_page && copy_files
h
#!/bin/bash
set -e
trap 'echo "!! Something went wrong...!!"' ERR
#============= Variables ==========#
# Source files
LOCAL_DIR=$HOME/twtxt
TWTXT=$LOCAL_DIR/twtxt.txt
HTML=$LOCAL_DIR/log.html
TEMPLATE=$LOCAL_DIR/template.tmpl
# Destination
REMOTE_HOST=remotHostName # Host already setup in ~/.ssh/config
WEB_DIR="path/to/html/content"
GOPHER_DIR="path/to/phlog/content"
GEMINI_DIR="path/to/gemini-capsule/content"
DIST_DIRS=("$WEB_DIR" "$GOPHER_DIR" "$GEMINI_DIR")
#============ Functions ===========#
# Building log.html:
build_page() {
\ttwtxt2html -T $TEMPLATE $TWTXT > $HTML
}
# Bulk Copy files to their destinations:
copy_files() {
\tfor DIR in "${DIST_DIRS[@]}"; do
# Copy both `txt` and `html` files to the Web server and only `txt`
# to gemini and gopher server content folders
\t\tif [ "$DIR" == "$WEB_DIR" ]; then
\t\t\tscp -C "$TWTXT" "$HTML" "$REMOTE_HOST:$DIR/"
\t\telse
\t\t\tscp -C "$TWTXT" "$REMOTE_HOST:$DIR/"
\t\tfi
\tdone
}
#========== Call to functions ===========$
build_page && copy_files
<time class="dt-published" datetime="2024-09-17T15:05:19+01:00"> 2024-09-17 14:05:19 +0000 UTC+0000 </time>
the datetime=...
atribute is in my local time UTC+1 then the text within the tag is in UTC+0 The thing is, I've been poking at the template as well, but nothing changes. I literally whole portionsm added in lorem text just to see if it would do anything, then
twtxt2html -T ./layout.html <link to twtxt file> | less
shows same thing as before! nothing changes. LOL I'm not sure I'm going at it the right way.
<time class="dt-published" datetime="2024-09-17T15:05:19+01:00"> 2024-09-17 14:05:19 +0000 UTC+0000 </time>
the datetime=...
atribute is in my local time UTC+1 then the text within the tag is in UTC+0 The thing is, I've been poking at the template as well, but nothing changes. I literally whole portionsm added in lorem text just to see if it would do anything, then
twtxt2html -T ./layout.html <link to twtxt file> | less
shows same thing as before! nothing changes. LOL I'm not sure I'm going at it the right way.
<time class="dt-published" datetime="2024-09-17T15:05:19+01:00"> 2024-09-17 14:05:19 +0000 UTC+0000 </time>
the datetime=...
atribute is in my local time UTC+1 then the text within the tag is in UTC+0 The thing is, I've been poking at the template as well, but nothing changes. I literally whole portionsm added in lorem text just to see if it would do anything, then
twtxt2html -T ./layout.html <link to twtxt file> | less
shows same thing as before! nothing changes. LOL I'm not sure I'm going at it the right way.
twtxt2html https://domain.ltd/twtxt.txt > /path/to/static_files_dir/
that way I could benefit from the 'relative time' portion I'm getting rid of with the -n
...
twtxt2html https://domain.ltd/twtxt.txt > /path/to/static_files_dir/
that way I could benefit from the 'relative time' portion I'm getting rid of with the -n
...
twtxt2html https://domain.ltd/twtxt.txt > /path/to/static_files_dir/
that way I could benefit from the 'relative time' portion I'm getting rid of with the -n
...
-n
🫠
-n
🫠
-n
🫠
h
twtxt2html $HOME/path/to/local_twtxt_dir/twtxt.txt > $HOME/path/to/local_twtxt_dir/log.html && \
scp $HOME/path/to/local_twtxt_dir/log.html user@remotehost:/path/to/static_files_dir/
I've been lazy to add it to my publish_command script, now I can just copy/pasta from the twt 😅
h
twtxt2html $HOME/path/to/local_twtxt_dir/twtxt.txt > $HOME/path/to/local_twtxt_dir/log.html && \\
scp $HOME/path/to/local_twtxt_dir/log.html user@remotehost:/path/to/static_files_dir/
I've been lazy to add it to my publish_command script, now I can just copy/pasta from the twt 😅
h
twtxt2html $HOME/path/to/local_twtxt_dir/twtxt.txt > $HOME/path/to/local_twtxt_dir/log.html && \
scp $HOME/path/to/local_twtxt_dir/log.html user@remotehost:/path/to/static_files_dir/
I've been lazy to add it to my publish_command script, now I can just copy/pasta from the twt 😅
~/.cache/jenny
and my maildir_target
when I tried to reset things. Still got wrecked 😅 If it's not too much to ask, could you backup or/change your
maildir_target
and give it a try with an empty directory?
~/.cache/jenny
and my maildir_target
when I tried to reset things. Still got wrecked 😅 If it's not too much to ask, could you backup or/change your
maildir_target
and give it a try with an empty directory?
~/.cache/jenny
and my maildir_target
when I tried to reset things. Still got wrecked 😅 If it's not too much to ask, could you backup or/change your
maildir_target
and give it a try with an empty directory?
PS: I still can't get your and bender's archived twts (at least the ones I've noticed), nor can I
--fetch-context
on replays to them. your oldest is the one from 2024-06-14 18:22
... I can see lyse's tho! but I doubt this is related the edit issue but this helps with something.
PS: I still can't get your and bender's archived twts (at least the ones I've noticed), nor can I
--fetch-context
on replays to them. your oldest is the one from 2024-06-14 18:22
... I can see lyse's tho! but I doubt this is related the edit issue but this helps with something.
PS: I still can't get your and bender's archived twts (at least the ones I've noticed), nor can I
--fetch-context
on replays to them. your oldest is the one from 2024-06-14 18:22
... I can see lyse's tho! but I doubt this is related the edit issue but this helps with something.
- It all started with a LOT of his old twts starting back in 2020 showing in a weird way, some were empty others were duplicates and a lot more got marked for deletion by neomutt with the
D
tag. - After trying to restart things with a fresh Maildir, I couldn't fetch a lot of twts, even mine which was a replay to one of his. but then I was able to after temporarily deleting his link from my follow file.
then @quark and @bender pointed out the inconsistent from: + feed url and the twt edit
- It all started with a LOT of his old twts starting back in 2020 showing in a weird way, some were empty others were duplicates and a lot more got marked for deletion by neomutt with the
D
tag. - After trying to restart things with a fresh Maildir, I couldn't fetch a lot of twts, even mine which was a replay to one of his. but then I was able to after temporarily deleting his link from my follow file.
then @quark and @bender pointed out the inconsistent from: + feed url and the twt edit
- It all started with a LOT of his old twts starting back in 2020 showing in a weird way, some were empty others were duplicates and a lot more got marked for deletion by neomutt with the
D
tag. - After trying to restart things with a fresh Maildir, I couldn't fetch a lot of twts, even mine which was a replay to one of his. but then I was able to after temporarily deleting his link from my follow file.
then @quark and @bender pointed out the inconsistent from: + feed url and the twt edit
prev = hash twtxt.txt/n
instead of a link by design? I couldn't fetch any, nor can I do a --fetch-context on replays to your old twts.
prev = hash twtxt.txt/n
instead of a link by design? I couldn't fetch any, nor can I do a --fetch-context on replays to your old twts.
prev = hash twtxt.txt/n
instead of a link by design? I couldn't fetch any, nor can I do a --fetch-context on replays to your old twts.
> No keyboards were harmed during this experiment... yet.
> No keyboards were harmed during this experiment... yet.
> No keyboards were harmed during this experiment... yet.
I'll set up jenny and mutt on another computer and see how it goes from there.
I'll set up jenny and mutt on another computer and see how it goes from there.
I'll set up jenny and mutt on another computer and see how it goes from there.
> Also what are the change that the same _human_ will make two different posts within the same second?!
Just out of curiosity, What would happen someday if I (maybe trolling) edit my twtxt.txt-file manually and switch/switch a couple of twt timestamps, or add in 3 different twts manually with the same time stamp?
> Also what are the change that the same _human_ will make two different posts within the same second?!
Just out of curiosity, What would happen someday if I (maybe trolling) edit my twtxt.txt-file manually and switch/switch a couple of twt timestamps, or add in 3 different twts manually with the same time stamp?
> Also what are the change that the same _human_ will make two different posts within the same second?!
Just out of curiosity, What would happen someday if I (maybe trolling) edit my twtxt.txt-file manually and switch/switch a couple of twt timestamps, or add in 3 different twts manually with the same time stamp?