> iirc in twtxt v2 it starts prohibited
This is not true. There are no issues supporting fetching feeds via Gemini/Gopher. This is totally fine. What will likely happen is "recommendations" and "drawbacks of using Gemini/Gopher"
> iirc in twtxt v2 it starts prohibited
This is not true. There are no issues supporting fetching feeds via Gemini/Gopher. This is totally fine. What will likely happen is "recommendations" and "drawbacks of using Gemini/Gopher"
Another thing, it seems that you sugget we only use the domain in the hash-creation and not the full path to the twtxt.txt
$ echo -e "https://example.com 2024-09-29T13:30:00Z Hello World!" | sha256sum - | awk '{ print $1 }' | base64 | head -c 12
Another thing, it seems that you sugget we only use the domain in the hash-creation and not the full path to the twtxt.txt
$ echo -e "https://example.com 2024-09-29T13:30:00Z Hello World!" | sha256sum - | awk '{ print $1 }' | base64 | head -c 12
Another thing, it seems that you sugget we only use the domain in the hash-creation and not the full path to the twtxt.txt
$ echo -e "https://example.com 2024-09-29T13:30:00Z Hello World!" | sha256sum - | awk '{ print $1 }' | base64 | head -c 12
Another thing, it seems that you sugget we only use the domain in the hash-creation and not the full path to the twtxt.txt
$ echo -e "https://example.com 2024-09-29T13:30:00Z Hello World!" | sha256sum - | awk '{ print $1 }' | base64 | head -c 12
Otherwise you can check if you already have the
pdftotext command that comes with the poppler-utils package, try converting converting the pdf into a text file and copy to your heart's content. I have just tried it myself. If you don't have it already here's what you can do on Ubuntu or any Debian based distribution of Linux:
- Update and upgrade your packages:
> sudo apt update && sudo apt upgrade
- Install the
poppler-utils package> sudo apt install poppler-utils
- Now you can convert your pdf to txt file with:
> pdftotxt -layout -enc UTF-8 name_of_source_file.pdf name_of_destination_file.txt
You can always do a
pdftotxt --help to see the rest of possible options. Hope this helps.
Otherwise you can check if you already have the
pdftotext command that comes with the poppler-utils package, try converting converting the pdf into a text file and copy to your heart's content. I have just tried it myself. If you don't have it already here's what you can do on Ubuntu or any Debian based distribution of Linux:
- Update and upgrade your packages:
> sudo apt update && sudo apt upgrade
- Install the
poppler-utils package> sudo apt install poppler-utils
- Now you can convert your pdf to txt file with:
> pdftotxt -layout -enc UTF-8 name_of_source_file.pdf name_of_destination_file.txt
You can always do a
pdftotxt --help to see the rest of possible options. Hope this helps.
Otherwise you can check if you already have the
pdftotext command that comes with the poppler-utils package, try converting converting the pdf into a text file and copy to your heart's content. I have just tried it myself. If you don't have it already here's what you can do on Ubuntu or any Debian based distribution of Linux:
- Update and upgrade your packages:
> sudo apt update && sudo apt upgrade
- Install the
poppler-utils package> sudo apt install poppler-utils
- Now you can convert your pdf to txt file with:
> pdftotxt -layout -enc UTF-8 name_of_source_file.pdf name_of_destination_file.txt
You can always do a
pdftotxt --help to see the rest of possible options. Hope this helps.
f:f
~/src/jenny $ git diff
diff --git a/jenny b/jenny
index ada8da2..8ae9a06 100755
--- a/jenny
+++ b/jenny
@@ -1194,7 +1194,7 @@ if __name__ == '__main__':
if args.edit:
edit_twt_file(app)
elif args.fetch:
- with DirectoryLock(f'/tmp/jenny-{getuser()}.run'):
+ with DirectoryLock(expanduser(f'~/tmp/jenny-{getuser()}.run')):
retrieve_all(app)
elif args.last_seen:
print('Feeds last seen at (times are local time), oldest first:')
f:f
~/src/jenny $ git diff
diff --git a/jenny b/jenny
index ada8da2..8ae9a06 100755
--- a/jenny
+++ b/jenny
@@ -1194,7 +1194,7 @@ if __name__ == '__main__':
if args.edit:
edit_twt_file(app)
elif args.fetch:
- with DirectoryLock(f'/tmp/jenny-{getuser()}.run'):
+ with DirectoryLock(expanduser(f'~/tmp/jenny-{getuser()}.run')):
retrieve_all(app)
elif args.last_seen:
print('Feeds last seen at (times are local time), oldest first:')
f:f
~/src/jenny $ git diff
diff --git a/jenny b/jenny
index ada8da2..8ae9a06 100755
--- a/jenny
+++ b/jenny
@@ -1194,7 +1194,7 @@ if __name__ == '__main__':
if args.edit:
\tedit_twt_file(app)
elif args.fetch:
- with DirectoryLock(f'/tmp/jenny-{getuser()}.run'):
+ with DirectoryLock(expanduser(f'~/tmp/jenny-{getuser()}.run')):
retrieve_all(app)
elif args.last_seen:
\t print('Feeds last seen at (times are local time), oldest first:')
f
~/src/jenny $ git diff
diff --git a/jenny b/jenny
index ada8da2..8ae9a06 100755
--- a/jenny
+++ b/jenny
@@ -1194,7 +1194,7 @@ if __name__ == '__main__':
if args.edit:
edit_twt_file(app)
elif args.fetch:
- with DirectoryLock(f'/tmp/jenny-{getuser()}.run'):
+ with DirectoryLock(expanduser('~/tmp/jenny-{getuser()}.run')):
retrieve_all(app)
elif args.last_seen:
print('Feeds last seen at (times are local time), oldest first:')
and of course make sure you mkdir ~/tmp~
f
~/src/jenny $ git diff
diff --git a/jenny b/jenny
index ada8da2..8ae9a06 100755
--- a/jenny
+++ b/jenny
@@ -1194,7 +1194,7 @@ if __name__ == '__main__':
if args.edit:
edit_twt_file(app)
elif args.fetch:
- with DirectoryLock(f'/tmp/jenny-{getuser()}.run'):
+ with DirectoryLock(expanduser('~/tmp/jenny-{getuser()}.run')):
retrieve_all(app)
elif args.last_seen:
print('Feeds last seen at (times are local time), oldest first:')
and of course make sure you mkdir ~/tmp~
f
~/src/jenny $ git diff
diff --git a/jenny b/jenny
index ada8da2..8ae9a06 100755
--- a/jenny
+++ b/jenny
@@ -1194,7 +1194,7 @@ if __name__ == '__main__':
if args.edit:
\tedit_twt_file(app)
elif args.fetch:
- with DirectoryLock(f'/tmp/jenny-{getuser()}.run'):
+ with DirectoryLock(expanduser('~/tmp/jenny-{getuser()}.run')):
retrieve_all(app)
elif args.last_seen:
\t print('Feeds last seen at (times are local time), oldest first:')
and of course make sure you mkdir ~/tmp~
2024-09-29T12:08:15Z (#7wdvhia) @<lyse https://lyse.isobeef.org/twtxt.txt> love 27! Is that your town as seeing from the mountain, or some other town? From 395 to 40 is quite some picking! I figure that’s the most difficult part, right?
Ah, 16°C… what dreams are made of! 😍
2024-09-29T12:08:15Z\t(#7wdvhia) @<lyse https://lyse.isobeef.org/twtxt.txt> love 27! Is that your town as seeing from the mountain, or some other town? From 395 to 40 is quite some picking! I figure that’s the most difficult part, right?
Ah, 16°C… what dreams are made of! 😍
#1234567, it could refer to the original or some edit of it. It is up to clients to find out what this hash could mean (by keeping a historical database of all feed versions, basically).Isn’t this *essentially* the same as only including
url and timestamp in the hash?
#1234567, it could refer to the original or some edit of it. It is up to clients to find out what this hash could mean (by keeping a historical database of all feed versions, basically).Isn’t this *essentially* the same as only including
url and timestamp in the hash?
#1234567, it could refer to the original or some edit of it. It is up to clients to find out what this hash could mean (by keeping a historical database of all feed versions, basically).Isn’t this *essentially* the same as only including
url and timestamp in the hash?
#1234567, it could refer to the original or some edit of it. It is up to clients to find out what this hash could mean (by keeping a historical database of all feed versions, basically).Isn’t this *essentially* the same as only including
url and timestamp in the hash?
> Ah, 16°C… what dreams are made of! 😍
I'd like it to be a nice cool 16°C here 🤣
> Ah, 16°C… what dreams are made of! 😍
I'd like it to be a nice cool 16°C here 🤣
Ah, 16°C… what dreams are made of! 😍
twet and continue to improve it. It's an "okay" Twtxt cli client, but it needs a bit more work 👌
twet and continue to improve it. It's an "okay" Twtxt cli client, but it needs a bit more work 👌
twet 🤦♂️
twet 🤦♂️
My client fetches a feed. It builds a map/hashmap/dictionary of all twts: Timestamps map to twt hashes. It then stores/shows the twts. It also stores the hashmap.
On the next fetch operation, the client re-processes all twts in the feed. It must now compare each timestamp to the previously built hashmap: Aha, timestamp
T has now a twt hash of B instead of A, so this is an edited twt.Did I understand that correctly so far? 🤔
My client fetches a feed. It builds a map/hashmap/dictionary of all twts: Timestamps map to twt hashes. It then stores/shows the twts. It also stores the hashmap.
On the next fetch operation, the client re-processes all twts in the feed. It must now compare each timestamp to the previously built hashmap: Aha, timestamp
T has now a twt hash of B instead of A, so this is an edited twt.Did I understand that correctly so far? 🤔
My client fetches a feed. It builds a map/hashmap/dictionary of all twts: Timestamps map to twt hashes. It then stores/shows the twts. It also stores the hashmap.
On the next fetch operation, the client re-processes all twts in the feed. It must now compare each timestamp to the previously built hashmap: Aha, timestamp
T has now a twt hash of B instead of A, so this is an edited twt.Did I understand that correctly so far? 🤔
My client fetches a feed. It builds a map/hashmap/dictionary of all twts: Timestamps map to twt hashes. It then stores/shows the twts. It also stores the hashmap.
On the next fetch operation, the client re-processes all twts in the feed. It must now compare each timestamp to the previously built hashmap: Aha, timestamp
T has now a twt hash of B instead of A, so this is an edited twt.Did I understand that correctly so far? 🤔
Yep, these are some sick mushrooms. No idea what they are, though. Not sure if they're edible more than once or not, but I have a feeling that one should refrain from trying. The ones I photographed here were in a nature reserve. They were a bit bigger than the others we came across on meadows. Still impressive sizes nevertheless.