grep -v git at the end, so my repo is still in working order. Phew. I wish find had grep-like --exclude-dir and --exclude options (or the include variants) instead of its own weird options that I never can remember and combine properly.
Well, breaking threads on edits is considered a feature by some people. I reckon the only approach to reasonably deal with that property is to carefully review messages before publishing them, thus delaying feed updates. Any typos etc., that have been discovered afterwards, are just left alone. That's what I and some others do. I only risk editing if the feed has been published very few seconds earlier. More than 20 seconds and I just ignore it. Works alright for the most part.
https://github.com/processing/processing4/issues/1243
https://github.com/processing/processing4/issues/1244
#SVG #LinuxXFCE
https://movq.de/v/b24882ecb1/s.png
A lot of HN comments are like this – in general, I mean, not only regarding my blog posts.
I’m not even angry, I’m just genuinely confused. 😂 The blog post in question isn’t a rant to begin with. Are those comments bots to drive engagement? Is this humor that I don’t understand? Is the person being serious?
What motivates people do post such comments? What’s going on here?
This is very, very weird to me.
(I don’t use HN, I just notice it by the increased load on the server.)
i'm pretty sure the timezone is stripped off the offset correctly (
2025-09-14T12:45:00+02:00 → 2025-09-14T12:45:00Z) though messing with how the hash is generated i can't get it to make one that matches...but all other hashes for all other feeds seem to be correct? does yarn use a different canonical url for lyse internally? is there a bug in the libraries im using? bwehhh
edit: i read the spec wrong :3 only +/-00:00 is stripped, not the entire timezone offset >.<
i skimmed through discussions under other the proposals — i agree humans are very bad at keeping the integrity of the web in tact, but hashes in done in this way make it impossible even for systems to rebuild threads if any post edits have occurred prior to their deployment
sed -i s/… $(find …). Clearly, I found too many files. That's the signal to go to bed.
I’m a fan of the EPA and all its efforts and hope that we helped in some small way for this agency to communicate within itself, to other government agencies, and with the American people. I’m very grateful and appreciative that Jesse Reed and Hamish Smyth of Standards Manual, and Julie Anixter of AIGA, brought this document to life again. Have fun revisiting.»
(from the introduction by Steff Geissbühler)
Protection Agency
Graphic Standards System
Designed by Steff Geissbühler,
Chermayeff & Geismar Associates
The EPA Graphic Standards System is one of the finest examples of a standards manual ever created. The modular and flexible system devised raised the standard for public design in the United States.
The book features a foreword by Tom Geismar, introduction by Steff Geissbühler, an essay by Christopher Bonanos, scans of the original manual (from Geissbühler’s personal copy), and 48 pages of photographs from the EPA-commissioned Documerica project (1970–1977).»
https://standardsmanual.com/products/epa
A photo that shows part of a page from the EPA Graphics Standard System, it includes some nice black and white geometric patterns.
https://movq.de/v/2a7918d719/a.jpg
https://movq.de/v/2a7918d719/a.jpg
https://villares-shop.fourthwall.com/
https://umapenca.com/villares/
#Python #Processing #py5 #CreativeCoding #FLOSS #numpy #shapely #trimesh
Screen capture from my Fourthwall shop page with t-shirts, stickers and mugs.
yarnd was built over a weekend 😀
internally, bbycll relies on a post lookup table with post hashes as keys, this is really fast but i knew i'd inevitably run into this issue (just not so soon) so now i have to either:
1) pick the newer post over the other
2) break from specification and not lowercase hashes
3) secretly associate canonical urls or additional entropy with post hashes in the backend without a sizeable performance impact somehow
Siempre hay salida. ⌘ Read more****
Festival Japão. Torna-Viagem | Parte II
"Conheça o programa de hoje e embarque connosco nesta viagem ao Japão.
A partir das 15h00, no Teatro Ribeiro Conceição - Município de Lamego."
+ info em https://museudelamego.gov.pt/festival-japao-torna-viagem-parte-ii-12-e-13-setembro/
#museudelamego #festivaljapaotornaviagem #raquelochoa
#museusemonumentosdeportugal
#teatroribeiroconceicao #portugalexpo2025 #OsakaemPortugal
#vaagostudio #TRC #Lamego
Poster do festival para o dia 13   
Festival Japão. Torna-Viagem | Parte II
"Conheça o programa de hoje e embarque connosco nesta viagem ao Japão.
A partir das 15h00, no Teatro Ribeiro Conceição - Município de Lamego."
+ info em https://museudelamego.gov.pt/festival-japao-torna-viagem-parte-ii-12-e-13-setembro/
#museudelamego #festivaljapaotornaviagem #raquelochoa
#museusemonumentosdeportugal
#teatroribeiroconceicao #portugalexpo2025 #OsakaemPortugal
#vaagostudio #TRC #Lamego
Poster do festival para o dia 13   
Festival Japão. Torna-Viagem | Parte II
"Conheça o programa de hoje e embarque connosco nesta viagem ao Japão.
A partir das 15h00, no Teatro Ribeiro Conceição - Município de Lamego."
+ info em https://museudelamego.gov.pt/festival-japao-torna-viagem-parte-ii-12-e-13-setembro/
#museudelamego #festivaljapaotornaviagem #raquelochoa
#museusemonumentosdeportugal
#teatroribeiroconceicao #portugalexpo2025 #OsakaemPortugal
#vaagostudio #TRC #Lamego
Poster do festival para o dia 13   
A normal person is completely lost (that's why I got involved). Visting the broken URL opens a popup dialog suggesting to deactivate script blockers. Which I had already done upfront as a matter of prudence.
Fun bonus on top: The JWT in the link has identical
iat (issued at) and exp (expiry) claims. The expiry is definitely not checked, it's well in the past.Medical software just has to be horrible. It's a law.
Sunset