Acho que quero fazer algo parecido com isso aqui, mas a minha incompetência / inexperiência me derruba...
Tenho um geodataframe com praças e parques, e um com massa de vegetação significativa (que peguei no geosampa), queria saber calcular em uma coluna o quanto cada praça está coberta de vegetação significativa...
https://gis.stackexchange.com/questions/421888/getting-the-percentage-of-how-much-areas-intersects-with-another-using-geopandas
Eu soube fazer um overlay de instersecção, filtrar as com área menor que 100m2 e usar o .explore() pra colorir as massas por área, já fiquei feliz, mas queria mais rsrsrs.
#python #geopandas #geoPython #GIS
Acho que quero fazer algo parecido com isso aqui, mas a minha incompetência / inexperiência me derruba...
Tenho um geodataframe com praças e parques, e um com massa de vegetação significativa (que peguei no geosampa), queria saber calcular o quanto de cada praça está coberto de vegetação significativa...
https://gis.stackexchange.com/questions/421888/getting-the-percentage-of-how-much-areas-intersects-with-another-using-geopandas
Eu soube fazer um overlay de instersecção, filtrar as com área menor que 100m2 e usar o .explore() pra colorir as massas por área, já fiquei feliz, mas queria mais rsrsrs.
#python #geopandas #geoPython #GIS

I am feeling "mushy" today. Ugh, ageing sucks.

That wrote a cool script
But since, he left github
Deleted all his repos with it.
Now the cool script is gone
remaining this poem I wrote instead
Doesn't rhyme, isn't cool, sounds bad.
A simple bash script to write a post in a
mktemp
file then clean it with regex.I don't even bother to hash the replies, I just open https://twtxt.net and copy the hash by hand since I'm checking the new posts from there anyway (temporarily, as I might end up DoS-ing everyone's feed in my client right now).
I also getting angry thinking how this Chat Control crap will escalate to.
I'm already thinking of countermeasures and self-hosted alternatives, while searching lists of affected apps and services to replace/drop in the worst scenario (and probably devices).
That's why part of my proposal was to allow custom strings and be free from a specific format that need periodical upgrades, but it's not much of a problem in the end.
I'll adapt to what we can get out of this.
* https://www.pcp.pt/insistencia-do-governo-em-medidas-fiscais-agrava-injusticas-promove-precariedade-na-habitacao
"Até agora, a redução da taxa para 10% estava condicionada à celebração de contratos com a duração mínima de dez anos, com esta alteração, passa a ser aplicada a taxa de 10% mesmo a contratos de apenas um ano (com rendas até 2300 euros). Na verdade, o Governo incentiva a conversão de contratos de longa duração em contratos de curta duração, desprotegendo os inquilinos e contribuindo para uma nova espiral de aumento das rendas."
"quer o alargamento das deduções com as despesas de arrendamento por parte dos inquilinos em sede de IRS – medida que deixa de fora mais de 40% dos inquilinos, pois não pagam IRS – , quer em particular a descida do IVA na construção para imóveis até 648 mil euros inserem-se numa orientação que privilegia e favorece um tipo de construção cujos valores não respondem às necessidades de quem procura arrendar ou comprar habitação"
"O País precisa [...] da regulação do mercado de arrendamento que coloque limites ao valor das rendas e force a sua descida. Precisa da promoção da estabilidade dos contratos de arrendamento assumindo os dez anos como referência."
...though, the presence of the text fragments then makes reversing the replied-to twt (and therefore its hash) trivial, which could allow clients to tolerate the omission of the hash — and while it would be 'non-standard' this would be the best of both worlds; potential to *tolerate* (or pave a glacial path toward? :o) human writable twts whilst keeping a unique id for twts that is universal across all pods
...though, the presence of the text fragments then makes reversing the replied-to twt (and therefore its hash) trivial, which could allow clients to tolerate the omission of the hash — and while it would be 'non-standard' this would be the best of both worlds; potential to *tolerate* (or pave a glacial path toward? :o) human writable replies whilst keeping a unique id for twts that is universal across all pods
i concede, it does make a lot of sense to fix up the hashing spec rather than completely supplant it at this point, just thinking about what the rewrite would be like is dreadful in and of itself x.x
i concede, it does make a lot of sense to fix up the hashing spec rather than completely supplant it at this point, just thinking about what the rewrite would be like is dreadful in and of itself x.x

* https://eco.sapo.pt/2025/09/25/rendas-de-2-300-euros-sao-moderadas-a-explicacao-do-governo/
Veja-se só:
"para poderem precisamente fixar-se e poderem constituir os recursos humanos que a nossa administração pública e as nossas empresas precisam para serem competitivas"
"uma família com dois ou três filhos em Lisboa e no Porto, muitas vezes não consegue ter uma habitação a um preço inferior a este"
"os 2.300 euros, nomeadamente nas zonas de maior pressão, apontam para um rendimento do agregado familiar na casa dos 5.000 euros para um agregado familiar de um pai, de uma mãe, dois filhos"
"também estão lá os funcionários públicos em início de carreira com 1.248 euros"
E a realidade em Portugal?
* salários mais baixos na Função Pública são 878,41 euros brutos mensais (2025)
* rendimento bruto médio de agregado familiar em Portugal era 3352€/mês (2022)
* casal de FPs a receber cada 1.248 euros/mês não ganha o suficiente para pagar uma renda mensal de 2300
Mas pronto, são rendas moderadas, os salários é que são baixos... ainda bem que o Governo vai subir o salário mínimo nacional para os 1200€... Ai não vai? É só 1100€ e para 2029? Ah, mas dá antes descontos aos senhorios, pode ser que seja mais ou menos a mesma coisa...
#ptpol #rendas*
After thinking about it for a while I got to two solutions:
Proposal 1: Thread syntax (using subject)
Each post have an implicit and an *optional* explicit root reference:
- Implicit (no action needed, all data required are already there)
- URL + timestamp
- Explicit (subject required)
- Identity (client generated)
- External reference
- Random value
We then add include a "root" subject in each post for generating explicit theads:
1.
[ROOT_ID] (REPLY_ID)
: simpler with no need of prefixes2.
(root:ROOT_ID) (reply:REPLY_ID)
: more complex but could allow expansions-
(rt:ROOT_ID) (re:REPLY_ID)
: same but with a compact version-
($ROOT_ID) (>REPLY_ID)
: same but with a single charactersEach post can have both references, like the current hash approach the reference can be treated as a simple string and don't have a real meaning.
Using a custom reference this way allows a client to decide how to generate them:
- Identity: can be a content hash or signature or anything else, without enforcing how it is generated we can upgrade the algorithm/length freely
- External references: can be provided from another system (Eg.
7e073bd345
, *yarnsocial/yarn* latest commit)- Random value: like a UUID (Eg.
9a0c34ed-d11e-447e-9257-0a0f57ef6e07
)Proposal 2: Threaded mentions (featuring zvava)
Inspired by @zvava's solution it could be simplified into:
#<nick url#timestamp>
or #<url#timestamp>
It can be shown like a mentions or hidden like a subject.
If we're using thinking of using a counter in the client, I think there's no point in avoiding the timestamp anymore.*
Would be nice to have a fixed fee for that, a car is a car anywhere in the world...
500 unique generative covers, art by Rod Junqueira and title variations by André Burnier.
https://www.indiegogo.com/projects/variavel-magazine
If there are opposing opinions we either build a bridge or provide a new parallel road.
Also, I wouldn't call my opinion a "stance", I just wish for a better twtxt thanks to everyone's effort.
The last thing we need to do is decide a proper format for the location-based version.
My proposal is to keep the "Subject extension" unchanged and include the reference to the mention like this:
// Current hash format: starts with a '#'
(#hash) here's text
(#hash) @<nick url> here's text
// New location format: valid URL-like + '#' + TIMESTAMP (verbatim format of feed source)
(url#timestamp) here's text
(url#timestamp) @<nick url> here's text
I think the timestamp should be referenced verbatim to prevent broken references with multiple variations (especially with the many timezones out there) which would also make it even easier to implement for everyone.
I'm sure we can get @zvava, @lyse and everyone else to help on this one.
I personally think we should also consider allowing a generic format to build on custom references, this would allow for creating threads using any custom source (manual, computed or external generated), maybe using a new "Topic extension", here's some examples.
// New format for custom references: starts with a '!' maybe?
(!custom) here's text
(!custom) @<nick url> here's text
// A possible "Topic" parse as a thread root:
[!custom] start here
[custom] simpler format
This one is just an idea of mine, but I feel it can unleash new ways of using twtxt.
https://merankorii.bandcamp.com/track/the-alchemist
https://nebula.tv/videos/maryspender-the-dire-straits-story-full-documentary
https://www.youtube.com/live/_g00S-a_0lo
-f bestvideo[height<=?1080]+bestaudio/best
yt-dlp
ed https://www.youtube.com/watch?v=OZTSIYkuMlU. It's only worth for an experiment, no recommendation to watch.
Printi, 360imprimir, Rei do Sticker?
(no Brasil)

Press play to hear from Marina Zurkow & James Schmitz [@hx2A@mastodon.art] the artists behind ‘The River is a Circle (Times Square Edition)’ - September’s #MidnightMoment, a visual “combination of live data and a matrix of researched information about the Hudson River ecology,” says Zurkow.»
https://www.instagram.com/reel/DO6jbXrEdBG
#CreativeCoding #Processing #Python #py5 #TimesSquare #NYC
Press play to hear from Marina Zurkow & James Schmitz [@hx2A@mastodon.art] the artists behind ‘The River is a Circle (Times Square Edition)’ - September’s #MidnightMoment, a visual “combination of live data and a matrix of researched information about the Hudson River ecology,” says Zurkow.»
https://www.instagram.com/reel/DO6jbXrEdBG
#CreativeCoding #py5 #TimesSquare #NYC

In here it is all about control, and money.

https://www.catarse.me/variavel (para o Brasil)
https://www.indiegogo.com/projects/variavel-magazine (para outros países)
#ProgramaçãoCriativa #CreativeCoding

The thread is defined by two parts:
1. The hash
2. The subject
The client/pod generate the *hash* and index it in it's database/cache, then it simply query the subject of other posts to find the related posts, right?
In my own client current implementation (using hashes), the only calculation is in the hash generation, the rest is a verbatim copy of the subject (minus the
#
character), if this is the common implemented approach then adding the location based one is somewhat simple.s
function setPostIndex(post) {
// Current hash approach
const hash = createHash(post.url, post.timestamp, post.content);
// New location approach
const location = post.url + '#' + post.timestamp;
// Unchanged (probably)
const subject = post.subject;
// Index them all
addToIndex(hash, post);
addToIndex(location, post);
addToIndex(subject, post);
}
// Both should work if the index contains both versions
getThreadBySubject('#abcdef') => [post1, post2, post3]; // Hash
getThreadBySubject('https://example.com#2025-01-01T12:00:00') => [post1, post2, post3]; // Location
As I said before, the mention is already location based
@<example https://example.com/twtxt.txt>
, so I think we should keep that in consideration.Of course this will lead to a bit of fragmentation (without merging the two) but I think this can make everyone happy.
Otherwise, the only other solution I can think of is a different approach where the value doesn't matter, allowing to use anything as a reference (hash, location, git commit) for greater flexibility and freedom of implementation (this probably need the use of a fixed "header" for each post, but it can be seen as a separate extension).
(WTF, asciiworld-sat-track somehow broke, but I have not changed any of the scripts at all. O_o It doesn't find the asciiworld-sat-calc anymore. How in the world!? When I use an absolute path, the .tle is empty and I get a parsing error. Gotta debug this.)
1. I don't see any difference between the two schemes regarding link rot and migration. If the URL changes, both approaches are equally terrible as the feed URL is part of the hashed value and reference of some sort in the location-based scheme. It doesn't matter.
2. The same is true for duplication and forks. Even today, the "cannonical URL" has to be chosen to build the hash. That's exactly the same with location-based addressing. Why would a mirror only duplicate stuff with location- but not content-based addressing? I really fail to see that. Also, who is using mirrors or relays anyway? I don't know of any such software to be honest.
3. If there is a spam feed, I just unfollow it. Done. Not a concern for me at all. Not the slightest bit. And the byte verification is THE source of all broken threads when the conversation start is edited. Yes, this can be viewed as a feature, but how many times was it actually a feature and not more behaving as an anti-feature in terms of user experience?
4. I don't get your argument. If the feed in question is offline, one can simply look in local caches and see if there is a message at that particular time, just like looking up a hash. Where's the difference? Except that the lookup key is longer or compound or whatever depending on the cache format.
5. Even a new hashing algorithm requires work on clients etc. It's not that you get some backwards-compatibility for free. It just cannot be backwards-compatible in my opinion, no matter which approach we take. That's why I believe some magic time for the switch causes the least amount of trouble. You leave the old world untouched and working.
If these are general concerns, I'm completely with you. But I don't think that they only apply to location-based addressing. That's how I interpreted your message. I could be wrong. Happy to read your explanations. :-)
It looks amazing from the map, you probably can't tell even by looking from space.
Hoje abriram as inscrições para os tutoriais, grátis, vagas limitadas!
(Aproveite e avise aquele seu amigo gringo que ele pode comprar um ingresso de estudante doação para alguém que não poderia participar do evento completo com as palestras! :D)
https://pybr2025.eventbrite.com.br/
#Python
1. The current hash relies on a
url
field too, by specification, it will use the first # url = <URL>
in the feed's metadata if present, that too can be different from the fetching source, if that field changes it would break the existing hashes too, a better solution would be to use a non-URL key like # feed_id = <UNIQUE_RANDOM_STRING>
with the url
as fallback.2. We can prevent duplications if the reference uses that same url field too or the client "collapse" any reference of all the urls defined in the metadata.
3. I agree that hashing based on content is good, but we still use the URL as part of the hashing, which is just a field in the feed, easily replicable by a bot, also noting that edits can also break the hash, for this issue an alternative solution (E.g. a private key not included in the feed) should be considered.
4. For offline reading the source would be downloaded already, the fetching of non followed feeds would fill the gap in the same way mentions does, maybe I'm missing some context on this one.
5. To prevent collisions there was a discussion on extending the hash (forgot if that was already fixed or not), but without a fallback that would break existing clients too, we should think of a parallel format that maintains current implementations unchanged, we are already backward compatible with the original that don't use threads at all, a mention style format for that could be even more user-friendly for those clients.
We should also keep in mind that the current mention format is already location based (
@<example https://example.com/twtxt.txt>
) so I'm not that worried about threads working the same way.Hope to see some other thought about this matter. 🤓
Looking forward to it!✌️