ssh
. https://mkws.sh is an OpenBSD VPS hosted by http://openbsd.amsterdam/, I do most of my work there. I run another small Void Linux VPS. I'm not that hardcore. π
https://mkws.sh
is a OpenBSD machine. I'm not that hardcore. π
ssh
. https://mkws.sh is an OpenBSD VPS hosted by http://openbsd.amsterdam/. I run another small Void Linux VPS. I'm not that hardcore. π
Win+;
.
swallow
effect. There's no need for text-mode graphics.
foo
function call which is also not found because again there's no foo
command anywhere, because the function definition wasn't parsed correctly earlier as preferred.
$ /bin/sh -c "$(cat foo); foo"
, the $(cat foo)
part is evaluated in the outer sh
process, so the actual argument your sh
invocation is getting is:\n\n$ /bin/sh -c "foo() {\n printf "Hello World"\n}; foo"\n
\nYou have the function definition there and the call which works.\nWhen calling $ /bin/sh -c '$(cat foo); foo'
, the $(cat foo)
part is evaluated in your sh
process, so what's actually happening is $(cat foo)
is trying to interpret the first "command" from the foo
file, foo()
which is obviously not found
.
$ /bin/sh -c "$(cat foo); foo"
, the $(cat foo)
part is evaluated in the outer sh
process, so the actual argument your sh
invocation is getting is:
$ /bin/sh -c "foo() {
printf "Hello World"
}; foo"
You have the function definition there and the call which works.
When calling
$ /bin/sh -c '$(cat foo); foo'
, the $(cat foo)
part is evaluated in your sh
process, so what's actually happening is $(cat foo)
is trying to interpret the first "command" from the foo
file, foo()
which is obviously not found
.
\n$ cat foo\nfoo() { printf "Hello World" }\n$ sh foo\nfoo[1]: syntax error:
{' unmatched\n\n\n
\n$ cat foo\nfoo() {\nprintf "Hello World"\n}\n$ /bin/sh -c "$(cat foo); foo"\n\n\nWorks well.\n\n
\n/bin/sh -c '$(cat foo); foo'\n\n\nDoesn't.\n\n I guess is how newlines are processed inside double quotes as opposed to single quotes.
\n$ cat foo\nfoo() { printf "Hello World" }\n$ sh foo\nfoo[1]: syntax error:
{' unmatched\n\n\n
\n$ cat foo\nfoo() {\nprintf "Hello World"\n}\n/bin/sh -c "$(cat foo); foo"\n\n\nWorks well.\n\n
\n/bin/sh -c '$(cat foo); foo'\n\n\nDoesn't.\n\n I guess is how newlines are processed inside double quotes as opposed to single quotes.
$ cat foo
foo() { printf "Hello World" }
$ sh foo
foo[1]: syntax error:
{' unmatched
$ cat foo
foo() {
printf "Hello World"
}
$ /bin/sh -c "$(cat foo); foo"
Works well.
/bin/sh -c '$(cat foo); foo'
Doesn't.
I guess is how newlines are processed inside double quotes as opposed to single quotes.
sed
s on those html files and maybe create some temp files and pass those files to ./share/l.upphtml
in the mkws
main script?
t
f() {
uname
}
t2
#!/bin/sh
. ./t
f
ssh -T adi@REMOTE << EOF
$(cat t)
f
EOF
\nf() {\nuname\n}\n
\n\nt2\n\n#!/bin/sh\n\n. ./t\nf\nssh -T adi@REMOTE << EOF\n$(cat t)\nf\nEOF\n
pp
? π Did you get pp: Buffer overflow
? π
pp
? π
> After that, I tried to use a program bundled with someone else's shell script site generator to make my own, but I couldn't get around one of the absurd limitations of the original generator. I eventually conceded that I would have to drastically change the formatting of my website and continued working, and I very quickly ran into either a bug that I couldn't fix because I don't know C, or (more likely) an even more absurd limitation that I don't care to conform to.
Is this
pp
? π Did you get pp: Buffer overflow
? π
sed 's/"/'\\''/g'
I guess for double quotes.
sed 's/"/'\''/g'
I guess for double quotes.
share/sitemap.uppxml
:\n\n\n<?xml version='1.0' encoding='UTF-8'?>\n<urlset xmlns='http://www.sitemaps.org/schemas/sitemap/0.9'>\n#!\nfor f in *.html\ndo\n#!\n<url>\n <loc>$1/$(basename "$f")</loc>\n <lastmod>$(lmt -f '%Y-%m-%dT%H:%M:%SZ' "$f" | cut -d' ' -f1)</lastmod>\n <priority>1.0</priority>\n</url>\n#!\ndone\n#!\n</urlset>\n
\n\nlmt
is scripted in there, also in ./share/l.upphtml
. You can script lmt
also in your Atom Feed. pp
is called all over the place.
share/sitemap.uppxml
:
<?xml version='1.0' encoding='UTF-8'?>
<urlset xmlns='http://www.sitemaps.org/schemas/sitemap/0.9'>
#!
for f in *.html
do
#!
<url>
<loc>$1/$(basename "$f")</loc>
<lastmod>$(lmt -f '%Y-%m-%dT%H:%M:%SZ' "$f" | cut -d' ' -f1)</lastmod>
<priority>1.0</priority>
</url>
#!
done
#!
</urlset>
lmt
is scripted in there, also in ./share/l.upphtml
. You can script lmt
also in your Atom Feed. pp
is called all over the place.
share/sitemap.uppxml
:\n\n\n<?xml version='1.0' encoding='UTF-8'?>\n<urlset xmlns='http://www.sitemaps.org/schemas/sitemap/0.9'>\n#!\nfind . -name '*.html' | while read f\ndo\n#!\n<url>\n <loc>$1/$(basename "$f" | pe)</loc>\n <lastmod>$(lmt -f '%Y-%m-%dT%H:%M:%SZ' "$f" | cut -d' ' -f1)</lastmod>\n <priority>1.0</priority>\n</url>\n#!\ndone\n#!\n</urlset>\n
\n\nlmt
is scripted in there, also in ./share/l.upphtml
. You can script lmt
also in your Atom Feed. pp
is called all over the place.
mkws
, the rest are dependencies. You don't run those manually and you can script them in your templates. Makes sense to call pp $anotherfile
than singlebinary render $anotherfile
I guess.
entr
in my static site generator for instance, entr
does its job very well. Also another thttpd
or some other webserver or some weird websocket JavaScript live reload mechanism or another markdown processor when we have smu
, cmark
, lowdown
, all of these with a lengthy set of switches and knobs on a single binary.
Markdown
support in that single binary, and live reload, and a webserver. There's Hugo for that. Maybe Hugo is better.
wget -O - https://mkws.sh/mkws@4.0.8.tgz | tar -xzvf -
.
pp
and lmt
. I really refuse to build another static site generator that has a webserver inside.
pp
, lmt
, mkws
symlinks linking to one mkws
main binary, or call one mkws
main binary with the arguments mkws pp
, mkws lmt
, mkws main?
. Or just build a single binary with arguments like "everybody" does it.
sh
is a good fit as a templating language, it was built for massaging text, that's what templating languages mostly do.
tools
, plugins
or extensions
section to the website. Also, you can check out some stuff from here also https://adi.tilde.institute/, linters
and cl
, cbl
and fl
which are CLI log "analytics" tools: unique, visits, referers etc. Might rewrite them in C not sure, but awk
fits the bill pretty well. Some tinkering required for now.
Markdown
capabilities to the main binary, or add a web server, or add live reload to the main binary. I kept it as simple as possible, I don't believe I really have to combine everything in a single executable, a tree is not bad.
Markdown
capabilities to the main binary, or add a web server, or add live reaload to the main binary. I kept it as simple as possible, I don't believe I really have to combine everything in a single executable, a tree is not bad.
./share/l.upphtml
file is https://clbin.com/phoub, where l.js
is https://livejs.com/. This gives me a live reload env.
./bin/d
script for development:
#!/bin/sh
export DEV=1
https &
find . -type f -name 'mkws' -o -name '*.upp*' | entr ./bin/mkws https://mkws.sh
https
is https://clbin.com/tIIMk, entr
is http://eradman.com/entrproject/.
./bin/d
script for development:\n\n\n#!/bin/sh\n\nexport DEV=1\n\nhttps &\n\nfind . -type f -name 'mkws' -o -name '*.upp*' | entr ./bin/mkws https://mkws.sh\n
\n\nhttps
is https://clbin.com/tIIMk, entr
is http://eradman.com/entrproject/.
mkws
script for customization. pp
, lmt
and mkws
could be combined in a single shell file via shar
but that would give a complicated main mkws
script I guess or combine them all in a single static binary but that would mean being unable to customize the mkws
script. Anyway, give it a spin and let me know how it works.
pp
and lmt
I guess?
all
is a broad scope. My idea was to distributed a full tree to and from which you can add/extract stuff to your preference. Read this also https://twtxt.net/conv/hvygjbq. There are also other static site generators who deliver a single static executable, hugo
, zola
, `saait`.
all
is a broad scope. My idea was to distribute a full tree to and from which you can add/extract stuff to your preference. Read this also https://twtxt.net/conv/hvygjbq. There are also other static site generators who deliver a single static executable, hugo
, zola
, `saait`.
> but itβd be cool to have just one file static linked executable that does it all.
Maybe yes, maybe no. It would complicate some stuff, like customization, you would have to add config files,
all
is a broad scope. My idea was to distribute a full tree to and from which you can add/extract stuff to your preference. Read this also https://twtxt.net/conv/hvygjbq. There are also other static site generators who deliver a single static executable, hugo
, zola
, `saait`.
sh
file via https://man.openbsd.org/shar. I'll have to explore this.
uudecode
not uuencode
, this one:
ssh -T "$1" << EOF | uudecode
uudecode
not uuencode
, this one:\n\n\nssh -T "$1" << EOF | uudecode\n
\ntar -czf - ws.sh|uuencode "\\$f"\n>&2 printf "Writing %s\\\\n" "\\$f"\n
\n\n>&2 printf "Writing %s\\\\n" "\\$f"
goes to stderr
and doesn't interfere with uudecode
which is getting it's input from stdout
.
tar -czf - ws.sh|uuencode "\$f"
>&2 printf "Writing %s\\n" "\$f"
>&2 printf "Writing %s\\n" "\$f"
goes to stderr
and doesn't interfere with uuencode
which is getting it's input from stdout
.
\ntar -czf - ws.sh|uuencode "\\$f"\n>&2 printf "Writing %s\\\\n" "\\$f"\n
\n\n>&2 printf "Writing %s\\\\n" "\\$f"
goes to stderr
and doesn't interfere with uuencode
which is getting it's input from stdout
.
4.0.9
is identical to 4.0.8
. It's a bug.
\nssh -T "$1" << EOF | uudecode\n>&2 printf "Packing for %s\\\\n" "\\$(uname)"\ntrap "rm -rf $tmp" EXIT INT HUP TERM\ncd "$tmp"\n...\nf=mkws-"\\$(uname | tr '[:upper:]' '[:lower:]')"@"$(echo "$5"| tr _ .)".tgz\ntar -czf - ws.sh|uuencode "\\$f"\n>&2 printf "Writing %s\\\\n" "\\$f"\nEOF\n}\n
ssh -T "$1" << EOF | uudecode
>&2 printf "Packing for %s\\n" "\$(uname)"
trap "rm -rf $tmp" EXIT INT HUP TERM
cd "$tmp"
...
f=mkws-"\$(uname | tr '[:upper:]' '[:lower:]')"@"$(echo "$5"| tr _ .)".tgz
tar -czf - ws.sh|uuencode "\$f"
>&2 printf "Writing %s\\n" "\$f"
EOF
}
mkws
's binaries on OpenBSD and Linux I run a set of commands via ssh
, create a binary archive, pipe it to uuencode
on the remote server and pipe it back to uudecode
locally. Great use case for uuencode
and uudecode
pair.