low key tempted to metaprogram all my metaprogramming notation cause you can infer their structure purely by parsing their names...

all the pipelining stuff fits on the screen without scrolling again which as we all know is the most important part of making something

i'm using π for confusingly many things now lmao in my defense it always means piping somehow

π the command is a boolean check for whether you are being piped to

π-/-π are directional piping commands that i need to massively rework anyway so they don't count

π in the metaprogramming notation indicates a positional argument that can be taken from stdin instead (or regular positional if not piped to)

π the variable is used in the metaprogramming itself to store the pipeline prefix

but yeah i guess really what this is is a sort of compact syntax for capturing pipelining arity to chunk up the one big argument vector

regrettably i think i will have to properly parse the argument vector if i want the integral notation to actually be able to create a context for commands within. so i guess the next order of business is metaprogramming up some convenient parsing commands

and there's still unrealized metaprogramming that would make some things a lot simpler but i can't seem to see it

currently, writing pipelines is more ergonomic than building zsh stuff out of those pipelines because you have sort of a clash of calling conventions

it works for the simple cases but clearly i was underestimating the complexity so i need a better interface

no faster way to come up with a solution than to publicly admit defeat

the metaprogramming code is pretty gnarly but you can now compactly tell pipelines how to steal arguments from later in the argument vector and to divvy them up across the pipeline stages between you and the arguments. not terribly useful purely interactively, but now you can define shell functions in terms of pipelines with the ability to merge the calling conventions

namely because you can't get nothing out of seq like you can with most programming languages sequence generation facilities. it just counts backwards and matching range endpoints return that number

now you can write the first version here instead of the second version, and it's more general to boot. ¿ basically lets you build pipeline stages out of whole pipelines more easily by specifying where to sneak their positional arguments into the pipeline

through the magic of cheating, i'm still going to call this a single screen's worth of implementation lmao

now that's the most unhinged shell scripting you're likely to ever come across lmao

the one bit of notation i dislike is that . as an argument to ¿:π is not the same interpretation as . as an argument to ¿ but like that's fine? cause it's a special case for ¿:π and the only other notation that would work better doesn't

i'm really proud of ¿:π actually. writing it (and ¿) really helped clarify what i'm concretely doing with the design of this language

my thesis is basically that typical shell pipelines become unergonomic to write outside some simple albeit very common cases. it's one thing to write a snappy all-pipes no-command-substitution pipeline when you wanna go through awk/sed/perl blah blah, but lots of things suddenly need to move information around a different way

let's use rev and basename as two motivating examples:

rev reverses whatever it sees on stdin, and passing it arguments treats them as filenames to read and act on, a very common genre of unix tool

basename acts on a positional argument and ignores stdin, another extremely common genre of unix tool

but if you ever want to start a pipeline with the reversal of a literal string, not a file, you need a silly little echo whatever | rev type deal.

likewise, if you want to take the basename of some slick pipeline you came up with, you usually end up doing something like basename $(that | slick | pipeline)

what i wanted was a way to capture those calling conventions in a uniform way so that i can automatically convert between them, and i have!

¿:π . rev which i call ÷ is a version of rev you can use to start a pipeline (at the cost of giving up the file opening capabilities, you'd have to open the file yourself in my scheme). these both print frog-status, but you can only substitute rev for ÷ in the first form

echo sutats-gorf | ÷
÷ sutats-gorf

¿:π π basename which i call ./ is a version of basename that you can pipe into or use normally. these both print frog-status as well, but you can only substitute basename for ./ in the second form

echo some/path/frog-status | ./
./ some/path/frog-status

so ¿:π is basically a tool to massage other shell utilities to help them conform to the most useful (imo) calling convention rather than whatever they happen to do

the actual notation for ¿:π invocations works like this:

. is special, meaning "this takes no positional arguments and wants stdin, but if you don't have stdin, take a positional argument"

otherwise, the number of characters is how many positional arguments the command takes, represented by : and π characters. there can be up to one π and the corresponding positional argument to the command will be taken from stdin (if piped to, otherwise standard positional so you can start pipelines by passing all args)

on the other hand, ¿ is just for reaching ahead in the argument vector. periods represent an upcoming pipeline stage and colons between them represent where positional arguments get filled in so that a compound definition can pass its positional args to arbitrary locations in the pipeline

so ¿ .:..:. some four stage pipeline would effectively become some $1 four stage $2 pipeline when you give it two arguments

so, maybe a concrete example of this for real useful code i wrote and what actually happens that makes me want to write code involving bullshit like ¿ .:...

actual real goal: move (unchunked) files around in a p2p fashion taking the fastest available peer, whether that looks like "the jellyfin client would like to switch to streaming from a local copy" or something more like lite torrenting

solution in my weird language: .@ .y ./ .!, which means download by hash (which also passes the hash down the pipeline), fetch hash info, trim to filename, register local file in the index of file hashes. that's literally what i type into my shell to download the output of stuff from another computer without necessarily having to care which of my computers it comes from.

neat, cool! very hard to have logic errors in such a short pipeline which is always good

but we run into a problem right away. fundamentally how my little language here works is each command takes however many arguments as it wants off the one big list that is The Pipeline and leaves the ones it doesn't care about on the end to be executed as the rest of the pipeline.

interactively this doesn't present much of an issue; these are approximately equally ergonomic:

.e $hash .@ .y ./ .!
.@ $hash .y ./ .!

but when you go to pull .@ .y ./ .! out into a modular chunk of code, suddenly we lose the ability to start a pipeline with a literal hash without an initial echo, even though that's one of the things we were making all of these readability sacrifices for in the first place.

the issue is that .@ .y ./ .! some-argument is simply a different function than .@ some-argument .y ./ .!, but in this language, giving a name to .@ .y ./ .! as a whole means it can only ever see the former. so we have to work within that constraint somehow; there's simply no way for .@ to know how far ahead it should look ahead for its argument unless we embed that information somehow.

enter ¿

¿ .:... means "run the following four (# of periods) element pipeline, and if you need a positional argument past that, pass it to the first element in the pipeline", so we can define .@! () { ¿ .:... .@ .y ./ .! $* } and now we have a modular version of that very short p2p filesharing operation that preserves the convenient ability to be used interactively with an initial argument while sharing its implementation directly with the pipelined version so there's no possibility of behavioral divergence

more generally, you'd run into this anytime you need to influence behavior at multiple stages in any pipeline you wanted to reuse in this language, so i really like that there's a mnemonic/glanceable way to check the arities and calling conventions of the various pieces. in some sense they're even usable as type annotations to check pipeline connections for sensibility

anyway these last four or five posts are perhaps slightly more approachable as far as what i've been working on lately if anyone has been holding off sharing their thoughts 👀

i guess really this could all be reasonably described as a dsl to make shell scripting concatenative

alternatively i suppose this is explicit syntax for the combinators j leaves implicit as hooks and forks, just in a shell pipeline context rather than binary operator context

also late last night i fixed the naming lol ¿ is now just the metaprogramming prefix, ¿:π is now just ¿π and ¿ is now ¿Ω (π for regular pipelining, Ω for when you need to look past the end for arguments)

that right there is another benefit to everything fitting on screen lol you can see everywhere you could possibly have to update changed names so you're like pshh easy

@wallhackio unironically my library of alexandria, right down to accidentally burning it down

Sign in to participate in the conversation
📟🐱 GlitchCat

A small, community‐oriented Mastodon‐compatible Fediverse (GlitchSoc) instance managed as a joint venture between the cat and KIBI families.