Winter MMXXII: An unintelligible tale of lost self
2022-12-24, post № 268
poetry, #winter, #short-story
2022-12-24, post № 268
poetry, #winter, #short-story
2022-12-15, post № 267
software, #deprecation, #opinion, #bloat
“A hackable text editor for the 21st Century” once was the proud motto of yet another text editor’s working group [A22 [1]]; only that it also served as the flagship to their shiny new platform. Web everything! were the joy-filled chants ushering in a new era of containerized Chromiums. Too elite to recognize UNIX, too elite to value Emacs. Too elite and dazzled by their own superiority stemming from one interpretation of Git, they bestowed upon us Atom.
Eleven years later [2], Atom gets slaughtered [G22 [3]]: GitHub got gobbled up and Redmond picked favorites: VS Code runs on the platform envisioned with Atom — like a mistletoe surviving felling, it, transiently, seems to be the one that stays.
I have fond memories of Atom: in early 2019, when I first dipped my toes into a crude Git workflow, necessitated by a university group project to build a Java compiler in Haskell, it was my tool. Not realizing and at most tentatively asking myself about the difference between a popular version control system and a no-value centralization effort regarding the very, my window into computing was a MacBook and on it my window into conducted computing GitHub Desktop and Atom.
Atom wasn’t too bad if one accepted the dulled GUI-y way of life [S99 [4], pp. 46—60: ‘The Interface Culture’]: its shaky foundation and interpretation of JavaScript’s strengths only possible origins being besotted Web addicts isn’t apparent to the jaded mind. It felt right at home in the form-over-everything dystopia of macOS.
As such, I personally feel less of a loss in software and more an anger to monopolization: as so often with this duplicitous Redmond-free software, we witnessed another instance of gulp, grief, let grieve. Sub-par software and design ethe weren’t dropped, they were set in stone, hammered shut and embossed with a mark of ownership.
2022-11-26, post № 266
opinion, #computing, #despair, #free-software
It must have been last year’s late months when while strolling through my library’s isles I glimpsed at one unexpected cover which earnestly pulled me in: with bold, dark cyan letters on a once white background which by now radiated its decades shelved away it read UNIX. One of the original works [KP84 [1]] on the approach to conducted computing now ever so dear to my heart: Its authors instrumentally involved in its creation, I felt compelled to read an ad fontes account of the digital landscape of forty yesteryears.
For by now nearly a year I flipped through the yellowed pages of gorgeously typeset hardware descriptions, shell documentation, design rationale, operating system history, C listings and DSL showcases. This book accompanied me an entire year and despite hiatuses of several weeks at a time, when I did pick it up, it pulled me into a magical world of optimism and dare I say naïf exploration of live symbol shifters, their capabilities and aesthetical merits. I wistfully bethink; reading a story of thought, interleaved with prose of C. Slightly smirking at the supposed limitations of terse call stacks notwithstanding true marvel for what had then been achieved.
I seem to have lost this spark. The raw awe of feeling text come alive on its own. The artful contention with the impossibility of an oracle but the dullness of an echo. The joy of playing this game: talking to a single dollar sign on a black void and therein watching my thought act, be, talking back.
Nowadays, the air of digitality becomes further and further intertwined with a frightened, unsmotherable and ever-present anticipation of existential demise. [F22 [2]]
I have furthermore come to the realisation that all I do is long for the origins of our stagnant over-technologized world. When at a rasterized photon spewer, I do everything to make it be a terminal. If I want my letters to have non-informational entropy attached to them, I incorporate caligraphy into my hand writing and enjoy the flown ink. I see nothing more than byte transformers in these machines and for that the simple read
and write
syscalls are sufficient. All advances made in the field of hardware are about dubiously motivated reality approximation unit density and time warping, yet can still be described by ps
. I don’t care for the web, for the obsessive skeuomorphic recontextualization of reality GUIs make one believe in. [S99 [3], pp. 46—60: ‘The interface culture’]
I even wonder if Stallman’s legacy is indeed a cult with only those wise enough to abstain from befouling their mouths with ignorance the likes of ‘FOSS’ members. A cult which has failed, their one and only message through conformance kneaded into obscurity.
But where to then? With churches crumbling and analogue life waning? With every last cranny infested by fakery? — Maybe the right choice is to hide in local woodland; all efforts to think the contrary seem that unbearably inane.
2022-10-29, post № 265
opinion, software-design, #package-manager, #dependency-management
Monolithic kernel or not, a base operating system is seldom equipped for all tasks a user expects to perform on it. As such, most OS’ provide facilities to run custom executables on the machine, as well as loosely managing them when running. However, software acquisition is largely thought of as an extra layer on top of the base system, interwoven with it to various degrees: a userland’s package manager’s job.
Unfortunately, fueled by the manic lust for usability by a fabricated usager incompétent, the classical package manager maxim (as acted upon by e. g. Advanced Package Tool (APT) [1], Pacman [2], Portage [3], Dandified YUM (DNF) [4], Zypper [5]) seems to fade in popularity, with large GNU/Linux distributions gravitating towards container bundles (see e. g. Flatpak [6], Snap [7], AppImage [8]): whilst their advantages of dependency encapsulation and supposed security benefits are never-endingly chanted by their proponents, in my view they are merely an ad-hoc solution. Instead of crafting solid software with thoughtful third-party inclusions and kempt versions of all parts (which would include replacing abandoned upstreams instead of freezing them together with their reason for abandonement indefinetely), a jumbled mess of any bit that has not evaded the grasp of GitHub is crammed into a bloated archive, coated in a generic runtime and made executable to fool an unsuspecting quick glance into believing one is installing a piece of software worth one’s while.
A cleaner approach which does not use the plethora of unwritten bits found on modern storage devices as a lazy cop-out to understand a package as a miniature virtual computer but nonetheless achieves the desirable property of package atomicity, solves the issue of competing library names or versions and even promises reproducible builds is to improve upon the classical maxim but not discard it (see e. g. Nix [9], Guix [10]). However, I feel like these projects get carried away by their founding principles, slowly encroaching on the entire system: understandling themselves as much more than merely a package manager, they become the distribution and even want to manage user-specific configs (see e. g. guix home
[11]). As such, they fall into the trap of complexity and bloatedness, alienating those who came for a Unix experience.
Notably, Guix’ appeal does not stop at boot-level reproducible systems: it is the one entry on the FSF’s list of endorsed GNU/Linux distributions [12] which I managed to successfully use for a few days. The signficance of the existence of such a distribution should not be underestimated, since — yet again unfortunately — the ideals of free software increasingly fall by the wayside with proprietary firmware, kernel blobs and spyware for the masses being accepted by many. Combined with GNU’s declining relevance [13], it might be that Stallman’s four freedoms [14] fail to capture a now relevant dimension of software; in my opinion a pertinent one is to restrict source complexity.
Filling this void, the concept of software minimalism, famously advocated for by suckless.org e.V., prohibits many of the driving forces behind abandoning free software principles: cutting-edge hardware pressure, bloat-cope and dependency trends to name a few.
If now Guix has shown that a package manager can be used as a vehicle to preach a message but — for me more importantly — also comfort believers, I sense there to be an open niche for a package manager representing both software minimalism and software freedom.
Peculiarly, my proposed niche seems to at first glance be diametrically opposed to the core function of a package manager: managing complex package hierarchies. After all, many few-man-show projects choose to go with the in-memory installer route of curl shady.org | sudo sh
. Leaving security concerns aside (should one have reason to assume malice, installation via less exploitable means is only a boon when never running the installed), lack of transparency makes uninstalling a daunting task plus further obfuscates the user’s perception of their own system — contrary to the goals of free software.
Furthermore, the separation of program binary and run command (i. e. rc, config file) is blurring evermore in light of suckless-style architectures (see e. g. dmenu [15], dwm [16], XMonad [17], knôtM [18]) lifting configurations to the compilation stage, thus leading to a deluge of similar forks — a scenario classical package managers are not equipped to satisfactorily handle.
On the topic of architectural changes, the decrease in sheer source volume also makes dropping pre-compilation more attractive: Guix consoles the wary with the promise to only optimise reproducible building by injecting pre-compilations and cryptographically proving identity, whilst Portage bears the burden of hour-long compile times. Yet many package managers rely on binary distribution of clouded origins, which itself undermines the principles of free software: claiming to treat people in a free manner is not just about pointing at a licence file on a project’s homepage and executable poisoning is one of the least arduous vectors a miscreant could dream of.
Suffice it to say, software freedom isn’t worth a cent without the source, which is often unfairly treated like an addendum to the binary and thus forgotten about; never shipped. I think source compilation is paramount and beyond its conceptual merits, minimalistic software does not suffer from excruciating compile times, rendering compilation on delivery viable.
From a deeper architectural view, dependency management and the attitude towards dynamic libraries has also changed over the last decade: go’s toolchain relies heavily on ad fontes source inclusion [C18 [19]] and the benefits of dynamic libraries are nowadays hard to make out [D20 [20]]. My few encounters with such system-wide dependencies of late have only been Google’s brotli wrapper for go [21] (I am unsure if the hassle to rely on cgo is worth the presumed performance superiority) as well as the never-dying fossils Xlib [22] and ncurses [23].
Some research-stage approaches have gone even further, dissecting the concept of a source package into finely hash-indexed language-semantic atoms instead of collections of files (see e. g. Unison [24], Fragnix [25]). [26]
I surmise the disappearance of a pressing need to save disk space, evermore longed for moral and security-related transparency paired with the tremendous potentials e. g. dead code elimination, call-specific fusion and measurement-based custom optimization heuristics open up will help the proliferation and maturing of such heavily source-oriented software design approaches.
My own involvement in the area of package management stems from the realisation that my current dotfile management leaves a lot to be desired: fed up with the myriad of individual files I had to manually manage (a shell rc, an editor rc, color schemes for said editor, and various utilities either written in shell or C — thus also needing compilation), in 2021 I started to verbatim paste or base64 encode everything into my shell rc, effectively devising an ad-hoc archive format. [27] Whilst undoubtably in line with the Unix tradition [KP84 [28], pp. 97—99: 3.9], it proved to be both fiddly to manage and did not play well with version control. Thus, I began thinking about a unified framework which enables me to portably and recreatably manage my configurations.
This ongoing effort has currently manifested itself in fleetingly [29], a light-weight package manager with the goal to un-intrusively and atomically add functionality to a running system, journalling what it did to it. As such, fleetingly need not worry about system-integral or deeply dependant packages nor packages of great binary size. It represents my attempt to fill the above described niche, paying attention to proper licencing, proper attribution, software freedom and software minimalism.
2022-10-01, post № 264
poetry, #camera
2022-09-03, post № 263
programming, #haskell, #order-theory
Task. Given a type , a finite array with and together with a finite set of pairs , compute a permutation such that respects the strict partial ordering induced by , provided is loop-free.
Furthermore choose to respect relative ordering of equal elements, that is
An application of late is computing the execution order of inter-dependent transformers.
Possibly due to the seeming banality of the application, my many first presumptuous attempts were both of questionable algorithmic complexity as well as failing in subtle ways, fostering bugs which only seldom showed their ugly faces. Fed up with hotfixed bypasses and manual pre-ordering, I badly longed after an implementation which indeed took some time to get right:
Heart of my approach I will herein present is computing a total ordering on all order-participating values that is compatible with and thus an apt choice for a sorting key to attain by arbitrary extension to .
For this consider the digraph and inductively deconstruct it to form a -conforming array . Care must be taken, however, to correctly position elements left of a node of minimal degree three as e. g. greedily seeking chains may disregard local element relations. As such, iteratively pluck an arbitrarily chosen minimal element, laying these out from left to right via . Their minimality ensures that no element can contradict going down.
When using a stable sorting algorithm which sorts under , relative ordering is naturally respected as desired.
I wrote a quadratic implementation in Haskell that only stipulates Eq a
: sort-via.hs. To achieve the complexity typically sought after for sorting tasks, I suspect an underlying total order on the elements is required for data management. Avoiding Haskell’s non-ADT containers and disappointing standard library support for random number generation, I wrote a C++ implementation and test suite testing both: sort-via.cpp, Makefile.
2022-08-06, post № 262
programming, #haskell, #vim
Cogniscent of the baggage associated with this timeless kids-game-turned-into-interview-question, sparing its hiring efficacy and undeniable tedium when implemented again and again in yet another C imitation, I believe that the lack of its demise is in parts due to it being just shy of trivial with regards to all three pillars of imperative computing (control flow, i/o and data).
Contrasting a functional pearl — which is a dazzling S-Expression found out there in conceptual cosmos —, I want to describe fizzbuzz as an imperative nut: effortlessly consumable when bought already cracked and put into a bag on a store shelf, yet unexpectedly hard to crack into two clean halves by oneself. I feel its implementation often beckoning me, enticing me with forthcoming elegance, only to turn around and show its ugly face of cumbersomely construed i/o calls and disconcertingly intertwined execution paths.
Following an over-engineered approach in Haskell utilizing overlapping instances to not discriminate against index or periodicity (decoupled-fizzbuzz_overlapping-instances.hs), I chose to try the other extreme of thinking about both streams of different origins and only splicing them appropriately (decoupled-fizzbuzz_decoupled.hs). To further decrease apparent coupling, I finally hid the branching away inside a sort (decoupled-fizzbuzz_decoupled-ifless.hs):
main = mapM_ putStrLn . take 100 $ zipWith3 (\n s z -> head . reverse . sort $ [show n, s ++ z]) [1..] (cycle ["","","fizz"]) (cycle ["","","","","buzz"])
Fascinatingly, the often seen approach of leeching periodicity of off the indices’ arithmetic properties has vanished completely to the point of having two mutually oblivious data streams being merged based on their intrinsic willingness to provide non-empty data.
Thus born was the basis for a vim implementation of fizzbuzz. Not a vimscript implementation — which would presumably not bring anything new to the table — but in non-branching, linearily typed vim keystrokes (decoupled-fizzbuzz_decoupled.vim):
i1<Esc> qiyyp<C-A>q98@i qfkAfizz<Esc>kkq32@f qb4jAbuzz<Esc>jq19@b :%s/^\d*\ze[fb]<Enter>
It is not yet clear to me how to transform arbitrary branching decisions into decoupled blind text manipulation tasks, where every branch has somehow become an effect of a textual arbiter introduced for a niche action. However, I currently entertain hopes of coaxing vim-y edits into a scripting language more attuned to editing tasks and with less translational friction than traditional tools including regexp+ and fully-fledged Turing complete programs can offer. The Kolmogorov problem of fizzbuzz is in my view a convincing demonstration of the presence of untapped potential in this domain.
2022-07-09, post № 261
version-control, freedom, #git, #proprietary, #seeking-refuge
As so many naturally grown things, my tiny corner of the IT space I inhabit is, too, a local state. A maximum of sorts, it is a snapshot in time of my path meandering this young, unexplored constructed world. Steps are often taken on a whim and thus not pondered on for long, the juicy sign on button all too elusive.
When I first signed on to the then independant Octocat service, it was with little care nor need: my fellow students flocked there out of habit, yet to convincing them of an alternative there was no barrier errected: our project a clean slate, and our university offering a Git server indeed. Now, nigh four years later, my public projects released and shared, the feline bought by one Big Brother and me having taken on a job centered around a repository hosted yonder, a fence has risen.
For long I dreamt escaping, yet where to? Another bloated webby clone with all the same deceptive ties just in a different coat of paint? No; Git’s bible [1] rightfully proclaims in its fourth chapter this proper unixy task’s ease, yet assumes a healthy management of keys; focussing on one sole project — not managing a few dozens. Coupled with the aformentioned trapping ties, leaving stayed mere a distant dream for months.
Yet dreams come true when acted upon and action ought to be sparked. It was a fortnight past when I first read Drew DeVault’s GitHub Copilot and open source laundering [2], a text which threw me into an action frenzy: I could no longer bear to take a part in this monopolistic pile of vigilantes, not bear to help their efforts further. Though sprinting off is only half the story: all my repositories now seeking refuge, the question where to grew louder.
With revitalized spirits, one needn’t fret: I coded up a thin SSH-Git authentication layer together with a Dumb HTTP Git protocol layer for public projects around a thousand lines of Go strong. It is called gruau and publicly served by itself [3], free for anyone to use or inspect and try to break into.
I was pleasently surprised what profound impact the reclaiming of my Git repositories had on my connection with my data. I will surely try to never open a new repository on any of the lock-in services out there again. On the technical side, I found my own shallow plumbing solution to be around twice as fast when it comes to small exchanges which are most likely dominated by handshake overhead. Aside from the moralistic reasons, this increase in remote Git snappiness alone would make me take on this journey again.
I wholeheartedly thank Drew DeVault for sparking the cinder.
One is invited to interpret my account of seeking refuge as a call to action. Yet, a shallow glance of introspection later, I sincerly do not aim to deflect anyone’s life’s trajectory. As such, this post should be understood as an outlet for my wretched digital encounters.