Enki – Sumerian god, Uniter of Languages.
I started using the Nix
package manager and software deployment system as it was great for
declaratively defining software deployments for Eilean. But I quickly ran into issues with Nix’s
operating system-centric view of packages; like other system packages
managers (see Debian’s APT, Arch’s pacman, or OpenBSD’s pkg_add
) it maintains a coherent package set.
Unlike these other package managers it also packages language-ecosystem
packages; since it eschews the Filesystem Hierarchy Standard (FHS) if
you want to depend on system packages you need to build a Nix
derivation1.
But unlike language package mangers, Nix doesn’t have version solving: it resolves dependencies on an exact version, and doesn’t support expressing more complicated version constraints. This seems to be an approach that doesn’t scale to disparate large open source for ecosystems; half the failures I encounter in Nixpkgs are due to incompatible versions of dependencies. As a result, a lot of Nix derivations are programmatically generated from the result of resolution from a from language-ecosystem specific tooling (be that with a lockfile or with Import From Derivation).
I worked on a tool to generate Nix derivations from an Opam version resolution this building MirageOS unikernels with Nix, Tweag’s opam-nix. There’s a lot of language ecosystem tooling to Nix derivation projects out there, with dream2nix aiming to provide a unified framework to build them.
Something that this approach doesn’t work well for is multi-lingual projects. Projects have to vendor dependencies from foreign ecosystems and duplicate packaging to target other languages. This hinders visibility into dependencies and upgradeability; what if there’s a vulnerability in one of the dependencies, do you have to wait for upstream to re-vendor the updated dependencies? All these package managers are functionally doing the same thing, with varying degrees of interoperability with build systems and compilers.
What if instead of this ad-hoc and unversioned interoperability, we could resolve dependencies across ecosystems? Enki is a cross-ecosystem dependency solver using the Pubgrub version solving algorithm, which keeps track of the causality of conflicts, and is built on Rust Pubgrub2. We see a number of use-cases for this system;
System dependencies: Language package managers have varying ways of interoperating with system package managers; Opam has the
depext
mechanism to express system dependencies, and Cargo has*-sys
packages. Enki can add fine-grained and versioned system dependencies to language ecosystems. This enables us to, for example, solve for the smallest and most up-to-date container image that satisfies the system dependencies of a project. We can even encode the architecture in this version formula and solve for particular hardware.De-duplication of packages across ecosystems can be done with datasets such as repology-rules.
Cross-language dependencies: Instead of vendoring dependencies from other ecosystems or requiring separate solves in each, we can directly express dependencies across ecosystems and solve for the most up-to-date packages in each.
Portable lockfiles: By solving for all Operating Systems and architectures we can create truly portable lockfiles.
Vulnerability tracking: We can this dependency graph to know what our dependencies all the way down the chain are, create complete Software Bill of Materials programmatically, and track CVE’s that appear in our dependencies. We can even envision monitoring vulnerabilities in our supply chain and dynamically solving and redeploying software to ensure continued secure operation. I’m interested in this for use in Eilean.
GPU hardware requirements: Dependencies can changed depending on the hardware available for GPU workloads.
Agentic AI: Large Language Models (LLMs) that use tools often fail to interface with package managers. They fail to express version contraints on the most recent packages, or hallucinate packages which don’t exist exposing attack vectors. We’ve written an MCP server to make Enki available to AI agents, and plan to expand it to support a vector search across package metadata. This will enable agents to perform such tasks as resolve system dependencies of a package to create a declarative dockerfile, decide on a language to use based on packages available, and more.
Once we have Enki resolving dependencies across ecosystems we can look at how we can provide them:
In a container; invoking ecosystem specific tooling in a containerised environment such as docker.
With Nix; all these ecosystem to Nix tools go through the Nix derivation language, but perhaps we could interface with the Nix store directly enabled by RFC 0134 Carve out a store-only Nix.
Docker is good for development, and Nix is good for deployment, but perhaps we could bridge the gap with Enki.