Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Debian Packages That Need Lovin' (debian.net)
201 points by spyc on Feb 20, 2021 | hide | past | favorite | 146 comments


I have been using Debian on and off since the late 90s, including some time creating packages. It was wonderful to be able to install a recent, working version of pretty much anything you wanted for the vast majority of that time.

More recently, so many things I want to use are not available as a reasonably up-to-date package. Some examples are hugo and eclipse, where the versions provided are unusably ancient.

https://lwn.net/Articles/842319/

Meanwhile, more and more software is actively hostile to packaging / distributions, and things seem to have devolved into grabbing things from random github repos, or various dedicated/language-specific package managers like npm, pip, brew, ...

It's definitely annoying, seems like a step backwards, and its not clear to me whether there's some better distro i could be using, whether some funding / volunteer time could help, or the world has just "moved on" (backwards...) from the idea of a linux distribution with reasonably stable, up-to-date packages that "just work" for basic infrastructure so you can spend your time developing on your own project, instead of with the tedium of fetching and installing software and managing version compatibility problems yourself.


I think the future is probably something more NixOS-like. Now, personally, I've tried it and found it a bit wanting UX-wise (and for really niche stuff), but for providing cutting edge and the ability to roll back safely I don't think it can be beaten. If you have databases, etc. that might need to be rolled back things get more complicated, ofc.

Right now, I'm running Arch Linux with a small smattering of self-compiled stuff. Arch seems to actually be pretty stable, unless you're using their 'testing' repos... and it's very close to bleeding edge. Their secret, I think, is staying as-close-as-possible to upstream -- the trouble usually starts when distros start to add large patches. This has been a huge issue for me with Debian/Ubuntu.


It doesn't have to be one or the other.

I've been very happily using Guix as a package manager inside my Debian, thereby having a Debian with up-to-date versions of packages and access to all the other benefits of using a next-generation package manager (specifying release candidates, docker-like portable builds, plus emacs integration).

Just have to be slightly careful not to install packages via Guix that might cause conflicts, especially with Gnome. But if you do you can always just hop into a tty and `guix package --roll-back` to get back to a working build.

Maybe one day I'll migrate to Guix as a distro in itself rather than just a package manager within a foreign distro (Debian), but for now the setup works well, and it's also a nice way to get experience with Guix before diving into the deep end.


I have done this as well (with Nix rather than Guix, but it doesn't matter). You do run into some friction sometimes, e.g. the UI themes from your main distro not applying properly (because paths are all wrong).

It does basically work, but it can be a bit jarring, depending on exactly what you're trying to do.


> Their secret, I think, is staying as-close-as-possible to upstream

I've wondered about this. This would depend on upstream being sane about how it does things or puts things.

Is this always the case?


Usually, yes, it appears. Maintainers of projects are generally pretty invested in getting their software into popular distros... and so they tend to follow the patterns that would allow that. (I mean, stuff like 'where do I put the binaries' is still often configured in Arch and other distros, so we're talking about stuff outside of that kind of thing. Most packages aren't complicated enough to require anything outside of simple configure flags.)


I wish there were more standards & conventions here. It seems silly that every package needs to be manually packaged for Debian/red hat/arch/etc, FreeBSD, homebrew, and maybe Cygwin/whatever people use now. That’s a whole lot of work, often for every new version of the program! And all that work could vanish if there was a standard bundler.

I think modern package managers for programming languages show us the right approach here. A cargo binary crate can be built on any platform that rust supports, it knows it’s dependencies and the compiler flags are passed in in a consistent way. I could imagine cargo having a 1 button “publish crate to dpkg/homebrew/ etc etc” button. Or alternately, I could imagine Debian allowing crates.io to exist as part of the package namespace. Then you could “apt install crate/foo” and the “foo” package author wouldn’t have to think about dpkg at all. (Maybe this already exists? It would be a sweet way to use the ppa system.)


> And all that work could vanish if there was a standard bundler.

The same balance of forces that mandate all of that toilsome work are exactly what would prevent any standard bundler from succeeding.

You can follow https://utcc.utoronto.ca/~cks/space/blog/linux/SnapsFlatpaks... and https://www.techrepublic.com/article/why-snap-and-flatpak-ar... and various related discussions around this and eventually I am confident you will conclude that while things may change (today there's docker in the mix as well, tomorrow there will be something else), the one-true-packaging-solution will never be invented.

There are simply far too many fundamentally conflicting interests to ever prevent that level of standardization.

What you see now is more or less what you will always get. Or, perhaps, we'll see something worse (i.e. only fully locked down computing devices, only app stores with no sideloading capabilities, various other dystopian outcomes). I'm pretty sure we're not going to see something better though; we've had plenty of time...


Personally I feel like in many ways OSS got its shit together here over the past decade or so, especially when it comes to default directories and naming conventions etc., so maybe traditional distros might want to reconsider adding so many patches for everything. Its especially bad if it's security patches for unmaintained software, as they are often incomplete, different distros all come up with their own patches whenever someone stumbles upon the bug, because why talk to others and try to prevent duplicate work...


I’ve been using Linuxbrew with Ubuntu (actually Pop!_OS). Linuxbrew is macOS’s Homebrew but for Linux, maintained by the same organization and using the same codebase.

If something needs to be installed system-wide, like dev headers for ncurses or something, I’ll install it through apt. But for nearly everything else (git, Neovim, fzf, tmux, htop, ripgrep, ...) I install it though Linuxbrew.

Linuxbrew formulas are up to date, not ancient. I have yet to run into a problem where a formula only worked on macOS and not Linux. Also for homebrew core packages they publish pre-built bottles so that I don’t have to build everything from source.

It’s pretty great. Pop!_OS is pretty much zero-upkeep, and Linuxbrew has everything I need.

I’ve said it before but the only thing really missing to keep me from using Linux as my main OS with this combo is better desktop mouse sensitivity and keybindings that mimic macOS system-wide (Cmd-C, Cmd-V, ...)


With zfs and boot environments the whole rollback issue is solved. No need for nixos for that.


AFAIK, nixos isn't about just rollbacks...Things like ability to declare the state of your system (with all your apps) in a single file and possibility of having two different build of the same program in the system seems very futuristic and productive to me...I mean, this thing frees us from docker/vbox.

I think even the Linux kernel would be a component of NixOS and could be substituted with something else.

The whole idea is just an enterprise-proposal away from getting polished around the edges.


Definitely for most cases, yes... but 'data' is still an issue (if you care about the data). NixOS has an interesting mechnanism for separating "package" upgrades from "data" upgrades, tho, so that's a slight advantage. (They basically keep around old binaries to be able to run and back up an old database and restore it on a newer version. Straightforward file system snapshots don't really do that.)

EDIT: Sorry, I feel I didn't explain this very well. The main point is: You can have multiple versions of all the software on your system, including stuff like PostgreSQL. That's actually very powerful.

Another potential problem is the licensing and root-on-ZFS.

Personally, I'm really looking forward to an in-upstream-kernel FS like bcachefs. (I'm actually a fan of ZFS, we use it on our servers, etc. I still feel a bit uneasy about the licensing issues and the fact that it's not in-kernel.)


seems like zfs would be a low-level solution and nix would be a high-level solution.


After using Ubuntu for over a decade, I switched to Arch Linux.

If you exclude the duplicated architecture packages in the Ubuntu repos and include the community-maintained packages, Arch has more packages new packages seem to commonly available within 24 hours of an upstream release.

For example, I use some utilities based on "rofi". A search for Ubuntu packages containing "rofi-" contains just no results, but a search for Arch packages returns about 50 results.

https://packages.ubuntu.com/search?suite=groovy&section=all&...

https://aur.archlinux.org/packages/?O=0&SeB=n&K=rofi-&outdat...

AUR packages look easier to maintain than PPAs, so I'm more likely to get get involved with packaging something on Arch then I was on Ubuntu.


Yes. However, Arch Linux means you need to spend time, probably several times per year, to fix stuff that worked yesterday but doesn't work anymore.

Arch Linux is just an entirely different thing compared to Ubuntu. I'd like someone who isn't me to make sure stuff works. Only very rarely will I be bothered to do actual work to upgrade to a new breaking version on my daily driver. Randomly breaking my shit on a Tuesday will make me install something more stable. On the rare occasion that I'd like to try something unstable, I can usually find a PPA or something. I don't like edges that make me bleed on my base OS.


Upgrading Arch Linux can be a mess... But Ubuntu is no picnic, either. I've seen plenty of glitches during Ubuntu 20.04 "LTS" upgrades. The most recent is it dropping the network in the middle of the 20.04.2 upgrade. Before that, it lost the default route (there was a bug with multiple interfaces.) There have been others.


If we're going anecdotal, Ubuntu has been nothing but great for me. I have left laptops off for months and then turned them on and update/upgrade them no problem, as well as computers that I use daily. I think the crux of the update issue is really how well package managers handle interdependencies between packages vs how many/what packages you have on your system.


These are servers with multiple Ethernet and bridge interfaces, primarily used for VM hosting. This requires no third party packages and is not anything non-standard or terribly exotic. I often wonder if Ubuntu tests configurations with anything beyond a single NIC.


Debian is more reliable than Ubuntu about upgrades between and within major versions and in configurations that aren't the most typical.

Still consult the release notes before upgrading between major versions in case there are relevant release-specific gotchas before/after/during, and for generally helpful precautionary advice, but quite often there aren't.


There’s a continuum between something like Arch or Gentoo and CentOS on the other side - the amount of “pain” is roughly similar but when you encounter it changes - it’s usually easy to figure out WHAT broke on Gentoo as it was probably the last thing you installed - whereas a move from CentOS six to seven can be a huge undertaking (even more of you jump straight to 8).

But there definitely is something nice about general stability (until you have to install something on an old OS that requires a new package and you enter a whole new hell).


> Yes. However, Arch Linux means you need to spend time, probably several times per year, to fix stuff that worked yesterday but doesn't work anymore.

Largely a myth and config-dependent. I used to have this issue 10 years ago but now most of my troubles are self-inflicted. Things are as stable as they come when using software from the official repos.


I feel like a fairly competent Linux user and within 2 months had my Arch install broken to a point where I didn't want to invest the time to go down that "why is right click broken along with all these dependencies" rabbit hole. I went back to my mint/debian/Ubuntu safe space.


I installed Arch Linux on a ThinkPad in 2011 and it's literally running the same OS today, nearly 10 years later. I won't say I've never had anything break, but never anything major like audio or video or wifi.

In my experience, Arch is one of the most stable OS's I've seen. But I suspect a lot of it has to do with what you're doing with it. First, ThinkPads are generally well supported in Linux. Also, I've never seen any reason to use GNOME as a desktop. I used to run XFCE on Compiz, then i3 as Compiz stopped being updated. More recently, I've switched to sway, and I've finally started using PulseAudio because it's required for Zoom. Obviously, i3 and sway aren't for everyone. YMMV.


I've been thinking about leaving Fedora and your comment intrigues me. Does Zoom work well with i3? Can you share your screen? Regarding PulseAudio, I didn't realize Linux users had a choice about whether to use it - is there an alternate audio subsystem available?


You've already been told about ALSA (which Pulse uses and organizes, for its the low-level audio subsystem the kernel offers) and Jack, which specializes in low-latency audio. There's also sndio ported from OpenBSD.

But the one you might want to consider instead is called PipeWire. It implements the ALSA, Pulse and Jack APIs with a single implementation and it's pretty much the future of audio (and video casting) on Linux.

If you're using Wayland, you're already using PipeWire, but likely not for audio. On Fedora you can follow the instructions here https://fedoraproject.org/wiki/Changes/DefaultPipeWire which basically boil down as "uninstall pulse and install these packages" and you'll be using PipeWire for your audio.

I had tried several times to switch to it and it was not ready, but about a week ago I've been able to make the move with no loss in functionality and even got a fix to an old bug that affected my sound card, so at least with version 0.3.22 I can recommend giving it a go. Switching back to Pulse if bugs arise is trivial anyway.


I never had any problem with Zoom on i3, but I never tried to share my screen.

Zoom on sway works, but since sway runs on top of Wayland and Zoom doesn't have the bindings to communicate with Wayland, I'm fairly positive screen sharing would not work. At least, not out of the box ... it might be possible to connect it to some other screen capture process if you're adventurous.

PulseAudio runs on top of ALSA. Traditionally, Linux desktop applications connect with ALSA and you can still manage ALSA directly. More recent applications, such as Zoom, expect to connect with Pulse and don't support connecting directly to ALSA.

From a user perspective, Pulse offers some advantages like being able to manage audio levels for different applications independently. In early releases, Pulse was unstable and difficult to troubleshoot, so it was easier just to use ALSA directly. Pulse has gotten a lot better in recent versions.


> Does Zoom work well with i3? Can you share your screen?

Yes and yes. Source: I run Zoom with i3.


Do you happen to know if Zoom is available on FreeBSD?


Alsa and JACK are major forces aside from PulseAudio.


I used to believe that a properly configured Arch Linux setup is super stable, until one day when I upgraded the Linux kernel and there was multiple GPU hangs and graphics glitches. It's fixed upstream now, but it does make me think again about running close to the bleeding edge.

On the same line, I don't realize that my Bluetooth driver was broken until I checked dmesg. This one wasn't fixed upstream yet, but I don't need Bluetooth on my laptop anyway.


I think arch is better for desktop sorts of things. remember:

- desktop users want the latest thing, immediately.

- server users want nothing to change, ever.


As a desktop user, not so much, no.

I want things to work. I don’t want to spend my time arguing with my tools, just for the sake of having the latest and greatest. I might go through and upgrade everything every couple of years, at most.

Beyond that, I just want security upgrades to happen without me having to think about them.


> After using Ubuntu for over a decade, I switched to Arch Linux.

Hah! Very similar experience; see my sibling reply :)


> More recently, so many things I want to use are not available as a reasonably up-to-date package. Some examples are hugo and eclipse, where the versions provided are unusably ancient.

I understand your comment but Hugo is the wrong example, this package is actively maintained and is up to date in Debian unstable.

https://tracker.debian.org/pkg/hugo


Packaging Eclipse is something that _any_ distro will have a hard time with, and for good reasons, given the number of jars involved here.


This might just be evolution. For development, at least for me, the experience was never install a distro and some packages from the distro and start coding. There was always something that would require a bunch of bespoke installations. Libthis or libthat out of date. Files in the wrong place in the distro (or upstream developer's config). Old version of the database server.

Now, I think there's a pretty nice balance going on. Seems like the distros do a good job of getting you to a working GUI and command line, and we've had all kinds of new tools show up for getting to the latest/greatest for development. That said, I think the distros could do a much better job helping devops and admins understand how to use tools like apt and yum. Then there's the whole container thing, which really does give us a more universal way to distribute software (ok, maybe not snaps).


> things seem to have devolved into grabbing things from random github repos

Devolved from what? I believe from software not having any dependencies unless they absolutely must because adding dependencies is such a royal pain. Software written in C and C++ has long been used to the system package manager being language package manager, and the (UNIX) OS being its IDE.


Debian successfully packages a ton of Python software, notorious for its complicated dependency graphs. Has been for decades.

Debian packages a noticeable amount of C-based software which is just old. Emacs, for example. I suppose that skipping major versions (25 to 27) is considered unacceptable, and packaging and maintaining several versions is just proportionally harder.

(This is why my laptop now runs Void, and I get a fresh Firefox next day after the release.)


> Debian successfully packages a ton of Python software, notorious for its complicated dependency graphs.

I guess this depends on your definition of success, but from my perspective as a Python developer, it doesn't really. Most Python software I'm aware of encourages users to get it by other means (e.g. pip, Anaconda, containers), and anecdotally, the majority of users seem to do so. The occasional people who use apt cause frustration 'upstream' when they report bugs in a version that's 2 years out of date.

One package which I help maintain upstream had a security flaw with a CVE, left open in the Debian package for many months after we fixed it. No-one upstream is a Debian developer, and no-one in Debian was updating the package.

Python started out with a traditional model of applications sharing libraries, and is now moving in the direction of self-contained applications with separate dependencies. Languages like Javascript and Rust went straight for the latter model, so are probably even less amenable to Debian packaging.


> Debian successfully packages a ton of Python software, notorious for its complicated dependency graphs. Has been for decades.

But that's what I'm talking about. Suppose there is no software in Debian that uses a Python dependency that you need, hence no one packaged it. Now instead of adding a line to your setup.py (or whatever Python devs use), you have to go through the entire adventure of cooking a Debian package if you want to follow the Debian way. And them submitting it to the main repo which is an adventure and commitment of its own which most people are not ready to make.

Sure, that works to the benefit of everybody, not only your app. And that's the whole point of Debian distribution. Debian developers work on improving Debian. Not on publishing their software. This is the main difference between “app stores” and “package repositories” and “distribution”. Debian is a distribution, and a package repo is only a part of that.

But it kinda saddens me that every language ecosystem has their own package repository, and every distribution ends up having to do effectively busywork of adapting those packages to their distribution and its rules.


IMO, this moved forward with Linux at the OS level 20 years ago with apt and rpm.

Seems like now that hyper scale cloud vendors are displacing distros as gatekeeper, we’re going down a road towards niche packaging for languages, frameworks, etc.


Because for those languages, their runtimes are the platform, the underlying OS is irrelevant, as long as the runtime and associated libraries provide the required abstractions to deliver what is being asked for.

UNIX is C's platform, hence POSIX and its influence on C and C++ based OSes.


A clear example why platform languages always win.

C was born to make UNIX portable, and C++ was born a couple of years later on the same office rooms where UNIX came to life.

Any guest language introduces additional hurdles, debugging tools, binding libraries, editor support, and naturally OS package managers.


Maybe FreeBSD is an option?

Their packages repo has a “latest” which is very up to date. One can also use ports for more control and custom options etc


I have the same problem. While setting up home stuff recently. I was doing everything through ansible on LXC containers and found myself doing a lot of building from source. I still continue to use it but will explore other distros in the future.


Homebrew is meant to be language agnostic.


Packaging for debian anything non trivial is damn too hard.

It took me 5 days to figure out how to package a complete web app with params, upgrades, post install transpiling, db init, etc.

And I haven't put that on a private repo yet, it's yet another annoying thing to do.

Nothing is well documented, doc is old and confusing, the tooling is archaic and wants to inflict pain (debconf anyone ?), the life cycle of a Deb package is atrocious to get right.

And you have to do all that in raw bash scripts. Not there are no alternatives, any scripting language is potentially usable, but the support is poor enough to deter you from them.

It's not I don't want to contribute to the ecosystem, but I won't invest the colossal effort and exercice in frustration to overcome the barrier to entry. My packages don't even need to go to the official repo, just let me do my things in peace.

Make a python lib that let you describe a package, hook on life cycle events to run code, with clear documented recipes et where to put what types of files, and let me run that to generate the Deb. Event web pack is easier to use for God sake.

I'm not even touching the process of packaging something to be included in debian repositories here, which is another beat entirely.

Quit the smug act, debian packagers. You don't know better.

You do know better on how to design a distro and protect the official repository. Great. I praised you for that for decades.

But you know nothing about making your users life easy. You just don't. So ask them, and fix that, or don't complain about no contrib. This is not news, we raised our voices for years.

I contribute all the time to Foss in code and doc, I donate in mass. We ARE willing to help. And we do.

It's not us. It's you.


As for building packages for your own use, the tooling is not that difficult. The documentation is just oriented to contribute packages for Debian itself. This implies some unwanted complexity and navigating outdated documentation. Packaging for Debian becomes simpler, but it's heavily rooted into having multiple files describing the package, use of a tarball as the source package and having each language comes with its own helper and documentation.

Here is a "pragmatic" way to write Debian packages without relying on non-Debian tools: https://vincent.bernat.ch/en/blog/2019-pragmatic-debian-pack... (I am the author).

But other than that, I quite agree with your stance: we (Debian) makes life of our contributors harder than it should be. Unfortunately, there is no awareness in the project for that. People trying to change that just silent themselves, like https://michael.stapelberg.ch/posts/2019-03-10-debian-windin....


I also made myself a deb packager script in 100 lines: https://github.com/qznc/simpledeb

It doesn't cover anything fancy like post-install actions though.


Thanks for this tutorial, it covers a lot.


> And you have to do all that in raw bash scripts.

If only that were true. The entrypoint in Debian packaging is a damn Makefile.

I've been a Debian Developer for 16 years, and creating packages for more than that. Saying that I like it or that there aren't problems would be the archetype of Stockholm Syndrome.


On the other hand, Debian is a large community of volunteers providing a distro on democratic principles for a long time successfully. Comparable distros are either companies (Ubuntu, RedHat, SuSE) or heroic deeds of one or few people (Slackware). I find that an amazing even if it is heavily burdened by technical debt these days.


Oh, sure.

But that's beyond the point that if what they really desire is contribution, which they have been claiming for years, they should make it less painful to package software for the plateform, which we have been claiming for years.


You have some points here, but the Bash part is _not_ the problem. Bash is everywhere and the true fix to that bit is to learn Bash: It's not hard and I benefit from my Bash skills everyday, it's one of the best things to invest even a bit in. Just don't write "actual software" in Bash.


Oh, but bash is definitively part of the problem.

I'm pretty proficient in bash myself, and error handling, debugging, decoupling and refactoring in bash are all horrible. It's just not design for being good at a script with more than a couple of lines. The fact we do use it for such purpose does not make it good at it.

The fact it has no namespace, the lack of decent data structures and the very limited functions also all leads to making it a poor foundation to build a lib on. Hence deb packages don't have any kind of framework for basic things: you are on your own.


I think many package managers take longer than a week to figure out. RPM is more difficult that deb, setuptools takes months to understand it fully and is changing constantly.

Meson, Conan, winget+msi, and the hundreds of other mixtures of build system + package manager also take long to understand. And you have to understand each of them.

Homebrew does not even want package authors to create packages themselves.

It may be a matter of perception: If you count all web searches, interactions with CI, conference talks that help you understand a more recent package manager, it will also add up to 5 days.

Creating a deb is more boring, it's just the machine and you.


I'm not counting the research I did prior the 5 days to actually understand deb packaging, only the 5 days trying to package something.


Well, just me finding it a joy to work with? All the dh helpers that basically does the hard work, I just need to update one or two things in control file and that is it.

I have packages closure, Java, python, php, c and very complex piece of software with weird version formats and so far Debian packages cover all those cases for me.

I find that is pretty easy to complain about things we don't know, or rant when frustrated, but after investing a bit o time to learn and understand how it works pays off.


Back in the day, I created my packages manually (creating the TAR archives, the preinstall/postinstall scripts, and putting all of that in an AR archive, renaming it to .deb).

I never even read the documentation, I just did:

  $ file some.deb
  $ ar --output somedir some.deb
  $ ls somedir
Definitely not the recommended way, but I never spent more than 10min on a Debian package (installed with dpkg and not through repositories).


I'll put it like this: Debian bureaucracy makes your usual governmental bureaucracy seems sane and sensible in comparison

No accountability, tightly knit cabal with high barrier to entry, refractory to external opinions, etc


Yes. It’s a living breathing example of how Anarchy operates.

Which was the intention I believe - hence why https://packages.debian.org/stable/doc/anarchism is a package.


Creating arch or nix packages is a breeze in comparison. I'd argue nix is what everyone wants whether they know it or not. All this container junk is a recipe for disaster.


I had to create a custom package of some C++ header files. I failed. I was shocked at how hard it was.


For my (local) C++ projects, I use CMake to make .deb packages. It is easy enough, uninstalls cleanly, so does the job.


As someone who's thinking about moving to Linux and has made many non-trivial contributions to Homebrew, the contribution process for Debian packages scares me. The official documentation seems to be more of a reference guide than a tutorial, and community blog posts always seem to start with "That other tool is outdated; here's the new way to do it," making it impossible to know which method is the right one.

IMO, what really needs some lovin' is the official onboarding process for new contributors.


Here's something to keep in mind that isn't super clear to newcomers, Linux != Debian. Linus Torvalds doesn't use Debian and nothing Debian does really impacts him or the core of Linux. Debian is just a highly opinionated take on Linux. You can totally dive in and be super productive with Linux without ever having to know about, use or, or care that Debian exists. (oh and Homebrew works great on Linux, give it a shot as it's a really easy way to move over from OSX).

Regarding the Debian package process, etc.--I'll be blunt and say that all of the pain you see is by design. It's not designed to be a super inclusive or friendly community for contributions. There are explicit gatekeeping checks in place to ensure everyone follows "the Debian way" (nevermind that there is no clear definition of "the Debian way" that anyone agrees on and it's only used to enforce power hierarchies in the project). It comes from an even older gatekeeping around "the Unix way". IMHO ignore all that rubbish.


"By Design" is a bit much. Debian loves people putting things in .deb files. There is a purposeful barrier to putting things in the main archive, but nobody wants there to be any barriers around tooling.

And I'll offer a defence of that barrier - Debian isn't about making an operating system. It is very explicitly making a free operating system. Listening to one Q&A session after a Stallman speech was more than enough to show me that there are a lot of capable developers out there who just don't get the idea of free. While those developers are a mighty force for good they need to be kept away from the Debian main archive. It isn't a place for them. They can go start their own project (eg: Ubuntu, Mint).


Debian is one of the distros that really takes package security seriously, which is a big plus for me.

> It comes from an even older gatekeeping around "the Unix way".

That's not gatekeeping, that's a very specifically-defined approach to operating systems.


I understand if you're taking issue with the term as a pejorative, but how is purposefully placing barriers for those that don't conform to the "specifically-defined approach" not definitionally gatekeeping?


I think of "gatekeeping" as blocking access, especially for something that doesn't have a defined membership, like a fandom. e.g. "you're not a real star wars fan if..."

whereas anyone is welcome to contribute to debian, they just have to follow the rules. there's kind of no meaning to the term "gatekeeping" if any organization that has rules to follow counts.


gatekeeping is used as pejorative in those cases, where no gate ought to be. for example in at least one email Linus claims that forbidding C++ in the kernel was also to exclude C++ developers, whether he was right or wrong that is another case of gatekeeping


If you actually believe that, then all software specs are "gatekeeping".

But you do realize that I'm objecting to its use as a pejorative here.


I don't think it is "by design". Many Debian contributors are frustrated as well by the packaging processes and tools, but it turns out that it is not that easy to coordinate the work of thousands of people, most of whose contribute voluntarily in their free time. Debian was founded in the early 90's, developed by thousands and used by millions. It has tens of thousands of packages. You don't just migrate them to the newest packaging technology every day. Once upon a time the next big thing was Subversion. Now it is Git. What will it be next decade? It is difficult to keep up on volunteer work.

I don't mean to say that you have to accept it as it is. If you don't like it, just go somewhere else. If you are motivated to help fixing stuff, that's wonderful. Just don't pretend we like to make stuff difficult for the sake of it.

Disclaimer: volunteer Debian Developer, always on short time to contribute.


It sounds like the focus on "the Debian way" is the cause for the symptom cataloged by the original article. The Homebrew maintainers are also notoriously picky (and for good reason), but I think it's the open nature of GitHub that has helped it grow so well. I wonder if opening the Debian gates a bit (at least superficially) would help encourage a healthier package ecosystem.


Like most things in software, it's ultimately just people and collaboration problems. I should be careful not to paint Debian in too bad of a light because ultimately it's what the people involved make it--good or bad. It's just that historically Debian enthusiasts have gravitated towards a lot of bad, old gatekeeping mindsets that have really made it difficult for the broader Linux community to grow. For example I think back to all the drama with the early days of Ubuntu and how many in the Debian community were vehemently against the idea that someone make a distro that's easier for people to use.


> For example I think back to all the drama with the early days of Ubuntu and how many in the Debian community were vehemently against the idea that someone make a distro that's easier for people to use.

That is disingenuous.

As much as I am annoyed by Debian gatekeeping, I am also gratefull that there still exist one Linux distribution that's serious about their users rights, aka software freedom, and that is able to properly discern, for instance, that Ubuntu was not going to be easier for newbies to install on their computers than it was going to be easier for Ubuntu to force, say, amazon crapware on newbies computers.

I hate Debian gatekeeping yet one have to admit that's the last line of defense against appstores everywhere and the last living example of what user empowerment can be.


> the last line of defense against appstores everywhere

Package management on Debian has started to look much like an app store (in a good way) with Gnome Software[0], just without all the non-free stuff.

[0]: https://wiki.gnome.org/Apps/Software


Debian packaging being complex, there are many documentations around it. Packaging practices also change, but it's difficult to know which documentation is up-to-date or not. However, the very official documentation from the project is kept up-to-date: https://www.debian.org/doc/manuals/maint-guide/ and https://www.debian.org/doc/manuals/developers-reference/. But, yes, they are long and only cover the basics. There is also https://www.debian.org/doc/manuals/packaging-tutorial/packag... which is kept up-to-date and is more in the tutorial style.

Every few years, there are various attempts at simplifying packaging, usually in the form of a more universal helper to build a package skeleton, but it doesn't really reduce the complexity and until now, none of these helpers replaced debmake.


Please note that maint-guide is considered deprecated by its author, who has switched to debmake-doc.


I found this article from 2 years a great read: https://michael.stapelberg.ch/posts/2019-03-10-debian-windin...


I once built a whole deployment system out of packaging all our services as Debian packages and running them out of our own apt repo. Once we got it working, this was a really low maintenance system and bringing new servers online was stupid easy.

Since then Debian packages have become easier to create and maintain. And it’s a great skill if you ever need to create e.g. a custom-compiled version of nginx or some such. It’s a really well thought out system and I am surprised it isn’t more widely used. By contrast Docker seems to be more portable but way more of a pain in the ass.


I'm helping with building that part at my current job and the main problem that I'm finding is that the documentation is not good at all. There's a lot of trial and error, outdated documentation and such that makes it hard to get started.

I do agree that once it's setup properly with a CI pipeline, it becomes a breeze to install and maintain updated systems.


Yeah I remember the Debian docs being a little obscure. And debhelper is only so helpful.


Yeah, same here. It’s really nice to be able to spin up a clean Ubuntu VM on any provider, and just adding your repo, key and being an apt install away from deployment. Makes it super easy for us to deploy dedicated instances for Enterprise customers as well - no exotic dependencies and no problem if they want on prem or wherever else.


Being able to uninstall something with more than file is enough reason for deb packaging for end user software. Server software isn’t as relevant as the shift to containerized work loads as made server configurations “build once and toss”.


I find that using it on the server is super nice. If you are using Ubuntu on the server anyways, why bother packaging a Docker service for anything else?


What do you recommend for learning that stuff?


I wish I could recommend something but I learned it years ago and I honestly don’t remember where from. I’ll try to put together a modern version of my old LUG talk on the subject to show people how to get started.


Yes plz. I'd be very interested to read about this (long time Debian user, but never packaged anything).

I know there is an install, postinstall, and remove scripts but don't have a clear idea of when exactly they will get called and what "interface" they must implement.

It would be really helpful to have a "package lifecycle" state diagram like this https://vuejs.org/v2/guide/instance.html#Lifecycle-Diagram that shows all the states and events that "fire" based on .deb scripts and metadata.


Like this? https://www.debian.org/doc/debian-policy/ap-flowcharts.html

I skimmed the New Maintainers Guide for a rough idea of what to do and then used the Debian Policy Manual with the various debhelper manpages for specific references of what each field/script meant. I also dig into the source of some packages to see how they were packaged (for example, when I was creating an archive keyring). The two areas I had the most trouble with were deb triggers and debconf, which only seem to be documented on external sites.

New Maintainers’ Guide: https://www.debian.org/doc/manuals/maint-guide/


If you sort by Installs this is kind of disturbing.

A lot of well known packages (Apache2 / OpenSSL / LibreOffice etc.) have no owner?

https://wnpp.debian.net/?sort=installs%2Fdesc&page=1


I think the list of packages that need lovin', needs lovin' itself. It seems this list is abandonware.

If you look into the details for some these packages, you will see that it is a dead discussion forgotten even by the person who initiated it, a decade ago.

E.g. the discussion about Libreoffice is not about Libreoffice at all, but Apache Open Office, and you can see that there are people who have offered help, but the OP never responded to their offers.

Just click on the links on the package names and read for yourself.


When I looked at the list, my first thought was amazement that LibreOffice has no maintainer.

EDIT: Previous HN discussion about LibreOffice's lack of resources: https://news.ycombinator.com/item?id=23793942


Column "owner" is the owner of the related issue (bug tracker ticket), not the owner of the package.


I think packages listed as “RFH” aren’t unmaintained really, just requesting for help maintaining it. Clicking-through on e.g. grub2 shows a mailing list thread requesting for help from…2004. grub2 has certainly received updates in Debian since then.


I was pretty shocked, given Debian is considered by some as just a bootloader for Apache.


Maintainers come and go over time and these things don't just package themselves.


At page 3 there is apache2 with more than 300k installs. The maintainer stated no more interest[1]. This seems like low hanging fruit for a massive supply chain attack.

[1] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=910917


The thread shows how supply chain attacks are difficult with Debian: random people cannot really contribute to something like Apache 2 because they need to be sponsored by a Debian developer and they are being asked to do "boring" stuff before that, like triaging bugs and fixing them.

The RFH is mostly here to get more people involved, by the maintenance of Apache2 in Debian is going well and the latest version is currently packaged: https://tracker.debian.org/pkg/apache2.


That discussion is probably no longer relevant, because you can see on Salsa that people are pushing patches to the packaging: https://salsa.debian.org/apache-team/apache2


When you click on random (or not so random) packages like Libreoffice, it brings you to [0] where help was requested in 2007 and as recent as oct 2020 offers for help are ignored. Not sure how this 'process' works?

[0] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=419523


I'm a Debian developer. It can look a little chaotic from outside.

But there are a few simple principles operating here. Mostly they follow from the Debian being a collection of 1000 developers, all equal. There is no CEO who can decide what's important and force someone to do something. Instead, Debian is a do'ocracy. If there is a dispute, it always takes the form of two Debian developers wanting to do something different. (This means users don't get a say. Not because Debian doesn't care, but because it's a do'ocracy and users can't do anything without going through the lengthy test and evaluation process that Debian insists all it's developers must go through.)

Asking for help, then not taking up an offer breaks this first rule. We don't have two Debian developers doing conflicting things, so there is no conflict. It may not be the nicest way to behave, but shrug no one can force someone to do something like supervise a newbie. After all Debian developers are unpaid volunteers, and you can't force an unpaid volunteer to do something.

Lets take it further. Lets say we have two Debian developers proposing to do different things. Lets say a Debian developer wants to upload a new version of libreoffice. A second rule comes into play here: the package maintainer is king of his packages. He can resist just about anything providing he isn't breaking any policies. So providing the libreoffice package is in reasonably good condition - not too old, all CVE's fixed, yada yada he can resist the new version. In fact I've resisted people wanting to update my package to a beta release, so I've don't just that.

Now lets say Debian packager of libreoffice has let it languish, in fact languished so much it's accumulating CVE's. And worse, in order to resist he must be actively deleting the new version. So we have two people doing different things, and they conflict. At this point either developer can invoke Debian's dispute resolution mechanisms. Interestingly the team doing the dispute resolution isn't allowed to do something themselves, they can only chose which (or perhaps neither) of the things done can proceed. In this case not updating a package accumulating CVE's breaks policy, so they would probably allow the developer uploading the new version to proceed.

This is a very different way of going about getting work done then people normally experience in their working life. A work environment is typically far more hierarchical and authoritarian. A boss can actually order someone to accept help for example :). However, Debian's system is demonstratedly every bit good as RedHat's. Security patches come out just a quickly, and it has far more packages. And clearly as there are 1000 of us developers all pulling together, it can't be too hard a system to work in.


I understood the second rule, but could you please explain the first rule more clearly? This idea of a do-ocracy is very interesting to me.


I imagine you are asking why the grand architect included the second rule, as what it does seems clear enough.

The honest answer is I don't know. No one has explained it to me, and I wasn't there when it was born. But Debian being what it is, I'd be surprised there if there was a grand architect, or that it's possible to point to a definite moment in time it became a thing. The formality Debian has acquired over the years looks to be to attempts at codify existing practice after it was challenged. Codifying works to prevent reoccurrences of the unpleasantness associated with such challenges.

The practice has always been someone volunteers to maintain a package, they expected nothing but the freedom to do the job in the way they saw fit without fear of interference providing they followed policy, and no reward except the bouquets and bricbats for how it turned out. When put that way it's not much of a reward, but it's been enough to attract a lot of us to the task.

The importance of that reward can best be seen by looking at what selfish motivation is left if you took it away. Good deeds are great and all, but when done where nobody can see them or others can easily take over after you've put all the hard work in or worse can claim credit for it, such hidden deeds are not great motivation for sticking at it for years.

We are all very aware that binds us and we are protective of it. But human nature being what it is, it can't be surprising the second rule has been challenged on many occasions, which has lead to it being codified in so many ways it's become part of Debian's DNA.


Sadly in recent years I've seen a number of neglected Debian packages picked up by people who just wanted to pad their resume and say "I'm a Debian maintainer!". This usually has ended badly with the new maintainer not caring about the userbase. They close all open bugs without having fixed anything, and break shit wholesale just so they can say it builds.

Please don't pick up a package just because you think it would be cool to be a maintainer. If you are not invested in the well-being of the userbase, you will get called out.


> If you are not invested in the well-being of the userbase, you will get called out.

The person doing this might get called out, but will the negative consequences of that calling out outweigh the resume padding benefit they experience?

If not, then I would expect this to keep happening.



You have a point here: I have made that^^ the new default sort order now.


Aren’t canonical and other distributions upstreaming changes to Debian?


If I want to help, what should I do?


Well, you need to get competent enough before you take on a role as package maintainer. For start, imagine that you are one.

First, choose a package that really matters to you. On a daily basis. A package that you yourself would NEED to have it up2date the next day a new version is out.

Get the source deb and try to build it. Usually it's not very hard.

Then, try to do the same on all the debian variants (unstable, stable, ...)

If you succeed and the package works, you have the prerequisite competencies, at least as a beginner.

Then, these are some of the things that you will need to do, quite often.

Change the configuration of the package description so that you can build it with a later version of the source. For all variants of the distro.

Then try the same with the package dependencies, where those dependecies are not needed for other packages.

Then try to push it as close towards the latest source version, as you can without breaking stuff.

Once you do all that without issues and you are confident that you can use the version you built from source as a daily driver and you are not bothered by that, contact the original maintainer directly and offer help, sharing the details what you did as proof that you are competent enough.

If the maintainer is uncollaborative or unresponsive, put up your own repo for public use, and invite the public to install the package from your repo, so that the public will see some benefit and you will receive zillion of bug reports to further streghten your competence level.


That does seem like it would be a sensible way of going about things if I could manage to find a package of modest complexity that is important to me, but under-maintained.


Maintaining a package is not as hard as developing software. You don't need to know algorithms, you don't need to know programming, you just need to know how to start a compiler and packaging software from the command line and read errors from the log.

If you are an active developer in the language that is used it will be easier, but you can do it even if you are not.

Example. Todays news here, is the latest release of Kodi. Debian does not have the most recent version. Some months ago I had an issue with the Kodi from the debian multimedia repo, so I decided to build it from source. It took me the better part of a day to have all the build tools installed and set-up everything, starting from zero. But in the end I had the latest Kodi running and it was running like that for 5 months and I even forget that I built it.

I managed to do that without even looking into the source code. In fact if you ask me what programming language is used to develop Kodi - I really don't know and don't remember.

Package maintainance is more about testing than about building.

So a list like this can serve as a guide in the process of choosing where to help. Just sort by number of users that use the software and go down the list until you find something that you use daily and where you would like to have the most recent version available and would volunteer your effort for 2-3 years (it is not really helpful if maintainers change every couple of months).


Thank you. That is useful information.

I'll also have to find something my employer is happy with me working on, but I think you've pretty clearly laid out what I'd be signing up for, and I appreciate it.


I a good way to find things you use that need help is the how-can-i-help tool:

https://wiki.debian.org/how-can-i-help


Jump on the #debian-mentors IRC channel (OFTC) (also bridged to Matrix) and folks there can walk you through it.

PS: lots of different ways to help Debian:

https://www.debian.org/intro/help


Isn't it worrisome that something like openssl is listed as having no owner? Wouldn't a sneaky patch in something as low-level and widely-used as that have devastating consequences?

Is there another Linux distro that gets multiple eyeballs on (core) package changes and proper security reviews that you folks would recommend for daily driver?


Column "owner" is the owner of the related issue (bug tracker ticket), not the owner of the package.


Linux Standard Base recommend rpm support.

But the rpm package need adoption and refers users to alien, but the alien package is orphaned.

hah, looking at wikpedia it seems LSB support have been dropped by Ubuntu and Debian in November 2015.


From a quick scan of just the first half, it looks to me like libzstd1 might be the most deserving of help. But I could see libreoffice there.


Looking at orphans by installs and I remember using some of these 20 years ago, but do people even use libgpod and xmms anymore?


that's my first reaction to this list too - is a debian-specific package really providing more value than the effort of maintaining it for a lot of this software? even libgpod and xmms are relatively high-profile examples.

i do appreciate being able to apt-get install some obscure package and have often wondered how it's a sustainable system that maintainers are putting in effort to package these things up into a distro for me. but maybe it isn't?


I wonder if the install base is is so high because 20 years ago some package maintainer decided to put XMMS in a gnome-apps (or whatever) metapackage, and no one removed it.

I don’t even know the last time I found a publicly accessible mp3 url stream, or listened to a local mp3 file. Local files had to be like 10 years ago.


Anyone else disturbed by this? These packages have root access for millions of computers and thousands of Fortune 500 companies and no one is maintaining them?


Seems like those Fortune 500 companies might think about making a donation or two?


If you aren't volunteering, don't be surprised that other people aren't either.


Asuming that only self-funded individuals can contribute in their spare time, but companies can dedicate staff too.


fair...then people shouldn't advertise debian as the most "stable" distro fwiw


why is eclipse so many version behind the current version? I do not think it would hurt to update a little faster.


Why would you need a package for Eclipse. Download it is a ZIP, extract it in the location of your choice and run it from there. (Portable app)

I have been using it like that almost every day for 15 years now.


Eclipse isn't portable, somebody has to compile it for Linux, it just comes down to whether you want that done by the Eclipse Foundation or Debian people.


The discussion here was about why a debian package for Eclipse is not up to date.

My theory is that it is obviously because there is a way how to run Eclipse, without any issues and without the need of an officially tested debian package, so there is not huge demand.


That would be fine if Linux preserved backwards compatibility, the glibc version change last year meant that some binary packages no longer run.

Something like Netbeans doesn't need a debian package but Eclipse is split into lots of jar files that each depends on some shared libraries.


Okay, so why don't you volunteer to update it?


If there's no people volunteering to update it, then there's nobody to update it a little faster.


Exclipse got removed from Debian because it became too hard to package.


holy shit, order by installs, Apache and sudo have no maintainers


Apache makes sense - it's a bit of a mammoth with complex packaging (as different components arrive in different packages). A cruise through the relevant list thread shows he's looking for a gradual handover, soliciting help, but being suitably picky about who takes over (eg, eagerness isn't the only job requirement).

Do click on the package titles to go through to the relevant thread. For most the packages you'd be worried about, what you'll see is either a well-reasoned handover of responsibilities, or a simple call for help.

(Or just look at the 'type' column - RFH is Request for Help, RFA is Request for Adoption. Important or complicated packages looking for more team members isn't a panic.)


Neither of those are listed as O(rphaned), they're RFA/RFH.



This is an off topic question, but is Debian a good alternative to Gentoo? Over the last three years I'm assuming Gentoo is so low on resources that they are just removing packages left and right. Every few months my install is broken due to this.

The main reason I'm still with gentoo is inertia and I really like the tools and just the way things work. The problem really is the portage tree (basically the repo) which is become more and more bare as the years go on. For example, I miss eix whenever I use debian, apt-cache just isn't as useful. Also, having to use not systemd is a plus and a reason I won't use Arch. Any suggestions?


I think that's a trick question, Debian's outdated but stable binary software is in many way completely antithetical to Gentoo's philosophy, but on the other hand, it's still Linux, and absolutely a viable alternative regardless. I'd love to hear some more opinions about the state of the portage tree though, as a new and naive, but dedicated, Gentoo user, I don't have a sense of it's history or the state of the projects resources.


The thing I'm imagining is that if there was something I wanted the USE flags for, I'd just end up compiling myself in a prefix. I already do this for some packages because of the issue with Gentoo removing things they consider stale which is the problem.

I don't need USE flags all the time or compiling everything with -O3 -march or whatever, for most things it's just a plus that it might be faster. I think the one thing that might actually matter for me is in Gentoo fixing broken things is more direct, just compile, while I've had issues that persist on ubuntu even after reinstalling something. It just felt like there was somehow "stale state" in my OS which even sounds ridiculous saying. But, at this point, I can't handle breaks in portage because while you can fix things, it requires mental effort on my part and I'm tired of it.

EDIT: reading your comment, if you're asking for my opinion I mean look at this in particular:

https://bugs.gentoo.org/420783

ioquake3 was literally removed from portage for this "bug" which even if you accept that it's a hypothetical issue has easy patches that were never included, and it was removed, essentially removing hosts of open source fpses from portage! I even read a forum post talking about "upstream" not caring...upstream? ioquake3's release was charity, it's not like an active project.

This has been happening for a while. I've been here since 2008 or so and the instability and removal of slots within months has gotten out of hand. Again, the tooling is great, emerge, eix, the whole system of configuration files is awesome and I'm just used to openrc. I wish I could keep them but the slot removals in portage has just become such a chore to deal with.


Gentoo packages, by virtue of use flags, do seem like they'd take more effort to build and maintain, but I don't know enough about gentoo packaging as I'd like to :p


I have packaged for both Debian and Gentoo; packaging for Gentoo is way easier and more fun, one reason being that (simplified) you're dealing with a single Bash file rather than a magic Makefile and things split into 20+ small files. The difference is huge.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: