tagpy: Taking it over and doing a new release with wheels

3 Jan 2023

Python 

A couple of years ago I wrote a tool to workaround issues with my DLNA player and I intended to do a proper release of it but there was a few blocking issues, one of which was that tagpy had become somewhat unmaintained, but was still technically active. I was digging through some of my older unmerged PRs (as I do periodically) and noticed that the repo was now archived. For those who haven't seen this status for Github repos before, it effectively places the repo in a "read-only but still accessible" form, and is usually used as a "I'm not maintaining this anymore" marker. I emailed the owner because although I could just do my own fork (because open source), he owned the PyPI page for it and I'd rather update things there as well. Luckily he was happy with me taking it over, and this post is because I've now done a new release of it.

The other fun item here was trying to make the wheels for it. By default, Python packages are distributed as source, and for a package like tagpy which has C++ extension work you need to have a compiler and all the relevant libraries setup locally every time you install things. This is slower to install and needs more work for every user of the library, so instead there's Python Wheels which are precompiled versions of a library. How to do make them properly is not exactly well documented (there's a 5 year old bug about fixing this), and so I had to figure out things for myself. My first thought was doing this with static linking, which is something I've gotten to work before. I knew this wasn't necessarily going to be easy, as often static libraries aren't packaged and I'd probably have to do a lot of building the libraries from scratch. However, even with the aid of a 3rd party action to install a precompiled Boost it was still looking pretty terrible (all the fun of errors like relocation R_X86_64_PC32 against undefined symbol `PyList_Type' can not be used when making a shared object; recompile with -fPIC ).

Maybe I was thinking about this wrongly? I decided to dig around in psycopg2-binary as I knew this was another project with similar issues (C extensions for PostgreSQL work) and so maybe they had figured out a good way to do this? It turns out the answer is "bundle shared libraries in the wheel" which sounds pretty terrible (yay vendoring). OTOH, it's the supported path: there's a series of PEP's documenting the manylinux "platform" (a set of defined symbols in C libraries that are compatible with a decent range of Linux distros), along with the auditwheel tool to both check you've both built compliant packages and to do the vendoring of the shared libraries that are outside of the spec. I haven't fully folded this into automatic releases, but there's a script to build the packages which is run in a manylinux Docker file (currently a prebuilt one with Boost) that generates suitable wheels which I'm uploading by hand with twine for the moment. If I was planning on many releases of this I'd automate it, but I'm not expecting too many. I've only done this for manylinux_2_17, but that's the equivalent of manylinux2014 i.e. compatible with most distros released in the last 9 years.

So, net result, if you're running Python 3.7-3.9 on x86 linux, pip install tagpy now just works (which solves my core use case). For everyone else, you still need the libraries/compiler installed first, but then the pip install bit also still works :) This is automagically tested on Fedora, Ubuntu, Alpine and some of the Manylinux baselines in CI. MacOS/Windows aren't covered, but patches welcomed!

Other notable new new features are:

Previously: ESLint plugin to require preventDefault for onClick Next: serial_test - 7 million downloads, and 1.0