Python packages using Nim

This blog post describes a way to create Python packages using Nim.

Build example Nim module

We start by implementing a simple compiled Python module using nimpy. First you must Install Nim then install nimpy with nimble install nimpy.

Create file named mymodule.nim . Note, this filename will match the module name that you are going to import from Python.

import nimpy

proc greet(name: string): string {.exportpy.} =
  return "Hello, " & name & "!"

Then compile the module.

# Compile on Windows:
nim c --threads:on --app:lib --out:mymodule.pyd mymodule
# Compile on everything else:
nim c --threads:on --app:lib mymodule

Try calling this Nim module

We can now try this module using the following Python example. Copy this text to a file called and then run with python

import mymodule

Build Python package

Now we want to package this module, for publishing to PyPI. First, create the directory structure required for the module publishing project.

  • example_nim_pkg/
    • is simply a blank file.
    • mymodule.nim
    • or mymodule.pyd depending on platform.

The content of the LICENSE and are up to you. Example content for is as follows:

import setuptools
from setuptools import Extension

with open("", "r") as fh:
    long_description =

    author="Steven Robertson",
    description="A small example package",
        "Programming Language :: Python :: 3",
        "License :: OSI Approved :: MIT License",
        "Operating System :: OS Independent",
    	"": ["*.so", "*.pyd"]
    ext_modules = [
            name = 'dummy',
            sources = ['dummy.c']

Change the highlighted lines and replace with your own details. The package name should include your own PyPI username.

At this point we will create and activate a virtualenv environment, install the required packages for publishing, and then build the package.

Prepare your virtualenv environment:

# pip install virtualenv
virtualenv venv
source ./venv/bin/activate # on Linux and macOS etc.
# On Windows activate with 'venv\Scripts\activate'

Build the package:

python sdist bdist_wheel

The above command will require a build environment. On Windows this will probably require Visual Studio with C build tools. The reason for this is that we create a dummy C extension in the process to create a platform specific wheel. This is important because the Nim extension is built for a single platform, and using this technique we can create and include wheels for other platforms.

Publish the Python package

It must be noted here that uploading a wheel for Windows and uploading this to PyPI will only work for Windows. You can use Azure Artifacts to publish packages that include an additional wheel built for Linux doing what is described here without any additional steps. Otherwise, packages including wheels for Linux may need to use manylinux, which is out of the scope of this article. That can done using a Docker image to prepare the Linux package. This might be covered in a later post here. I initially used Azure Artifacts, then wrote these instructions with PyPI as the example package host.

Upload the package

We use twine to upload the package:

pip install twine
twine upload dist/*

The last command will prompt you for your PyPI username & password. Following this your package will be published to the public index.

Including wheels for multiple platforms

Generating the package using on each platform creates a platform-specific wheel in the dist/ sub-folder. You can copy these wheels into the same sub-folder from the platform from which you publish the package. This should then include all wheels for each of the platforms.

Using the published package

Now, we will create a new shell window and a new virtualenv environment to install this package and try it out.

# activate virtualenv as shown above
pip install example-nim-pkg-stever

The following Python file can be used to call out to the Nim module:

from example_nim_pkg import nimgreet


This article demonstrates a way to create multi-platform packages using Nim modules. This seems like a useful way to package Nim modules and consume them in Python scripts.

Source-code for this project is available here.

Docker on WSL 2

I did wonder that one of the biggest motivations for WSL 2 would be improved Docker support when building Linux images on Windows. Glad to report that this appears now to be seamless.

When building Linux Docker images on Windows this message would appear:

SECURITY WARNING: You are building a Docker image from Windows against a non-Windows Docker host. All files and directories added to build context will have '-rwxr-xr-x' permissions. It is recommended to double check and reset permissions for sensitive files and directories.

I noted earlier that Docker for Windows automatically detected the availability of WSL 2 and configured itself accordingly. I found that the docker command was already available. Looking at the output of htop it is clear that this is running a proxy with the Docker for Windows install.

/mnt/wsl/docker-desktop/docker-desktop-proxy --distro-name Ubuntu-20.04 --docker-desktop-root /mnt/wsl/docker-desktop

Testing this out, I can use WSL 2 to build Linux images that I am preparing in Windows, avoiding the warning shown earlier. However, the warning would be expected to still apply when using the Windows file-system. Access to the WSL file-system from Explorer is done via \\wsl$.


Welcome to WSL 2

I’ve been waiting for Windows 10 to be ready to update, and receive WSL 2. Now it is here. Docker for Windows was nice enough to set-up with this automatically. I am setting up a fresh Ubuntu 20.04 install just now, to use with Windows Terminal.

I had enjoyed using WSL 1, and found it useful so far as it went. It was a bit slow, and not everything worked. WSL 2 takes a different approach, embedding a whole Linux kernel, instead of implementing a Linux kernel interface. I expect this to be very useful when working with Linux Docker images etc. Hopefully it will be more frictionless.

JS GraphQL IntelliJ Plugin – Developer Guide

To use the JS GraphQL IntelliJ Plugin an important bit of info for me was creating a .graphqlconfig file at the root of the project. This is where this plugin picks up the schema.

  "name": "Dashboard Schema",
  "schemaPath": "schema.json",
  "extensions": {
    "endpoints": {
      "Local GraphQL Endpoint": {
        "url": "http://localhost:5004/graphql",
        "headers": {
          "user-agent": "JS GraphQL"
        "introspect": true

There’s also more about this configuration file here.

Fixing a PS4 controller that won’t charge

Having browsed a few articles, the wobbly USB connector and failure to charge two PS4 controllers indicated a problem that may be difficult to repair. I wasn’t keen on hard-wiring the battery to the USB cable either!

Many suggestions seemed to conclude simply having to replace the controller in this case, until I found the link above with the simplest and best solution, a charging dock that uses the connector at the other side. Excellent. That should keep these good for some time yet.

Relay, & GraphQL for .NET

I am very interested in GraphQL and I am currently working with GraphQL in ASP.NET Core 3.1 using GraphQL for .NET and Relay. It hasn’t been immediately clear what I needed to do in order to get up and running.

If you ignore Relay, things can be a lot easier. There’s a good example in the source for GraphQL.EntityFramework which includes GraphQL subscription too – which is nice. However, if you want to use Relay properly it isn’t immediately obvious that this isn’t currently possible with Entity Framework. If you are aiming to use Relay, this article provides a paging solution using SqlKata.

Getting started with Relay, the Quick Start Guide is fairly helpful, particularly Installation and Setup for the npm packages and build configuration.

Having created a GraphQL schema on the server using a code-first approach I was initially a bit confused by the requirement for a schema.graphql file. For a little time I thought I was going to have to recreate the schema manually, and might have been best going initially with a schema-first approach instead. However, reading How to Retrieve a GraphQL Schema was very helpful, once I discovered that the schema file can be a .graphql or a .json file!

To use the Relay Compiler, you need either a .graphql or .json GraphQL schema file, describing your GraphQL server’s API. Typically these files are local representations of a server source of truth and are not edited directly.

We can use a GraphQL query to get the JSON version of the schema directly from the server endpoint.

4 simple ways to call a GraphQL API

Now, with the following relay-compiler command you can generate the framework that’s required for the Relay queries to work.

relay-compiler --src ./src --schema ./schema.json

This doesn’t replace a step-by-step guide, but a quick summary of some of the important details I found were not clearly noted.

I will write more about GraphQL again later. I think sometimes it is easier when there is a working example to learn from. I will think about providing an example to illustrate a step-by-step guide in more detail, perhaps.

It really is nice working with GraphiQL and Voyager. There are other alternative interfaces too, but I like these two.

This is the Voyager UI. It’s a pleasure to interact with. Very useful.

Introduction to computers and computer science

I was reminiscing about my early interest in computers. I remembered a book with a red cover, that I thought was about business computing. It was an interesting book and it wasn’t about specific computers but more abstract concepts of computing solutions. I was very young when I read it. It may have been given to me by my grandparents, who were both teachers, and encouraged me to follow my interest in computers and think of it as a future career, or I might have got it from the library.

Looking around online, doubtful that I could find the book with such a vague memory, I found this book, “Introduction to computers and computer science” which was published in 1972. Immediately the book was recognisable, but I thought the title was different. As a child, this probably was very much a business computing type of book, and that’s how I remembered it.

I’ve ordered a copy of this from a bookstore in the US. It will be interesting to see if this was the book I remember.

The book I had read had written about people and their resistance to computers, for fear of losing their jobs. This can still be a problem, and more now perhaps with the prospect of AI and the perception and some claims that people make about how jobs will evolve or be replaced. I can appreciate there is some truth to it, though think people have the tools to do more, and remain competitive, and gain new abilities.

Rich is a Python library for rich text

This is a screenshot of the output in Windows Terminal (PowerShell)

Output in screenshot above from the following commands, where we have Python and virtualenv previously installed.

virtualenv venv
pip install rich
python -m rich

Very nice. I will use this in Python scripts.

Neat, this also works in Jupyter Notebooks. Read more about it here.