Thinking about building your first CLI Python app? Meet Clibato, my first CLI Python app that I built to help me backup and restore my dotfiles. I wrote the tool to get a taste of the Python ecosystem. Also, I wanted to write some nice, test-driven, object-oriented code to cheer myself up. In this article, I share my experience of building Clibato and the problems I faced during the adventure.

Around the beginning of 2021, I felt a strange hunger to write code. I wrote some small apps in Ruby, however, my curiosity wouldn’t let me at peace without exploring the territories of the serpent 🐍.

Though I still love Ruby, the Ruby job-market is not very inviting to enthusiasts who are gladly willing to embrace Ruby.

Thus, I decided to leave my comfort zone and write some Python code. As a matter of fact, I received an invitation to Google Foobar while writing this app, but that’s a different story. So, let’s get down to business and talk about the real stuff.

The PEP 8 song – Sounds awesome and has nice lyrics.

Two minute version

  • Used Docker (Lando) for development environment.
  • Project has 2 main directories: clibato and test.
  • Running python -m clibato runs clibato/__main__.py.
  • Noteworthy dependencies:
    • Nose for running tests.
    • Pylint for coding standards.
    • MyPy for type-checking.
    • pathlib.Path for managing os-agnostic paths.
  • See the Clibato source code on GitHub.
  • Install and use the app with pip install clibato.

Specifications

To write fragrant, object-oriented, well-tested, Python code, I needed a project. So, I thought of building Clibato – CLI Backup Tool.

Since projects aren’t fun without deadlines, I set a deadline of 2 weeks and created a list of initial features.

It is often tempting to complicate pet projects, however, I strictly followed the YAGNI rule and built only the essential features.

  • Support YAML configuration files that define:
    • Contents of the backup.
    • Backup method or destination.
  • Support backing up to a directory or a git repository.
  • Support restoring the last backup.
  • Test most of the code/features; red/green/refactor.
  • Upload the package to PyPi.

Here's how Clibato looks in action.

$ clibato -c ~/.clibato.yml backup
Backed up: .clibato.yml
Backed up: .zshrc
Backed up: .vimrc
Backed up: .bash_profile
Backed up: .bashrc
Backup completed.

Docker environment

Reference: .lando.yml

First, I explored some ways of creating an isolated, replicable dev env for my Python project. I read about venv, but being a Docker fan, I went for it. To keep things simple, I used Lando to dockerize my project and to have the correct version of Python and other dependencies.

Dependencies

requirements.txt

One thing I love about the Python ecosystem is the simplicity of their ways. It is very pythonic to have a requirements.txt file containing your project’s dependencies along with their versions. Similarly, you can have requirements-dev.txt for your dev and test dependencies.

GitPython>=3.1.13
PyYAML>=5.4.1

With one or more such files in place, you can then run pip to install your dependencies.

pip install -r requirements.txt -r requirements-dev.txt

Pylint

It’s always a good idea to have a linter so that you can adhere to the coding standards. It didn’t take long to stumble upon Pylint and I found it very useful. You can create a config file for your project with pylint --generate-rcfile.

PyTest / Nose

Python comes with a module named unittest which can take care of most of your testing needs. However, to run the tests with ease, it’s recommended to install a package like Nose or Pytest. I chose nose, but I’m thinking about switching to Pytest.

Makefile

I realized that there were specific sets of commands which I had to run with the exact same parameters in 3 different situations: during development, during Docker container setup, and during the CI workflows. So, I put these commands in a Makefile and I can now run them as make install or make test.

Project structure

Python code is organized into packages and modules. Packages are like directories containing many modules, which are .py files. A Python module can contain one or more objects that you can import and use from other modules. It took me a while to figure out how to organize my project, but now that I know, here’s a summary.

.
├── clibato # contains Clibato
│   ├── __init__.py # executed when you do "import clibato"
│   ├── __main__.py # executed when you do "python -m clibato"
│   ├── *.py # other modules internal to Clibato
└── test # contains tests
│   ├── test_*.py # other modules internal to Clibato
└── dist # contains build files; ignored from git
└── requirements.txt
└── Makefile # contains common commands

__init__.py

This file is executed once when you import the module as import clibato. Thus, I've declared the Clibato class (the main class) in this file so that it can easily be imported.

__main__.py

Python has a concept of executable modules. Some modules can be executed with a command like python3 -m clibato version. When such a command is run, this __main__.py file is what gets executed.

test

The directory containing tests can be given any standard name and the testing module (nose) will usually detect it. However, Pylint won’t lint your tests unless you create a __init__.py in it.

Argparse

Reference: clibato/__init__.py

This is where the fun began for me. I’ve used Commander (NodeJS), Thor (Ruby), however, I was surprised to see that Python has a built-in module for building CLI apps: Argparse.

Main command

common_parser = argparse.ArgumentParser(add_help=False)
# ...
main_parser = argparse.ArgumentParser(
    prog='clibato',
    usage='clibato [-v] [-c ~/.clibato.yml] ACTION',
    parents=[common_parser]
)

Here, the the common parser is just another ArgumentParser object which contains arguments that are common to all commands. For example, all Clibato commands take the --config, and the --verbose arguments which are defined in the common parser, which is passed to the main parser as parents=[common_parser].

Sub-commands

Most CLI applications have sub-commands, i.e. clibato backup or clibato restore.

subparsers = main_parser.add_subparsers(dest='action')
subparsers.add_parser('init', parents=[common_parser])
subparsers.add_parser('backup', parents=[common_parser])
subparsers.add_parser('restore', parents=[common_parser])
subparsers.add_parser('version', parents=[common_parser])

The above code defines those sub-commands, which will be saved to a variable named action. At the time of execution, here’s how the right function is executed depending on the sub-command.

class Clibato:
    def execute(self, args=List[str]) -> bool:
        """Executes the CLI"""
        self._args = Clibato.parse_args(args)
        # ...
        method = getattr(self, self._args.action)
        method()

Each command has a corresponding method in the Clibato class, which are executed as required, with the right arguments, i.e. self._args.

Testing

I mentioned using Nose for running my tests, however, testing this app wasn’t a walk in the park. Here are some special cases that I ran into.

Output tests

For a CLI app, most output goes to stdout. Python makes it very easy to test the output with  contextlib.redirect_output.

class TestClibato:
    def test_version(self):
        """Test: clibato version"""
        with redirect_stdout(StringIO()) as output:
            Clibato().execute(['version'])

        self.assertEqual('...', output.getvalue())

Contents of the standard output are then available in the output variable for you to make assertions. I had to work a little extra to handle Windows’ \r\n line-breaks, but it was fun.

Logging tests

Python’s unit tests come with an assertLogs(name, level) method, however, in order to make use of it, you must implement your logging using the logging module. To complement this method, I decided to create my own assertion as follows:

self.assert_log_record(
    cm.records[0], level='ERROR',
    message=f'Configuration not found: {config_path}'
)

Additionally, WindowsPath contain backslashes, which get escaped twice in the log messages for some reason, so I handled in a custom assertion.

Exception tests

Testing exceptions was very straight forward, but there were some gotchas.

def test_config_not_found():
    with assertRaises(ConfigError) as cm:
        app.execute()

    self.assertEquals('...', str(cm.exception).strip("'"))

I could’ve used assertRaisesRegex(Exception, message), however, it wasn’t working as expected on Windows. Also, the exception message gets wrapped in an extra set of single quotes, which I’m removing with .strip("'").

File-system tests

This was the part that took the most time, but Python’s tempfile module has some excellent classes for it.

  • NamedTemporaryFile: Creates a temporary file with a unique name.
  • TemporaryDirectory: Creates a temporary directory.
  • gettempdir(): Gets you a path to the temporary directory, no matter the OS.

Also, pathlib.Path impressed me with its simplicity. I used it to handle paths in an OS-agnostic way to be more Windows friendly.

Continuous integration

I’ve recently started adding CI workflows to all my projects and this project is no different. I used GitHub Actions for this and it was very easy to setup. I’d like to configure things such that pushing a new tag to Git would result in a new build being published to PyPi.org, however, that is not essential at the moment, so it’s out of scope.

Publishing to PyPi

PyPi.org is the hub of many Python community projects. Since my tool had a unique name and some real-life utility as well, I decided to push it to PyPi. This involved following the tutorial about packaging Python projects. I had to create some extra files, e.g. setup.py, and now anyone can download Clibato from PyPi.org.

$ pip install clibato
$ clibato version
Clibato v1.0.1

Conclusion

After 2 weeks of work, I called the project complete. I can’t wait to get my hands on a real Python project, or maybe even a job as a Python Developer? Though I’d love to, but life isn’t that easy 😉. If you like what Clibato offers, i.e. backup/restore your dotfiles (or any small files) to a directory or a Git repository, I welcome you to try it.

Next steps

On this page

Never miss an article

Follow me on LinkedIn or Twitter and enable notifications.