PyBites: 5 ways I use GitHub Actions :

PyBites: 5 ways I use GitHub Actions
by:
blow post content copied from  Planet Python
click here to view original post


I am increasingly using GitHub Actions these days.

If you’re new to this you might want to check out our article or video.

In this article I will show you 5 cool ways I use it day to day.

Run tests and linters

The first and most obvious reason is to automate tooling. Although you probably want to first and foremost do this step locally, for example by using pre-commit, it’s nice to always have that second automated check in the cloud.

Example workflow.

On: push dictates that I want to run this job upon every code push.

You can read the steps yourself, but basically we set up a container installing python and the required project dependencies. Then we run pytest with coverage. There is one environment variable required (EMOJI_PREFERENCES) which I set under env.

Here is another example of a test runner further restricting the on clause.

Automate publishing a package to PyPI

In the same repo you can see another workflow.

This creates a new package and pushes it to PyPI. It always tries the test instance, but only does the “live” PyPI instance when I push a tag (using condition if: startsWith(github.ref, 'refs/tags')).

So now I don’t have to use flit / poetry / twine locally, I can just do rely on using git tags:

git tag x.y.z
git push --tags

(I have to check but I think I also need to manually update the version in __init__.py)

How does it know about my PyPI auth token?

You can use GitHub’s encrypted secrets which I load in like this:

password: $

This is pretty slick.

Update my GitHub Readme

A few weeks ago I made a self updating Readme on GitHub.

The build-readme.py script runs every day to pull in new content (articles, tips, toots) and updates my profile repo’s Readme with it.

Here is the workflow.

As you can see this one is slightly different from the previous ones, it uses the cron feature, so this script runs once a day:

on:
  push:
  schedule:
    - cron: "45 8 * * *"

Back up Mastodon toots

While on the topic of syncing content, I used the same cronjob style GitHub Action to sync my Mastodon (Fosstodon) activity to a local database (repo).

Here is the job.

This runs a few times a day (for crontab syntax crontab.guru is really useful):

on:
  push:
  schedule:
    # At minute 4 past every 4th hour
    # https://crontab.guru/#4_*/4_*_*_*
    - cron:  '4 */4 * * *'

Notice how the job also commits the updated sqlite db to version control using stefanzweifel/git-auto-commit-action@v4

Post to an API

Lastly, I made a job to sync new tips from my code tips repo to our CodeImag.es API

Here is the job.

I run it once a day: cron: "45 16 * * *"

It follows the usual order of setting up a container, installing Python and the requirements, and then running the script:

python sync_tips.py

Similarly to the publishing a package workflow (2nd tip), I use GitHub secrets to load in my API user credentials.

CODEIMAGES_USER: $
CODEIMAGES_PASSWORD: $

And that’s it. I hope this gave you some inspiration to leverage GitHub Actions to save time and make your life easier 🙂

– Bob


December 14, 2022 at 11:03PM
Click here for more details...

=============================
The original post is available in Planet Python by
this post has been published as it is through automation. Automation script brings all the top bloggers post under a single umbrella.
The purpose of this blog, Follow the top Salesforce bloggers and collect all blogs in a single place through automation.
============================

Salesforce