GitHub Actions Amazon ECR Integration - part2
This post is the continuation of part 1.
In this post, I will show how to use a GitHub Actions workflow to automatically build and publish docker images to an ECR repository on a push or pull request merge.
We need a dummy project for illustration. I have bootstrapped my project using poetry, my favourite python build tool.
The project structure looks like
$ tree -I ".git|.terrafom|terraform|tests|.pytest_cache" -a
.
├── Dockerfile
├── .github
│ └── workflows
│ └── ci.yml
├── .gitignore
├── poetry.lock
├── pyproject.toml
├── .python-version
└── src
└── handler.py
The project definition metadata, pyproject.toml
:
[tool.poetry]
name = "learn-gha-ecr"
version = "0.1.0"
description = ""
[tool.poetry.dependencies]
python = "^3.11"
boto3 = "^1.34.11"
jsonpickle = "^3.0.2"
[tool.poetry.group.dev.dependencies]
pytest = "^8.0.0"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
The lambda handler, handler.py
:
import os
import logging
import jsonpickle
import boto3
logger = logging.getLogger()
logger.setLevel(logging.INFO)
client = boto3.client('lambda')
def lambda_handler(event, context):
logger.info('## ENVIRONMENT VARIABLES\r' + jsonpickle.encode(dict(**os.environ)))
logger.info('## EVENT\r' + jsonpickle.encode(event))
logger.info('## CONTEXT\r' + jsonpickle.encode(context))
response = client.get_account_settings()
logger.info(response['AccountUsage'])
At the moment the project isn't doing any real work, except logging some information when the lambda function executes. I proceed to write the GitHub Actions workflow file for the project, .github/workflows/ci.yml
On any push to main
branch, checking out the code,
name: Lambda Application CI workflow
on:
push:
branches:
- main
jobs:
build:
runs-on: ubuntu-latest
permissions:
id-token: write
contents: read
env:
AWS_REGION: <region>
AWS_ACCOUNT_ID: <account-id>
BUILD_ROLE_NAME: GHA-Build-Role
steps:
- name: Checkout code
uses: actions/checkout@v4
Take note of the environment variable BUILD_ROLE_NAME
, this is the name of the IAM role I have created in part1.
Installing Python and Poetry build tool,
- name: Read Python version
id: read-python-version
run: printf "version=$(cat .python-version)\n" >> "$GITHUB_OUTPUT"
- name: Install Python
id: install-python
uses: actions/setup-python@v5
with:
python-version: "${{ steps.read-python-version.outputs.version }}"
- name: Install Poetry
uses: snok/install-poetry@v1
with:
version: 1.7.1
virtualenvs-in-project: true
Installing the project dependencies,
- name: Restore Python virtual environment from cache
uses: actions/cache@v4
with:
path: .venv
key: venv-${{ runner.os }}-${{ steps.install-python.outputs.python-version }}-${{ hashFiles('poetry.lock') }}
- name: Install project dependencies
run: |
poetry install --no-interaction --no-root
poetry self add poetry-plugin-export
poetry export --without-hashes --without dev -f requirements.txt > requirements.txt
Here we use the caching function of GitHub Actions to speed things up. The cache is invalidated only if poetry.lock
's content or python version has changed.
We also exports the project dependencies to a requirements.txt
file for docker to use later.
Running unit tests,
- name: Run tests
run: |
poetry run pytest
The first run will break, as there isn't any tests written. You may want to write some dummy tests to rectify the build failure after that happens.
Building and publishing docker images
- name: Get temporary credentials from AWS STS
uses: aws-actions/configure-aws-credentials@v4
with:
aws-region: ${{ env.AWS_REGION }}
role-to-assume: "arn:aws:iam::${{ env.AWS_ACCOUNT_ID }}:role/${{ env.BUILD_ROLE_NAME }}"
role-session-name: run-${{ github.run_id }}
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v2
- name: Build and publish docker image
id: build-publish-docker
env:
DOCKER_REGISTRY_ENDPOINT: ${{ steps.login-ecr.outputs.registry }}
run: |
version=`poetry version`
short_version=`poetry version --short`
IMAGE_NAME=${version% $short_version}
IMAGE_TAG=$short_version-build$GITHUB_RUN_NUMBER
docker build --tag $DOCKER_REGISTRY_ENDPOINT/$IMAGE_NAME:$IMAGE_TAG .
docker push --all-tags $DOCKER_REGISTRY_ENDPOINT/$IMAGE_NAME
printf "IMAGE_NAME=$IMAGE_NAME\nIMAGE_TAG=$IMAGE_TAG\n" >> "$GITHUB_OUTPUT"
It is important here that you have created the necessary AWS resources with Terraform, otherwise these steps will fail.
Post-build steps to tag the GitHub repo on a successful build.
outputs:
IMAGE_NAME: ${{ steps.build-publish-docker.outputs.IMAGE_NAME }}
IMAGE_TAG: ${{ steps.build-publish-docker.outputs.IMAGE_TAG }}
post-build:
needs: build
runs-on: ubuntu-latest
permissions:
contents: write
env:
IMAGE_NAME: "${{ needs.build.outputs.IMAGE_NAME }}"
IMAGE_TAG: "${{ needs.build.outputs.IMAGE_TAG }}"
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Tag Github repo
run: |
GIT_TAG="v${{ env.IMAGE_TAG }}"
git config --global user.email "github-action@users.noreply.github.com"
git config --global user.name "Github Action"
git tag -a "$GIT_TAG" -m "tagged by GHA $GIT_TAG"
git push origin "$GIT_TAG"
The post-build job is separate because it requires elevated permission.
Bonus: trigger a deployment using a workflow in another GitHub repo. You need personal access token for this step to work.
- name: Invoke CD workflow
uses: actions/github-script@v7
with:
github-token: ${{ secrets.PAT_WORKFLOW_DISPATCH }}
script: |
github.rest.actions.createWorkflowDispatch({
owner: context.repo.owner,
repo: 'infra_setup',
ref: 'main',
workflow_id: 'cd.yml',
inputs: {
env: 'dev',
images: '{"${{ env.IMAGE_NAME }}": "${{ env.IMAGE_TAG }}"}'
}
})
Once this is done, commit and push the changes to main
branch to trigger the workflow. Enjoy coding.