Website migration

In my previous log, I have mentioned that I wanted to move from GitHub Pages, to an European service provider.

First, I have tried CleverCloud which provides Cellar, a S3-compatible object storage.

The UI is minimalist, the only way to manage a bucket, is to rely on the API (through a CLI, such as aws, s3cmd).

The minimalism goes too far as not providing a full API support (such as the CloudFront part).

The second part, and this is when I gave up, is the necessity to rely on a load balancer to have a TLS certificate.

Then, I moved to Scaleway, which also provides a S3-compatible object storage.

It also provide edge services, which exposes and caches the bucket.

I have made some latency measurements, while I was at ~580ms on GitHub pages, I have dropped to ~380ms with Scaleway.

I have noticed the page was downloading 2.8 MB, which is related to Elasticlunr.js, the search.

I use Zola for this website, and it is the default, to fix this, I have changed the following configuration:

[search]
index_format = "elasticlunr_json"

I have checked again, the new size is ~550 KB, and the new latency is ~290 ms.

However, this latency is only in Frankfurt, in San Francisco it goes up to ~1.8 s.

At this point, since Scaleway did not have a CDN, I have looked for an European CDN, there are not many.

Most of them are expensive, the second least expensive one was OVH, which has a subscription of 12€ per month, not including the bandwidth, not to mention reliability issues and a toxic culture.

I have settled on BunnyCDN, which has two drawbacks:

  • No S3-compatible API, I have to use a FTP client, like it's 2002
  • Pay in USD, which is not good for reliability

Changing my CI workflow to:

- name: Build site with Zola
  run: nix shell '.#zola' -c zola build

- name: Deploy to BunnyCDN
  env:
    BUNNY_HOSTNAME: ${{ secrets.BUNNY_HOSTNAME }}
    BUNNY_USERNAME: "${{ secrets.BUNNY_USERNAME }}"
    BUNNY_PASSWORD: "${{ secrets.BUNNY_PASSWORD }}"
  run: |
    echo "Deploying on $DOMAIN"
    nix shell '.#lftp' -c lftp \
      -u $BUNNY_USERNAME,$BUNNY_PASSWORD \
      -e 'mirror -v --delete --reverse --parallel=50 public . ; quit' \
      $BUNNY_HOSTNAME

Note: FTP is slow, really slow, without parallelism it takes 3m10s, with 50 as parallelism it still takes 24s.

Then, I have to purge the pull zone, the CDN part as follows:

- name: Purge cache
  env:
    BUNNY_API_KEY: ${{ secrets.BUNNY_API_KEY }}
    BUNNY_PULL_ZONE: "${{ secrets.BUNNY_PULLZONE }}"
  run: |
    # Purge cache
    curl \
      -X POST \
      -H "AccessKey: ${BUNNY_API_KEY}" \
      -H "Content-Type: application/json" \
      "https://api.bunny.net/pullzone/${BUNNY_PULL_ZONE}/purgeCache"

- name: Refill cache
  run: |
    # Force refill
    curl "https://$DOMAIN"

Since the gitea runner is on my homelab, I can push a notification of deployment on my local IRC as follows:

- name: Notify result
  if: always()
  run: |
    workflow="https://gitea.hannibal.local/${{ github.repository }}/actions"
    if [ "${{ job.status }}" == "success" ]; then
      status="done"
    else
      status="failed"
    fi

    curl -X POST http://localhost:3040/hannibal-monitoring \
      -H "Content-Type: application/json" \
      -d "\
          { \
            \"receiver\": \"webhook\", \
            \"alerts\": [ \
              { \
                \"status\": \"$status ($workflow)\", \
                \"labels\": { \
                  \"alertname\": \"blog published\" \
                }, \
                \"annotations\": { \
                  \"summary\": \"https://$DOMAIN\" \
                } \
              } \
            ], \
            \"version\": \"4\" \
          }"

That being done, my website is accessible every on earth with a latency of 230 ms (amazon is at ~480ms, and google is at ~330ms).

Unless my website becomes really popular, it should not cost me much more than $2.