From 42d1b516845f52e2d4341f231cd5ab34bca455dd Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Stephan=20D=C3=B6rfler?= Date: Tue, 11 Feb 2020 09:20:11 +0100 Subject: [PATCH] added deployment conclusion --- content/blog/003-writing-articles/index.md | 28 +++++++++++++++++----- 1 file changed, 22 insertions(+), 6 deletions(-) diff --git a/content/blog/003-writing-articles/index.md b/content/blog/003-writing-articles/index.md index 2e03bdc..363033a 100644 --- a/content/blog/003-writing-articles/index.md +++ b/content/blog/003-writing-articles/index.md @@ -4,7 +4,7 @@ date: "2020-01-27T15:38:26.882Z" description: How I automated everything between writing an article and updating the blog itself --- -# Writing +## Writing This is a blog. So first of all to add new content to it, I write stuff. For now the topics originate from the development and deployment of the blog itself. @@ -22,7 +22,7 @@ For now this is a manual process, which means I usually write the posts in Visua All steps on the server are handled from the terminal and typing the same commands over and over on each new post is rather unconfortable. -# Automation - Planning +## Automation - Planning Whenever you do the same thing often, automation comes to mind. So the steps on the server are to be automated. I have some ideas for that. @@ -45,6 +45,8 @@ Around that time of planning I decided this isn't the way to go. At my job I rel [Drone](https://drone.io) caught my eye, as it has full docker support, is open source and can have multiple, distributed workers. Perfect, I finally get to use the "sandbox" VPS I rent which just accumulates virtual dust. After reading the documentation, the setup was fairly easy. +## Automation - Setting up + First I spun up a docker container for drone, `drone/drone` on Docker-Hub. It has lots of required arguments so I will only describe some of the configuration: * The gitea server I want to depend upon (other git servers can also be used) @@ -84,11 +86,10 @@ steps: - '1.3.0' ``` -For now I only require three steps: +For now I only require two steps: -1. install node.js dependencies -2. build the gatsby project -3. build the new docker image and push it to the registry +1. install node.js dependencies and build the gatsby project +2. build the new docker image and push it to the registry An additional benefit of the drone build is this beatiful badge, every project has nowadays, conveniently prepared as markdown: @@ -98,3 +99,18 @@ Also, It has nice visualization of the build process with logs for debugging in ![Drone-CI UI](./drone-ci.png) Then, the last required step is to update the running container to the new version. The event on which to react would be the upload to the registry. There are some ways to handle this myself using webhooks, but as with the build trigger I decided to take a route a little more convenient: use [Watchtower](https://containrrr.github.io/watchtower). I tried watchtower before and don't feel comfortable blindly updating every container I run, so I [configure it to just watch the one blog container and update that automatically](https://containrrr.github.io/watchtower/container-selection/). As I am the only one pushing updates of the image for the blog I can take precautions when I know something will break. + +## Conclusion + +As a result, I now just happily type a new article in a new markdown file. As soon as I commit and push that (onto the `master` branch) the automation takes over: + +1. The git server notifies drone of the new commit +2. drone spins up a new worker that: + 1. pulls the latest version of the code + 2. installs node dependencies + 3. builds the gatsby project + 4. builds the docker image + 5. pushes the docker image to my repository +3. watchtower sees a new version of the image in the repository and updates the running container + +It took me quite some hours to get everything working and being used to out-of-the-box CI/CD experiences like Azure DevOps I knew it had to work, but stumbled over the details quite a few times. It was a great opportunity to learn about all the connected parts and hopefully will help me in the future when I need to dive into the details of any pipeline.