I’m going to start experimenting with CI/CD Pipelines. I’m pretty sure this will be a multi-day project, if not multi-week or multi-month!
CI = Continuous Integration
CD = Continuous Deployment
Pipeline = an automated CI/CD lifecycle
You might have heard the term DevOps mentioned. CI/CD is pretty much what DevOps is about…a continuous cycle of develop/integrate/test/deploy but using automated functions. Continuous isn’t what I’m trying to achieve here though. I only develop when I get a new idea or something breaks, so it’s hardly continuous for me. However, having a more rigidly controlled source code repository, plus a standard/automated deployment method, with the ability to roll a change back would be useful.
Coming from an IT background, including development and project management, I tend to try to keep source code under control and manage deployments as I did professionally to try not to break things. However, I am conscious that each time I move servers, or modify code that I’ve not touched for some time, I have to re-learn how some things work. Having to more rigorously manage code and it’s deployment will help me to manage things better in the longer-term.
Containerising has also given me opportunities that weren’t available before, e.g. my test and live versions of scripts use different directory paths for data/config, but that also means I have to create config entries pointing at the paths and then make sure I use the appropriate path variables when I refer to files. I can now do that outside containers, so inside the container /path/to/my/data is always the same, but outside the dev files are in /data/dev_gfs and /data/prod_gfs and these map to the internal paths when the container is created. So, I want to modify the GFS/ECMWF code in particular to handle this and thought that trying to do it with CI/CD would be good. It’s also non-time critical because I’m not trying urgently to fix a bug, which means there’s no pressure in trying to implement a new way of working very quickly.
Finally, I’m conscious that I post these things in the Site Feedback category which isn’t really what I intended that category to be for. So I’m going to create a new category to hold my discussions on where I’m going and move my posts there. Not sure what to call it yet, but I’ll think after posting this. That way, anyone not interested in the IT ramblings of an ex-professional can mute the category so you don’t get notifications.
Ramblings of a Crazy Admin?
Admin’s IT Revolution?
Falling Down the Rabbit Hole with Admin?
What’s Admin Doing Now?
What’s Next from Admin?
I’ve made some progress. I’m experimenting with something called Jenkins and following a tutorial. So far I’ve got Jenkins installed and have successfully created two “Hello World” jobs. The first is just a bash shell script and the second is a (very simple) Python3 script. Those two jobs ran in the Jenkins Master which isn’t a recommended way to do things.
Next I worked through creating a Jenkins agent (a Docker container) that ran the bash script Hello World job. This allows the job to run in an isolated environment, much as it would do as a conatiner, so that the testing can run alongside the other production containers on the server. In a bigger environment you’d have specific servers for testing, but that’s an expense too far!
Finally for today I created an agent/container that included Python to run the Python Hello World script.
This has mostly all worked first time, with just one part requiring some extra research. I can see how Jenkins can help me with what I’m trying to do, but I still have areas to explore. Luckily one of my sons is a developer in a company who use CI/CD so I’ll be having a chat with him about the best way to achieve what I’m trying to do.
That’s all for today. Sleep beckons!
I haven’t had much time yesterday and today to make a huge amount of progress, but I’ve made some.
I’ve completed the first tutorial on the use of Jenkins for automation. I now know enough to be dangerous! It’s relatively straightforward so far, to the extent that I’ve tried something of my own today. I failed to get it working, but that’s not actually due to my not understanding Jenkins. It was actually my misunderstanding about how to build a docker container image inside a docker container.
I’ve also been doing some work that isn’t Jenkins, but is a very useful addition to using Jenkins. You’ve probably heard of GitHub and/or GitLab. These are cloud front-ends for the source code versioning software called ‘Git’. You can create repositories of software source code, and other artefacts, with previous file versions retained for a long as you need them. Some examples are:
-
I can create a repository for the GFS data download for WxSim. All the scripts are stored in it. Modifications are made to the scripts and stored in Git. If I make a change that goes wrong I can roll back the source code to the previous (working) version whilst I try to fix the issue with the new scripts.
-
If I want to make major changes to the scripts I can create a new ‘branch’, e.g. called ‘big change’. There are now two versions of the scripts. The current live version and a duplicate copy (‘big change’) that I can work on. Any work I do on ‘big change’ doesn’t affect the live scripts, so if there’s a need to fix something in the live scripts I can do that without having to rush through the ‘big change’ to release a fixed version. When I’m happy that ‘big change’ is ready to release I can ‘merge’ the current live version with ‘big change’ to make a new live version that incorporates all the changes.
I now have my own local instance of a cloud Git service using software called Gitea. Having my own version keeps it all nicely secure within my servers.
Git, and Gitea, can integrate with Jenkins. For example, the Jenkins scripts that automate the software building process can download (clone) the files from Gitea and I can even configure Jenkins to automatically rebuild the software when I make certain changes to the source code scripts.
Finally, to go alongside Gitea, I’ve installed Microsoft Visual Studio Code. This is a fancy programmer’s editor which does nice things like code highlighting, code completion and even highlighting code issues as you edit. It also has integrations with Gitea so it’s able to check things out of a Gitea repository (clone/pull) and check them back in (push/commit). I believe it’s also possible for it to integrate with Jenkins but I’ve not got that far yet.
That’s as far as I’ve got for now. Despite the weather being nice for a change today I can’t go out because we’ve got workers digging up the footpath and house paths to install a new gas service to replace the old lead pipe that’s been in the ground for years. I need to stay nearby as they need to get into the house to connect things up. Unfortunately that means I have more time to spend with Gitea, Jenkins and VSC
2 Likes
I’m getting used to git and having only really ever used it for downloading files I wish I’d learned to use it sooner.
I’ve started the process of creating repositories that build the four docker images that I need. All the configuration files/scripts for two images are now stored in their own git repositories.
To build a new image containing all the latest updates I firstly clone the repository onto a system that has docker installed. That downloads all the scripts and config files into a new directory.
I then run the build script which creates a new image with an incremented version number and uploads the image to git (so that Docker can download it to create a container using it later on). The build script also updates the stored version number and puts that information back into git so that the next built version get the correct number. Each time files are updated a comment is added so that I can look back to understand why each change was made.
This hopefully will help me avoid the ‘why did I change that file’ questions in future
1 Like