Part 3: Creating a CI/CD Pipeline in Azure DevOps

⏰ Previous Post

If you’re just tuning in, I would highly recommend you go and checkout Part 1 and 2 of this series:

In the last post we added in some custom CSS and HTML to our blog and started to look a bit more at the advanced options offered by Sphinx, ablog and the Pydata Sphinx Theme.

If you just want a working version, you can fork the template repo I have on my GitHub page, this will allow you to dive right into blogging without messing around with internals! 🙌

🙋🏻‍♀️ What you are going to do in this tutorial

  1. Configure a pipeline in Azure DevOps to build our project.

  2. Configure a pipeline in Azure DevOps to auto deploy our blog.

  3. Create an Azure Storage Container for our static website.

After you complete all three parts you should end up with an end result that looks like this:


📓 Prerequisites

  • An Azure Account. If you don’t have an existing Azure account, they offer 12 months of popular free services, and $280 credit to explore Azure for 30 days.

  • A free Azure DevOps Account.

  • Basic knowledge of Azure, CI and CD pipelines.

If you plan to you the Azure CLI ensure you have the Azure CLI tools for your operating system.

🚀 Let’s Get Started

Hopefully after setting up an Azure account, you’ve become somewhat familiar with the interface and general concepts if not I’d recommend at the very least checking out this Microsoft doc on core Azure architectural components which will explain management groups, subscriptions, resource groups and resources.

🐱‍💻 Creating a Resource Group


Blades are used to refer to views or panels in Azure. You’ll notice as you use and become more familiar with Azure that you can use the horizontal scroll bar to view previous blades.

You’ll want to be logged into the Azure Portal and then using the side navigations, select the Resource Groups option:


In the Resource Groups blade, select Add and a new blade will open:


Next you’ll need to prompted to enter some details about your resource group:

  • Subscription:: The name of your current subscription.

  • Resource group name: The name of your resource group.

  • Region: The location the resource will be created in.


Once you’ve filled in those details click the Review + Create button at the bottom on the screen, and you will be shown a summary of the resource group you are about to create, for example this is mine:


Click the Create button. It will take a few seconds for Azure to create the resource group. Once it’s done though, you will see a small dialog box in the top right corner letting you know the resource has been created like so:


If you want to find out more about your resource group, or configure additional settings you can click the Go to resource group button, but for now we’re done with the resource group.

📦 Creating a Storage Account

Navigate back to the Azure Portal and using the search bar up the top, we want to look for Storage Accounts:


In the Storage Accounts blade, select Add and a new blade will open:


Next you’ll need to prompted to enter some details about your storage account:

  • Subscription: The subscription used in the previous step.

  • Resource group: Select the resource group you made in the previous step.

  • Storage account name: The name you pick must be unique across Azure and must be between 3 and 24 characters long, and can include only numbers and lowercase letters.

  • Region: Where you want you storage account to be located.

  • Performance: Standard.

  • Account kind: Storage V2.

  • Replication: Change to Locally-redundant storage (LRS) - which is cheaper option but still affords you a level of redundancy.


The remaining settings covered on the Networking, Data protection, Advanced and Tags blade can be left as is and you can click the Review + Create button at the bottom of the screen. Like the previous section you will be shown a summary of the storage account you are about to create, for example this is mine:


Click the Create button and it will take a few seconds to create the storage account. Once it’s done though, you will see a small dialog box in the top right corner letting you know the resource has been created. Click the Go to storage account button and you will be presented with the storage account blade:


In the side navigation bar, you will see a search bar, in there type Static website:


Click through and, then select the Enabled button to turn on a static website for our blob service:


You’ll now be prompted to enter the index document page, and error page. For now just set index to index.html like so:


Click Save, and a URL will be generated:


Browse to this URL to make sure everything is working, keep in mind there won’t be much there to see:


🙌🏼 Congrats! You’ve now set up a blog container that serves a static website! The next step is setting up a CI pipeline, using Azure DevOps.

🛠 Creating Your Continuous Integration Pipeline

Welcome to the next phase! Creating you continuous integration pipeline! Start by singing in, or creating an Azure DevOps organisation and create a project (also referred to as a repository).

You’ll also want to commit your code to your new repository, this will allow you to test that the build process works as expected before moving onto the deployment section of this tutorial.

Click Pipeline in the side navigation, and click New Pipeline:


Select Azure Repos Git:


Next you will select the repo you want to build:


Select your blog repository, and then Starter pipeline:


This will give you a very simple starter pipeline we will work off to build our site:


The first thing we want to do is specify the version of Python we want to use, I historically have used a matrix strategy which generates copies of a job, each with a different input, for example, if you wanted to test your deployment against different Python version, but I have also included a simpler version, where we specify the Python version in the pipeline variables:

- master

  vmImage: 'ubuntu-latest'
        python.version: '3.9'
        architecture: 'x64'

- task: UsePythonVersion@0
    versionSpec: '$(python.version)'
  displayName: 'Use python $(python.version)'
- master

  vmImage: 'ubuntu-latest'

- task: UsePythonVersion@0
    versionSpec: '$(pythonVersion)'
  displayName: 'Use python $(pythonVersion)'

Next you’ll need to set the $(pythonVersion) variable. You can do this by clicking the “Variable” button up in the top right corner. Click the ‘+’ button up the top, and define the variable name pythonVersion set this value to the same Python version defined in your Pipfile!

In our case of building a website, using either way is fine and won’t actually impact the build process.

We’ve now introduced the steps directive which contains tasks which run specific tasks or script which runs an arbitrary set of commands, lucky for us we’ll be using both!

Now that we have our Python version installed we need to install dependencies which we stored in our pipenv environment, but of course we need to have pipenv installed before we can install those, so our first call to action is to pip and then install pipenv!

- script: |
python -m pip install --upgrade pip
pip install pipenv
displayName: 'Install pipenv'


The pipe character (|) is needed to break our commands out onto multiple lines, so without it it would look like:

- script: python -m pipenv install
  displayName: 'Install dependencies'

Now that pipenv is installed we can install our dependencies in the same way we would if we were building our application locally:

- script: |
    python -m pipenv install
  displayName: 'Install dependencies'

We’ll next add in a command line task, the @n suffix tells us the version of the command we are using, so in our case, you’ll notice we are using version 2:

- task: CmdLine@2
    script: pipenv run ablog clean && pipenv run ablog build
    workingDirectory: '$(Build.SourcesDirectory)'
  displayName: 'Ensure clean environment, and rebuild blog'

Once our website has built, we need to copy those files from out source directory to our artifact staging directory:

- task: CopyFiles@2
    SourceFolder: '$(Build.SourcesDirectory)/_website/'
    TargetFolder: '$(Build.ArtifactStagingDirectory)'
    Contents: '**'
  displayName: 'Copy website files to artifact directory'

Finally, we publish the build artifact staging directory, which allows us to push our HTML files to Azure Pipelines, Microsoft Team Foundation Server, or a file share:

- task: PublishBuildArtifacts@1
    # The folder or file to publish - this can be a fully qualified
    # path or path to the root of a repository (wildcards not supported)
    PathtoPublish: '$(Build.ArtifactStagingDirectory)'
    # The name of the artifact you want to create, it can be whatever you
    # want!
    ArtifactName: 'drop'
    # Where we store the artifact we can pick in Azure Pipelines
    # (Container), or copy it to a file share (FilePath) that is
    # accessible from the build agent
    publishLocation: 'Container'
  displayName: 'Publish documentation as artifact'

Click the Save and Run button, from here if there are any errors they will show up in the job logs.

Make a small change to your blog, such as adding a new blog post and commit this to your repository, this allows you to make sure the build process works as expected, and it will create an artifact for you that makes selecting what you want to release in the next phase easier.

If you pop into the next section, we’ll find out how to publish your website everytime you push new code to the repo.

🚢 Automating Deployment

Welcome to the last part of this blog! If you’ve stuck around congrats, you’re almost finished, so let’s get stuck into it!!

Click Pipeline in the side navigation, and click Releases:


You’ll be prompted with a large list of deployments you can pick between, but we want to create a new empty release, the option for which is tucked away up the top:


Now we have our empty template, we can select the artifact we want to release to Azure storage. By clicking the + Add an artifact square:


Leave the Source Type, and Project as is and select the name of your repository in the Source (build pipeline) dropdown:


Click Add to complete setting the artifact. In our release pipeline, you will now see the artifact has been set:


Next we want to set up our pipeline to deploy or blog everytime a new build is available, we do this by clicking the lightning bolt icon in the top right corner of the artifact square:


A new “blade” will pop out and you will be able to set the Continuous deployment trigger to Enabled:


The final step in configuring the automatic release of our pipeline is moving our artifacts to Azure Storage. We start this, by adding a stage in the main release pipeline view:


Set the name to something meaningful, in this case something like “Publish to Storage” and make sure if not already, your user is set to the Stage owner:


Save this, and it will create a unpopulated stage in our release pipeline:


To add a task to our stage, click the 1 job, 0 tasks link in the stage box:


This will allow us not to create the stage and define the actions we want to happen. Start by clicking the + next to the Agent job listing:


This will pop out a blade that lists all the tasks you can complete as part of a release pipeline, you will see there are quite a few options available, so instead of looking through all of them, using the search bar at the top we want to look for Azure file copy:


Click Add. This will show us more options to configure, but before we start we want to change the task version from v4.* to v3.*. This is primarily because I’ve never had any luck getting the version 4 configuration to work 😣:


Once the version has changed, a few of the options will change and the mandatory ones will be highlighted in red, below I run through how to set these:

Source: The absolute path of the source folder path. Using the you are able to browse the source, which allows you to select the _drop directory in your artifact registry:


If you didn’t push code to the repository after creating the build pipeline in the previous section you may not have a _drop directory readily available to you, in this case I’d recommend closing the dialog box, committing the code and returning to this step.


Azure Subscription: The Azure Resource Manager subscription to target for copying the files, if this is your first time using your Azure subscription, you will need to click the Authorize button on the right of the text box. Once authorised, you’ll be able to select the subscription you used in the previous step to create the resource group:


Destination Type: Either Azure Blob or Azure VMs. This should be set to Azure Blob.

RM Storage Account: Specify a pre-existing ARM storage account. This should be the name of the storage account you created in the Creating a Resource Group section. You can use the drop down to pick the right one so don’t worry if you don’t recall the name.

Container Name: Name of the container for uploading the files. By default when we are uploading content to a static website in a storage container, this should be set to $web.

The remaining settings can be left un-configured and you can click the Save button up the very top to save your configuration. You’ll be prompted to select a folder and a comment. The folder should be left as /, however feel free to specify anything in the comment:


Click OK and be sure you save all the changes we have made to the pipeline by clicking that Save button in the top right corner! Once you’ve done that, I’d encourage you to once again make some changes to your blog and push them to the repository so you can make sure everything is working as expected. Just keep in mind that the deployment can sometimes take upto a minute so make sure you’ve given everything adequate time to do it’s thing.

You can also keep track of how the process is going going to the Pipelines > Releases view which will tell you what stage of the build process your code is up to!

🙌🏻 Congrats! You should now have a fully functioning website that will update everytime you push code to your Azure DevOps repo! Congratulations! 🎉

But of course you’re now probably like, “Buffy! Where do I look at my website?!” and I would say “That’s an AMAZING question!”. Do you recall a few sections back, when we set up our static website in Azure? You would have seen a screen similar to the one below:


That primary endpoint is where you can access your website once it has been successfully built! You can get back there by logging into the Azure Portal > Searching for the name of your Storage Account > Clicking through your Storage Accounts management interface, and in the left hand panel looking for “Static website”!

But before you publish that on your LinkedIn, Twitter or other social mead profiles you might want to get a nicer looking URL! ☺️ There is one more part to this series about configuring Azure CDN for your web application, which should be released sometime in the new year so hopefully, I’ll see you back in 2021 for that last installment in this series ♥️.

📚 Find out More