Azure – vZilla https://vzilla.co.uk One Step into Kubernetes and Cloud Native at a time, not forgetting the world before Mon, 22 Mar 2021 19:53:31 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.1 https://vzilla.co.uk/wp-content/uploads/2018/01/cropped-profile_picture_symbol-32x32.png Azure – vZilla https://vzilla.co.uk 32 32 Getting Started with Microsoft AKS – Azure PowerShell Edition https://vzilla.co.uk/vzilla-blog/getting-started-with-microsoft-aks-azure-powershell-edition https://vzilla.co.uk/vzilla-blog/getting-started-with-microsoft-aks-azure-powershell-edition#respond Tue, 23 Mar 2021 08:35:00 +0000 https://vzilla.co.uk/?p=2846 This post is going to cover off using Azure PowerShell to get a Microsoft Azure Kubernetes Service (AKS) cluster up and running in your Azure Subscription.

In the previous post, we went through the same AKS cluster creation using the Azure CLI

Which one you choose will depend on your background and usage, if you are familiar with PowerShell then you might choose this option as you might be more familiar with the object output. There are lots of posts already out there around the Azure CLI vs Azure PowerShell here is one, but I am not going to get into that here.

Install Azure PowerShell

Spoiler Alert! To use Azure PowerShell, you are going to need to install it on your system. This article explains how to install the Azure PowerShell. Or before doing this confirm that you have it already installed by running the command in your PowerShell console.

# Connect to Azure with a browser sign in token
Connect-AzAccount

With the above command, you are either going to get a wall of red text saying module not found or you will be prompted to log in to your Azure portal. Alternatively, you can just check which modules you have installed with the Get-Module command.

032221 1433 GettingStar1

Either way, you need to connect to your Azure Account and authenticate.

032221 1433 GettingStar2

Authenticate to the account you wish to use and then you will see the following in the same browser.

032221 1433 GettingStar3

Back in your PowerShell console, I am using Visual Studio Code to run through these commands I now see the following:

032221 1433 GettingStar4

Variables

I generally want to define some variables before we begin creating our AKS cluster. We will use these variables later on in our commands and you will get the complete script linked at the bottom.

$ResourceGroupName = "CadeAKS"
$ClusterName = "CadeAKSCluster"
$ClusterLocation = "eastus"
$NodeCount = "3"

Creating the Azure Resource Group

Next, we need to create a new resource group for where our AKS cluster will be hosted. Broadly speaking the Azure resource group construct is a group where resources are deployed and managed, when creating a resource group, you define a location and a name. For more information, you can find that here.

#Create a New resource group
New-AzResourceGroup -Name $ResourceGroupName -Location $ClusterLocation

032221 1433 GettingStar5

Creating your AKS Cluster

For this example, I will be using Azure PowerShell to also generate a new SSH Public Key, but if you wish to create or use an existing key then you can see the detailed process for creating that public SSH key here. The command to create your AKS cluster with your existing SSH key is as follows: Obviously pointing to the correct location of your SSH Key.

New-AzAksCluster -ResourceGroupName $ResourceGroupName -Name $ClusterName -NodeCount $NodeCount -SshKeyValue 'C:\\Users\micha\\.ssh\\id_rsa'

As I mentioned I will be creating a new cluster and with that also creating new SSH keys with the following command.

#Create the AKS cluster, GenerateSshKey is used here to authenticate to the cluster from the local machine.

New-AzAksCluster -ResourceGroupName $ResourceGroupName -Name $ClusterName -NodeCount $NodeCount -GenerateSshKey -KubernetesVersion 1.19.7

032221 1433 GettingStar6

When this is complete you will get the cluster, information posted like below.

032221 1433 GettingStar7

Accessing the Kubernetes Cluster

The first part to access is making sure you have the kubectl available on your system you can do this by running the below command.

#This will install Kubectl but i am not sure if this is needed if you already have kubectl on your system will have to test that.


Install-AzAksKubectl

Once you have this, we can now import the AKS cluster context to our kubectl configuration to access the cluster.

#Now we need to add our AKS context so we can connect


Import-AzAksCredential -ResourceGroupName $ResourceGroupName -Name $ClusterName -Force

032221 1433 GettingStar8

Now if we check the kubectl config contexts

032221 1433 GettingStar9

Deleting the AKS Cluster

When you have finished your testing, learning tasks then I would advise removing your cluster, do not just leave it running unless you really need to. By leaving it running you are going to be spending money and potentially lots of it.

When you are finished running the following command based on what we have created above.

#To Delete your cluster run the following command
Remove-AzResourceGroup -Name $ResourceGroupName -force

At this stage you might also want to delete that SSH Public Key we created above as well, and this can be done with the following command.

Remove-Item C:\Users\micha\.ssh\id_rsa

aks

You might also find this repository on GitHub useful where I store my scripts for the above as well as Azure PowerShell which I will cover in another post.

Hopefully, this will be useful to someone, as always open for feedback and if I am doing something not quite right then I am fine also to be educated and open to the community to help us all learn.

]]>
https://vzilla.co.uk/vzilla-blog/getting-started-with-microsoft-aks-azure-powershell-edition/feed 0
Getting started with Microsoft Azure Kubernetes Service (AKS) https://vzilla.co.uk/vzilla-blog/getting-started-with-microsoft-azure-kubernetes-service-aks https://vzilla.co.uk/vzilla-blog/getting-started-with-microsoft-azure-kubernetes-service-aks#comments Mon, 22 Mar 2021 12:32:03 +0000 https://vzilla.co.uk/?p=2829 In this post we will cover getting started with Microsoft Azure Kubernetes Service (AKS) much the same as the previous post covering Amazon EKS, we will walk through getting a Kubernetes cluster up and running. Now we could walk through the Azure Portal and this is pretty straight forward and if you would like to see that as a walkthrough let me know and I will cover this but I think the most appropriate way is gearing up for Infrastructure as Code.

I took the scientific approach over the weekend and asked if I should use Azure CLI or Azure PowerShell, the twitter-verse responded with their views and opinions on which one they use and why. Let’s rewind slightly we have 4 options when it comes to Microsoft AKS, Azure Portal, Azure CLI, Azure PowerShell or using an ARM Template. ARM templates will come later along with Terraform, but I wanted to understand a little more about the components you needed to get up and running. If you didn’t respond, then head on over to Twitter and let me know your choices and why. Neither is the wrong answer.

032221 1052 Gettingstar1

I decided based on the thread and feedback that I should probably take both options and simply walk through the configuration. This post will cover Azure CLI and I will make sure I also walkthrough another post covering Azure PowerShell.

Getting Started with Azure CLI

Start by downloading the Azure CLI, I am using Windows but as always you have options for macOS and Linux all found in the same location. Another option is the Azure Cloud Shell which you may have if you are using the Windows Terminal. Now I believe that this is the same as Azure CLI but more details can be found here from Sarah Lean aka Techie Lass Blog.

032221 1052 Gettingstar2

Once you have the Azure CLI downloaded and installed when you open your PowerShell window you should be able to run the following command. This confirms that you have the Azure CLI now installed on your system. This also means that I can work with my Azure CLI from my PowerShell console rather than having to open the specific tab on Windows Terminal. If you run az version on your Azure Cloud Shell you will get something very similar which is why I think this is the same.

032221 1052 Gettingstar3

The first thing we need to do is connect to our Azure account. We can do this by running the following command.

#Web Browser will open to authenticate against your subscription


az login

This is going to open a new browser window which you will need to authenticate to gain access and be able to start building things.

032221 1052 Gettingstar4

Once you have authenticated your browser will report the following

032221 1052 Gettingstar5

I am using Visual Studio Code, once you have authenticated you will see your subscription ID as per below.

032221 1052 Gettingstar6

Creating the Azure Resource Group

Next, we need to create a new resource group for where our AKS cluster will be hosted. Broadly speaking the Azure resource group construct is a group where resources are deployed and managed, when creating a resource group you define a location and a name. For more information, you can find that here.

#Create a New resource group


az group create --name AKSResourceGroup --location eastus

032221 1052 Gettingstar7

With the above we have created a resource group in the EAST US by running the following command it will output a JSON file containing the Azure Region data. All geocode locations for Azure Regions are estimates. They do not represent the actual physical location for specific data centres.

az account list-locations > azure_regions.json

Creating your AKS Cluster

For this example, I will be using Azure CLI to also generate a new SSH Public Key, but if you wish to create or use an existing key then you can see the detailed process for creating that public SSH key here. The command than to create your AKS cluster with your existing SSH key is as follows: Obviously pointing to the correct location of your SSH Key.

#If you would like to use existing SSH keys


az aks create --resource-group AKSResourceGroup --name MyAKSCluster --node-count 3 --ssh-key-value C:\\Users\micha\\.ssh\\id_rsa

As I mentioned I will be creating a new cluster and with that also creating new SSH keys with the following command.

#If you would like to generate new SSH Keys


az aks create --resource-group AKSResourceGroup --name myAKSCluster --node-count 3 --generate-ssh-keys

Once the above command is complete you are going to get a JSON output indicating what you have just built, that will look something like the following, In here you will notice some important aspects that you did not define but could do in the future to tailor your cluster to your requirements.

032221 1052 Gettingstar8

We are using Orchestrator version 1.18.14 you could define this with the following

[--kubernetes-version]

We are using a VM size of “Standard_DS2_v2” you could define this differently using the following

[--node-vm-size]

There are lots of other options as you can imagine spending some time here understanding some of those variables you could use.

Accessing the Kubernetes Cluster

Now we have our running AKS cluster we want to access this to start deploying our apps. First, we need to make sure we can access the cluster and to do this we need to run the following command. At this point, you need to make sure you have the kubectl installed on your system.

#Merge AKS Cluster with current Kubectl Configuration


az aks get-credentials --resource-group AKSResourceGroup --name myAKSCluster

you can then confirm this by running

#Confirm kubectl has new config


kubectl config get-contexts

032221 1052 Gettingstar9

At this stage, we have access to our cluster

032221 1052 Gettingstar10

If you are just getting started then I have to say that the Microsoft quickstarts are the place to be, super simple and easy walkthroughs. Some great examples of getting applications up and running from start to finish. You can find a link to a specific one on Azure CLI and AKS here.

Deleting the Kubernetes cluster

When you have finished your testing, learning tasks then I would advise removing your cluster, do not just leave it running unless you really need to. By leaving it running you are going to be spending money and potentially lots of it.

When you are finished running the following command based on what we have created above.

#Delete the Cluster


az group delete --name AKSResourceGroup --yes --no-wait

For peace of mind, you can then also double-check the portal to make sure all the resources within the resource group are also being deleted.

Kubernetes AKS

You might also find this repository on GitHub useful where I store my scripts for the above as well as Azure PowerShell which I will cover in another post.

Hopefully, this will be useful to someone, as always open for feedback and if I am doing something not quite right then I am fine also to be educated and open to the community to help us all learn.

]]>
https://vzilla.co.uk/vzilla-blog/getting-started-with-microsoft-azure-kubernetes-service-aks/feed 4
Veeam Direct Restore to Microsoft Azure, It is not new but… https://vzilla.co.uk/vzilla-blog/veeam-direct-restore-to-microsoft-azure-it-is-not-new-but https://vzilla.co.uk/vzilla-blog/veeam-direct-restore-to-microsoft-azure-it-is-not-new-but#comments Mon, 04 May 2020 08:41:00 +0000 https://vzilla.co.uk/?p=2184 What if I told you, you could take any Veeam image based backup and convert / restore that to an Azure virtual machine without the requirement of any additional storage or file system within Azure other than the disks and resources required to run that virtual machine or virtual machines.

And what if I told you, this has been around for years with Veeam Backup & Replication. Veeam have had this capability for a while now since 2016 in fact.

Primary use cases that we have seen have been,

Test and development

When you have the public cloud at your fingertips why not take advantage of it? Instead of having to purchase specific test and development environments. Also, perfect idea if you are looking to just see how certain apps and workloads are going to run in Microsoft Azure.

Data Migration

Let’s say you know where you are going and that is Microsoft Azure, how are you going to get those workloads there in a fast and efficient manner, Direct Restore to Microsoft Azure enables a fast way to restore those backups to the public cloud without compromising on keeping the restore points and more to the point the rollback is back to those production systems you also still have on premises.

Data recovery

We tend to talk about the bad failure scenarios, or we think nothing will happen to us and not really touch on the in between. What if you lost half your production virtualisation servers due to an outage of some description? What would you do? This feature within Veeam Backup & Replication enables you to restore some of your workloads from backups into Microsoft Azure you can then use an existing VPN or some other connectivity to join the environments and continue working or you could use VeeamPN to achieve this.

Walkthrough

In this YouTube video I walk through how easy and simple it is to get those image-based backups restored into Microsoft Azure as native Azure VMs for some of those use cases mentioned above. This also ties into the Veeam Backup for Microsoft Azure that was released this week.

Where should I run the conversion process?

I ran some tests for this one to determine for my lab where and what would be the best practice when it comes to restoring workloads into Microsoft Azure. Veeam offers a lot of choice when it comes to restore and how to assist when environmental challenges are in the way. Things like link speed to the public cloud due to location or other reasons for that. Also since the release of this feature back in 2016 there have also been many other enhancements and features added to Veeam Backup & Replication including the new Veeam Cloud Tier which gives us the ability to store our backups in Object storage, well we can also recover from those as well. This video linked below goes into more detail around where and what considerations you should take when looking to restore workloads to the public cloud.

Cloud Tier

It is only right that we have spoken about protecting native Azure VMs using the Veeam Backup for Microsoft Azure, we have spoken about getting your image based backups from either virtual or physical platforms that you have on premises or even in other public clouds to Microsoft Azure so I had to mention Cloud Tier or Capacity tier on how we can tier our backups or copy our backups into Microsoft Azure Blob Storage for either a long term retention or an offsite copy of your data.

Couple all these features together and we have a pretty dynamic and flexible way of being able to move data to from and within the public clouds.

If you have any questions or comments, feedback at all on the videos then please let me know either here in the comments, on the YouTube channel or on Twitter, a side note here is that I will be creating more video content over the next few weeks whilst we are stuck at home, I for one have been consuming a lot more of my news and education through YouTube and judging by the uptake in subscriptions I think you are too so let me know anything you want to see or for me to walk through.

]]>
https://vzilla.co.uk/vzilla-blog/veeam-direct-restore-to-microsoft-azure-it-is-not-new-but/feed 4
Veeam Backup for Microsoft Azure https://vzilla.co.uk/vzilla-blog/veeam-backup-for-microsoft-azure https://vzilla.co.uk/vzilla-blog/veeam-backup-for-microsoft-azure#respond Sun, 03 May 2020 15:21:25 +0000 https://vzilla.co.uk/?p=2181 Last week Veeam released its version 1 of Veeam Backup for Microsoft Azure.

What is Veeam Backup for Microsoft Azure?

This new product focuses in on the Azure IaaS workloads you have running in the public cloud, much like the Veeam Backup for AWS edition that was released early this year, this product provides you the ability to protect those Azure VMs without having to install and agent on each one. It is a policy driven approach allowing for both snapshots and backups to be part of your data management plan when it comes to Microsoft Azure.

The product is a standalone solution that is deployable from the Microsoft Azure marketplace. A very easy to use wizard driven approach to configuration and management. Veeam Backup for Microsoft Azure Free Edition and subsequent versions are available within the Microsoft Azure Marketplace.

050320 1515 VeeamBackup1

The FREE edition allows you to protect 10 Azure VMs using native snapshots and then tier those snapshots to an Azure Blob Storage repository.

Within the Azure Blob Storage Repository these backups are stored in the portable data format that sets Veeam apart from the other vendors in this space. This allows for the Veeam Backup & Replication External Repository feature to be leveraged and enables the ability to further additional data protection or allow for other tasks such as migrations or on premises data recovery.

As you would expect the offering also allows you to recover those Azure Virtual Machines not only back where they initially resided but also across accounts and even across regions. As well as being able to provide file level recovery for a more granular option.

Another cool feature is the ability to see a level of cloud cost, when you create your policies through the wizard driven approach you have the ability to start seeing some cost forecasting so you can make better decisions about your cloud cost consumption.

Policies, Workers & Protected Data

Those familiar with Veeam will notice a different approach to some of the key functions and naming, and maybe you can liken these new terms with those found in Veeam Backup & Replication they have some differences.

Those familiar with Veeam Backup & Replication will recognise Policies as something more commonly known as Backup Jobs, however even within Veeam Backup & Replication world we are seeing policies now entering the fold with the CDP policy coming in later releases.

Policies give you the ability to define several requirements when it comes to your cloud data management. But again, it is that same very easy to use wizard driven approach that all Veeam customers will be familiar with.

You can choose to protect everything in a region, or we can be granular on what to protect. An awesome feature here is that you can select either by Instance or by Tag. Tags really lend well to the fast-moving pace of Cloud Instances being spun up and spun down all the time. The ability to use tags means we can protect in a more dynamic fashion. We will demonstrate the ease of use and how dynamic these tags within Azure can be created and used for your data management needs.

I mentioned above about Snapshots and Backups and how they are used together in this product to provide the best of both worlds when it comes to fast recovery points but also an out of band copy of your data not linked to the original VM.

You may wish on some workloads to only provide Snapshots and some only backups, or both. Snapshot settings allows you to define when these will be taken and how many snapshots you intend to keep. Backup Settings is where we can define that Microsoft Azure Blob Storage repository in which we wish to store those backups to, this will also play the part of making that data visible if you wish to see that within Veeam Backup & Replication. You also have the same retention setting to define here.

The workers are configured during the configuration stage and setup of the Veeam Backup for Microsoft Azure. Those familiar with Veeam Backup & Replication could maybe liken these worker nodes to the Veeam Backup Proxy component within VBR.

The worker is a Linux based instance that is deployed and used when data needs to be transferred, the worker is used for both backup and recovery. When the policy is complete then the workers are shut down but remain in place for the next scheduled policy to take place.

Cost Estimations

A unique feature that is built into the Veeam Backup for Microsoft Azure free edition and will obviously include other versions is the ability to estimate cost when it comes to backups and storing the retention you have defined. This is something else we go into further detail within the video walk-through below.

As I have mentioned this post gives a very high-level overview of what you can find with the new product but if you would like to see more then I have created a walk-through below. Any comments please comment here, on the YouTube video or find me on twitter.

Let me know what you think to the YouTube walk-through’s it is something I am intending to really increase given that we are house bound and I have more time to create this content.

]]>
https://vzilla.co.uk/vzilla-blog/veeam-backup-for-microsoft-azure/feed 0
#SummerTraining – Options for data in the cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-data-in-the-cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-data-in-the-cloud#respond Tue, 17 Mar 2020 08:30:00 +0000 https://vzilla.co.uk/?p=2091 The next #ignitethetour training I took was with Cecil Philip of Microsoft. Data is of huge interest of me and has been my whole IT career, knowing where that data is stored for production, backup for analytics regardless of it being on premises or in the public cloud or even being hosted by a service provider.

I think with the options we have available today to store our personal data and our mission critical enterprise data and everything in between we have so much choice.

This session was focused on how the cloud could help when it comes to storing your data in Microsoft Azure.

Three key things that the session enabled viewers to go away with were.

  • Understand the type of data you have
  • Azure has hosted options for databases
  • Your data solution should be able to grow with you

What is important for you – the customer

There are thousands of things that will be specific, but many will be very similar.

  • How can we make things faster?
  • Limit or mitigate risk when deploying new services
  • Putting more control to the developers in the organisation
  • Scalability
  • Using the right tool for the job and potentially being able to pivot when need be

Should we have a storage strategy?

The session moved into more of a why is a storage strategy important. This has been something a good friend of mine Paul Stringfellow has been speaking about both on his blogs and his podcasts, and this relates to all customers not just large enterprise customers and environments should have a storage or data strategy.

We should always be considering,

  • Maintaining Security
  • Breaking down data and storage services into manageable set
  • Consider the lifespan of the data and where it needs to be and for how long?

Before we start

What data do you have?

  • Structured Data – data that has been organised into a formatted repository, typically a database, so that its elements can be made addressable for more effective processing and analysis. A data structure is a kind of repository that organizes information for that purpose.
  • Unstructured Data – information that either does not have a pre-defined data model or is not organized in a pre-defined manner. Unstructured information is typically text-heavy, but may contain data such as dates, numbers, and facts as well.
  • Semi Structured Data – a form of structured data that does not obey the formal structure of data models associated with relational databases or other forms of data tables, but nonetheless contains tags or other markers to separate semantic elements and enforce hierarchies of records and fields within the data.

How much data do you have?

  • Volume
  • Variety
  • Velocity

Azure Storage Services

There is quite the storage offering when it comes to Microsoft Azure and it’s important to understand the options for those data types and what should be stored where.

031520 1701 SummerTrain1

What is Azure Blob Storage?

Azure’s Object Storage platform used to store and serve unstructured data.

  • App and Web scale data
  • Backups and Archive
  • Big Data from IoT, Genomics, etc.

My interest here instantly went toward the backup mention above and in particular how Veeam have been leveraging Object Storage for backup data across the platform for storing the backup formats for either long term retention, direct copies but also copies of your Microsoft Office 365 backup data. Some of the characteristics that come with object storage especially with Microsoft Azure Blob Storage.

  • Infinite scale
  • Globally accessible
  • Cost efficient

Databases – Relational Databases

Relational databases have many different options within Microsoft Azure.

The first option is by taking your on-premises SQL or relational database and migrating those VMs or workloads to Microsoft Azure. But this is likely not going to be the best route to take because of cost and management.

The compelling route should be more along the line of PaaS offerings from Azure, these could be any of the following and I am sure there a likely new services happening as and when the demand is great enough.

  • Azure SQL Database
  • SQL Data Warehouse
  • PostgreSQL
  • MySQL
  • MariaDB

All of these PaaS offerings still leverage the Azure Compute and Storage layer, but they offer the ability for many other Azure services to work with these databases.

Cosmos DB

Azure Cosmos DB – A globally distributed, massively scalable, multi-model database service

A NoSQL database is different to what we just mentioned with SQL or other relational databases.

031520 1701 SummerTrain2

I actually want to learn some more about Azure Cosmos DB, the introduction in this session was great and opened my eyes to this actually being not another flavour of a NoSQL database but potentially an aggregation of existing NoSQL databases. I need to learn more on this for another time.

I am really interested in the distributed format of these databases and the ease of use about being able to have a write region and then additional read regions across the world or at least in different locations. However, you can have multi region writes which will help with scale.

Resources

All of this is really well covered in the Azure Documentation – https://aka.ms/apps20ignite

Another thing that I had to share was the learning paths for this session alone. Almost 15 hours of training! This is hands on training and interactive without the billing but all the learning!

Sessions Resources

Session Code on GitHub including presentation

All Events Resources

]]>
https://vzilla.co.uk/vzilla-blog/summertraining-options-for-data-in-the-cloud/feed 0
#SummerTraining – Options for building and running your app in the cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-building-and-running-your-app-in-the-cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-building-and-running-your-app-in-the-cloud#respond Mon, 16 Mar 2020 08:30:00 +0000 https://vzilla.co.uk/?p=2087 Well it is Summer some place, but this learning curve has been going on since the summer in England where I really wanted to take some of the pre events season down time and learn something new, this has spanned a wide range of new and upcoming technologies of which some I have not even written about yet but I have been looking at I promise.

A big focus on

  • Containers & Kubernetes
  • Public Cloud Hyperscalers (Microsoft Azure, AWS and Google Cloud Platform)
  • Infrastructure as Code & Automation

My aim for the public cloud and in particular Microsoft Azure was to get a better understanding on why? Why would some of our existing customers want or should they move to Microsoft Azure and what options do they have in doing so?

The level of education I am aiming for is around foundation learning curve that allows me to better understand in all three of the fore mentioned public cloud hyperscalers

  • Compute
  • Storage
  • Security
  • Networking

The idea is not to sit all the certifications and become a master in any or all that would be insane, but an understanding is required to be able to have those conversations in the field with our customers and prospects.

My Azure learning started with the Ignite sessions all available online. I have to say Microsoft really do nail the production quality and the time to get this stuff online straight after they have happened. It was the first ignite on tour or at least the one in London that got me interested and although I could not attend the live show, I was able to grab the agenda.

This first write up will touch on the “Getting Started” and will focus on the session that was delivered by Frank Boucher called “Options for Building and Running Your App in the Cloud” The session touches on the options available and security as the first steps of understanding and leveraging the public cloud for what it was built for.

Franks first comment and I think this is a solid way of thinking about cloud technologies. There are no bad choices… there is no bad first step. But I will add to this the purpose and the requirement of that data and use case has to be clear. If the data is important make sure it is protected against failure.

There are always plenty of options, just get started and you can always or should always be able to move to other options or find better ways to improve your application or the purpose you are trying to achieve.

Deployment Tools

Visual Studio IDE

Modular, QA / Test very versatile and regardless of your programming language there are options. Multi-platform and also in the cloud version. Visual Studio Online

  • Multi-Platform (Windows & Mac)
  • Customisable workloads
  • Multi-Language
  • Tons of Extensions
  • Live Share – Real time collaborative development!

Visual Studio Code

Lighter version of the previously mentioned IDE version. Less features but still powerful,

  • GIT commands built in
  • extensible and customisable
  • Full support for all platforms (Linux, Mac and Windows)

Terminal & CLI

  • Cloud Shell
  • Azure CLI
  • Azure PowerShell

ARM Templates

Azure Resource Manager Template, this is where we meet infrastructure as code functionality where we concentrate on version control, a fast way to deploy resources in a declarative model without having to manually deploy our infrastructure.

  • Architecture / Infrastructure as code
  • Version Control
  • Fastest Way to deploy

ARM templates might be a completely new way of looking for many infrastructure administrators, but I have to say the Microsoft Documentation in this area is amazing.

Aka.ms/azArm

Deployment options

Now we know some of the tools available and there are others but I wanted to focus on the Microsoft options, I personally believe that at this point there is a strong focus on using especially when it comes to Infrastructure as code, You may want to be agnostic to where you run your deployment, for this something like Terraform from HashiCorp is a great option to achieve this across multiple platforms.

Let’s take a website as the example of what we want to consider deploying. There are many options available.

Azure Blob Static Websites

  • Very low cost – Cheapest option
  • Fast
  • Static – however although this can be HTML it can also be more complexed options using Angular and React

PaaS (Web Apps)

PaaS removed the requirement to manage the architecture at a deep level, Scaling, Backup, Disaster Recovery and other platform tasks that are now managed by the service.

  • Client Side & Server Side
  • PaaS Features
  • Windows & Linux
  • Many Languages Supported (.NET, Java, PHP, Ruby, Python… etc.)

Containers

A couple of container options when it comes to Azure

  • ACI – Azure Container Instance
  • AKS – Azure Kubernetes Services

There are many different use cases between the two offerings above but also some overlap. I am not going to get into the AKS or Kubernetes in general benefits and functionality but if you are looking to simply run a very small or very simple application or service then ACI is going to be a great choice there. If you require scale and deeper choices and orchestration for your containers, then AKS will be the likely choice.

Virtual Machines

What if you already have the web server already configured and working in a different location, maybe on premises for example running in VMware as a virtual machine. You don’t have time to change this, but you want to get to Azure and that’s also possible.

Veeam has the ability in the free version to Directly restore image-based backups to Azure.

Shared Image Gallery

There is also a gallery that contains different images available, different Operating Systems and versions for both Windows & Linux. Some of these images also contain application deployments also.

  • Databases
  • Web Servers
  • Development Tools

Basic Security Features

Security has to be at this stage of the project, it should not be an afterthought. Because you may start and you are the only developer / operations engineer but then you scale out and out and out. Meaning sharing security keys and passwords over messenger apps becomes a complete vulnerability in your process.

Azure Key Vault

Azure Key Vault is a cloud service for safeguarding encryption keys and application secrets for your cloud applications.

The AKV keeps or focuses on clear separation of security duties, meaning that the role attributed to security can be in charge and manage the important security aspects.

  • Encryption Keys
  • Secrets
  • Certificates

Whilst App owners can consume and use the certificates in their applications. As well as your deployment being secured and segregated.

  • Manage all of your secrets in one place
  • Seamlessly move between Development, QA, and Production environments
  • Update credentials in one spot and update everyone’s credentials
  • Version, enable, and disable credentials as needed
  • Add a credential and it’s instantly available to all Developers in a controlled manner

Managed Service Identity (MSI)

Ok, so Azure Key Vault sounds great but how do we get into it to control the security aspects that have just been mentioned. How do we authenticate into AKV?

So we need credentials to get credentials…

031520 1700 SummerTrain1

Your deployment is registered with Azure this can be that VM, Function or anything we mentioned in that above Deployment Options. A local endpoint is exposed but this is only accessible within your local host that allows for access to valid credentials within the key vault.

Loads more reading material at aka.ms/docAAD on Azure Active Directory.

Resources

Session Resources

Session Code on GitHub including presentation

All Events Resources

]]>
https://vzilla.co.uk/vzilla-blog/summertraining-options-for-building-and-running-your-app-in-the-cloud/feed 0