cloud – vZilla https://vzilla.co.uk One Step into Kubernetes and Cloud Native at a time, not forgetting the world before Thu, 27 Aug 2020 08:08:00 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.1 https://vzilla.co.uk/wp-content/uploads/2018/01/cropped-profile_picture_symbol-32x32.png cloud – vZilla https://vzilla.co.uk 32 32 Automated deployment of Veeam in Microsoft Azure – Part 2 https://vzilla.co.uk/vzilla-blog/automated-deployment-of-veeam-in-microsoft-azure-part-2 https://vzilla.co.uk/vzilla-blog/automated-deployment-of-veeam-in-microsoft-azure-part-2#respond Thu, 27 Aug 2020 08:07:59 +0000 https://vzilla.co.uk/?p=2377 The first part of this series was aimed at getting a Veeam Backup & Replication Azure VM up and running from the Azure Marketplace using Azure PowerShell. A really quick and easy way to spin the system up.

The use case we are talking about is the ability to recover your backups from maybe on premises up into Microsoft Azure.

I was asked “what about AWS?” and yes of course if you are using the capacity tier option within Veeam Backup & Replication on premises and you are using the copy mode function to land a copy of your backups on AWS S3 or IBM Cloud or any S3 Compatible storage then there could be possible synergies in doing this in AWS, why I chose Microsoft Azure was simply because there is an Azure Marketplace offering we can take advantage of.

If you would like to see a similar series with AWS then let me know either on twitter or in the comments below. This will involve a different way of automating the provisioning of a Windows OS and the installation of Veeam Backup & Replication, but not too hard as we already have this functionality using Terraform & CHEF but only for vSphere but the code can be changed to work with AWS and really any platform that requires this functionality.

Veeam Configuration

As I said if you followed Part 1 of this series then you will have your Veeam server now running in Azure with no Veeam configuration.

In order for us to automate the direct restore process we need to provide some details in the script which i will share in stages and in full at the end of the post. But as a high level we need to

Add Azure Storage Account
Import Backups
Add Azure Compute Account

Then we will take those backups and run the Direct Restore to Microsoft Azure on the appropriate backups in a converted state ready to be powered on, or you can choose to power them on as part of this script process.

Firstly we need to add the Veeam snap in and connect to the local Veeam Backup & Replication Server, depending on where you run this script you will need to change the appropriate localhost below to the relevant DNS or IP Address. It is my recommendation that this is done on the server itself, but I am exploring how this PowerShell script could be hosted on your network and not publicly and used that way to fill in the secure details.


Add-PSSnapin VeeamPSSnapin

#Connects to Veeam backup server.
Connect-VBRServer -server "localhost"

Next we will add the Microsoft Azure Compute Account, this command will prompt you to login and authenticate into Microsoft Azure. I use MFA so this was the only way I could find to achieve this.


#Add Azure Compute Account

Add-VBRAzureAccount -Region Global

Next we will add the storage account, You will need to update the script with the requirements below.

Access Key – this will be based on a storage account that you have already created and you will need the long access key for authentication.

Azure Blob Account – this is the name of the storage blob account you have previously created. This is the same blob account and process that you used for adding Microsoft Azure Blob Storage to Veeam Backup & Replication on premises.


#Add Azure Storage Account

$accesskey = "ADD AZURE ACCESS KEY"
 
$blob1 = Add-VBRAzureBlobAccount -Name "AZUREBLOBACCOUT" -SharedKey $accesskey

Now we need to add our capacity tier, this is where you have been sending those backups.


#Add Capacity Tier (Microsoft Azure Blob Storage) Repository

$account = Get-VBRAzureBlobAccount -Name "AZUREBLOBACCOUNT"
 
$connect = Connect-VBRAzureBlobService -Account $account -RegionType Global -ServiceType CapacityTier

$container = Get-VBRAzureBlobContainer -Connection $connect | where {$_.name -eq 'AZURECONTAINER'}

$folder = Get-VBRAzureBlobFolder -Container $container -Connection $connect

The next part to adding capacity tier is important and I have also added this into the script, this repository needs to be added with exactly the same name that you have in your production Veeam Backup & Replication.


#The name needs to be exactly the same as you find in your production Veeam Backup & Replication server
$repositoryname = "REPOSITORYNAME"

Add-VBRAzureBlobRepository -AzureBlobFolder $folder -Connection $connect -Name $repositoryname

Next we need to import and rescan those backups that are in the Azure Blob Storage.


#Import backups from Capacity Tier Repository

$repository = Get-VBRObjectStorageRepository -Name $repositoryname

Mount-VBRObjectStorageRepository -Repository $repository
Rescan-VBREntity -AllRepositories

Now if you are using encryption then you will need the following commands instead of the one above.


#if you have used an encryption key then configure this section

$key = Get-VBREncryptionKey -Description "Object Storage Key"
Mount-VBRObjectStorageRepository -Repository $repository -EncryptionKey $key

At this point if we were to jump into the Veeam Backup & Replication console we would see our Storage and Compute accounts added to the Cloud Credential Manager, we would see the Microsoft Azure Blob Storage container added to our backup repositories and on the home screen you will see the object storage (imported) which is where you will also see the bakcups that reside there.

Next we need to create the variables in order to start our Direct Restore scenarios to Microsoft Azure.

A lot of the variables are quite self explanatory, but as a brief overview you will need to change the following to suit your backups.

VMBACKUPNAME = Which VM is it you want to restore

AZURECOMPUTEACCOUNT = this is the Azure Compute Account you added to Veeam Backup & Replication at the beginning of the script.

SUBSCRIPTIONNAME = you may have multiple subscriptions on one Azure compute account pick the appropriate one here.

STORAGEACCOUNTFORRESTOREDMACHINE = we are going to be converting that backup to your Azure Storage Group

REGION = Which Azure region would you like this to be restored to

$vmsize = this is where you will define what size Azure VM you wish to use here. In this example Basic_A0 is being used, you can change this to suit your workload.

AZURENETWORK = define the Azure Virtual Network you wish this converted machine to live.

SUBNET = Which subnet should the machine live

AZURERESOURCEGROUP = the Azure Resource Group you wish the VM to live

NAMEFORRESTOREDMACHINEINAZURE = Maybe a different naming conversion but this is what you wish to call your machine in Azure.


 #This next section will enable you to automate the Direct Restore to Microsoft Azure

$restorepoint = Get-VBRRestorePoint -Name "VMBACKUPNAME" | Sort-Object $_.creationtime -Descending | Select -First 1

$account = Get-VBRAzureAccount -Type ResourceManager -Name "AZURECOMPUTEACCOUNT"

$subscription = Get-VBRAzureSubscription -Account $account -name "SUBSCRIPTIONNAME"

$storageaccount = Get-VBRAzureStorageAccount -Subscription $subscription -Name "STORAGEACCOUNTFORRESTOREDMACHINE"

$location = Get-VBRAzureLocation -Subscription $subscription -Name "REGION"

$vmsize = Get-VBRAzureVMSize -Subscription $subscription -Location $location -Name Basic_A0

$network = Get-VBRAzureVirtualNetwork -Subscription $subscription -Name "AZURENETWORK"

$subnet = Get-VBRAzureVirtualNetworkSubnet -Network $network -Name "SUBNET"

$resourcegroup = Get-VBRAzureResourceGroup -Subscription $subscription -Name "AZURERESOURCEGROUP"

$RestoredVMName1 = "NAMEOFRESTOREDMACHINEINAZURE"

Now we have everything added to Veeam Backup & Replication, We have all the variables for our machines that we wish to convert and recover to Microsoft Azure VMs. Next is to start the restore process.


Start-VBRVMRestoreToAzure -RestorePoint $restorepoint -Subscription $subscription -StorageAccount $storageaccount -VmSize $vmsize -VirtualNetwork $network -VirtualSubnet $subnet -ResourceGroup $resourcegroup -VmName $RestoredVMName1 -Reason "Automated DR to the Cloud Testing"

The full script can be found here


#This script will automate the configuration steps of adding the following steps
#Add Azure Compute Account
#Add Azure Storage Account
#Add Capacity Tier (Microsoft Azure Blob Storage) Repository
#Import backups from Capacity Tier Repository
#This will then enable you to perform Direct Restore to Azure the image based backups you require.

Add-PSSnapin VeeamPSSnapin

#Connects to Veeam backup server.
Connect-VBRServer -server "localhost"

#Add Azure Compute Account

#Need to think of a better way to run this as this will close down PowerShell when installing
msiexec.exe /I "C:\Program Files\Veeam\Backup and Replication\Console\azure-powershell.5.1.1.msi"

Add-VBRAzureAccount -Region Global

#Add Azure Storage Account

$accesskey = "ADD AZURE ACCESS KEY"
 
$blob1 = Add-VBRAzureBlobAccount -Name "AZUREBLOBACCOUT" -SharedKey $accesskey

#Add Capacity Tier (Microsoft Azure Blob Storage) Repository

$account = Get-VBRAzureBlobAccount -Name "AZUREBLOBACCOUNT"
 
$connect = Connect-VBRAzureBlobService -Account $account -RegionType Global -ServiceType CapacityTier

$container = Get-VBRAzureBlobContainer -Connection $connect | where {$_.name -eq 'AZURECONTAINER'}

$folder = Get-VBRAzureBlobFolder -Container $container -Connection $connect

#The name needs to be exactly the same as you find in your production Veeam Backup & Replication server
$repositoryname = "REPOSITORYNAME"

Add-VBRAzureBlobRepository -AzureBlobFolder $folder -Connection $connect -Name $repositoryname

#Import backups from Capacity Tier Repository

$repository = Get-VBRObjectStorageRepository -Name $repositoryname

Mount-VBRObjectStorageRepository -Repository $repository
Rescan-VBREntity -AllRepositories

#if you have used an encryption key then configure this section

#$key = Get-VBREncryptionKey -Description "Object Storage Key"
#Mount-VBRObjectStorageRepository -Repository $repository -EncryptionKey $key

 #This next section will enable you to automate the Direct Restore to Microsoft Azure

$restorepoint = Get-VBRRestorePoint -Name "VMBACKUPNAME" | Sort-Object $_.creationtime -Descending | Select -First 1

$account = Get-VBRAzureAccount -Type ResourceManager -Name "AZURECOMPUTEACCOUNT"

$subscription = Get-VBRAzureSubscription -Account $account -name "SUBSCRIPTIONNAME"

$storageaccount = Get-VBRAzureStorageAccount -Subscription $subscription -Name "STORAGEACCOUNTFORRESTOREDMACHINE"

$location = Get-VBRAzureLocation -Subscription $subscription -Name "REGION"

$vmsize = Get-VBRAzureVMSize -Subscription $subscription -Location $location -Name Basic_A0

$network = Get-VBRAzureVirtualNetwork -Subscription $subscription -Name "AZURENETWORK"

$subnet = Get-VBRAzureVirtualNetworkSubnet -Network $network -Name "SUBNET"

$resourcegroup = Get-VBRAzureResourceGroup -Subscription $subscription -Name "AZURERESOURCEGROUP"

$RestoredVMName1 = "NAMEOFRESTOREDMACHINEINAZURE"


Start-VBRVMRestoreToAzure -RestorePoint $restorepoint -Subscription $subscription -StorageAccount $storageaccount -VmSize $vmsize -VirtualNetwork $network -VirtualSubnet $subnet -ResourceGroup $resourcegroup -VmName $RestoredVMName1 -Reason "Automated DR to the Cloud Testing"

You will also find the most up to date and committed PowerShell script here within the GitHub repository.

Feedback is key on this one and would love to make this work better and faster. Feedback welcome below in the comments as well as getting hold of me on Twitter.

]]>
https://vzilla.co.uk/vzilla-blog/automated-deployment-of-veeam-in-microsoft-azure-part-2/feed 0
Automated deployment of Veeam in Microsoft Azure – Part 1 https://vzilla.co.uk/vzilla-blog/automated-deployment-of-veeam-in-microsoft-azure-part-1 https://vzilla.co.uk/vzilla-blog/automated-deployment-of-veeam-in-microsoft-azure-part-1#respond Wed, 26 Aug 2020 15:58:43 +0000 https://vzilla.co.uk/?p=2373 For those that saw this post and the video demo that walks through the manual steps to get your instance of Veeam Backup & Replication running in Microsoft Azure. I decided although that was still quick to deploy it can always be quicker. Then following on from this post we will then look at the automation of the Veeam configuration as well as the direct restore functionality from in this instance Microsoft Azure Blob Storage into Azure VMs.

Installing Azure PowerShell

In order for us to start this automated deployment we need to install locally on our machine the Azure PowerShell module.

More details of that can be found here.

Run the following code on your system.


if ($PSVersionTable.PSEdition -eq 'Desktop' -and (Get-Module -Name AzureRM -ListAvailable)) {
    Write-Warning -Message ('Az module not installed. Having both the AzureRM and ' +
      'Az modules installed at the same time is not supported.')
} else {
    Install-Module -Name Az -AllowClobber -Scope CurrentUser

Select either [Y] Yes or [A] Yes to All as this is an untrusted repository. You can also change currentuser to allusers if you wish to enable for all users on the local machine.

Breaking down the code

This section is going to talk through the steps taken in the code, the way in which this will work though is by taking this code from the GitHub Repository you will be able to modify the variables and begin testing yourself without any actual code changes.

First we need to connect to our Azure account, this will provide you with a web browser to login to your Azure Portal, if you are using MFA then this will enable you to authenticate this way also.


# Connect to Azure with a browser sign in token
Connect-AzAccount

Next we want to start defining what, where and how we want this to look in our Azure accounts. It should be pretty straight forward to understand the following but

locName = Azure Location

Publisher Name = Veeam

Offer Name = is the particular offering we wish to deploy from the publisher, there are quite a few so expect to see other options using this method.

SkuName = what product sku of the offering do you wish to use

version = what version of the product


# Set the Marketplace image
$locName="EASTUS"
$pubName="veeam"
$offerName="veeam-backup-replication"
$skuName="veeam-backup-replication-v10"
$version = "10.0.1"

The following are aligned to the environment.

resourcegroup = which resource group do you wish to use this can be an existing resource group or a new name

vmname = what name do you wish your Veeam Backup & Replication server to have within your Azure environment

vmsize = this is the image that will be used, my advice to pick the supported sizes, this is the default size used for production environments.


# Variables for common values
$resourceGroup = "CadeTestingVBR"
$vmName = "CadeVBR"
$vmSize = "Standard_F4s_v2"

Next we need to agree to the license terms of deploying from the marketplace for this specific VM Image. The following commands will do this.


Get-AzVMImage -Location $locName -PublisherName $pubName -Offer $offerName -Skus $skuName -Version $version

$agreementTerms=Get-AzMarketplaceterms -Publisher "veeam" -Product "veeam-backup-replication" -Name "10.0.1"

Set-AzMarketplaceTerms -Publisher "veeam" -Product "veeam-backup-replication" -Name "10.0.1" -Terms $agreementTerms -Accept

If you wish to review the terms then you can do by running the following command. Spoiler alert the command will give you a link to a txt file to save you the hassle here is the link in the txt file where you will find the Veeam EULA – https://www.veeam.com/eula.html


Get-AzMarketplaceTerms -Publisher "veeam" -Product "veeam-backup-replication" -Name "10.0.1"

Next we need to start defining how our Veeam Backup & Replication server will look in regards to configuration of network, authentication and security.

I also wanted to keep this script following best practice and not containing any usernames or passwords so the first config setting is to gather the username and password for your deployed machine in a secure string.


# Create user object
$cred = Get-Credential -Message "Enter a username and password for the virtual machine."

Create a resource group


# Create a resource group

New-AzResourceGroup -Name $resourceGroup -Location $locname -force

Create a subnet configuration


# Create a subnet configuration
$subnetConfig = New-AzVirtualNetworkSubnetConfig -Name "cadesubvbr" -AddressPrefix 10.0.0.0/24

Create a virtual network


# Create a virtual network
$vnet = New-AzVirtualNetwork -ResourceGroupName $resourceGroup -Location $locName `
  -Name CadeVBRNet -AddressPrefix 10.0.0.0/24 -Subnet $subnetConfig

Create a public IP Address


# Create a public IP address and specify a DNS name
$pip = New-AzPublicIpAddress -ResourceGroupName $resourceGroup -Location $locName `
  -Name "CadeVBR$(Get-Random)" -AllocationMethod Static -IdleTimeoutInMinutes 4

Create inbound security group rule for RDP


# Create an inbound network security group rule for port 3389
$nsgRuleRDP = New-AzNetworkSecurityRuleConfig -Name CadeVBRSecurityGroupRuleRDP  -Protocol Tcp `
  -Direction Inbound -Priority 1000 -SourceAddressPrefix * -SourcePortRange * -DestinationAddressPrefix * `
  -DestinationPortRange 3389 -Access Allow

Create network security group


# Create a network security group
$nsg = New-AzNetworkSecurityGroup -ResourceGroupName $resourceGroup -Location $locName `
  -Name CadeVBRNetSecurityGroup -SecurityRules $nsgRuleRDP

Create a virtual network


# Create a virtual network card and associate with public IP address and NSG
$nic = New-AzNetworkInterface -Name CadeVBRNIC -ResourceGroupName $resourceGroup -Location $locName `
  -SubnetId $vnet.Subnets[0].Id -PublicIpAddressId $pip.Id -NetworkSecurityGroupId $nsg.Id

Next we need to define what the virtual machine configuration is going to look in our environment using the above environment configurations.


#Create a virtual machine configuration

$vmConfig = New-AzVMConfig -VMName "$vmName" -VMSize $vmSize
$vmConfig = Set-AzVMPlan -VM $vmConfig -Publisher $pubName -Product $offerName -Name $skuName
$vmConfig = Set-AzVMOperatingSystem -Windows -VM $vmConfig -ComputerName $vmName -Credential $cred
$vmConfig = Set-AzVMSourceImage -VM $vmConfig -PublisherName $pubName -Offer $offerName -Skus $skuName -Version $version
$vmConfig = Add-AzVMNetworkInterface -Id $nic.Id -VM $vmConfig

Then now we have everything we need we can now begin deploying the machine.


# Create a virtual machine
New-AzVM -ResourceGroupName $resourceGroup -Location $locName -VM $vmConfig

If you saw the video demo you would have seen that the deployment really does not take long at all, I actually think using this method is a little faster either way less than 5 minutes to quickly deploy a Veeam Backup & Replication server in Microsoft Azure.

Now that we have our machine there is one thing we want to do to ensure the next stages of configuration run smoothly. Out of the box there is a requirement for Azure PowerShell to be installed to be able to use the Azure Compute accounts and Direct Restore to Microsoft Azure. The installer is already on the deployed box and if we go through manually you would have to just install that msi instead in this script we remote run a powershell script from GitHub that will do it for you.


# Start Script installation of Azure PowerShell requirement for adding Azure Compute Account
Set-AzVMCustomScriptExtension -ResourceGroupName $resourceGroup `
    -VMName $vmName `
    -Location $locName `
    -FileUri https://raw.githubusercontent.com/MichaelCade/veeamdr/master/AzurePowerShellInstaller.ps1 `
    -Run 'AzurePowerShellInstaller.ps1' `
    -Name DemoScriptExtension

At this stage the PowerShell installation for me has required a reboot but it is very fast and generally up within 10-15 seconds. So we run the following command to pause the command before then understanding what that public IP is and then start a Windows Remote Desktop to that IP address.


Start-Sleep -s 15

Write-host "Your public IP address is $($pip.IpAddress)"
mstsc /v:$($pip.IpAddress)

Now, this might seem like a long winded approach to getting something up and running but with this combined into one script and you having the ability to create all of this on demand brings a powerful story to being able to recover workloads into Microsoft Azure.

In the next parts to this post will concentrate on a configuration script which is where we will configure Veeam Backup & Replication to attach the Microsoft Azure Blob Storage where our backups reside, Our Azure Compute Account and then we can look at how we could automate end to end this process to bring your machines up in Microsoft Azure when you need them or before you need them.

here is the complete script


# Connect to Azure with a browser sign in token
Connect-AzAccount

# Set the Marketplace image
$locName="EASTUS"
$pubName="veeam"
$offerName="veeam-backup-replication"
$skuName="veeam-backup-replication-v10"
$version = "10.0.1"

# Variables for common values
$resourceGroup = "CadeTestingVBR"
$vmName = "CadeVBR"
$vmSize = "Standard_F4s_v2"
$StorageSku = "Premium_LRS"
$StorageName = "cadestorage"

Get-AzVMImage -Location $locName -PublisherName $pubName -Offer $offerName -Skus $skuName -Version $version

$agreementTerms=Get-AzMarketplaceterms -Publisher "veeam" -Product "veeam-backup-replication" -Name "10.0.1"

Set-AzMarketplaceTerms -Publisher "veeam" -Product "veeam-backup-replication" -Name "10.0.1" -Terms $agreementTerms -Accept


# Create user object
$cred = Get-Credential -Message "Enter a username and password for the virtual machine."

# Create a resource group

New-AzResourceGroup -Name $resourceGroup -Location $locname -force

# Create a subnet configuration
$subnetConfig = New-AzVirtualNetworkSubnetConfig -Name "cadesubvbr" -AddressPrefix 10.0.0.0/24

# Create a virtual network
$vnet = New-AzVirtualNetwork -ResourceGroupName $resourceGroup -Location $locName `
  -Name CadeVBRNet -AddressPrefix 10.0.0.0/24 -Subnet $subnetConfig

# Create a public IP address and specify a DNS name
$pip = New-AzPublicIpAddress -ResourceGroupName $resourceGroup -Location $locName `
  -Name "CadeVBR$(Get-Random)" -AllocationMethod Static -IdleTimeoutInMinutes 4

# Create an inbound network security group rule for port 3389
$nsgRuleRDP = New-AzNetworkSecurityRuleConfig -Name CadeVBRSecurityGroupRuleRDP  -Protocol Tcp `
  -Direction Inbound -Priority 1000 -SourceAddressPrefix * -SourcePortRange * -DestinationAddressPrefix * `
  -DestinationPortRange 3389 -Access Allow

# Create a network security group
$nsg = New-AzNetworkSecurityGroup -ResourceGroupName $resourceGroup -Location $locName `
  -Name CadeVBRNetSecurityGroup -SecurityRules $nsgRuleRDP

# Create a virtual network card and associate with public IP address and NSG
$nic = New-AzNetworkInterface -Name CadeVBRNIC -ResourceGroupName $resourceGroup -Location $locName `
  -SubnetId $vnet.Subnets[0].Id -PublicIpAddressId $pip.Id -NetworkSecurityGroupId $nsg.Id

# Create a virtual machine configuration
#vmConfig = New-AzVMConfig -VMName $vmName -VMSize $vmSize | `
#Set-AzVMOperatingSystem -Windows -ComputerName $vmName -Credential $cred | `
#Set-AzVMSourceImage -VM $vmConfig -PublisherName $pubName -Offer $offerName -Skus $skuName -Version $version | `
#Add-AzVMNetworkInterface -Id $nic.Id

#Create a virtual machine configuration

$vmConfig = New-AzVMConfig -VMName "$vmName" -VMSize $vmSize
$vmConfig = Set-AzVMPlan -VM $vmConfig -Publisher $pubName -Product $offerName -Name $skuName
$vmConfig = Set-AzVMOperatingSystem -Windows -VM $vmConfig -ComputerName $vmName -Credential $cred
$vmConfig = Set-AzVMSourceImage -VM $vmConfig -PublisherName $pubName -Offer $offerName -Skus $skuName -Version $version
$vmConfig = Add-AzVMNetworkInterface -Id $nic.Id -VM $vmConfig

# Create a virtual machine
New-AzVM -ResourceGroupName $resourceGroup -Location $locName -VM $vmConfig

# Start Script installation of Azure PowerShell requirement for adding Azure Compute Account
Set-AzVMCustomScriptExtension -ResourceGroupName $resourceGroup `
    -VMName $vmName `
    -Location $locName `
    -FileUri https://raw.githubusercontent.com/MichaelCade/veeamdr/master/AzurePowerShellInstaller.ps1 `
    -Run 'AzurePowerShellInstaller.ps1' `
    -Name DemoScriptExtension

Start-Sleep -s 15

Write-host "Your public IP address is $($pip.IpAddress)"
mstsc /v:$($pip.IpAddress)

You can also find this version and updated versions of this script here in my GitHub repository.

Any comments feedback either down below here, twitter or on GitHub.

]]>
https://vzilla.co.uk/vzilla-blog/automated-deployment-of-veeam-in-microsoft-azure-part-1/feed 0
Veeam Direct Restore to Microsoft Azure, It is not new but… https://vzilla.co.uk/vzilla-blog/veeam-direct-restore-to-microsoft-azure-it-is-not-new-but https://vzilla.co.uk/vzilla-blog/veeam-direct-restore-to-microsoft-azure-it-is-not-new-but#comments Mon, 04 May 2020 08:41:00 +0000 https://vzilla.co.uk/?p=2184 What if I told you, you could take any Veeam image based backup and convert / restore that to an Azure virtual machine without the requirement of any additional storage or file system within Azure other than the disks and resources required to run that virtual machine or virtual machines.

And what if I told you, this has been around for years with Veeam Backup & Replication. Veeam have had this capability for a while now since 2016 in fact.

Primary use cases that we have seen have been,

Test and development

When you have the public cloud at your fingertips why not take advantage of it? Instead of having to purchase specific test and development environments. Also, perfect idea if you are looking to just see how certain apps and workloads are going to run in Microsoft Azure.

Data Migration

Let’s say you know where you are going and that is Microsoft Azure, how are you going to get those workloads there in a fast and efficient manner, Direct Restore to Microsoft Azure enables a fast way to restore those backups to the public cloud without compromising on keeping the restore points and more to the point the rollback is back to those production systems you also still have on premises.

Data recovery

We tend to talk about the bad failure scenarios, or we think nothing will happen to us and not really touch on the in between. What if you lost half your production virtualisation servers due to an outage of some description? What would you do? This feature within Veeam Backup & Replication enables you to restore some of your workloads from backups into Microsoft Azure you can then use an existing VPN or some other connectivity to join the environments and continue working or you could use VeeamPN to achieve this.

Walkthrough

In this YouTube video I walk through how easy and simple it is to get those image-based backups restored into Microsoft Azure as native Azure VMs for some of those use cases mentioned above. This also ties into the Veeam Backup for Microsoft Azure that was released this week.

Where should I run the conversion process?

I ran some tests for this one to determine for my lab where and what would be the best practice when it comes to restoring workloads into Microsoft Azure. Veeam offers a lot of choice when it comes to restore and how to assist when environmental challenges are in the way. Things like link speed to the public cloud due to location or other reasons for that. Also since the release of this feature back in 2016 there have also been many other enhancements and features added to Veeam Backup & Replication including the new Veeam Cloud Tier which gives us the ability to store our backups in Object storage, well we can also recover from those as well. This video linked below goes into more detail around where and what considerations you should take when looking to restore workloads to the public cloud.

Cloud Tier

It is only right that we have spoken about protecting native Azure VMs using the Veeam Backup for Microsoft Azure, we have spoken about getting your image based backups from either virtual or physical platforms that you have on premises or even in other public clouds to Microsoft Azure so I had to mention Cloud Tier or Capacity tier on how we can tier our backups or copy our backups into Microsoft Azure Blob Storage for either a long term retention or an offsite copy of your data.

Couple all these features together and we have a pretty dynamic and flexible way of being able to move data to from and within the public clouds.

If you have any questions or comments, feedback at all on the videos then please let me know either here in the comments, on the YouTube channel or on Twitter, a side note here is that I will be creating more video content over the next few weeks whilst we are stuck at home, I for one have been consuming a lot more of my news and education through YouTube and judging by the uptake in subscriptions I think you are too so let me know anything you want to see or for me to walk through.

]]>
https://vzilla.co.uk/vzilla-blog/veeam-direct-restore-to-microsoft-azure-it-is-not-new-but/feed 4
Veeam Backup for Microsoft Azure https://vzilla.co.uk/vzilla-blog/veeam-backup-for-microsoft-azure https://vzilla.co.uk/vzilla-blog/veeam-backup-for-microsoft-azure#respond Sun, 03 May 2020 15:21:25 +0000 https://vzilla.co.uk/?p=2181 Last week Veeam released its version 1 of Veeam Backup for Microsoft Azure.

What is Veeam Backup for Microsoft Azure?

This new product focuses in on the Azure IaaS workloads you have running in the public cloud, much like the Veeam Backup for AWS edition that was released early this year, this product provides you the ability to protect those Azure VMs without having to install and agent on each one. It is a policy driven approach allowing for both snapshots and backups to be part of your data management plan when it comes to Microsoft Azure.

The product is a standalone solution that is deployable from the Microsoft Azure marketplace. A very easy to use wizard driven approach to configuration and management. Veeam Backup for Microsoft Azure Free Edition and subsequent versions are available within the Microsoft Azure Marketplace.

050320 1515 VeeamBackup1

The FREE edition allows you to protect 10 Azure VMs using native snapshots and then tier those snapshots to an Azure Blob Storage repository.

Within the Azure Blob Storage Repository these backups are stored in the portable data format that sets Veeam apart from the other vendors in this space. This allows for the Veeam Backup & Replication External Repository feature to be leveraged and enables the ability to further additional data protection or allow for other tasks such as migrations or on premises data recovery.

As you would expect the offering also allows you to recover those Azure Virtual Machines not only back where they initially resided but also across accounts and even across regions. As well as being able to provide file level recovery for a more granular option.

Another cool feature is the ability to see a level of cloud cost, when you create your policies through the wizard driven approach you have the ability to start seeing some cost forecasting so you can make better decisions about your cloud cost consumption.

Policies, Workers & Protected Data

Those familiar with Veeam will notice a different approach to some of the key functions and naming, and maybe you can liken these new terms with those found in Veeam Backup & Replication they have some differences.

Those familiar with Veeam Backup & Replication will recognise Policies as something more commonly known as Backup Jobs, however even within Veeam Backup & Replication world we are seeing policies now entering the fold with the CDP policy coming in later releases.

Policies give you the ability to define several requirements when it comes to your cloud data management. But again, it is that same very easy to use wizard driven approach that all Veeam customers will be familiar with.

You can choose to protect everything in a region, or we can be granular on what to protect. An awesome feature here is that you can select either by Instance or by Tag. Tags really lend well to the fast-moving pace of Cloud Instances being spun up and spun down all the time. The ability to use tags means we can protect in a more dynamic fashion. We will demonstrate the ease of use and how dynamic these tags within Azure can be created and used for your data management needs.

I mentioned above about Snapshots and Backups and how they are used together in this product to provide the best of both worlds when it comes to fast recovery points but also an out of band copy of your data not linked to the original VM.

You may wish on some workloads to only provide Snapshots and some only backups, or both. Snapshot settings allows you to define when these will be taken and how many snapshots you intend to keep. Backup Settings is where we can define that Microsoft Azure Blob Storage repository in which we wish to store those backups to, this will also play the part of making that data visible if you wish to see that within Veeam Backup & Replication. You also have the same retention setting to define here.

The workers are configured during the configuration stage and setup of the Veeam Backup for Microsoft Azure. Those familiar with Veeam Backup & Replication could maybe liken these worker nodes to the Veeam Backup Proxy component within VBR.

The worker is a Linux based instance that is deployed and used when data needs to be transferred, the worker is used for both backup and recovery. When the policy is complete then the workers are shut down but remain in place for the next scheduled policy to take place.

Cost Estimations

A unique feature that is built into the Veeam Backup for Microsoft Azure free edition and will obviously include other versions is the ability to estimate cost when it comes to backups and storing the retention you have defined. This is something else we go into further detail within the video walk-through below.

As I have mentioned this post gives a very high-level overview of what you can find with the new product but if you would like to see more then I have created a walk-through below. Any comments please comment here, on the YouTube video or find me on twitter.

Let me know what you think to the YouTube walk-through’s it is something I am intending to really increase given that we are house bound and I have more time to create this content.

]]>
https://vzilla.co.uk/vzilla-blog/veeam-backup-for-microsoft-azure/feed 0
#SummerTraining – Options for data in the cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-data-in-the-cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-data-in-the-cloud#respond Tue, 17 Mar 2020 08:30:00 +0000 https://vzilla.co.uk/?p=2091 The next #ignitethetour training I took was with Cecil Philip of Microsoft. Data is of huge interest of me and has been my whole IT career, knowing where that data is stored for production, backup for analytics regardless of it being on premises or in the public cloud or even being hosted by a service provider.

I think with the options we have available today to store our personal data and our mission critical enterprise data and everything in between we have so much choice.

This session was focused on how the cloud could help when it comes to storing your data in Microsoft Azure.

Three key things that the session enabled viewers to go away with were.

  • Understand the type of data you have
  • Azure has hosted options for databases
  • Your data solution should be able to grow with you

What is important for you – the customer

There are thousands of things that will be specific, but many will be very similar.

  • How can we make things faster?
  • Limit or mitigate risk when deploying new services
  • Putting more control to the developers in the organisation
  • Scalability
  • Using the right tool for the job and potentially being able to pivot when need be

Should we have a storage strategy?

The session moved into more of a why is a storage strategy important. This has been something a good friend of mine Paul Stringfellow has been speaking about both on his blogs and his podcasts, and this relates to all customers not just large enterprise customers and environments should have a storage or data strategy.

We should always be considering,

  • Maintaining Security
  • Breaking down data and storage services into manageable set
  • Consider the lifespan of the data and where it needs to be and for how long?

Before we start

What data do you have?

  • Structured Data – data that has been organised into a formatted repository, typically a database, so that its elements can be made addressable for more effective processing and analysis. A data structure is a kind of repository that organizes information for that purpose.
  • Unstructured Data – information that either does not have a pre-defined data model or is not organized in a pre-defined manner. Unstructured information is typically text-heavy, but may contain data such as dates, numbers, and facts as well.
  • Semi Structured Data – a form of structured data that does not obey the formal structure of data models associated with relational databases or other forms of data tables, but nonetheless contains tags or other markers to separate semantic elements and enforce hierarchies of records and fields within the data.

How much data do you have?

  • Volume
  • Variety
  • Velocity

Azure Storage Services

There is quite the storage offering when it comes to Microsoft Azure and it’s important to understand the options for those data types and what should be stored where.

031520 1701 SummerTrain1

What is Azure Blob Storage?

Azure’s Object Storage platform used to store and serve unstructured data.

  • App and Web scale data
  • Backups and Archive
  • Big Data from IoT, Genomics, etc.

My interest here instantly went toward the backup mention above and in particular how Veeam have been leveraging Object Storage for backup data across the platform for storing the backup formats for either long term retention, direct copies but also copies of your Microsoft Office 365 backup data. Some of the characteristics that come with object storage especially with Microsoft Azure Blob Storage.

  • Infinite scale
  • Globally accessible
  • Cost efficient

Databases – Relational Databases

Relational databases have many different options within Microsoft Azure.

The first option is by taking your on-premises SQL or relational database and migrating those VMs or workloads to Microsoft Azure. But this is likely not going to be the best route to take because of cost and management.

The compelling route should be more along the line of PaaS offerings from Azure, these could be any of the following and I am sure there a likely new services happening as and when the demand is great enough.

  • Azure SQL Database
  • SQL Data Warehouse
  • PostgreSQL
  • MySQL
  • MariaDB

All of these PaaS offerings still leverage the Azure Compute and Storage layer, but they offer the ability for many other Azure services to work with these databases.

Cosmos DB

Azure Cosmos DB – A globally distributed, massively scalable, multi-model database service

A NoSQL database is different to what we just mentioned with SQL or other relational databases.

031520 1701 SummerTrain2

I actually want to learn some more about Azure Cosmos DB, the introduction in this session was great and opened my eyes to this actually being not another flavour of a NoSQL database but potentially an aggregation of existing NoSQL databases. I need to learn more on this for another time.

I am really interested in the distributed format of these databases and the ease of use about being able to have a write region and then additional read regions across the world or at least in different locations. However, you can have multi region writes which will help with scale.

Resources

All of this is really well covered in the Azure Documentation – https://aka.ms/apps20ignite

Another thing that I had to share was the learning paths for this session alone. Almost 15 hours of training! This is hands on training and interactive without the billing but all the learning!

Sessions Resources

Session Code on GitHub including presentation

All Events Resources

]]>
https://vzilla.co.uk/vzilla-blog/summertraining-options-for-data-in-the-cloud/feed 0
#SummerTraining – Options for building and running your app in the cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-building-and-running-your-app-in-the-cloud https://vzilla.co.uk/vzilla-blog/summertraining-options-for-building-and-running-your-app-in-the-cloud#respond Mon, 16 Mar 2020 08:30:00 +0000 https://vzilla.co.uk/?p=2087 Well it is Summer some place, but this learning curve has been going on since the summer in England where I really wanted to take some of the pre events season down time and learn something new, this has spanned a wide range of new and upcoming technologies of which some I have not even written about yet but I have been looking at I promise.

A big focus on

  • Containers & Kubernetes
  • Public Cloud Hyperscalers (Microsoft Azure, AWS and Google Cloud Platform)
  • Infrastructure as Code & Automation

My aim for the public cloud and in particular Microsoft Azure was to get a better understanding on why? Why would some of our existing customers want or should they move to Microsoft Azure and what options do they have in doing so?

The level of education I am aiming for is around foundation learning curve that allows me to better understand in all three of the fore mentioned public cloud hyperscalers

  • Compute
  • Storage
  • Security
  • Networking

The idea is not to sit all the certifications and become a master in any or all that would be insane, but an understanding is required to be able to have those conversations in the field with our customers and prospects.

My Azure learning started with the Ignite sessions all available online. I have to say Microsoft really do nail the production quality and the time to get this stuff online straight after they have happened. It was the first ignite on tour or at least the one in London that got me interested and although I could not attend the live show, I was able to grab the agenda.

This first write up will touch on the “Getting Started” and will focus on the session that was delivered by Frank Boucher called “Options for Building and Running Your App in the Cloud” The session touches on the options available and security as the first steps of understanding and leveraging the public cloud for what it was built for.

Franks first comment and I think this is a solid way of thinking about cloud technologies. There are no bad choices… there is no bad first step. But I will add to this the purpose and the requirement of that data and use case has to be clear. If the data is important make sure it is protected against failure.

There are always plenty of options, just get started and you can always or should always be able to move to other options or find better ways to improve your application or the purpose you are trying to achieve.

Deployment Tools

Visual Studio IDE

Modular, QA / Test very versatile and regardless of your programming language there are options. Multi-platform and also in the cloud version. Visual Studio Online

  • Multi-Platform (Windows & Mac)
  • Customisable workloads
  • Multi-Language
  • Tons of Extensions
  • Live Share – Real time collaborative development!

Visual Studio Code

Lighter version of the previously mentioned IDE version. Less features but still powerful,

  • GIT commands built in
  • extensible and customisable
  • Full support for all platforms (Linux, Mac and Windows)

Terminal & CLI

  • Cloud Shell
  • Azure CLI
  • Azure PowerShell

ARM Templates

Azure Resource Manager Template, this is where we meet infrastructure as code functionality where we concentrate on version control, a fast way to deploy resources in a declarative model without having to manually deploy our infrastructure.

  • Architecture / Infrastructure as code
  • Version Control
  • Fastest Way to deploy

ARM templates might be a completely new way of looking for many infrastructure administrators, but I have to say the Microsoft Documentation in this area is amazing.

Aka.ms/azArm

Deployment options

Now we know some of the tools available and there are others but I wanted to focus on the Microsoft options, I personally believe that at this point there is a strong focus on using especially when it comes to Infrastructure as code, You may want to be agnostic to where you run your deployment, for this something like Terraform from HashiCorp is a great option to achieve this across multiple platforms.

Let’s take a website as the example of what we want to consider deploying. There are many options available.

Azure Blob Static Websites

  • Very low cost – Cheapest option
  • Fast
  • Static – however although this can be HTML it can also be more complexed options using Angular and React

PaaS (Web Apps)

PaaS removed the requirement to manage the architecture at a deep level, Scaling, Backup, Disaster Recovery and other platform tasks that are now managed by the service.

  • Client Side & Server Side
  • PaaS Features
  • Windows & Linux
  • Many Languages Supported (.NET, Java, PHP, Ruby, Python… etc.)

Containers

A couple of container options when it comes to Azure

  • ACI – Azure Container Instance
  • AKS – Azure Kubernetes Services

There are many different use cases between the two offerings above but also some overlap. I am not going to get into the AKS or Kubernetes in general benefits and functionality but if you are looking to simply run a very small or very simple application or service then ACI is going to be a great choice there. If you require scale and deeper choices and orchestration for your containers, then AKS will be the likely choice.

Virtual Machines

What if you already have the web server already configured and working in a different location, maybe on premises for example running in VMware as a virtual machine. You don’t have time to change this, but you want to get to Azure and that’s also possible.

Veeam has the ability in the free version to Directly restore image-based backups to Azure.

Shared Image Gallery

There is also a gallery that contains different images available, different Operating Systems and versions for both Windows & Linux. Some of these images also contain application deployments also.

  • Databases
  • Web Servers
  • Development Tools

Basic Security Features

Security has to be at this stage of the project, it should not be an afterthought. Because you may start and you are the only developer / operations engineer but then you scale out and out and out. Meaning sharing security keys and passwords over messenger apps becomes a complete vulnerability in your process.

Azure Key Vault

Azure Key Vault is a cloud service for safeguarding encryption keys and application secrets for your cloud applications.

The AKV keeps or focuses on clear separation of security duties, meaning that the role attributed to security can be in charge and manage the important security aspects.

  • Encryption Keys
  • Secrets
  • Certificates

Whilst App owners can consume and use the certificates in their applications. As well as your deployment being secured and segregated.

  • Manage all of your secrets in one place
  • Seamlessly move between Development, QA, and Production environments
  • Update credentials in one spot and update everyone’s credentials
  • Version, enable, and disable credentials as needed
  • Add a credential and it’s instantly available to all Developers in a controlled manner

Managed Service Identity (MSI)

Ok, so Azure Key Vault sounds great but how do we get into it to control the security aspects that have just been mentioned. How do we authenticate into AKV?

So we need credentials to get credentials…

031520 1700 SummerTrain1

Your deployment is registered with Azure this can be that VM, Function or anything we mentioned in that above Deployment Options. A local endpoint is exposed but this is only accessible within your local host that allows for access to valid credentials within the key vault.

Loads more reading material at aka.ms/docAAD on Azure Active Directory.

Resources

Session Resources

Session Code on GitHub including presentation

All Events Resources

]]>
https://vzilla.co.uk/vzilla-blog/summertraining-options-for-building-and-running-your-app-in-the-cloud/feed 0
Kicking Off #VeeamON 2019 https://vzilla.co.uk/vzilla-blog/kicking-off-veeamon-2019 https://vzilla.co.uk/vzilla-blog/kicking-off-veeamon-2019#respond Sat, 11 May 2019 09:31:43 +0000 https://vzilla.co.uk/?p=1639 051119 0931 KickingOffV1

051119 0931 KickingOffV2

We seem to be heading fast toward VeeamON 2019, this year our conference heads to Miami and none other than the famous Fontainebleau Miami Beach luxury hotel and resort. I have never been to Miami, so I am really looking forward to being there but also there is nothing quite like the VeeamON feeling and that of your own conference.

051119 0931 KickingOffV3

The conference officially runs from May 20th to May 22nd all in the same resort. But the fun actually starts on the Saturday for the die-hard techies heading out to Miami they will be starting their VMCE course that runs from Saturday to Monday before the conference kicks off with a welcome reception on the Monday evening.

We then have two full days of conference, breakout sessions and general sessions both of which I am going to get into more detail on later on.

Obviously, it wouldn’t be a Veeam conference if we didn’t have the famous Veeam party to close out the conference, pretty excited to see Flo-Rida perform.

051119 0931 KickingOffV4

My Breakout picks for the conference

We have I think 60+ breakout sessions this year, lots of technical content, business decision maker content, alliance content it’s a full-blown packed agenda of sessions and I have no doubt that everyone has put an enormous effort in to get the right content for the audience.

The length of the sessions are 45 minutes, I think this is a great move as this means the content needs to be on point and resonating with the audience but also not too long to lose interest. An attendee can hit 7 breakout sessions, so you have to choose carefully and plan your agenda accordingly with that in mind. Below are some of my picks.

There is an App for the event you can grab that will show all the other sessions and what else is happening during the event.

1.    To Download the app, search for CrowdCompass AttendeeHub in the App Store or Google Play.

2.    Once downloaded, search the app for VeeamON 2019 Miami event.

3.    Log in to the VeeamON Event with the same First and Last name you used to register for VeeamON

I am pretty sure I won’t get to attend many of these sessions, unless I am presenting so anyone reading this, I would love an overview of the sessions you attend.

Day 1

10:20 – 11:05

Cumulonimbus – Cloud Tier Deep Dive & Best Practices – intrigued by what Cumulonimbus means but also the level of technical differentiation we have with our Cloud Tier feature that landed in the early part of 2019 is well worth seeing.

11:25 – 12:10

From the Architect’s Desk: Sizing of Veeam Backup & Replication, Proxies and Repositories – Now I have seen the presentation for this, and I know that this is going to get quickly get pretty technically deep, well worth the spot.

13:30 – 14:15

NetApp and Veeam: Deep Dive Into How Snapshots and Secondary Storage Can Help You Get More Out of Backup – This seems to be the Alliance 45 minutes, I am going to highlight the NetApp & Veeam story as one to attend, the end to end capabilities we have with NetApp is truly worth seeing.

Day 2

09:15 – 10:00

Activate Your Data with Veeam DataLabs – of course I am going to kick start day 2 with this one, and I will be making this one. I will be presenting on an end to end overview of Veeam DataLabs, what it is and how you can use it.

11:50 – 12:35

Now at this time there is a top-secret session and I cannot divulge what that is going to be about but that might be worth attending if there is room.

Architecture, Installation and Design for Veeam Backup for Microsoft Office 365 – This one will be a full on deep dive into Veeam offering with Office 365 and if this is your world or is going to be soon then it’s worth knowing how the parts fit together and how to architect VBO for your environment.

14:00 – 14:45

Veeam ONE 9.5 U4 Part 1: Monitor, Veeam Intelligent Diagnostics and Business View – Veeam ONE got a lot of love with the most recent release but it’s always been such a great product and really opens up the visibility into your whole environment not just your backup infrastructure.

Also, this slot has a session on Veeam Explorers another capability that is such a powerful differentiator and worth understanding some of the tips and tricks there.

15:05 – 15:50

Veeam ONE 9.5 U4 Part 2: Reporter, Heatmaps and Custom Reports – this is part two of the previous session so if that one got your interest then carry on listening to what can be done with Veeam ONE.

How I Stopped Worrying and Loved the Tape Media – There you go I have done it. I have suggested a tape session, and why not it’s still a massive strategy within many IT environments and still many require tape functionality this session will take a look into what Veeam is doing around Tape.

Technology General Session

The thing I want to close on is the Technology General Session where we will be live streaming out beyond the audience that are with us in Miami and we will be showing some technical demos. We then have an exclusive second half of the session where the stream will be turned off and we will share some of the futures that Veeam have coming later on in the year or sooner. This session will be live streamed from 15:30 Miami time.

With that I hope to see some of you there, if you are then don’t be shy and say hi, I will be the guy scrambling around between sessions and other matters probably wearing a funky vZilla t shirt or hopefully something if they arrive in time before I leave.

Hope you have a great VeeamON

051119 0931 KickingOffV5

]]>
https://vzilla.co.uk/vzilla-blog/kicking-off-veeamon-2019/feed 0