Friday, December 15, 2017

How to Maintain an Azure Site-to-Site (S2S) Connection with a Dynamic IP Address

This post should not be needed for your production environment. This is for those of us who test and build development environments at home and have created hybrid environments into Azure. IF you have a dynamic IP address for your business, please spend the extra bit of money for a static IP...
That being said, I have a dynamic IP address for my house, and when my IP address changes, it used to break my S2S connection with Azure. This post is about how I fixed the problem.
The fist step is to create the service account that is going to be logging in to Azure to check and update the IP Address.  I will be creating an unlicensed user on the .onmicrosoft domain to for this purpose.
As the Microsoft Online Data Service (MSOL) module did not come pre-installed, I ran the following to get started:

Next we are going to login and create the unlicensed service account. You will want to update the UPN and other variables accordingly:

Now that we have our service account created (an account that does not have access into our domain, O365, or Azure), it will need to be added to Access Control (IAM) for the Local Network Gateway in Azure.

With permissions set for Local Network Gateway, it is time to look at the current IP address of the gateway endpoint and compare it to the current local IP address endpoint. If the two IP addresses do not match, it is time to update your Local Network Gateway (in Azure).

Next we create some logging and logging clean-up:

And to finish off, we will connect all of our RRAS VpnS2SInterface connections.

Now let's put the whole thing together. First we create the service account and add their permissions:

Next we create the Update S2S file, and save the file to: 'C:\Scripts\Update S2S and RRAS.ps1'

Now that we are checking and updating our Local Network Gateway Connection IP address, we need to create a timer job that will check and update on a regular basis. Below is a script that will check every hour on the hour. Make sure that the Update S2S file path is set correctly.

Monday, April 3, 2017

Creating File Shares in Azure using PowerShell

The other day I was helping migrate a client from one cloud provider into Azure, when they ran into a problem with their file share server. It made me think about the Azure File Service that is available, and how to implement this from a corporate perspective, and then my mind wandered and I wanted to see how to create a file share just for my Surface Pro. There is an excellent post called Get started with Azure File storage on Windows that go me started, but I was not too happy with the implementation, as I would like my file share available to me after I reboot my Surface Pro, plus I do not want anyone logged into my Surface to have access to MY file share.
The beginning of the script is basic creating the Resource Group and Storage Account, as it is a file share, I am using global redundant storage, on HDDs not SSDs.

Storage within Azure is Context based, so to create the Azure File Share, we will need to create a context. Once we create the context, we can create the File Share.

Now that we have created the Azure File Share, we want to store the login credentials locally to make mounting the drive easier.

The next step is to mount the file share as a local drive (X:\ drive in this example). I tried using the New-PSDrive cmdlet, but could not get it to work consistently, so ended with the script below. One thing to notice is that in line 9 I am creating a Script Block from a string variable, instead of creating a script block with parameters and passing in values. I found this to be a very nice and easy way to deal with passing variables into a script block. Plus I need the script block in a later part of the script, so it was win-win.

Now at this point, I have a mounted X Drive to my Azure File Share, but it is not persisted. To allow my drive to return after reboot, I have to create a Scheduled Job that will run on logon for the person running this script. Notice in line 15, I grab the Scheduled Job's Path so that I can grab and update the Scheduled Task to update the Task's principal user.
All in all a very cool, and quick way to add 5TB of storage to my Surface Pro.
I could not think of any other reasons why an Enterprise would want a file share in Azure (from a corporate perspective) with the availability of SharePoint Online or OneDrive for Business, but then a client asked me how to send me their bloated SQL Database... You would send your client something like this:

Once they upload the file, you would then create a new key for the storage account.
How would/are you using Azure File Service?
Here is the code in its entirety: