Global Subscription and Directory Filter in Azure Portal

Azure portal allows us to apply global subscription and directory filter that gets applied to your entire portal session. This will cause that directory and/or subscription to persist throughout the¬†Azure portal session displaying only the resources from the selected subscription/Directory. So, if you have access to a bunch of directories that in turn hold a bunch of subscriptions within them, then you can use this global subscription and directory filter pane to select only the required directory and subscription that you’re currently working on. Thus saving your time and improving productivity while using the portal.

The subscription and directory filter is available in the top right pane of the Azure portal as shown below.

Global Subscription and Directory Filter in Azure Portal

Currently, there are 2 ways to filter on subscriptions.

  1. Filter the subscription after selecting a particular resource from “All Resources”
    subscription filter at resources level
  2. Filter the subscription from the global subscription filter on the top pane.

Filtering Directories from global directory filter

We can also select a different directory from the same filter pane as shown in the below screenshot

global subscription and directory filter in Azure Portal

As you can see in the above screenshot, we can also set directories as favorites which would allow us to switch directories even faster in the future.

Use this to save your time in case you have multiple subscriptions and directories assigned to your id but use only a few ūüôā

Hope this helps!

Posted in Azure | Tagged , , , , , , | Leave a comment

2 ways to retrieve content of Azure Automation Runbook within another runbook

In this post, we’ll see how we can retrieve content of azure automation runbook from within another runbook. This content is loaded in memory. There could be multiple reasons why we would want to¬†retrieve content of azure automation runbook. One of them could be to create a package that holds all the data from automation account. This package can be used to sync the data to other automation accounts.

There are 2 ways to do the same.

  1. Using the Get-Content cmdlet
  2. Using Export-AzureRmAutomationRunbook cmdlet

Using Get-Content Cmdlet

Consider, for example, the current runbook is Runbook1.ps1. The standard way is to use this cmdlet is to just pass the name of the runbook as shown below

Get-Content ./Runbook2.ps1

!!Caveat here!!

Now for the above cmdlet to execute correctly, the Runbook2 needs to be present within the sandbox environment at the current path where the current runbook (Runbook1.ps1) is running. This is NOT done by default when a runbook starts. Azure Automation DOES NOT get all the runbooks into the sandbox before running any. It only retrieves the ones that it sees are being INVOKED from the parent runbook.

Azure Automation parses the parent runbook and looks for any “invocations” in the code. Since Get-Content ./Runbook1.ps1 doesn’t look like an invocation, the runbook2 won’t be available and the cmdlet will return an error as shown below.

Get-Content : Cannot find path 'C:\Temp\pdur5g3d.hvp\runbook2.ps1' because it does not exist.

How to fix this

In order to fix the above issue, we need to find some way to trick Azure Automation into thinking that the parent runbook is going to invoke the second runbook. We can do this using the below snippet.

if ($false)

Get-Content ./Runbook2.ps1

The line of code inside the IF block looks like a script invocation and so Azure Automation will ensure that this runbook is present in the sandbox before running the parent runbook and now the Get-Content will retrieve the automation runbook2.ps1 content successfully. The parser is not intelligent enough to understand that this code will never execute.

Using Export-AzureRmAutomationRunbook cmdlet

Another way to retrieve the content of the runbook is to first export the runbook to a local folder within the sandbox and then use Get-Content cmdlet directly to retrieve the content. The cmdlet syntax looks like this

$ScriptFolder = "C:\Scripts"

New-Item -itemtype Directory -Path $ScriptFolder -Force 

Export-AzureRmAutomationRunbook `
    -ResourceGroupName $AutomationResourceGroup `
    -AutomationAccountName $AutomationAccount `
    -Name "Runbook2" `
    -AzureRmContext $SubscriptionContext `
    -OutputFolder $ScriptFolder -Force

Get-Content -Path (Join-Path $ScriptFolder "Runbook2.ps1")

This will export the runbook into the given location and call to Get-Content will load the content in the memory.

Which one to use when?

Now, as you can see, you are using Get-Content in both the ways. However, the first method looks a little bit hackish as the IF block is never going to be entered into. So if you need to load contents of multiple runbooks, then I’d suggest using the second method instead as the code looks much cleaner and clear.

Another point to keep in mind is that all AzureRmAutomation cmdlets need to authenticate to Azure RM first which is an extra step and might not be convenient always. However, if you’re going to authenticate elsewhere for other reasons anyway, we can use the Export-AzureRmAutomationRunbook cmdlet without any concerns.

So finally, it’s up to you to decide which cmdlet to use depending on whether you’re ok to make your code look a bit hackish or not. Personally, I prefer using Export-AzureRmAutomationRunbook as it more clearer and shows exactly what we intend to do to the person looking at our code.

Hope this helps!

Posted in PowerShell | Tagged , , , , , , | Leave a comment

[PowerShell Tip] – Assign array elements to separate variables in single line of code


In this post, we’ll see how we can assign array elements to separate variables in a single line of code

Consider a sample array

$TestArray = @("FirstElement", "SecondElement")

Now, to get the value of the elements into separate variables, we just need to declare the variables separated by a comma and at the end, assign the array to it as shown below.

$FirstValue, $SecondValue = $TestArray

When you print the $FirstValue and $SecondValue variables, you will see that they hold the values of the corresponding array element.

One use-case where we could use this is when we have a string that we need to split and we need to use let’s say all parts of the string.

Let’s say for example we have a string as given below

Amogh Natu;28;M;Hyderabad;India

We could read the whole string first and then split it using the character “;” and assign the resultant array to 5 variables namely, “FullName”, “Age”, “Sex”, “City”, “Country” and use these variables later in the code as needed.

Although, one point to remember here is that this is feasible and advisable to use when we know the array is going to have limited number of elements. It doesn’t make sense to use this technique if we know that the array has a lot of elements or is dynamic.

Hope this helps!

Posted in PowerShell | Tagged , , , , | Leave a comment

[PowerShell Tip] – Prevent cmdlet from printing anything to output


In this short post, we’ll see how we can prevent a PowerShell cmdlet from printing anything to the std output stream. There are two ways you can do this.

  1. Piping the output of the cmdlet to Out-Null
    e.g. Set-AzureRmContext -SubscriptionId "SubId" | Out-Null
  2. Assigning the output of the cmdlet to $null
    e.g. $null = Set-AzureRmContext -SubscriptionId "SubId"

Either of these would prevent the output being printed to the output stream.

P.S. I learned this today and would love to know if there are more ways to achieve the same ūüôā

Hope this helps!

Posted in PowerShell | Tagged , , , , | Leave a comment

Analyzing PowerShell scripts with PSScriptAnalyzer


This post will show you how you can use PSScriptAnalyzer to analyze whether your PowerShell scripts or functions confirm with industry best practices or not.

PSScriptAnalyzer (PSSA going forward) is a static code analyzer that checks your PowerShell scripts, modules, functions and gives a detailed report on any rule/rules that the scripts or modules are not conforming to.

PSSA is a tool developed by Microsoft that can be downloaded and installed from the PowerShell gallery using the

Install-Module -Name PSScriptAnalyzer

cmdlet. As of this writing, there are 51 rules that have been created as per the best practices being followed in the industry for PowerShell scripts. You can view these rules using the Get-ScriptAnalyzerRule cmdlet as shown below. You can use Out-GridView for seeing the rules more clearly. Note that, this cmdlet is available only after you install the PSScriptAnalyzer Module.

Get-ScriptAnalyzerRule | Out-GridView

You will see the output as shown below.


All these rules will be validated against the file(s) that you want to analyze.

To analyze a file or set of files, you can use the

Invoke-ScriptAnalyzer -Path [<Path(s)_to_Script>] | Out-GridView

cmdlet. I prefer to use the Out-GridView just for getting the output in a more clear way. You can choose to include or exclude it. You will see the output of a sample script as shown below.


As you can see, the analyzer gives out a clear report about all the violations currently present in my script with a detailed message about the issue and what I can do to resolve it. The report also shows the severity of the violation and line number in the script.

We can analyze multiple scripts at the same time by passing a folder path to the Invoke-ScriptAnalyzer cmdlet instead of the path of a single script.

Invoke-ScriptAnalyzer -Path "D:\" -Recurse

The -Recurse flag instructs the cmdlet to check and analyze scripts in sub-folders as well. You can see the complete output of all the scripts as shown below.


You can also check your scripts for a particular rule only by using the -IncludeRule parameter in the Invoke-ScriptAnalyzer cmdlet. Or you can exclude certain rules by using the -ExcludeRule parameter and passing the set of rule names to be excluded.

Invoke-ScriptAnalyzer -Path "D:\SampleScript.ps1" -IncludeRule "PSAvoidUsingWriteHost"

This would cause the script to be checked for only the PSAvoidUsingWriteHost rule. The -IncludeRule parameter accepts a string array so you can pass multiple rules to include. Separate multiple rules by a comma.

Similarly, you can pass one or more rules to exclude using the -ExcludeRule parameter which would cause the Invoke-ScriptAnalyzer cmdlet to ignore those rules.

You can also create your own Custom Rules module (it will be a .psm1 file) and specify the Invoke-ScriptAnalyzer cmdlet to use those.

You need to use the -CustomizedRulePath parameter which accepts a string array as value so you can pass one or more custom rule files. The structure of the custom rule file and how to create them is out of scope for this post. You can refer to this for details on creating custom rule file.

There are some more parameters to the Invoke-ScriptAnalyzer cmdlet like -Severity which lets us specify which severity specific rules to validate and -LoggerPath which can be used to specify paths to custom logger assemblies.

Hope this helps!

Posted in PowerShell | Tagged , , , , , , | Leave a comment

[Powershell-Basics] – Looping through hash table


This post will show how to loop through a hash table in PowerShell.

Let’s say for example we have a hashTable object in PowerShell as shown below:

$SampleTable = @{}   # Syntax for creating hashtable --> @{}
$SampleTable."Name" = "Amogh"
$SampleTable."City" = "Hyderabad"
$SampleTable."Country" = "India"

For looping through the hashtable, we can use the method GetEnumerator() method of the Hashtable class. This provides an enumerator that can be used in the Foreach loop. And we can use the Key and Value property of the hash table entry.

Foreach($Entry in $SampleTable.GetEnumerator())
        Write-Output "$Entry.Key ------ $Entry.Value"

The output of the above snippet will be as follows:

Name ------ Amogh
City ------ Hyderabad
Country ------ India

Hope this helps!

Posted in PowerShell | Tagged , , , | Leave a comment

Save with Encoding in Visual Studio


This post shows how you can save files while retaining their encoding and line endings format. For example, if you’re writing a shell script that will be run on a UNIX operating system, the script is supposed to have line endings as (LF) while Windows’ line endings usually follow¬†(CR)(LF) convention.

So to save a file with its encoding, you need to click on Save As… and then click on the arrow beside the save button as shown below and click on Save with Encoding.

After selecting the file location, VS will show the encoding window where we can select the required encoding. Select the required encoding and click on save. This will save the file with the selected encoding.


Other editors like Notepad++, sublime editor also provide similar options for setting line endings. Personally, I prefer Notepad++. However, best option to prevent line endings to get changed by mistake when moving the script from Windows machine to Unix like System is to use the VI editor within the Unix system itself to avoid the /bin/bash^M: bad interpreter: error message.

Hope this helps!

Posted in General | Tagged , , , , , , , , , , | Leave a comment

Shell script with 10+ parameters? Remember this….


This post is mainly aimed towards shell script newbies like myself and the goal is that they don’t end up wasting time on this as I had to.

So, if you are creating a new shell script that requires 10 or more parameters, you need to remember one thing that you can’t access the 10th and further parameters by just $10 or¬†$11, etc.

If you simply put something like below,

echo $TenthParameter
echo $EleventhParameter

The output would actually look like below: (Assume First’s parameter’s value is “First”)


and NOT the actual values of the tenth and eleventh parameters.

This is because the bash interpreter first sees $1 (in the “$10”) and replaces its value immediately.

To get the value of parameters from and beyond 10th param, you need to put them in Curly {} braces as shown below

...    // # WORKS!
echo $TenthParameter
echo $EleventhParameter

This would print the actual values of the parameters.

Hope this helps!

Posted in General | Tagged , , , , , , , , | Leave a comment

Get output of script executed as part of Set-AzureRmVMCustomScriptExecution cmdlet


This post explains how to retrieve the output of the script that is executed as part of Set-AzureRmVMCustomScriptExecution cmdlet.

The cmdlet adds a custom script virtual machine extension to a virtual machine. It lets users run custom scripts on the virtual machine. Now in order to retrieve the output if any, of the script that is executed, we can use the¬†Get-AzureRmVMDiagnosticsExtension cmdlet with the -Name and¬†-Status switch. Name holds the name of the custom script vm agent that was installed previously. Also, it is important to use the -Status switch here otherwise we won’t be able to retrieve the output.

You can see that the SubStatuses property of the output of this cmdlet is of importance here and that this holds the output of the script that got executed for the above Set-AzureRmVMCustomScriptExecution cmdlet.

This property is an array which has 2 elements. First holds the output from StdOut stream and second from the StdErr stream. In this case, we’re interested in the StdOut stream.

As you can see, the “Message” property holds the output of the script that got executed on the VM as part of the CustomScript Execution cmdlet. In this case, it is just sample output.

If you’re looking for any errors, you can refer to the StdErr stream using SubStatuses[1] array element.

A Catch here! An important one too!

If you’re running this code as part of an Azure Automation runbook of type “Workflow“, then remember that the cmdlet output is not an object but it’s just the names of the types. So the SubStatuses[0] will only return the name of the TYPE of the property and NOT the actual object. This is how workflows are designed. So if you still want to retrieve the value, you need to execute this within an InlineScript tag as shown below. Click on picture to see clearly.

After the InlineScript tag completes execution, the variable $Result holds the value that is returned from the inline script tag, in this case, the output of the script executed as part of Custom Script execution cmdlet.

Hope this helps!

Posted in .NET, PowerShell | Tagged , , , , , , , , | Leave a comment

Get-AutomationVariable Vs Get-AzureRmAutomationVariable


I finally started working on PowerShell and so thought that documenting my learnings here would be useful. So in this post, we will see the difference between the 2 cmdlets Get-AzureRmAutomationVariable and Get-AutomationVariable.

So let’s see the outputs of the two cmdlets. First, Get-AzureRmAutomationVariable

As you can see, when we just run this cmdlet directly, it asks for the resource group name and automation account name. Once these are provided, it lists all the variables that are present in the automation account along with properties of each of the automation variable.

There is also an option to get details of a particular automation variable by adding the         -Name parameter and providing the name of the automation variable. This retrieves the details of only that variable as shown below.

Now coming to the second cmdlet, Get-AutomationVariable. This cmdlet directly gives the value of the automation variable as shown below.

This can be used when we only need the value without any other details of the variable.

Hope this helps!

Posted in PowerShell | Tagged , , , , | Leave a comment